right now someone somewhere could be giving you a score a number that estimates how likely you are to break the law which tells police if they should zero in on you and which could forever change your life and the lives of those around you it it destroys everything and this someone might not be a human being but a computer so how do machines try to predict crime and is it okay to do that from dw this is techtopia and in this episode we are investigating how police use technology to predict where crimes will take place
and who will commit them we hear from officials about why they use the systems from experts who issued dire warnings they do embed a lot of historical discrimination and we explain how the programs attempt to forecast the future and if that sounds like science fiction to you well listen to this i feel like i just got a stamp on my head with he's a criminal watch out this is damien when he was 14 years old software flagged him to local police as a future criminal i committed a street robbery and the reason i was put
on the list was that i committed another one on my probation my name is damian saraju and i live in amsterdam i was hanging with the wrong people but you know i didn't care i didn't think of the consequences so damien did break the law but he says that what happened after his conviction made it hard for him to get back on the right path what would have helped me is someone who understood what i was going through i don't think a computer system is human enough to decide if you are going on the list
or not that's how it all started after damien was convicted for the two robberies a computer program estimated that he was likely to break the law again that's why authorities put him on a watch list called top 600 and they kept him on there after he had served his sentence [Music] they actually didn't tell me what it meant they were just like i got an own sort of detective and he tell me that if i would commit any crime from that moment he will be the first face i'll see in the morning when they come
to pick me up to arrest me i got id'd all the time for no reason you know you could be trying your best to be not a criminal and they would point you at the smallest thing you do what are you doing in this neighborhood i did feel trapped i did feel like i was put in a box with the top 600 logo on it and just put away i didn't see a way out and being on the list wouldn't only have an impact on damien's life it would also bring his brother who had no
criminal record into conflict with the police and it would eventually lead his mother to take on the entire system confused no worries we'll get to the bottom of that but first let's start by looking into how the technology works at its heart there's a big promise to stop crime before it's even committed this is done by combing through masses of data some programs focus on potential crime scenes if they find links between previous incidents say that they all occurred at a certain time and place or even during the same weather conditions they recommend that police
zero in on those locations other programs predict what they consider to be potential offenders they scan the criminal and personal histories of individuals for risk factors such as who was arrested and how often or even who dropped out of school and come up with a list of who's likely to break the law [Music] but how common is it for police to use those systems well it's actually more common than you might think over the last decade the use of predictive policing has spread all around the world police in over 50 countries from western democracies like
france to authoritarian nations like china now deploy the programmes this has given rise to a multi-billion dollar industry with companies from small startups to big corporations working on the programs but those companies only build the software the decision to use it is made by policy makers often together with law enforcement and we wanted to find out why in 2011 the then new mayor abraham milan took office and in that period of time amsterdam had to do with a lot of armed robberies and violence out of the streets and that caused great worries to us and
also to the inhabitants of amsterdam my name is hermann bolhar i'm the national rapporteur on human trafficking and sexual abuse of shielder in 2011 i was a chief prosecutor in amsterdam we talked about new ways of dealing with these problems and then the idea of focusing on the the hard core of offenders came up and then the idea of the top 600 was born we got together and came to the conclusion that we should do more to analyze what was really going on and try to see what we could do to be more preventative because
if you prevent it from happening then there are no victims and no perpetrators right over a decade later the top 600 is still up and running along with a second list called top 400 which was introduced in 2015 well the top 100 is the little brother of the top 600. my name is martin skippers i'm the chief information officer of the public order and safety department of the city of amsterdam 100 top 600 are our proven criminals top 400 are those at high risk at getting in that direction to recap the city of amsterdam now
has two lists one that's called top 600 for those who've already been convicted of crimes and a second one called top 400 for teenagers deemed likely to become criminals for example in the top 600 you would have to be arrested for high impact crimes a couple of times in the last couple of years if the software finds that a person has also been convicted more than once it recommends putting them on the top 600. for the top 400 the software is looking at more than just the criminal records of teenagers checking also if they have
a history of domestic abuse if they have serious problems at school or if they've ever been on the radar of youth care these are all criteria so say from one to ten and you have to fit say at least seven and then then you will be put on the top 400 or top 600 list the aim is on on the one side solving a societal problem solving a safety problem making the city safer but also trying to create a little bit of a better life for a portion of our population that never really had the
chances or the opportunities that's the strategy behind the programs in amsterdam first use technology to identify those most at risk of breaking the law and then boost both strict oversight by police as well as support by social workers to prevent them from committing crimes so much for the theory now the question is does it work when damien was 14 i heard of the top 600 list but it didn't say much to me i just read about it that it would help your son to get back on track my name is diana sergio and i'm the
mother of damien it didn't work out for us as well as we thought it it destroyed everything and we're not a family anymore it destroyed it it pulls us apart they put his older brother he's one year older on the listen for 400 just out of yeah precaution because they say from yeah if one of the kids is on the 600 that most likely will influence the other kids in the family it's like a self-fulfilling prophecy you know you put him on the list he's not a criminal yet but he will get there so they
labeled him and because of that he thinks okay is that what they think from me and i will act that way what diana says about her two sons here sums up the core controversy over predictive policing ask supporters and they tell you that the technology can help build a society where crime is nipped in the bud but opponents say the systems do the opposite and often target the most vulnerable members of society pushing them even further to the edges the programs for instance often flag low-income communities or minority neighborhoods as alleged hotspots prompting the police
to patrol those areas more than others this in turn generates even more data and sets of a vicious circle of discrimination flagging these areas over and over again similarly the programs tend to single out low-income people and minorities as potential offenders but why is that why do computers replicate the kind of biases we know from the analog world existing police data whether in new delhi whether in the u.s whether in germany whether in australia embed a lot of historical discrimination they do embed systemic problems such as racism or castism or even sexism now we're taking
all of this data that is you know seemingly objective and correct but is in reality quite biased and discriminated especially towards minority populations we take that as being the ground truth on which the computer kind of like learns what patterns exist you know i always learned one thing in school the computer doesn't make mistakes people do it's what you put in it that will come out and years later and i'm now older and somehow that came to mind again i said yeah but if they put wrong data feed the computer wrong data then we can
get a whole thing that isn't right you need to take care of poverty good education and equal chances her criticism is that with this program you are fighting the symptoms rather than the root cause of crime maybe yeah sometimes every case of the 600 is very different and with some you can only fight the symptoms with some you don't even get to the symptoms you know sometimes you just really don't get far at all and sometimes you get closer you can get deeper and you can actually help with the root causes so who's right the
city officials who say that their program helps prevent crime or damien's mother diana who says that it does the opposite well let's take a look at what the research says does predictive policing work we don't know we don't know well we did a broad literature review of publications about predictive policing to investigate to what extent we know whether predictive policing works but also to what extent it results in adverse effects but what we found out is that there is actually very little empirical evidence it's more ideological than empirically based the interesting thing is that predictive
policing is introduced so widely as if it would be very effective whereas we argue that there really is no strong basis for this assertion but as public awareness for the risks of predictive policing is growing we've identified two trends in our investigation one is that after a series of police killings ignited a debate over systemic racism and how to reform policing in the u.s police departments from santa cruz to los angeles abandoned their predictive policing efforts [Music] which raises the question could that push back that first trend be a global trend i would want to
say yes but unfortunately my answer is very much a no in countries that don't have strong data protection legislation or you know where governments are more open to experimenting with technologies on people we see that predictive policing is actually on the rise the second trend we've noticed is that many companies have moved away from forecasting who will commit crimes to where they will take place this they say makes their systems less prone to discrimination but are they no they are just as prone to discrimination so place-based predictive policing systems essentially take a bunch of data
whether that can be historical arrest data or how many people rent in a given neighborhood how close you are to um drug counseling centers and use all that input to guess where they think crime will happen and what's been found time and time again is that it simply sends police to the various places that they've already policed in the past so if experts tell us that predictive policing is prone to discrimination then what does the law say about its use governments are deploying tools without any sort of law in place to allow individuals that are
affected by those technologies to have appropriate remedies there needs to be more exposure about what tools are being used but there also need to be avenues for challenging not only the decisions that are being made by those computers but also in certain instances having a opportunity to abandon the use of the technology entirely in 2020 damien's hometown of amsterdam introduced an online registry where it lists all algorithms used by its public administration we want to show people this is what we do so if you're on the top 600 you can always in that register very
simply look up what it is that we did with your data and how we came to the conclusion to put you on the list or not and speaking of damien did he ever get off that list i did get off the list it took me three years you broke the law weren't you just paying a fair price for that yeah i think i already paid my price when i got sentenced for you know the things i've done i've got my house arrest my evening clock you know i even had some fines i think that or
i know that i already learned my lesson from that i expected my mom to turn her back on me because you know i felt like i was the black sheep what i did was the worst and could not be forgiven but she actually stood aside me i was the first mother who stepped out in public in the newspaper full name full face and told about my experience on the top 600 list i was the only one who was speaking up because the other ones were afraid to speak up i think it helped also that i
went public that he got after this and when diana realized that there were many other families like hers she decided to take her fight to the next level and set up a foundation i started at my kitchen table you know i i got money together in the beginning nobody would listen to me but here after three years i'm still here and i'm helping mothers all over the country now things are changing slowly but they're changing step by step and yet police departments around the world continue to use software to predict crimes and who will commit
them which brings us back to our key question when is it okay to use predictive policing it has to be clear how the police makes certain choices about about the allocation of its resources we need to know what kind of data are being used by these algorithms what kind of rules are being coded in these algorithms and we have to ensure that the police does not work in a discriminatory manner predictive policing should never be used it has been shown time and time again to be discriminatory technology is not always the answer to problems sometimes
the answer doesn't lie in technology but maybe in rethinking institutions in rebuilding institutions in figuring out where institutions have gone wrong predictive policing our investigation showed us remains one of the most controversial technologies of our time whether or not to use it is up to lawmakers to decide but all experts we talked to agreed that if they decide to use it clear rules are needed and it's time to debate how those rules should look like unlike the early systems like the one that flagged damien newer ones make it increasingly difficult or even impossible to understand
how exactly they come up with their predictions predictions that as we've seen can have life-altering consequences and when it comes to decisions like that what do you think should a machine have the last word [Music] you