top of page

Oh Brother its Big Brother!

By: Nyles Geiger

Image: lydia_shiningbrightly/Flickr (1)


What is Predictive Policing?

Police use machine learning tools and feed it data to for four main functions(3):

- Allocate police resources efficiently

- Identify individuals that are deemed likely to commit crimes

- Determine when are where crimes are likely to occur

- Identify patterns in crimes


Pattern recognition

Crimes can be related and may either be carried out by the same person(s) or may use the same modus operandi(1). (modus operandi = Thoughts and behavioral patterns that belong to a group of people). These characteristics can be things such as means of entry (front door, back door, window), time of crime, characteristics of the property (apartment, single family house), and proximity to other break-ins(1). These characteristics are used to help predict when certain criminals (burglars, muggers, kidnappers) will commit their corresponding crimes.


As stated by Innefu, "Machine learning tools can compare various crimes easily and generate a similarity score. These scores can then be used by the software to try and determine if there are common patterns"(3). This is already being implemented by the New York Police Department and has been used to crack cases(2).


Predictive analytics


Predictive analytics is the use of programs in which machine learning is employed to help predict where potential crimes may occur.


When the system identifies a trend in a crime being committed in a particular area, the police can then allocate resources to that area so that they can proactively manage the situation and prevent a crime from occurring(4). These areas are known as "hotspots". A map of hotspots consists of areas where a crime is predicted to happen with a certain probability. An example is a heat map, where the the "temperature" of a spot indicates how likely a crime is to occur



Faults


Bias

According to the US department of Justice, you are twice as likely to be arrested if you are black than if you are white. In general minorities are shown to be arrested (and convicted) disproportionally for nearly every crime in America(7). These biases are then used in datasets that are used to train various programs that are used to assist police. What this creates is a system where bad data leads to bad policing which leads to more bad data.


Crime Reports vs. Arrest Data

  • “The problem with victim reports is that Black people are more likely to be reported for a crime than white” (Heaven)

  • People in communities with high level police mistrust are less likely to report crimes

  • Leading many to believe that DMML models should not be used anywhere there is a history of policing prejudice.


Ineffective Measurements

In recent years, especially due to the buzz AI, many companies have been throwing their wallets at those who promise to improve their processes with machine learning magic. Companies have wasted millions of dollars on products which simply don't work. In 2020, Santa Cruz (Los An.) banned the use of predictive models in policing, this is after t the LAPD paid $20 million over the course of nine years to use Palantir’s predictive technology(8).


Along with Santa Cruz, a number of cities including New Orleans, have banned government use of predictive policing programs. As Jason Williams, a New Orleans councilman said at the time, “Right now we’re talking about using a flawed technology on our private citizens that we’ve learned is already in place in the City of New Orleans and used against citizens,”(9).


Faulty Measurements

One example I found to be extremely alarming is that of an innocent ederly man that was convicted with no hard evidence. All it took was him being accused by a particular algorithm:

"Williams was jailed last August, accused of killing a young man from the neighborhood who asked him for a ride during a night of unrest over police brutality in May. The key evidence against Williams didn’t come from an eyewitness or an informant; it came from a clip of noiseless security video showing a car driving through an intersection, and a loud bang picked up by a network of surveillance microphones. Prosecutors said technology powered by a secret algorithm that analyzed noises detected by the sensors indicated Williams shot and killed the man. The 65-year-old sat behind bars for nearly a year before a judge dismissed the case against him last month at the request of prosecutors, who said they had insufficient evidence.(5)"


Does ML have a place in Policing?



In a perfect world, these algorithms would accurately predict the actions of real people. The data to train these algorithms would be accurate to the real world.


In actuality, current models are inaccurate and susceptible to bias. The algorithms that are currently in place give some usable information but are far too inaccurate to be used by police. There is too much bias being fed into the models, which would harm the effectiveness of police and overrepresented communities.


Technology can't be used to solve people problems on their own. The machines we build don't have brains and can't do magic. They simply attempt to relate pieces of information. This means that they don't solve our problems, they advance the systems in which they are born in. This means that predictive policing products built in a discriminatory system, will only further the discrimination. As Matthew Guariglia of EFF said, "Technology Can’t Predict Crime, It Can Only Weaponize Proximity to Policing(8)"






Sources:










Comentários


Drop Me a Line, Let Me Know What You Think

Thanks for submitting!

© 2023 by Train of Thoughts. Proudly created with Wix.com

bottom of page