I& #39;ve spent about three years researching and wrote my MS thesis on algorithmic crime mapping, predictive policing, etc. So I& #39;m going to lay out some things I learned in my research that I think relevant as many call for police accountability and reform (1/)
Police both enable and are enabled by our capitalist power structure. In addition to protecting property of the wealthy, the criminal & #39;justice& #39; system allows politicians to funnel unbelievable amounts of money to companies (tech, arms, vehicles) with little oversight.
These techs don& #39;t work, reinforcing racial biases at every level. Ex: COMPAS has been shown to treat black defendants twice as badly as whites. But trying to get them removed through the court system doesn& #39;t work. COMPAS is still in use even after Loomis v Wisconsin.
Algorithmic models identify hotspots & lead to more police being in an area. When cops know they& #39;re in a hotspot, they& #39;re more likely to take biased action against anyone they see. Because of racial police bias, these can easily escalate to the atrocities we are seeing so often.
It goes beyond poor (blackbox) algorithms and lack of user training, the data is inherently biased. For as long as police have existed in America, they& #39;ve unfairly targeted black individuals and neighbordhoods. This is reflected in the data used by these models and permeates up.
Naturally, more crime is recorded in an area when cops are sitting around looking for, and expecting to find, crime, as is the case in & #39;hotspots.& #39; This causes a self reinforcing data loop. The data only gets worse, and it started at bad.
Fixing this isn& #39;t a matter of tech companies making the algorithms work better. You& #39;re still using racially biased data. And as long as your data is bad, your model can never be good.
People at every level know these tools don& #39;t work or have embraced ignorance. Predictive policing practices let cops shift accountability & line corporate pockets, which keeps politicians who sign these contracts happy.
Throughout my time working on predictive policing, I bring up the data pitfall a lot, and the curious ask what I would suggest to fix it. I reply, perhaps jokingly, the only way to fix it is a hard reset, a hard stop on police. But its not a joke. I& #39;m dead serious.
Crime data has always been biased. Because the very concept of crime and criminality has *always* been biased in America. No matter how much you try to clean it, future data will always carry the scars of that bias unless you break the cycle completely.
Defund the police. Defund the tech companies who have sold their souls with no intention of change. Build a brand new system free from the past with an express goal to never repeat it. Anything less, I fear, is a half-measure, doomed to be undone.

(/thread)
You can follow @KatyCantSpell.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: