1/ Did You Know That The High Policing Rate Of Blacks In The USA Is Partly Driven By An Artificial Intelligence?
#Thread #AI

One of the reasons for the high rate of policing of black communities in certain cities in the USA is due to algorithmic bias.
2/ Firstly, how do we understand bias in terms of algorithms? Algorithms are just math and code but algorithms are created by people and use data equally created by people, so biases that exist in the real world are mimicked or even exaggerated by AI systems.
3/ So, an algorithm itself does not become biased. It derives its bias based on the programmer, which is called algorithmic biased. This algorithmic bias is amplified by data bias, that is, the data on which the algorithm is fed with. Now, an algorithm is biased when it creates
4/ outcomes/outputs which are systematically less favourable to individuals of a certain social group. For example, let us say there is an algorithm that is responsible for treating job applications and then recommends the most favourable profile to the employer.
5/ Let us again assume that in that company, 2 employees both with Takam as their last name previously left the company. Another 3 employees also named Takam were fired last month due to misbehaviours. Now, If a job applicant bears the name Takam, the algorithm will play
6/ in its disfavour, even though the prospective employee could have been the best candidate fit for the job. Indeed, the example using the name ‘Takam’ can be applied with other attributes like race, gender, sexual orientation, or even age.
7/ Coming back to our main topic, in the USA, black communities have historically been policed more than any other community for whatever reason. Records of this policing have been kept and this data has been fed to an algorithm in order to help the police predict crime.
8/ This software is called PredPol (predictive policing), developed by a company with the same name. The company offers predictive policing through its machine learning algorithm based on three data points: crime type, crime location, and crime date/time. As you can notice,
9/ from the onset, the software will de facto consider black people as people with a high disposition to commit a crime (due to the historical data it was fed with). It will then send police officers to black-dominated communities for patrolling. During this process,
10/ petty crimes will be recorded and fed back to the algorithm.The point is that with time, the algorithm will mainly be receiving data from policed black communities thereby creating a vicious cycle.The higher the patrolling in black communities, the reverse is true for other
11/ communities. This means less policing in other communities, less record of petty crimes (or just crimes in general), and therefore fewer data from non-black communities fed into the algorithm. And the cycle continues. PredPol claims that
12/ no demographic, ethnic or socioeconomic information is ever used in its algorithm, thus eliminating privacy or civil rights violations. However, this claim is predicated on the assumption that the data supporting the algorithm is also free of demographic, ethnic or
13/ socioeconomic information which is highly unlikely. For example, using machine learning, the Los Angeles Police Department’s criminal data, and an outdated gang territory map to automate the classification of “gang-related” crimes.
14/ An algorithm is a formation of a social realm. The algorithmic culture arises as a result of the role algorithms play in sorting, ranking, and hierarchizing which represents the expression of human thought and condition. Therefore, it is important that
15/ we be critical about AI systems instead of just accepting that "the computer said so".
You can follow @CedricVilliams.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: