1 - Some machines use algorithms like training guides to learn how to complete tasks as data comes in over time. #AlgorithmicBias #CodedBias
2 - These machines then use what they learn and make big decisions in people’s lives like:
who gets hired or fired.
who receives proper medical treatment.
who is targeted in police investigations.



3 - Sometimes throughout the process of teaching a machine to make decisions, societal biases can creep in and encode racism, sexism, ableism, or other forms of harmful discrimination.
This is what we call #AlgorithmicBias.


4 - Robert Williams, a 43-year old black man from a suburb of Detroit, was wrongfully arrested in front of his daughters and held for 30hrs after facial recognition software led to his misidentification. #AlgorithmicBias #CodedBias https://www.metrotimes.com/news-hits/archives/2021/04/14/black-man-wrongfully-arrested-based-on-false-facial-recognition-match-sues-detroit-police
5 - Daniel Santos, a teacher in Houston, was FIRED after an ”automated assessment tool” didn’t count all his caring, qualitative work with students — undervaluing his performance and labeling him a “bad teacher”. #AlgorithmicBias #CodedBias https://www.chron.com/news/houston-texas/education/article/Houston-ISD-settles-with-union-over-teacher-12267893.php
6 - Some have said #AlgorithmicBias is just a technical problem and with better data, the issues will be fixed.
But how systems are used is just as important as how well they work!
But how systems are used is just as important as how well they work!
7 - The more people know and understand #AlgorithmicBias, the more we can upend the harm being created.
Join us in the fight
http://bit.ly/ajlnewsletter-signup
Join us in the fight
