I was tweeting about another entirely different thread with the exact same situation not even a few hours ago.
When folks talk about technological racism and how algorithms are not coded with any thought to black & brown ppl whom they may negatively affect this is what they mean https://twitter.com/bascule/status/1307440596668182528">https://twitter.com/bascule/s...
When folks talk about technological racism and how algorithms are not coded with any thought to black & brown ppl whom they may negatively affect this is what they mean https://twitter.com/bascule/status/1307440596668182528">https://twitter.com/bascule/s...
Like... not even joking. Here’s the other thread concerning a black teacher on zoom where someone else randomly discovered the same thing when he posted to twitter. This isn’t a “Twitter only” thing.
You wouldn’t know a black person was involved if you didn’t click the pictures. https://twitter.com/jongraywb/status/1307428697629229056">https://twitter.com/jongraywb...
You wouldn’t know a black person was involved if you didn’t click the pictures. https://twitter.com/jongraywb/status/1307428697629229056">https://twitter.com/jongraywb...
Reliance on algorithms coded to not recognize black folks as ppl (or single them out) has done nothing but harm. Yet folks think that mountains are being made of molehills when they are not.
Best example: using algorithms for “predictive policing”. https://www.google.com/amp/s/www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/amp/">https://www.google.com/amp/s/www...
Best example: using algorithms for “predictive policing”. https://www.google.com/amp/s/www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/amp/">https://www.google.com/amp/s/www...
Like... nobody watched Minority Report and learned the lesson from Tom Cruise jumping around everywhere? Really?
There’s no excuse for the reliance on these algorithms to bend to this racist a slant unless the persons coding them have inherent biases ingrained when they do so.
There’s no excuse for the reliance on these algorithms to bend to this racist a slant unless the persons coding them have inherent biases ingrained when they do so.
So y’know — maybe they should look at themselves before touting this as the next best thing.
Also, let’s be honest — the next best thing almost always winds up being “the next best thing for folks who want to weaponize it” at this point
Just b/c you can doesnt mean you should.
Also, let’s be honest — the next best thing almost always winds up being “the next best thing for folks who want to weaponize it” at this point
Just b/c you can doesnt mean you should.