Artificial Intelligence can be a bitch.

Here are 6 high-profile projects that have miserably failed and have made the respective companies look really foolish:

https://abs.twimg.com/emoji/v2/... draggable="false" alt="🧵" title="Thread" aria-label="Emoji: Thread">https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten">
https://abs.twimg.com/emoji/v2/... draggable="false" alt="1⃣" title="Tastenkappe Ziffer 1" aria-label="Emoji: Tastenkappe Ziffer 1"> Back in 2015, a software engineer reported that Google Photos was classifying his black friends as gorillas.

The algorithm powering the service was unable to properly classify some people of color https://abs.twimg.com/emoji/v2/... draggable="false" alt="🤦" title="Person facepalming" aria-label="Emoji: Person facepalming">!

Here is the story: https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai

https://www.theverge.com/2018/1/12... class="Emoji" style="height:16px;" src=" https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten">
https://abs.twimg.com/emoji/v2/... draggable="false" alt="2⃣" title="Tastenkappe Ziffer 2" aria-label="Emoji: Tastenkappe Ziffer 2"> Back in 2016, Amazon had to scrap it& #39;s AI recruiting tool because it discovered that the system taught itself that male https://abs.twimg.com/emoji/v2/... draggable="false" alt="👨" title="Mann" aria-label="Emoji: Mann"> candidates were preferable, and penalized every resume that pointed to a female https://abs.twimg.com/emoji/v2/... draggable="false" alt="👩" title="Frau" aria-label="Emoji: Frau"> candidate.

Here is the story: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G

https://www.reuters.com/article/u... class="Emoji" style="height:16px;" src=" https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten">
https://abs.twimg.com/emoji/v2/... draggable="false" alt="3⃣" title="Tastenkappe Ziffer 3" aria-label="Emoji: Tastenkappe Ziffer 3"> The COMPAS system, used in the US to assess a criminal& #39;s likelihood to re-offend, incorrectly classified black defendants as far more likely than white defendants to be at a higher risk of recidivism.

It was a scandal.

Here is the story: https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm

https://www.propublica.org/article/h... class="Emoji" style="height:16px;" src=" https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten">
https://abs.twimg.com/emoji/v2/... draggable="false" alt="4⃣" title="Tastenkappe Ziffer 4" aria-label="Emoji: Tastenkappe Ziffer 4"> In 2016, Microsoft created a chatbot named Tay, which really quickly turned into a Holocaust-denying racist https://abs.twimg.com/emoji/v2/... draggable="false" alt="👿" title="Teufelchen" aria-label="Emoji: Teufelchen">.

Microsoft had to take the chatbot down from Twitter and never put it back again.

Here is the story: https://www.bbc.com/news/technology-35902104">https://www.bbc.com/news/tech...
https://abs.twimg.com/emoji/v2/... draggable="false" alt="5⃣" title="Tastenkappe Ziffer 5" aria-label="Emoji: Tastenkappe Ziffer 5"> In 2018, MIT found that three of the gender-recognition systems (from IBM, Microsoft, and Megvii), had an accuracy of 99% for white men and only 35% for dark-skinned women.

Just think about that difference!

Here is the story: https://www.newscientist.com/article/2161028-face-recognition-software-is-perfect-if-youre-a-white-man/">https://www.newscientist.com/article/2...
https://abs.twimg.com/emoji/v2/... draggable="false" alt="6⃣" title="Tastenkappe Ziffer 6" aria-label="Emoji: Tastenkappe Ziffer 6">A few months ago, in May 2020, Microsoft’s MSN came under fire after it mistakenly paired an article on Little Mix singer Jade Thirwall with a photo of her bandmate Leigh-Anne Pinnock, both of whom are mixed-race.

Here is the story: https://hivelife.com/microsoft-ai-msn-racial-bias/

https://hivelife.com/microsoft... class="Emoji" style="height:16px;" src=" https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten">
There& #39;s something at play in every one of these cases: "algorithm bias."

This is a phenomenon that occurs when an algorithm produces results that are systemically prejudiced due to erroneous assumptions in the machine learning process.

Basically, garbage-in, garbage-out.

https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten">
The study of ethics, and the impact of biases when implementing Artificial Intelligence is paramount to achieve the results we want as a society.

Take a look at this TED Talk that @_jessicaalonso_ shared with me. It& #39;s pretty revealing: https://www.youtube.com/watch?v=UG_X_7g63rY&feature=youtu.be

https://www.youtube.com/watch... class="Emoji" style="height:16px;" src=" https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Rückhand Zeigefinger nach unten" aria-label="Emoji: Rückhand Zeigefinger nach unten"> https://www.youtube.com/watch...
I personally need to do better when thinking about how the solutions I build impact society and how to avoid biases that could undermine their usefulness.

I encourage you to do the same.
You can follow @svpino.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: