New from @JeffHorwitz & me: Facebook spent years studying the its role in polarization, according to sources and internal documents. One internal slide laid out the issue like so. ”Our algorithms exploit the human brain& #39;s attraction to divisiveness.” https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499?mod=searchresults&page=1&pos=1">https://www.wsj.com/articles/...
The concern predates the election. A 2016 slide shows showed extremist content thriving in more than one-third of large German political groups. “64% of all extremist group joins are due to our recommendation tools.” “Our recommendation systems grow the problem.”
The “common ground” team at Facebook wasn’t trying to change people’s minds. They were trying to stop the vilification of the other side. And they came up with different product ideas to tone down the rhetoric on Facebook.
But top executives shelved the effort. Policy exec Joel Kaplan considered the initiative “paternalistic,” per sources. The moves also affected right-leaning pages too much & risked growth. More here: https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499">https://www.wsj.com/articles/...
Kaplan spoke on the record for this story & defended his vetting process (code: Eat Your Veggies). For more background, here’s our profile of Kaplan from Dec. 2018. https://twitter.com/dseetharaman/status/1076857085608902657?s=21">https://twitter.com/dseethara... https://twitter.com/dseetharaman/status/1076857085608902657">https://twitter.com/dseethara...