New:
We scraped 138, mostly private, groups on Facebook, then ran basic search terms across our database of 2,6 million posts/comments. Our analysis shows that Facebook somehow still fails at finding pretty low-level, straightforward racist slurs (1/4) https://br.de/hassmaschine ">https://br.de/hassmasch...
We scraped 138, mostly private, groups on Facebook, then ran basic search terms across our database of 2,6 million posts/comments. Our analysis shows that Facebook somehow still fails at finding pretty low-level, straightforward racist slurs (1/4) https://br.de/hassmaschine ">https://br.de/hassmasch...
We also trained a machine-learning-algorithm – developed by Facebook – among other things, on Nazi imagery. Loads of hits. This being a random selection of groups and our "easy to find"-approach shows how big a problem this still seems to be (2/4)
Our investigation builds on reporting by @dseetharaman & @JeffHorwitz. Groups play a huge role for Facebook, algorithms were tweaked to favor their visibility. Our source told us he entered almost all groups at suggestion of FB-algorithm https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499">https://www.wsj.com/articles/... (3/4)
One of the main problems: If groups consist of 1000s of people that think torturing Muslims is fine, and if admins post hitler-memes, posts won& #39;t get flagged. Dissent gets squashed. As Facebook prioritizes groups, some of them them seem to be a hotbed of radicalization. (4/4)