We talked about this in the Rabbit Hole QAnon episode, but it's still wild to me that part of what fueled the rise of QAnon was Facebook trying to fix its misinformation problem after 2016 by boosting groups with "meaningful social interaction" over pages. https://www.nytimes.com/2020/05/28/podcasts/rabbit-hole-qanon-conspiracy-theory-virus.html
So many problems of social media platforms are related to their inability to imagine what might become "meaningful" or "relevant" in users' lives, under the right political and social conditions. It's a failure of empathy, encoded in algorithms and distributed at scale.
Was just reminded of this column from July 2017 (3 months before the first Q post) wondering aloud whether an outcome of Facebook’s Groups push would be “sorting like-minded people into closed echo chambers and sheltering them from divergent views.” 

https://www.nytimes.com/2017/07/16/business/behind-the-velvet-ropes-of-facebooks-private-groups.html


