Alright, because the documentary REALLY did not get into what the Facebook emotional contagion experiment was, I'm going to do a quick rundown of it and some broader issues with it.

If you want the academic version, here's my article: https://ctlj.colorado.edu/wp-content/uploads/2016/06/v2.final-Schroeder-4.14.16-JRD.pdf
So it's called the Facebook emotional contagion experiment because TECHNICALLY it was on and run by Facebook (this ignores that it was designed by others, but we'll get to that). Essentially, it was testing users to see if the types of posts they saw affected their emotions.
The only qualifications to be in the experiment were 1) you use Facebook in English and 2) you had logged on in the week before the experiment.

They ran this on almost 700,000 users and did not filter for state, country, age, etc.
(Also, people in the experiment were not informed, did not consent, and were NEVER told whether they were in it or not. SUPER COOL.)
People in the "positive" groups had their newsfeeds adjusted so posts with negative keywords (bad, nasty, hate, sad, etc.) did not appear on their feeds. Reverse of that for the "negative" group (all positive keyword posts removed). There was also a control group at neutral.
Facebook then tracked the level of engagement and posting frequency for the subjects along with whether any posts contained more negative or positive keywords to see if the user was affected by the more positive or negative content.
Does that seem manipulative or invasive? Unethical? Deeply irresponsible in case they were negatively manipulating, say, kids or people with mental health issues?

Hold on.
So the experiment was actually designed by university researchers from Cornell University and the University of California, San Francisco. They designed the experiment, handed it off to Facebook to execute, and then analyzed and published results.
Because Facebook performed the actual algorithmic manipulation and data collection, the researchers claimed they hadn't "performed" the experiment (despite DESIGNING IT and PUBLISHING THE RESULTS).
This is key because academic human subject research has requirements - namely, the Common Rule and Institutional Review Boards (IRBs). IRBs review proposed experiments to see if the potential risks are worth the results, to identify risks, and to mandate safeguards or changes.
Often this will mean acquiring some form of notice and consent from subjects, adjusting subject pool to avoid high-risk subjects, sometimes fully nixing an experiment if it can't be done safely/ethically.
But because Facebook did the actual subject manipulation and data collection, the researchers claimed it didn't have to go through an IRB because it wasn't THEIR experiment and Facebook isn't required to meet the same standards.

This loophole is known as IRB laundering.
Other arguments Facebook made to defend the experiment were:
-this was A/B testing - just normal product testing, def not to purposely mess with people (no)
-this was to address allegations that Facebook made people depressed and actually promoted user safety (nope)
Also, this probably violated Safe Harbor (in place at the time) and child data protection laws. And, despite this coming up in arguments later, data use for research wasn't added to Facebook's Terms of Use until four months after the experiment took place.
Basically, this whole thing highlighted a massive gap in protection for people subject to experiments - private companies just do not have the same requirements for experimenting on human subjects.
Frankly, I would be happy if all human subject experiments, by any individuals or companies, had to meet Common Rule requirements. I'm still not sure why we don't just do this (beyond companies arguing that it would be complicated and also they don't want to).
Also, the actual results of the experiment showed what is to me (I am not a scientist and not trained in reading these things) a nearly imperceptible increase in negativity from the negative group and positivity from the positive group.
Genuinely, the results are so small I do not know if they're scientifically significant. The study is still out there, if anyone more in this area wants to check: http://www.pnas.org/content/111/24/8788.full.pdf.
Btw, Facebook engages in lots of research and publishes a good chunk of it publicly, if you ever want to see the range of stuff they get into: https://research.fb.com/ 

To be clear, I'm very glad they are transparent about it, even if there are other major issues.
You can follow @Iwillleavenow.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: