I had the privilege of deposing before Delhi Government Committee on Peace & Harmony this week, on Facebook's role in the Delhi Riots, allegations of political bias. What I said:

1. My recco's:
a.Allegations against Facebook SHOULD be investigated by an independent agency (1/n)
for not taking down hate speech
b. Central/State govt may consider a law for regulating large social networks in a manner that this doesn't harm Intermediary Liability protections for others. We need a middle path.
c. We need transparency from FB about decision making (2/n)
regarding implementation of community standards, and role of local teams, access to information. This needs to be open to public scrutiny.
d FB has to improve implementation of community stds. Given their scale (+that of YouTube & Twitter), this needs to be expected of them (3/n)
2. FB is powerful: FB has ABILITY to benefit a political party. We have anecdotes, not statistical evidence. We are dependent on the benevolence of platforms to not harm democracy, or enable other govts to do so. (4/n)
Large platforms have to be held to higher level of scrutiny, expectations of transparency, neutrality because of their power.

3. FB has a bias towards power: As per Pro-Publica in 2017, FB has some categories of users that it treats as protected, including White Men. (5/n)
FB tends to lean towards power, and there's inaction there. We've seen that in Myanmar (where FB had to apologise), in Philippines, and now allegedly in India. FB serves governments who can block it. Caravan story suggests that (6/n)
FB censored memes related to PM Modi, protests against govt policies, and people who shared these. A bias towards power doesn't mean a bias towards a party. Ankhi Das made a recommendation that was biased towards power and reducing business threat from the government. (7/n)
We need more evidence here to indicate a political bias. Every company has a bias towards power: every industrialist gives great ratings for the budget. Ankhi Das will recco what is best for Facebook as a business.
4. FB needs to do better: Groups with hate speech (8/n)
are still active on FB. FB needs to do better. They have the resources to do so. FB's community standards and their implementation keeps on changing. There is clear lack of consistency here. Role of local teams isn't clear. No clarity on how algos operate. (9/n)
No clarity on whether there are India specific community standards, or how chain of command works. It's also likely that global teams would rely on the understanding of local teams.
FB has a history of move fast and break things. They act only when there's a major problem.(10/n)
WhatsApp didn't fix things post Muzaffarnagar riots. Only after @rsprasad repeatedly summoned Chris Daniels. FB responds when pulled up, or the US media highlights things. Did not act on Myanmar story until NYT report. (11/n)
5. Moderation requires human intervention: The scale of moderating content is huge, and so there are both algos and humans involved. While we want hate speech taken down, we also don't want censorship of free speech. Algos are not great judges of content, (12/n)
and their lack of effectiveness was evident when it came to the pandemic, when human moderators were asked to stay home, and algos censored legitimate speech. In the past, Algos censored the Napalm girl photo
(13/n)
5. Oversight board has insufficient power: As per @chinmayiarun, oversight board FB's oversight board cannot look into implementation of local laws, only content takedowns. It doesn't have the power to look into content that stayed up, despite it being hate speech. (14/n)
6. Intermediary Liability protections: protect platforms from liability but also allow them to impose their own community standards, in the interest of their own community. They're responsible, but we can't hold them liable. We need to be careful here, (15/n)
because changes in IL could hurt small platforms. We need a middle path between Intermediary Liability Protections and their removal.
7. Law enforcement also has a responsibility: if hate speech stayed up, and there is incitement to violence, law enforcement also (16/n)
needed to enforce the law. We also need to protect end to end encryption to protect privacy.

Also, I had disclosed at start of the meeting that FB has been a sponsor for many MediaNama events, along with other sponsors. (17/n)
We disclose all sponsors for each event. Sponsors do not have a say in event editorial or MediaNama editorial or my positions on issues. If they try, they can leave.
Please watch the deposition here: (after Paranjoy Guha Thakurta's, which was for around 40-45 min)
You can follow @nixxin.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: