This is a strong, important & timely decision, esp in light of Facebook's enforcement mistakes around #ResignModi yesterday.

There's some really important recommendations here. Facebook has 30 days to respond and that'll be worth watching.

A few things worth highlighting /1 https://twitter.com/OversightBoard/status/1387685869398732801
The case concerned a video that was critical of Modi & the BJP. Facebook removed it under its Dangerous Orgs policy. It was a mistake, & when the Board selected the case, but before it decided, fb said it got it wrong and reversed the decision. /2
This is not the first time the Board has done this. It's telling Facebook "you can't moot a case by simply reversing decisions when we pick a case. You wouldn't have found the mistake otherwise." /3
This was a case of human error. It was flagged for terrorism, a human looked at it and removed it. Facebook said the length of the video (17 mins), the complexity of the content could have contributed to the mistake—reviewers don't always have time to watch videos in full /4
The user's account was automatically locked, and they did not get a review. Facebook said this was due to reduced appeal capacity due to COVID. The Board says "yeah, we get that, unusual times etc. but you really need to get back to full capacity asap" /5
The Board was very conscious of the political context. It expressed concern about mistakes that especially impact minority language speakers or religious minorities, and noted that the political context in India right now "underscores the importance of getting decisions right" /6
The Board asked for, and Facebook refused to provide (on the basis that it was not "reasonably required" or for legal reasons), answers to the Board's qs re: possible communications from Indian authorities. It's possible fb is legally restricted from doing so, but still (!) /7
The Board acknowledges that "mistakes are inevitable when moderating content at scale" but that without knowing error rates it's impossible to tell from one case whether this is a systemic problem or a one-off. /8
The recommendations then.

1. Fb should translate its Community Standards into Punjabi because, you know, 30 million ppl speak it in India and more around the world.

Uh... INSANE this had to come from the Board, but here we are. Geez, fb. /10
2. Fb should restore human review to pre-pandemic levels asap, while protecting health of staff.

Yes. And (this is me now) fb in its response really should disclose how far off those levels it is, and its timeline and plan for getting back to full capacity. /11
(me still) This has been going on for a while now, and looks set to continue in certain areas for even longer. There's only so long a company with the resources of fb should be able to keep pleading "pandemic" when the "pandemic" also makes adequate CoMo even more important /12
The Board says fb shd do this by "making this info viewable by country and language" & "underscores that more detailed transparency will help the public spot areas where errors are more common, incl potential specific impacts on minority groups, & alert fb to correct them." /14
These recommendations are targeted, strong, and important. They are not binding. How fb responds in 30 days is the most critical part of this process (& also the part ppl tend to pay least attention to). /15
There's a lot of (in many respects, justified!) skepticism of the Board. It can't fix everything. But these recommendations show why I'm still hopeful it can make some meaningful impact. If fb plays ball. /16
Okay, I think that's it from me. The decision is worth reading.

Your move, Facebook. /18
You can follow @evelyndouek.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: