Today @LEAFNational publishes a report I researched & authored, examining whether & how to apply liability to digital platforms for tech-facilitated gender-based violence, abuse, and harassment (TFGBV): https://www.leaf.ca/publication/deplatforming-misogyny/ Overview thread of key highlights: 1/x
Platformed TFGBV has characteristics distinguishing it from "traditional" gender-based violence & abuse (GBV). Frictionless, no physical boundaries, collapses social context across life spheres, socially gamified, networked, distributed, & normalizes+escalates GBV IRL.
Platform companies themselves exacerbate this problem bc their business models, design features ("affordances"), and content moderation policy decisions prioritize "growth" & "engagement" (and appeasing conservative politicians) above protecting historically marginalized groups.
Platforms' content moderation policies & decisions have also been too reactive & selective, responding mainly to public or political pressure, & hypocritically rely on "free speech" rhetoric to defend TFGBV while not applying that conviction to expression of marginalized users.
Conclusion: No to further relying on industry self-regulation. But platform liability schemes have been rife with sloppy drafting, misunderstanding of both pre-existing law *&* the Internet, & ignorance as to the *good* the Internet enables/d for historically marginalized groups.
So what to do? First, I looked at the current Canadian landscape. What user wrongdoing do we currently hold platform companies liable for? Copyright infringement (of COURSE 🙄), defamation, and "illicit activity" if in Quebec. But there's more!
Platforms could technically be held liable for criminal offences like NCDII (non-consensual distribution of intimate images) or criminal harassment if they meet the elements of the test or are "party to an offence" (s 22), or provincial NCDII laws that are silent on this point.
Platform liability is a spectrum: No liability but legal obligations (via courts & statutes) to assist victims / survivors / those impacted <--> Limited up to full liability upon notice + inaction <--> Direct liability if sufficiently involved & user's act is already illegal.
Conclusion: There is nothing in Canadian law that holds digital platform companies liable for TFGBV, specifically. Just a sparse patchwork of laws that could maybe apply to *some* things and haven't yet been tested in court.
What most interests me: Laws that didn't even contemplate digital platforms or TFGBV at all, could be the way forward. Hold platforms *institutionally* liable for systemic harm & discrimination using e.g., human rights statutes, corporate negligence, commercial host liability.
I also examined intermediary liability & platform regulation in USA (CDA 230, FOSTA-SESTA, @daniellecitron / @benjaminwittes proposal, @cagoldberglaw's Herrick v Grindr) UK (Online Harms), EU (E-Commerce, CoC, DSA), Germany (NetzDG), Aus (eSafety, SAVMA), NZ (HDCA, Christchurch).
Lessons: LISTEN TO THOSE IMPACTED (FOSTA-SESTA); good evaluation data is critical (no one knows if NetzDG worked?!); systemic harm is next frontier (UK "super-complaints" for systemic harm, and EU DSA requirement of periodic assessment & mitigation of systemic risk by platforms).
Now for crux of it all: BUT WHAT ABOUT FREEDOM OF EXPRESSION / THE CONSTITUTION / USER RIGHTS? Sthg that became clear to me over the course of this project is that along the way—thanks copyright lobbyists & law enforcement!—"digi /human rights" became free ex & privacy, period.
Fortunately, the Supreme Court of Canada (SCC) knows, & knew, better. The right to equality is—just like free expression & privacy—a fundamental & constitutional human right guaranteed by the Charter & enshrined in intl human rights law. Marginalized users need all 3 to have any.
The SCC & Federal Court of Appeal (FCA) found criminal & civil laws prohibiting hate speech, incl hate speech online, to be constitutional. This is because upholding equality requires proportionate limitations on free expression. Esp *gestures vaguely at entire Internet & world*
To be clear: PLATFORMS ARE NOT PUBLISHERS OR SPEAKERS. There is no "transitive property" from hate speech laws to platform liability. But the *underlying principles & reasoning* are more relevant & applicable today than ever, esp in platformed TFGBV context.
It is actually amazing (and depressing) how entirely on point decades-old SCC decisions remain in articulating some of the most pressing contemporary problems we have today (Whatcott, Taylor, Keegstra + FCA Lemire). Some excerpts in particular were heartbreakingly prophetic.
What makes hate speech constitutional to outlaw is near if not actually identical to the traits & impacts of platformed TFGBV. But not every user necessarily can or shd be liable for indiv posts—it's the critical mass that requires legal response, thus an institutional problem.
SCC also emphasized the importance of analyzing rights & limits on those rights *in context*. Platformed TFGBV goes against whole point of free expression and violates its core values (pursuit of truth, indiv self-fulfillment, participation in democracy).
Jane Bailey made underrated point that for historically marginalized, private abuse can be as devastating as state abuse. If you *only* have the state to fear, not other people, that is a privileged position. State action to stem private abuse is appropriate. SCC said this too.
But state intervention must be done thoughtfully, w/ nuance, driven by intersectional feminist principles & substantive equality. i.e., Not what we've seen Heritage doing!!! (Was segueing to Recs but hit max tweets & launch panel starting!! to be cont'd.) https://www.leaf.ca/news/new-leaf-report-tfgbv/
Okay. I did not know until today that you could hit a max tweet count in one thread!! Here are the Recs as promised—just a few b/c this thread is already so long (cont'd from this tweet: https://twitter.com/Cyn_K/status/1387424496991281154):
1) Legal regime truly addressing TFGBV must focus on that, at most including other systemic oppressions. Do not jeopardize the constitutionality of our protection by 'bundling' TFGBV w/ other, disparate issues (e.g., disinformation, copyright, legacy media funding, terrorism).
Or sets up whole framework for failure (constitutional & practical); at worst it will further state encroachment on privacy, free expression, & equality from other direction. And allows other interests to piggyback on equality to advance industry agendas.
2) We call for a TFGBV-specialized *expert* regulator w/ dual mandate: a) Adjudicative / remedies / enforcement and b) Training / education / research. Incl dispute resolution, audits, orders & AMPs, supporting frontline community orgs working in (TF)GBV, training police.
3) The "enabler" provision from the Copyright Act should be adapted to apply specifically to purpose-built platforms dedicated to generating & disseminating TFGBV, e.g., NCDII & hate speech-dedicated forums. ***This is not an endorsement of anything in the copyright context***
4) Above all, centre substantive equality & intersectional feminist principles. Must listen /defer to those impacted by & experts in TFGBV & the nexus of systemic oppression, human rights, equality, AND tech law/policy. My personal test: were you devastated by the tumblr ban y/n
Link to full report w/ Exec Summ & Recs also posted as indiv docs. I have so many more thoughts, incl on writing this report as a full-on "classic" digital rights / Internet freedom advocate, & ACKNOWLEDGEMENTS of so many ppl. But all in here somewhere! ➡️ https://www.leaf.ca/publication/deplatforming-misogyny/
You can follow @Cyn_K.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: