There is a factual error in this story that we need to discuss as researchers.

White supremacists didn’t “learn from Russia.” In fact, US groups have long used the affordances of social media to impersonate activists, journalists, educators, and others. https://twitter.com/vekstra/status/1290615663183171584
For some reason, disinformation researchers keep reiterating this same line that white supremacists have adopted “the Russian playbook” and other such tropes.

That historical inaccuracy makes it very hard to see how domestic actors use platform’s features to their advantage.
Very few researchers actually study the words and actions of white supremacists. Instead, many glean a research insight and parrot it back to media without checking to see if it’s really empirically true or in what limited situations the claim can be applied.
But, framing the history of online disinformation as a “foreign issue,” lets US tech companies off the hook for their own responsibility in design.

If we admit that tech platforms have been manipulated by US political actors long before we get to 2016, then what “fix” is needed?
We can do better as a field to understand the politics of our own research, lest we allow tech companies and politicians to use our work as a shield for their desired outcomes.
And yes, I know I am annoying about this single point and often bring it up, but it pervades so much of how students and journalists conceptualize social change, so I can’t ignore it.
You can follow @BostonJoan.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: