We are totally unprepared for the effects recommendation-engine fueled radicalizing memes are having on society.
Back in the early days of reddit the community noticed that certain types of content could exploit a fast feedback loop, motivating behaviors that would rocket the content to the front page.
"Vote up if your cat is a jerk"

"Test post please ignore"

"Urgent! We need your help now to prove that..."

"Vote up if you're tired of vote up if posts"
These memes win by competing unfairly. They provide extrinsic motivation to share. They prey on curiosity. They spark strong emotions.

It's an old story, preceded by chain emails, chain letters, etc. in pretty much any written communication medium that has ever existed.
Think about this through the lens of contagion. There have always been harmful ideas that have spread virally, but they seem to be impacting the past decade at an accelerating, runaway pace. Why?

What's happening right now is twofold.
First, social media serves as a laboratory for humans to develop and refine catchy ideas. And then they escape.

Experienced denizens of online communities by now have been exposed to a raft of these memes and have developed resistance to them. But the broader internet has not.
So — through chance or on purpose — when a particularly powerful meme leaps from a thoroughly jaded community like 4chan to your extended family's Facebook feed, there's a dramatic increase in potency.
Communities like 4chan and reddit have developed defenses and skepticism towards most kinds of catchy ideas. They do so because they see this stuff all. the. time. It bores them. Everyone's already seen these things because they spread the fastest and farthest.
Many communities even specifically create guidelines to outlaw the most prevalent patterns, such as "vote up if" posts.

Your extended Facebook circle, however? They have built up no such antibodies.
Which leads to the second problem: recommendation algorithms try to predict content that will elicit a measurable reaction from viewers.

What content is the best suited for that? The most potent, most viral, most manipulative memes.
We have built automated systems to identify and further promote the most effective memes to populations that don't have a built-up skepticism of them.

This content is highly contagious, self-perpetuating, and causes harm to its host and others. We are spreading disease.
Society is NOT prepared to deal with this. We don't even have a vocabulary for this kind of problem yet. This is an illness we don't know the name of.

History shows the awful repercussions of large groups of people with no developed resistance being exposed to new contagion.
The people who are most affected don't perceive this as a threat. Which makes it particularly difficult to communicate with them about ideas that bend their thought process, their very critical thinking, into spreading further.
When internet communities grappled with this in the mid 2000s, they slammed into the tension of banning kinds of speech. It is incredibly loaded to call these ideas a kind of illness, but I know of no suitable term that conveys their contagiousness and malignancy.
We have to create a vocabulary for this.

We have to bridge poisoned communication lines and help loved ones understand what the internet is doing to them.

We have to increase awareness and literacy in the general public about — I can't believe I'm saying this — memes.
You can follow @chromakode.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: