It was only a few years ago that Google used to highly rank:

- Lots of untrue, anti-vax propaganda
- Inaccurate medical info
- Hyper-partisan political content
- Conspiracy theories about the climate change, holocaust, flat-earth, & more

A thread about how that changed.
Trigger warning: the catalyst was a school shooting.

Specifically, the Sandy Hook elementary school shooting, in which a gunman massacred 20 first-grade elementary students. Six and seven year olds. It's almost too terrifying to comprehend; I choke up even thinking about it.
The awfulness of that event was compounded by the rise of an Internet-spread conspiracy theory.

For those unfamiliar, NY Mag did an in-depth piece on the horrifying hoax, and how Google's search results were the starting point for many of its amplifiers: https://nymag.com/intelligencer/2016/09/the-sandy-hook-hoax.html
Mass shootings like Sandy Hook spurred Google's engineers to change how they weighted:

Popularity signals (links, clicks, engagement, etc)
VS.
Accuracy signals (factual info, trustworthy sources, etc)

Excellent reporting on that here: https://www.theguardian.com/technology/2019/jul/02/google-tweaked-algorithm-after-rise-in-us-shootings
You'll no longer find misinformation or disinformation in searches that previously surfaced a lot of it (e.g. https://www.google.com/search?q=sandy+hook+shooting)
Tragically, in the medical world, Facebook & Google's efforts around this have been too little, too late.

Anti-vax conspiracy content, and its spread in search results and social media, can be directly correlated (time series + geo) to the rise in Western vaccine hesitancy.
Google's trying, and they deserve to be commended for stepping up here. But the Pandora's Box was opened.

I'm hopeful that thanks to these algorithmic shifts toward accurate, truthful content > engagement-optimized conspiracy hoaxes, it'll be harder for future misinfo to spread.
You can see what this has done to sites with medical misinfo like Mercola ( https://twitter.com/CyrusShepard/status/1143412325925265408).

The same is true for many sites whose accuracy and trustworthiness in sectors like medicine, finance, politics, history, etc. is subpar (or intentionally misleading).
Good example is the "disinformation dozen," whose work can be directly attributed to more than 65% of all vaccine misinformation on Facebook & Twitter: https://www.cbsnews.com/news/vaccine-disinformation-social-media-center-for-countering-digital-hate-report/

Google has demoted *all* of these sites (best I can tell).
If you're in these sectors, it might pay to ask if your content fits the criteria Google's now requiring.

(NOTE: I'm not in SEO! If interested, you should follow folks like @lilyraynyc, @Marie_Haynes, & @dr_pete who are far more knowledgeable on these topics)
IMO, weighting of these factual / content quality signals happens on a scale and depends on the search query.

In some sectors ("capybara memes") there's not much need for trustworthy sources > high-engagement ones. In other sectors ("are vaccines safe?") it really matters.
(Quick update)

Twitter's already removed two responses to the thread above falsely claiming vaccine misinformation (those tweets appeared in my mentions, but had the "removed" message when clicked).

So, Google's not alone is making progress here!
You can follow @randfish.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: