2:

Using the technical abbreviation for the virus, SARS-CoV-2 (Severe Acute Respiratory Syndrome Coronavirus 2), the study concludes:
3:

“The population prevalence of SARS-CoV-2 antibodies in Santa Clara County implies that the infection is much more widespread than indicated by the number of confirmed cases. Population prevalence estimates can now be used to calibrate epidemic and mortality projections.”
4:

How much more widespread do they think the virus is? In their Abstract’s Results they write:

“These prevalence estimates represent a range between 48,000 and 81,000 people infected in Santa Clara County by early April, 50-85-fold more than the number of confirmed cases.”
5:

As Dr. Ioannidis explained in his video interview, this means that a very large number of people who were either asymptomatic or had mild symptoms have already had COVID-19 (which is the technical abbreviation for the disease, Coronavirus Disease 2019, not the virus)...
6:

...and this in turn means that the actual Death Rate from COVID-19 is much, much lower than we can measure based on Confirmed Cases, perhaps very similar to that of the seasonal flu.
7:

OK, that’s the story of the in a nutshell. And I hope that the study turns out to be true, so I have a bias. But...

The study has not gone unchallenged by heavyweight experts in this field. That’s what the article in my original tweet is about.
8:

This is what science is supposed to do. There is supposed to be give-and-take and even debate, sometimes passionate disputing, between academics as part of this process.
9:

But (a) this usually does not happen in such a public way because (b) not everything about the Stanford study has conformed to the standard scientific process.
10:

The article was rushed into publication as an “unrefereed preprint” & was therefore 𝘯𝘰𝘵 peer reviewed, so it wasn’t assessed by a panel of expert “referees” to “identify weaknesses in its assumptions, methods, and conclusions” for correction before final publication.
11:

The reason they went public with the study before peer review seems obvious: it could have important ramifications for current & urgent public policy decisions.
12:

But that reason is a blade that cuts two ways: what if it influences public policy for the worse because it doesn’t have the benefit of having potential errors corrected during the peer review process?
13:

I am not an expert in these things. Therefore, when the heavyweights jump into the ring to duke this kind of issue out, I stay on the other side of the ropes. If you’re not an expert, I recommend you do the same. Just sayin’.
14:

In the following tweets I will publish information some of you might find helpful—or overwhelming, depending on your background, personality, and how much coffee you’ve had today. Let me know if I missed anything significant.
15:

Point Cited in Favor of Stanford Study:

A previous study by USC done in LA with a smaller but more representative sample including more minority groups & recruiting subjects with a market research firm instead of a Facebook ad drew similar conclusions.
16:

Criticisms of Stanford Study Included:

1. Sampling problem #1: Using Facebook introduces a “consent problem” because it could have attracted people who thought they were exposed and wanted to be tested.
17:

2. Sampling problem #2: Using Facebook introduced a “self-selection bias” that nullifies much of the virus’s prevalence.
18:

3. Test-kit problem: Although it was the best available at the time, & although the study adjusted for this problem, it used a non-FDA approved test kit known to yield a high “false positive” rate.
19:

4. Estimation problem: The study’s “confidence intervals” (the range around a measurement that conveys how precise the measurement) do not reflect a careful approach.
20/X:

5. Allegation of political bias: 3 of the study’s authors, Jay Bhattacharya, Eran Bendavid, & John Ioannidis had previously published opinions that the pandemic does not justify universal quarantines or drastic economic sacrifices.
21/X:

Responses from Stanford Study Authors Include:

1. They’ll issue detailed appendix addressing “constructive comments and suggestions.”
2. They’ll issue expanded version of the study (but is this study + appendix?).
3. “The results remain very robust,” said Dr. Ioannidis.
22/X:

Planned Studies Include:

1. May: UC Berkeley to test 5,000 in E. Bay.
2. MON (4/20?): UC SF to test 1,680 in Bolinas, CA.
3. SAT: UCSF to test 5,700 in SF’s Mission District.
4. Results expected soon from surveys in China, Australia, Iceland, Italy & Germany.
23:

Contributors to the Stanford Study:

1. Eran Bendavid, Stanford U School of Med.
2. Bianca Mulaney, Stanford U School of Med.
3. Neeraj Sood, Sol Price School of Public Policy, UCLA
4. Soleil Shah, Stanford U School of Med.
5. Emilia Ling, Stanford U School of Med.
24:

6. Rebecca Bromley-Dulfano, Stanford U School of Med
7. Cara Lai, Stanford U School of Med
8. Zoe Weissberg, Stanford U School of Med
9. Rodrigo Saavedra-Walker, Health Education is Power, Inc., Palo Alto CA
25:

10. Jim Tedrow, The Compliance Resource Group, Inc., Oklahoma City OK
11. Dona Tversky, Dept of Psychiatry & Behavioral Sciences, Stanford U School of Medicine
12. Andrew Bogan, Bogan Associates, LLC, Palo Alto CA
13. Thomas Kupiec, ARL BioPharma, Inc., Oklahoma City OK
26:

14. Daniel Eichner, Sports Medicine Research & Testing Lab, Salt Lake City UT
15. Ribhav Gupta, Dept of Epidemiology & Population Health, Stanford U. School of Medicine
27:

16. John P.A. Ioannidis, Dept of Medicine, & Dept of Epidemiology & Population Health, Stanford U School of Medicine
17. Jay Bhattacharya, Dept of Medicine, Stanford U School of Medicine, Stanford CA
28:

List of Critics of the Stanford Study, as Cited by Lisa M. Krieger/The Mercury News:

1. Andrew Gelman, a professor of statistics and political science and director of the Applied Statistics Center at Columbia University.
2. Erik van Nimwegen of the University of Basel
29:

3. Marm Kilpatrick, an infectious disease researcher at the University of California Santa Cruz
4. Statistician John Cherian of D. E. Shaw Research, a computational biochemistry company
5. Biostatistician Natalie E. Dean of the University of Florida
30:

6. Lonnie Chrisman, chief technical officer, Lumina Decision Systems, Los Gatos
7. Alan M. Cole, Senior Economist, U.S. Congressional Joint Economic Committee, Republicans (Note: I had to look up Cole’s credentials; the article only supplied one of his tweets.)
You can follow @ronhenzel.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: