We have to go through this again? Really? Fine.

This is a silly idea. That doesn't mean it isn't dangerous. https://twitter.com/WIRED/status/1325433641804070913
This isn't science, it's pseudoscience at the very best. Forget about Popper and Kuhn and even Foucault: this is nothing more than guesswork. They don't even try to disguise that it's based on guesses in the article (though, I'm sure, their A-round pitch deck avoids the word).
They may respond "Sometimes guesses are good! That's Popper's point, right? We making estimates and, if we falsify them, we can either improve a theory or reject it."

This point has the benefit of being only mostly wrong, which is not as good as being mostly dead.
Look, it's fine to accept the idea that we manifest emotions in gait. It's actually a widely accepted notion, and we sense it intuitively. Look at the stick figures in the original post: all the "happy" one needs are paint cans and "Stayin' Alive" in the background. We get it.
But it's also something that we're conditioned to see *and* something that the article itself framed for you. Would you have assumed that the stick figure was "happy" if it had not already been tagged that way? You might've thought "cocky" or "energetic" or "Vince McMahon."
The "sad" one is the same. Strip away "SAD" from the image and play it at the same pace as "happy" and you'd think perhaps neutral, or at least not sad. Remember, too, that not *everyone* manifests sadness in the same way. It isn't always blatant.
The obvious problems are obvious: these oversimplify human emotion, blatantly disregard disability, almost certainly have major bias problems (given that the sample data was what a handful people thought someone else's emotions "looked like"), etc etc etc.
The secondary problems are there too, and not at all hard to spot. @shoshanazuboff and @julie17usc have written convincingly of the legal/economic system that makes human behaviour into surplus for fuelling what Cohen calls "Informational Capitalism," in effect: the data economy.
More deeply, the problem is one of decontextualisation and the annihilation of nuance. Humans are notoriously hard to predict (ask a pollster). For decades, some have worked to identify how we think and act in order to better reduce risk and liability. @FrankPasquale
Tech and data companies have had two decades to perfect the art, but it's clear that there is no analytic tool clever or fast enough to predict human behaviour consistently enough. And so an elegant solution appears: don't try to figure out human behaviour, channel it.
By creating frameworks about how to act and what is permissible (frameworks that are all built by lawyers, of course), you remove the last remaining doubt about human behaviour and make predictability and conformance not only utile, but profitable and consistent.
It's self-sustaining. You "need" to walk this way at work. An ad shows you what shoes you "need" to support your back based on how you need to walk. You don't buy the shoes; your rating at work drops. A resulting micro-change to your credit score: now you can't afford the shoes.
That's not Black Mirror -- this is the system as it exists, or will exists within the next year or so. It all begins with the transformation of our basic humanity (walking, having feelings) into grist for the mill of predictability. This is behaviourism made sacramental.
This is why vigilance matters. It's not always a Boston Dynamics killer robot dog -- sometimes, the most serious of things are presented in the silliest of ways.
You can follow @PrivacyLaw_JJW.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: