"The microphone on the Amazon Halo Band isn’t meant for voice commands; instead it listens to your voice and reports back on what it believes your emotional state was throughout the day." I have a whoooooole extra-spicy batch of questions! 1/n https://twitter.com/backlon/status/1298969366139228163
Per article: "It picks up on the pitch, intensity, rhythm, and tempo of your voice and then categorizes them into 'notable moments'..."

So, what's in your training dataset? What's the gender, race, age split? What dialects were included? 2/n
Turns out @backlon asked this! Amazon's rep gave no info on the demographic makeup of the dataset, but did say that the feature was trained on American English, and "if you have an accent," it will be less accurate.

I...
Before I shed my human form and swallow Amazon whole, let's break down this bog-standard linguistic discrimination together!

1. You used American English. Which dialect? Because sometimes phrases that read as angry/aggressive in one dialect are totally normal in another! 4/n
2. Define "accent" for me. Do you realize that everyone has an accent? Do you realize that lots of accents in American English are completely opaque to other American-English speakers? My dad puts earl inis ian-jin blahc, which confuses about 90% of people 100% of the time. 5/n
3. No, really, what's your gender/race/age/disability spread? A product that purports to give emotional feedback to the user using Fancy Technology that is ONLY trained on the voices of Hearing white dudes aged 20-40 is going to do material harm to people who are not that. 6/n
AN EXAMPLE! I gave a talk at a Big Deal Place a few years ago as part of a panel. It went well!

Then one of the other panelists got up to give his talk. It was on his app, which analyzes your voice and gives you feedback so you can take control of your day/life/etc. 7/n
He gave a bunch of examples of recorded voices run through the app, which they told you whether the speaker was sad/anxious/happy. The whole idea behind the app is that there are inflectional constants in English, and you can use these to your advantage.

Great! Except no! 8/n
Lots of the recorded examples were ambiguous. Yes, a wavery voice can mean sadness or anxiety. It can also mean that you have a physical condition that causes a vocal tremor. Or you just have _a wavery voice_!

We did that small-talk thing afterwards. He pulled out his app. 9/n
"I ran your talk through my app, and it told me you were anxious, apprehensive, and worried," he said. "I thought you might want to work on your speech delivery."

Reader: I tore out his heart and ate it backstage.

Not really, but I did actually laugh at him! 10/n
See, here is my terrible secret: I can't project my voice _for shit_. So when I give a talk with a bad microphone, my voice can sound strained. Because it is! It gets squeezed.

Do not tell me to get vocal training! You are missing my point! 11/n
I laughed and told him that I _wasn't_ any of those things; that I just suck at projecting my voice and I had a great, relaxed time.

*I* had the knowledge to know his tech was bad. One of the other panelists didn't. He told her the same thing; it ruined the day for her. 12/n
Which leads me to my final question:
4. If your training data is bad, then the result will be bad. With a product like this, that has real-world consequences for people. Will you require liability waivers to use your shitty tech, or will you pull it and do better? 13/FIN
(This rant brought to you by Too Much Packing!, the new moving-house game from Mattel!)
You can follow @KoryStamper.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: