Here's @revanhoe and @adanielescu talking about Gender Inclusive Conversational AIs. (And parallels to other Biases)

This sounds like it will be a complicated and nuanced, yet important, topic.
(I can't keep up - I'm sorry!)

We have evidence of gender bias:
- Devices work better for men
- Personalities display gender stereotypes
- Harmful behavior
"[A]t least 5% of interactions [with chatbots] were unambiguously sexually explicit."

When interactions were flirtatious or worse - the chatbots often encourage this.
So why is it harmful that our chatbots are female presenting?

Because they reinforce the stereotype of "barking orders at women".
But we don't need to gender our things.

We have an opportunity to do better...

(cue @adanielescu)
Most people associate a gender with the device based on the voice they hear. But a non-binary voice is sometimes heard as "strange".

There isn't a straightforward solution. But this is part of it.
Gendering voice is more than just pitch, it is also intonation, word choice, etc.
Notion of differing word choices is *very* interesting to me. I'm wondering how much we should be varying based on the context of the person asking... or if there is a more neutral way to create replies. But sounds like not yet. (Can't find a handle for Sharone, if they have one)
Big takeaways:
- Be deliberate in our thinking about gender and voice agents
- Be active in discouraging sexual harassment
- Be inclusive to capture everyone's needs and represent them
I know I thought long and hard about which voice to use when working on @VodoDrive. I initially went with a male-sounding voice, since I didn't want a stereotypical image of a boss yelling at a secretary for numbers, but have elaborate plans to make that highly tailorable.
You can follow @afirstenberg.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: