Here’s How a Genderless Virtual Assistant Is Undoing Gender Bias in Artificial Intelligence

Photo Courtesy: Boris SV/Moment/Getty Images

Although it might’ve seemed like something out of The Jetsons a decade ago, many of us have casually held up our smartphones to ask Siri a question. These days, intelligent virtual assistants — or artificial intelligence (A.I.) assistants — are commonplace. Using an A.I. system, these assistants emulate human interactions and complete tasks for us. For example, Microsoft’s Cortana can sift through search engine results and set reminders; nameless, voice-activated assistants read off our in-vehicle GPS directions; and Amazon’s ever-popular Alexa can emcee your living room entertainment setup.

Needless to say, everyone from Apple to BMW is investing in this tech, making virtual assistants rather ubiquitous. "We’re at the point now where these artificial intelligence-driven interactions are more than just a feature – they’re the products themselves," Phil Gray, executive vice president of business development at Interactions, writes. "Consumers are comfortable speaking with ‘a robot’ when the productivity and convenience value is there." That is, asking Siri to look something up no longer holds that Jetsons-esque novelty — nor is it the latest party trick. Instead, voice-activated assistants are expected.

Some of the most popular intelligent virtual assistants — Siri, Cortana and Alexa — have something else in common: Their default English-language voices are all made possible thanks to women. The woman behind Apple’s Siri is veteran voice actor Susan Bennett. Cortana, which is based on a synthetic intelligence character from Microsoft’s Halo franchise, is voiced by Jen Taylor, just like in the video game series. And, although Alexa isn’t any real person’s voice — generated instead from the rules of Text to Speech (TTS) and Artificial Intelligence Technology — Amazon’s assistant has, arguably, the most traditionally gendered name of all the popular A.I. assistants out there. And this raises several red flags.

ADVERTISEMENT