In a project for Copenhagen Pride, a Danish team has produced Q, a gender-neutral AI voice that could be used for virtual assistants, such as Siri or Cortana.
The team is responding to concerns that using a female voice as default on these services reinforces gender and sexist stereotypes, in virtue of users ordering around the assistant. This research has opened up the debate on gendered voices in virtual assistants, allowing a much needed examination on the potential harms particular software decisions can have on society.
In this example we are left wondering, why is it that all the virtual assistants are female by default? Is this reason not cause for concern? Does this fact highlight problematic perceptions in society and should tech aim to address this?