The UNESCO has decided to launch the alarm with a report in which it accuses Alexa, Siri and the others of being instruments that spread the image of a meek and submissive woman

Risultati immagini per vocal assistant

«Max, but did you drink your brain? It’s the third time you ask me the same thing: you need phosphorus! ” Who knows, tomorrow the voice assistants that are spreading in our homes could also respond in this way – with brutal frankness, irony or even an angry voice – to an improper or poorly formulated request. It would be curious, of course, but it would also be a step forward compared to the subtle answers, monotonous and to the tone of submission produced today by the software with which Amazon, Apple, Microsoft and Google feed their electronic butlers: Alexa, Siri, Cortana and Google Home. Three out of four have female names and, in most cases, a woman’s voice. Who always responds in a polite tone and with accommodating words, even when one turns to these machines with arrogance, using uncivilized expressions or veritable insults. In America, if you tell Alexa “you’re a whore”, you hear yourself answer “thanks for the feedback”.

It may seem only a curious technical deficiency, given also the scarce diffusion of these instruments. But in different parts of the world the voice assistant is widespread, especially among young people. In America it is predicted that in five years people will talk more with these synthetic voices in their homes than with their families. The impact on social habits and language therefore risks being noticeable: as has happened at other times since the beginning of the digital age, we will notice the shift of our customs to things done. This is why UNESCO has decided to raise the alarm with a report in which it accuses Alexa, Siri and others of being sexist tools that spread the image of a meek and submissive woman. It is the tip of the iceberg of a phenomenon that is hardly visible but has been known for some time: algorithms and digital codes that we blindly accept as objective and neutral, are actually affected by the prejudices of those who built them. In the case of voice assistants, companies have chosen female voices (only in Arab countries, in Holland and in France, male ones prevail) because market research suggests this, but also because 85-90 percent of their programmers are white males . Now Google has begun to change its logbook by randomly alternating different voices. Maybe one day the tones will also change.