Categories: Tech

‘Sexist’ Amazon Alexa can’t answer Lionesses question

Amazon has been accused of sexism after its Alexa voice assistant was unable to respond to a question about the Lionesses' semi-final victory at the Women's World Cup.

When asked on Wednesday "for the result of the England-Australia football match today" it said there was no match.

"This was an error that has been fixed," an Amazon spokesperson said.

Academic Joanne Rodda – who alerted the BBC – said it showed "sexism in football was embedded in Alexa".

Dr Rodda, a senior lecturer in psychiatry at Kent and Medway Medical School – with an interest in artificial intelligence (AI), said she had only been able to get an answer from Alexa when she specified it was women's football she was interested in,.

"When I asked Alexa about the women's England-Australia football match today it gave me the result," she told the BBC.

The BBC was able to replicate what she had found on Alexa.

Responding to Amazon saying it had remedied the situation, Dr Rodda told the BBC it was "pretty sad that after almost a decade of Alexa, it's only today that the AI algorithm has been 'fixed' so that it now recognises woman's World Cup football as 'football'".

Image source, Getty ImagesImage caption, Amazon admits its device made an error

Amazon told the BBC that when a customer asks Alexa a question, information is pulled from a variety of sources, including Amazon, licensed content providers, and websites.

It added that it had automated systems which use AI to understand the context and pull out the most relevant information, but the systems got it wrong in this case.

The firm said it expected the systems to get better over time, adding that it has teams dedicated to help prevent similar situations in the future.

Dr Rodda also questioned the extent to which the problem had actually been fixed, saying she still found similar problems with the Women's Super League.

"Out of interest, I just asked Alexa who Arsenal football team are playing in October," she said.

"It replied with information about the men's team, and wasn't able to give an answer when I asked specifically about women's fixtures.

"It highlights how heavily embedded societal biases are within AI," Dr Rodda said.

Share

Recent Posts

1 billion identity records exposed in ID verification data leak

Things like your name, home address, date of birth and even your Social Security number…

2 weeks ago

Android fixes 129 security flaws in major phone update

Most people never think about Android security updates until a headline like this appears. Suddenly,…

2 weeks ago

Burger King AI listens to workers

The next time you pull up to the drive-thru at Burger King, you may notice…

2 weeks ago

Fake Google Gemini AI pushes ‘Google Coin’ crypto scam

You may think you can spot a crypto scam from a mile away. But what…

2 weeks ago

Tesla builds a car with no steering wheel. Now what?

The first Tesla Cybercab has officially rolled off the floor at Tesla Gigafactory Texas. And…

2 weeks ago

Meta smart glasses privacy concerns grow

Smart glasses promise a future where technology blends into everyday life. You can ask a…

2 weeks ago