Categories: Tech

‘Sexist’ Amazon Alexa can’t answer Lionesses question

Amazon has been accused of sexism after its Alexa voice assistant was unable to respond to a question about the Lionesses' semi-final victory at the Women's World Cup.

When asked on Wednesday "for the result of the England-Australia football match today" it said there was no match.

"This was an error that has been fixed," an Amazon spokesperson said.

Academic Joanne Rodda – who alerted the BBC – said it showed "sexism in football was embedded in Alexa".

Dr Rodda, a senior lecturer in psychiatry at Kent and Medway Medical School – with an interest in artificial intelligence (AI), said she had only been able to get an answer from Alexa when she specified it was women's football she was interested in,.

"When I asked Alexa about the women's England-Australia football match today it gave me the result," she told the BBC.

The BBC was able to replicate what she had found on Alexa.

Responding to Amazon saying it had remedied the situation, Dr Rodda told the BBC it was "pretty sad that after almost a decade of Alexa, it's only today that the AI algorithm has been 'fixed' so that it now recognises woman's World Cup football as 'football'".

Image source, Getty ImagesImage caption, Amazon admits its device made an error

Amazon told the BBC that when a customer asks Alexa a question, information is pulled from a variety of sources, including Amazon, licensed content providers, and websites.

It added that it had automated systems which use AI to understand the context and pull out the most relevant information, but the systems got it wrong in this case.

The firm said it expected the systems to get better over time, adding that it has teams dedicated to help prevent similar situations in the future.

Dr Rodda also questioned the extent to which the problem had actually been fixed, saying she still found similar problems with the Women's Super League.

"Out of interest, I just asked Alexa who Arsenal football team are playing in October," she said.

"It replied with information about the men's team, and wasn't able to give an answer when I asked specifically about women's fixtures.

"It highlights how heavily embedded societal biases are within AI," Dr Rodda said.

Share

Recent Posts

Check if your passwords were stolen in huge leak

If you have not checked your credentials lately, now is the time.  A staggering 1.3…

57 minutes ago

Elon Musk teases a future run by robots

Elon Musk has shared a new, alarming video on social media that paints a world…

8 hours ago

Chinese hackers turned AI tools into an automated attack machine

Cybersecurity has been reshaped by the rapid rise of advanced artificial intelligence tools, and recent…

1 day ago

Apple Watch sleep score: What your number really means

Apple added a new Sleep Score feature that gives you a single number to sum…

1 day ago

Find a lost phone that is off or dead

Losing your phone can leave you in panic mode, especially when the battery dies. The…

2 days ago

Fox News AI Newsletter: How to stop AI from scanning your email

IN TODAY’S NEWSLETTER: - How to stop Google AI from scanning your Gmail- IRS to…

2 days ago