Amazon's Alexa Spreads Misinformation, Citing Fact-Checking Site
Amazon's voice assistant Alexa has been providing incorrect information on various topics, mistakenly attributing it to Full Fact, prompting efforts to resolve the issue.
- Full Fact discovered that Alexa was giving false information on topics like MPs' expenses and the Northern Lights, incorrectly citing their fact checks.
- Amazon has acknowledged the errors and stated they are working to fix the issue, which may already be resolved as incorrect responses could not be replicated later.
- Examples of misinformation included claims about UK diplomatic relations, NHS waiting list figures, and comments falsely attributed to Mike Tyson.
- The problem arose from Alexa misinterpreting the 'claim' part of fact checks as verified information, undermining trust in fact-checking sources.
- Other virtual assistants like Google Assistant and Apple's Siri did not replicate these errors, although Google Assistant provided one misleading response.