A woman whose private conversation with her husband was recorded by their Amazon Alexa and sent to a friend by email without their knowledge said she feels "invaded" by the worrying intrusion.
Luckily for the the Oregon couple, the discussion picked up by the hands-free devices placed around their home went no further than the subject of hardwood flooring.
But Danielle, who did not want to give her last name in local press reports, said that although the conversation was not highly personal, she nevertheless felt her privacy had been compromised by what Amazon said was a series of unfortunate events.
Every room in the family home in Portland is wired with the devices, which control the home’s heat, lighting and security system. Amazon said Alexa misinterpreted the conversation as a set of demands which led to the conversation being packaged up and sent to a seemingly random contact.
During the conversation in question, Danielle received a call from one of her husband’s employees, warning the couple: "unplug your Alexa devices right now! You’re being hacked.”
"At first, my husband was, like, ‘no you didn’t!’ And the (recipient of the message) said ‘You sat there talking about hardwood floors.’ And we said, ‘oh gosh, you really did hear us.’"
She unplugged all of her devices and contacted Amazon, who sent engineers to her house. Danielle said they apologised to her profusely.
However, she said the device did not audibly advise her it was preparing to send the recording, which it is programmed to do.
Amazon has given an explanation for the strange event, insisting its devices do not listen to customers unless they are "woken up" by the word “Alexa.”
According to the company, in this instance an “unlikely” series of events occurred, which lead to the device thinking the pair wanted their conversation about hardwood floors recorded and sent to their friend.
Amazon Alexa | Everything you need to know
A spokesperson explained: "Echo woke up due to a word in background conversation sounding like ‘Alexa.’ Then, the subsequent conversation was heard as a ‘send message’ request.
"At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’
"Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely."
Many have been suspicious of the company encouraging customers to place listening devices in every room of their homes.
However, Amazon maintains this event was a malfunction, and not proof that Alexa is always “listening.”
In the past, the company has filed patent applications for more invasive listening devices, which record all of the time. One such application included an algorithm that would analyse when people say they “love” or “bought” something.
This patent included a diagram in which two people having a telephone conversation were given separate targeted advertisements after they hung up.
There were also worries in 2016 when scientists found that voice assistants could be woken up by sounds unintelligible to humans.
The group found that commands could be hidden in white noise, with the device switching on and going to websites without being asked by a human.
In May, these findings went further, with researchers claiming they could embed commands directly into recordings of music or spoken text.
This could mean that as a human listens to music, the voice assistant may hear an instruction to send a message, or add a product to a shopping list.
Click Here: Golf Equipment Online