Voice Assistant Devices

From Privacy Wiki
Jump to navigation Jump to search
Voice Assistant Devices
Short Title Amazon, Google and Apple Voice Assistants Recording Users’ Conversations Without Them Knowing
Location Global
Date early 2020

Taxonomy Surveillance, Identification
Information Communication, Identifying
Threat Actors Google, Apple, Amazon

Affected Users of voice assistants
High Risk Groups
Secondary Consequences

A research found out that voice assistants from Amazon, Google, and Apple wake up regularly when they hear words similar to their waking words.


By design, smart voice assistants such as Amazon’s Alexa, Google Assistant, or Apple’s Siri have microphones that are always on, listening for so-called wake words. They are supposed to start recording only after hearing those cues.

However, a research in early 2020 found out that the devices also regularly “wake up” after hearing words similar to their wake words. In these cases, the devices record, store, and let companies’ employees listen to the audio files that were never supposed to be recorded. Users aren’t aware of their conversations existing in audio files, but it is easy to identify personal, often sensitive information through such files. Surveillance

Another issue here is that recordings, including ones that individuals don’t know exist, are being systematically listened to by humans: employees of Apple, Google, and Amazon for the purpose of improving the service. This allows Identification of individuals, who’s information is being recorded because it often includes names, home, work addresses, and private conversations.

Apple, Google, and Amazon do not explicitly disclose it in its consumer-facing privacy documentation.

Risk Statistics

Laws and Regulations