A new report by Bloomberg says thousands of Amazon employees stationed around the world listen to voice recordings captured from Amazon Echo owners. The employees are part of Amazon’s effort to make the company’s virtual assistant Alexa better respond to voice commands.
The recordings are listened to, transcribed, annotated, and added back into the software by a mix of contractors and full-time Amazon employees in locations ranging from Boston to Costa Rica to India and to Romania. The people work nine hours a day, and each reviewer can parse as many as 1,000 audio clips per shift
The process was described by seven people who have worked on the program, all who signed nondisclosure agreements barring them from speaking publicly about the program.
While most of the work is best described as “mundane,” employees have listened to more private recordings, such as a woman singing in a shower, or a child screaming for help. Amazon employees share some recordings in an internal chat room, if they need help in understanding a word, or when an “amusing recording” is found.
Although Amazon says there are procedures in place to handle recordings that are upsetting or potentially criminal, some employees say they have been told ” it wasn’t Amazon’s job to interfere.”
Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.
Bloomberg says the recordings don’t include a user’s full name or address, but an account number, first name, and the device’s serial number are associated with the recording.
Amazon told Bloomberg that there are measures in place to protect a user’s identity.
We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.
We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.
Alexa users do have the option to disable the use of their voice recordings for improving the service, but many of the users may not be aware of the option.
It should be noted that Amazon is not alone in using recordings to improve their service. Apple also has employees that listen to Siri queries to ensure the information delivered lines up properly with the user’s request. All Siri recordings are stripped of identifiable information, and are stored using a random identifier.
Google also has a similar team that accesses audio snippets from the company’s Google Assistant for the purpose of improving the product’s responses. Google also removes personally identifiable information and goes an extra step, by distorting the audio.
If you are concerned with your Alexa-enabled product, go into the Alexa app on your device, go to “Settings,” “Alexa Account,” “Alexa Privacy,” and “Manage How Your Data Improves Alexa.” There you can control how Amazon uses your voice recordings.