Did you know that Amazon Alexa could be activated by the sounds you make when you have sex?
The adoption rate of smart speakers in our homes has been on an exponential rise over the last few years and, it is estimated that, more than a quarter of households in the U.S. and one in five British homes have one. They offer an excellent virtual assistant service from delivering information about news and weather to offering quick access to multi-media and even acting as a central controller for other smart home devices. Their future looks assured in the modern way of life and technologists are already looking to see how they can be integrated into more essential services such as healthcare and welfare support in the community. But is there a darker side to this technology that might have an impact on our privacy and do they ‘listen’ to our more intimate moments?
In this feature we take a look at the recent reports that Amazon’s Alexa (and other virtual assistants) are ‘snooping’ on our sex lives. We’ll also find out what is being done to protect our privacy and what we can do to prevent this kind of ‘eavesdropping’ on our intimate moments.
Virtual Assistants: Are They Listening To Everything?
Okay, so it’s not just Amazon Alexa that this applies to but all popular virtual assistant devices including (but not limited to) Apple’s Siri and Google Assistant.
But are they really listening to everything we do?
The short answer is no….but yes. In order to deliver a fast response to your inquiries, virtual assistants are required to listen constantly so they can deliver their service when you need them. They should only be listening for an activation word such as a question or command.
With voice-activated technology, all sorts of sounds can trigger them into action. From dogs barking to kids laughing and, of course, the noises we make when we are having sex.
And it’s not just accidental triggering of Alexa that is causing a stir but with the launch of devices like the Lovense remote app couples can remotely operate teledildonics using Amazon’s virtual assistant, there are ways we are using these digital personal assistants that allows them access to a lot of sensitive voice commands and potentially very personal information.
But, it’s just a box of wires, so no problem, right?
All of the companies behind these devices offer their service on the basis of your accepting their terms and conditions, most of which have included (or still do include) the right for them to monitor recordings for quality and to help improve their service.
Now, that’s where the privacy issue comes in that’s getting people riled.
Virtual Assistants & Privacy Issues
This year, Amazon, Apple and Google admitted that, in order to grade their service for quality control, anonymized records from a sample of their users are analysed by human technicians; often subcontractors.
The media coverage followed reports made by a whistle-blower who was working for a Bulgarian subcontracting company whose job was to monitor huge quantities of Alexa recordings. The analyst reported to a UK national newspaper, The Sun, that she had even heard what sounded like a sex attack.
An unnamed source has also revealed to the media that Apple also uses similar teams to help its software ‘learn’ and that it was common for sensitive information to be heard during the analysis of recordings
Countless discussions between doctors and patients, sexual encounters and even criminal activities were all on the list of recordings that were made by Siri being triggered accidentally. Unlike a smart speaker, Siri is often activated using a mobile phone or with an Apple Watch. Both have extremely sensitive activation and even the sound of a zipper can trigger the microphone. In the case of the Apple Watch which can be set to be activated by raising your wrist to speak, the number of mistaken recordings made are even higher.
As well as recordings of couples having sex, information about medical issues, drug deals and private conversations have also been reported to have been analysed by technicians.
Whilst companies like Amazon stress that each clip can be only a few seconds long it is estimated that as many as 1,000 snippets a day are heard by each employee performing this ‘grading’ process. Of course, there are some recordings that are much longer and can include highly confidential data and, what should be, secure information.
Google, Amazon and Apple routinely use this practice to help train their software for better voice recognition and provide more accurate command responses.
Concerns are being raised about the high staff turnovers at these ‘listening posts’ which are situated in countries like India and Costa Rica as well as in the U.S. and Central Europe. According to one contractor, security vetting at some of these locations is not robust and with the level of detail being reported could represent a significant potential for misuse.
And, despite calls for better safeguarding and privacy controls there is still a lot of confusion about what is being done to prevent this leak of personal and intimate data.
How Can You Stop Alexa Listening In?
So, what’s being done to prevent companies like Amazon, Apple and Google from observing us in this way?
Firstly, all three of the major tech companies behind smart speakers and virtual assistants are very keen to point out that all recordings they use for the purposes of quality control and testing are anonymized. This (apparently) means that there is no way that subcontractors and technicians can identify users. In addition, all employees working on recordings are bound by strict non-disclosure contracts.
Secondly both Apple and Google have reportedly suspended the ‘grading’ process; Google has stopped processing EU customers recordings and Apple has implemented this worldwide.
It remains to be seen where Google comes down with its virtual assistant but Apple has agreed to include a way for customers to opt out of this mechanism.
Whilst Amazon did not suspend any of its customer recording processes, it did implement an option for users to protect their privacy.
In order to activate this setting, users of Amazon’s Alexa will need to log into their accounts and head to the Alexa Privacy Page. From there you can find an option which is labelled ‘Help improve Amazon services and develop new features’. By disabling this feature your Alexa should no longer share any conversations that it is party to, be it accidental or not.
As we know when it comes to technology and the internet, if you value your privacy then you have to be proactive so the best advice, and the only concrete solution to this scenario, is, if you don’t like the idea of anyone listening in to your most intimate moments then simply switch the device off or temporarily disable the microphone.
Featured image via Pexels.