78 million smart speakers were sold across the world last year, and analysts are saying that the devices could be the next big money-spinner for the tech companies as smart phone sales start to level off. We all love the convenience of asking Siri a question, telling Google to play our favourite music or asking Alexa for a recipe. But as convenient as they are, many people have serious concerns about their privacy when using these devices, and it turns out that they could be right.
Who is really listening?
As you talk to your Amazon Alexa smart speaker in the privacy of your own home, the last thing you would expect is for that conversation to be transcribed by a total stranger in an office block in the Romanian capital, Bucharest. Yet that is exactly what happens on a daily basis. A major investigation by the US newsgroup, Bloomberg, found that Amazon were transcribing thousands of recorded clips from their smart speakers every day. The company say it is a small fraction of one percent, but that is still a huge number of recordings. Apple and Google have also admitted to the practice, with Google saying around 0.2% of all clips were reviewed.
Why are they listening?
Knowing that the big companies are listening is enough to make you want to switch your speaker off. By listening to your daily conversations, your smart speaker could work out how you will vote, what you want to buy so they can send you adverts, or even who you think are favourites to win the World Series so they can change the odds. But just because they can do all these things, doesn’t mean they are.
The tech giants claim that they are only using the information to help train their smart speakers and improve their performance. Look carefully in the terms and conditions and you’ll find you have consented to exactly that, although you may not have been explicitly told that real people would be listening. Nonetheless, Germany’s Data Protection Commissioner, Johannes Caspar still described these systems as ‘highly risky’ from a personal privacy perspective.
Wake words
In theory, your smart speaker is only listening when you use a wake word, such as Hey Siri, Hey Google or Alexa. However, the Bloomberg report said that analysts were finding as many as 10% of their recordings had been triggered without an obvious wake word. This can easily happen by accident, thanks to everyday phrases such as the French ‘avec sa’, meaning ‘with his’ or ‘with her’, or the names Alex or Alexa. Recordings were also found to be triggered by loud TV and other stimuli. Just think how many times your iPhone chirps up when it thinks you have said ‘Siri’.
The good news is that the data analysed by Apple lacks any personally identifiable information and is only linked to a random number. However, Amazon admitted that its voice recordings were associated with an account number, the serial number of the device used and even the user’s first name.
What can you do?
How you respond to this will depend on how paranoid you are and whether you think that you have anything to hide. With a Google device, you can listen back to any recordings and delete them, as well as turning off web and app tracking in settings. Amazon devices do not let you opt out of human reviews, although you can choose to opt out of certain features by saying no to ‘help develop new features’ in settings.
At the end of the day, we should not be surprised that the smart devices we brought into our homes to listen to our every command are actually listening to us. What’s more, if the companies are using this data to improve their systems and software, then this will only benefit us in the future. The only question that remains is who else could get hold of our private conversations, and what they could do with that information if they did?
No Responses