Apple Contractors 'Regularly Hear Confidential Details' on Siri Recordings, Report Says (theguardian.com) 91
Alex Hern, reporting for The Guardian: Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or "grading," the company's Siri voice assistant, the Guardian has learned. Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world.
They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri's response was appropriate. Apple says the data "is used to help Siri and dictation ... understand you better and recognise what you say." [...] Apple told the Guardian: "A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements." The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long." Further reading: Google Contractors Are Secretly Listening To Your Assistant Recordings; and Amazon Workers Are Listening To What You Tell Alexa.
They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri's response was appropriate. Apple says the data "is used to help Siri and dictation ... understand you better and recognise what you say." [...] Apple told the Guardian: "A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements." The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long." Further reading: Google Contractors Are Secretly Listening To Your Assistant Recordings; and Amazon Workers Are Listening To What You Tell Alexa.
Eventually people are going to notice (Score:5, Insightful)
We keep a camera, microphone and GPS tracking device on our person nearly 24/7/365. You couldn't invent a better surveillance network if you tried.
Re: (Score:1)
"We"? What do you mean "We", white-eyes?
Re: (Score:2, Insightful)
"What Orwell failed to predict is that we'd buy the cameras ourselves, and that our biggest fear would be that nobody was watching."
-- Keith Lowell Jensen @keithlowell [twitter.com]
Re: (Score:2)
Re:Eventually people are going to notice (Score:4, Insightful)
You misunderstand how this works. You record everyone. Then if ever for some reason they need to nail you to a tree, all they have to do is go back in the archives and find something you're guilty of - we are all guilty of something.
"If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him. " -- Cardinal Richelieu
Free software can fix this problem. (Score:2)
Speak for yourself, there are some of us who don't. And I'm not just talking about Richard Stallman, who, it's worth noting as former Free Software Foundation lawyer Eben Moglen did in virtually every one of his talks, was right again.
The root of the problem isn't carrying around such a device, it's that the device is run on proprietary software—software you cannot trust. Proprietary software is often malware [gnu.org] and this w
Re: (Score:2)
The issue isn't preventing the leak of data once collected (and thus going on about the terms of service), it's allowing the user to choose which data is collected, and by whom. The data isn't "stored in 200 different drive arrays in different data centers", the data is coming from the mouths of people near the computer. Thus the issue is whether the computer collects that data to begin with. The freedom to determine how the user's computer runs requires free software.
Increasingly, you can NOT disable the assistant. (Score:4, Informative)
Re: (Score:2)
At least with Android, you can log in to Google and see everything that it's ever recorded. If Google is being forthright, this is a pretty good amount of transparency. You correctly have figured out that the only way to opt out is to not use the feature at all. I don't use it on my phone, but I do like the Google Home smart speakers at home and it is quite entertaining to see what it has recorded "accidentally".
Re: (Score:1)
At least with Android, you can log in to Google and see everything that it's ever recorded.
If that's the case, why do I get Google reminders on my phone to book a hotel for my stay (because I just got an email with travel related details) and according to this web site your talking about, there is no data on me?
Or why do I get a Google reminder on my phone that the 0% promotional period on my credit card is about to end, even though Google insists they have no data on me?
I would like to think that they're telling me the truth with all of those blank pages, but their actions suggest otherwise.
Re: (Score:2)
Hi, I think you are talking about them tracking you with cookies while we are talking about recordings kept from their voice products. Different topics.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Have you tried to nuke it from orbit? Just to be sure?
How to disable Google Assistant (Score:1)
Anonymous to avoid karma whoring
Nonfree software never puts users in real control. (Score:2)
There's no way to be sure this will do the job. It's easy for proprietors to ignore user settings and have the software do whatever the proprietor wants that software to do. We can't be sure of what a proprietor will do in the future based on what they've chosen to do or not do thus far. The underlying power is the real issue, the core injustice—users aren't in control of their devices (regardless of technical talent or willingness) if that device runs nonfree software (software that doesn't respect a
Re: (Score:3)
(Samsung Galaxy S5 owner here)
Don't worry: There is enough unremoveable bloatware on it that if you say "Okay google" it takes between 10 and 30 seconds to turn on the microphone, which is usually enough time that it times-out and cancels the voice request. It will instantly display the white border indicating that the Google Assistant started - but it won't actually start yet.
Alternative: Turn off the "okay google" prompt in the options. I changed mine to require you to hold down the home button for 2 se
Re: (Score:2)
However, at least in the past, they would allow you to disable the assistant. Ie.. you could disable the "hey google" or whatever keyphrase the device was listening for. I just got a Samsung Galaxy 5e and it's got the latest Android on it. I looked and looked for a way to disable the assistant and it does appear that it's now impossible.
It is entirely possible and takes 2 seconds on iOS with the latest phones and the latest iOS software. And it'll automatically disable the assistant on your watch if you have an Apple Watch, too.
Re: (Score:2)
Google Assistant is part of the Google launcher. Either use a different launcher, or go the Assistant screen, tap your profile icon and use the settings to disable voice activation or the assistant entirely.
Re: (Score:2)
You can take microphone permission away from the Google app. The down side is that you can then never use voice search. The up side is that hotword detection stops working, too, even with the search assistant on your launcher screen.
I'd prefer they just bring back the option to turn it off in Q, though
Hell F no!!! (Score:1)
"...understand you better and recognise what you say."
This is the LAST thing I want something in my house doing. These "tech companies" can take this BS and shove it up their asses!!
What I want to know is... (Score:1)
Who wears their Apple Watch while having sex? Are you REALLY that desperate to get your 10,000 steps in that you're going to record your action in the bedroom as well?
Re: (Score:2, Funny)
Who wears their Apple Watch while having sex? Are you REALLY that desperate to get your 10,000 steps in that you're going to record your action in the bedroom as well?
Re: (Score:3)
Who wears their Apple Watch while having sex?
I have a couple responses.
A) The moment came on to quickly to bother removing the watch.
B) The watch was removed but dropped close enough that the microphone picked up everything.
Also noted was that Siri is a service that exists in other Apple products like the iPhone, MacBook, and HomePod. There's plenty of reasons that people would have for any of these devices to be in microphone range of where one is having sex, other than it being attached to their wrist.
Re: What I want to know is... (Score:2)
Re: (Score:2)
The article seemed to mention that the Apple Watch was the biggest offender when it came to accidental Siri activations.
It would be amusing to see if any of these recordings have made it out into the wild.... it would probably do convince people to take their privacy seriously than a written article would.
Re: (Score:2)
I don't wear an Apple watch (Seiko Kinetic SCUBA if you really care), but there are situations where one might leave their watch on while having sex:
This is what the customers expect anyway (Score:1)
Re: (Score:2)
I'd regularly download hundreds of recordings each day to listen at home to evaluate the performance of th
Re:This is what the customers expect anyway (Score:5, Insightful)
Before any person buys one of these microphone search things, they always ask themselves how much they care if strangers can hear everything in their lives. The ones who proceed are the ones who are ok with it. It's completely opt in, ergo: non-story. People are allowed to fuck themselves over and that's not going to change no matter how much the left tries to be everyone's nanny.
You overestimate the knowledge and thought of most consumers. Talk to people for 5 minutes and you'll quickly see that the idea that someone else may listen to their Siri/Google query doesn't occur to them. Even when you make that clear, they still need help understanding the implications of that.
The average consumer buys because it has some specific feature they want or out of brand loyalty. Greater implications of features they may not even be aware of aren't remotely part of the equation.
Re: (Score:2)
they still need help understanding the implications of that.
Oh this should be good. Please do tell me what are the implications of a random contractor hearing some out of context recording? Is it for all those times I shout my bank name, login and password across the room while calling my first pet by name to come get dinner?
Re: (Score:2)
they still need help understanding the implications of that.
Oh this should be good. Please do tell me what are the implications of a random contractor hearing some out of context recording? Is it for all those times I shout my bank name, login and password across the room while calling my first pet by name to come get dinner?
Thank you for proving my point.
Your narrow vision about this one case is part of the problem. It's not just this one thing, it is the sum of all the little things that are known and not known. People seem to be in a mad dash to post as much of their life to the internet as possible. We have cameras in our house and outside watching us. We have fitness trackers and GPS devices tracking our movements. We have devices that notice when we are home or not to determine what our AC/heat should be set at. We have c
Re: (Score:1)
I poop at work, so people hear that anyway.
Re: (Score:2)
I hope you step into the bathroom first.
how much you want to bet (Score:2)
Hard button for Siri (Score:3)
I would prefer Apple put a hard button for Siri. Only when it is on, it should listen to Hey Siri. I can use that when driving and want to talk without holding phone. If the button is off, it should only listen locally (no data should be uploaded) and ask the user if they want to turn on Siri listening and if the user says yes then only it should listen. This two factor confirmation can prevent virtually all unintended listening. Once I was having a fun conversation with a friend when Siri heard harassment type statement and told that person to contact law enforcement or call 911.
Re: (Score:2)
I have the "Listen for 'Hey Siri'" setting turned off, so it does only trigger when I hold the phone / watch trigger button down.
Not sure if that's possible on newer iPhones though. It'd be pretty annoying if you can't disable "Hey Siri" on, say, the iPhone X.
Re: (Score:2)
I have the "Listen for 'Hey Siri'" setting turned off, so it does only trigger when I hold the phone / watch trigger button down.
Not sure if that's possible on newer iPhones though. It'd be pretty annoying if you can't disable "Hey Siri" on, say, the iPhone X.
I can confirm that it takes exactly 2 seconds to disable Siri on the latest iPhone with the latest iOS. Disabling it on your phone automatically disables it on your watch. Your voice recordings may persist if you have dictation enabled, otherwise they are deleted from all apple servers as soon as you turn off siri and dictation.
Re: (Score:2)
I can confirm that it takes exactly 2 seconds to disable Siri on the latest iPhone with the latest iOS.
Right, but on that latest iPhone is there a way to disable "Hey Siri" voice activation while still being able to manually trigger Siri, should you want to use it on occasion?
On the older iPhones, like my 6S, this is possible - I can have "Hey Siri" turned off, but still get Siri by holding down the Home button for a second or two. But the new phones don't have a Home button...
Re: (Score:2)
I can confirm that it takes exactly 2 seconds to disable Siri on the latest iPhone with the latest iOS.
Right, but on that latest iPhone is there a way to disable "Hey Siri" voice activation while still being able to manually trigger Siri, should you want to use it on occasion?
On the older iPhones, like my 6S, this is possible - I can have "Hey Siri" turned off, but still get Siri by holding down the Home button for a second or two. But the new phones don't have a Home button...
Yes, it works the same on all of the iOS devices. On the Face ID phones, you just hold the lock button instead of the home button.
Re: (Score:3)
Re: (Score:2)
Thank you, that's good to know.
"Don't tell Siri..." (Score:2)
Person: Don't tell Siri
iPhone: ding ding...
Hey Siri (Score:2)
I'm having sex!
Re: (Score:2, Informative)
"Would you like me to search the web for 'Jergens' and 'Kleenex'?"
Slippery Slope? (Score:5, Interesting)
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or "grading," the company's Siri voice assistant, the Guardian has learned.
Emphasis mine. And...
User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."
For now. How long before a law is passed that requires companies to report any illegal activity discovered? How long before a disgruntled employee who's been saving all sort of embaressing conversations and uses them to blackmail people, and/or the company? Or worse, just dumps all their ill gotten conversations on a public website for anyone to browse and mock?
In short, saving these queries for any length of time is a really monumentally bad idea and we need to tell companies we want them to stop doing this. I'm all for personal assistants, but saving anything said to that assistant is pretty much an invasion of privacy. Many consumers probably have no clue that everything they say to their phone is being saved and reviewed by grunts somewhere.
Stop being naive (Score:2)
Poop in the pool (Score:1)
Activate Siri on a loop and every time it dings, play a recording saying something completely nuts. Or fart noises, or people screaming, Sounds of the Taliban, or whatever. If Snapple reports it to the authorities, they out themselves. You show the authorities the device setup. Hilarity ensues. Then you're shipped to Guantanamo.
Bit Tech's Newspeak (Score:1)
Big Tech has redefined privacy to mean who they share the datat they collected. It should be what they are collecting and an explicit consent to be required for doing so.
At apple (Score:1)
Security is just another lie we tell to get you to buy our crap.
Vote With Your Dollars (Score:2)