Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Privacy Apple Technology

Policy Groups Ask Apple To Drop Plans To Inspect iMessages, Scan for Abuse Images (reuters.com) 89

More than 90 policy and rights groups around the world published an open letter on Thursday urging Apple to abandon plans for scanning children's messages for nudity and the phones of adults for images of child sex abuse. From a report: "Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the groups wrote in the letter, which was first reported by Reuters. The largest campaign to date over an encryption issue at a single company was organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT). Some overseas signatories in particular are worried about the impact of the changes in nations with different legal systems, including some already hosting heated fights over encryption and privacy.
This discussion has been archived. No new comments can be posted.

Policy Groups Ask Apple To Drop Plans To Inspect iMessages, Scan for Abuse Images

Comments Filter:
  • Irrelevant now (Score:4, Insightful)

    by taustin ( 171655 ) on Thursday August 19, 2021 @12:22PM (#61708041) Homepage Journal

    It doesn't matter if Apple backs away from this, at this point.

    They have publicly announced that they can do this. Therefore, governments that want to use it to control politically inconvenient subjects (looking at you, China) will mandate that they do so, and probably mandate that they do so secretly.

    So, in the end, as one would expect from the Law of Unintended Consequences, the protests will result in all of the bad things that can be done with this being done, and secretly, and none of the good that is intended. Caused by the privacy advocates.

    • We know that the basic tools for doing the hash are in iOS starting at least as far back as 14.3, per other stories. Is that code accessible to non-Apple apps?

      If so, the problem already exists in the wild, and may exist already in the App Store.

    • none of the good that is intended

      There never was any "good" intended. This was a (possibly misconceived) PR move by Apple to shed their encryption's image as a safe haven for terrorists and pedophiles. Problem is, if your trust chain is broken so you can catch the bad guys, it's also broken for everybody else too.

    • China didn't need a public announcement to know Apple can do this. They didn't need protests to ask them to do it secretly. Why all the mental gymnastics to shift the blame to those darned privacy advocates? I suppose it's someone else's fault Apple decided to manufacture their stuff there, too.

    • They have publicly announced that they can do this.

      Meh.

      It was always obvious that they could do this, and a lot more. They can give Xi Jinping an account which allows him to look at all photos in all iCloud accounts if they want to. It wouldn't shock me if there's some bit of their privacy policy statement that can even be interpreted to allow that... creative interpretation of language can take you a long way.

      The fact that they've announced a willingness to do this more privacy-preserving sort of scanning (even if it's maybe not as privacy-preserving a

    • "as one would expect from the Law of Unintended Consequences, the protests will result in all of the bad things that can be done with this being done,"

      I agree and one of the bad things that will be done is for this to be weaponized.

      Whatever threshold Apple sets, 30, 40, whatever, someone will figure out how to upload that many illegal pictures to your phone. Apple will find them and report them. I don't see any way Apple can avoid this short of a major policy change, such as warning the phone owner that t

  • Just don't use iMessage. There are dozens of better alternatives.

    • I don't use Apple as my mail provider exactly because they can do this sort of thing. And supposedly they did, at some point, block delivery of email that contained certain keywords. If mail is encrypted with the key on the user's phone or computer, then Apple shouldn't be able to do this. End-to-End encryption both traveling and in place for email, chat, video, etc should be the default. The fact they don't do that tells me to not use their service. And law enforcement can subpoena user account informatio
      • If mail is encrypted with the key on the user's phone or computer, then Apple shouldn't be able to do this. End-to-End encryption both traveling and in place for email, chat, video, etc should be the default. The fact they don't do that tells me to not use their service.

        I don't think this works the way you think it works. Apple isn't sending your mail and attachments in plaintext to be scanned on their servers... The scanning happens on your device, comparing a hash of an image on your device to a library of hashes of child porn. Non-encrypted data never leaves your phone, and there's end-to-end encryption for just about everything [apple.com].

        There may be reasons to criticize this technology and policy, but "Apple's sending my mail in plaintext!" isn't one of them.

    • by ytene ( 4376651 )
      I think you may be misunderstanding Apple’s announcement. Happy to be corrected if I am wrong, but the way I read it, if you take a photograph with a camera in your iPhone or iPad, it will be scanned by Apple’s servers as it gets transferred in to your iCloud account.

      The way they announced it, they were not giving you any choice in the matter.

      This has nothing to do with iMessage.
      • of course the best hint is always not to use Apple products, but I guess you are not forced to upload your pictures to iCloud, isn't it?

        • Oh? And whose product *would* you use?

          Notice how all the pushback and criticism of Apple here is from privacy advocate and civil rights organizations. Not one government that I've heard of, at any level, has stood for its citizens and told Apple: "No. You will NOT violate our citizens' privacy by spying on their cameras."

          Moreover; if Google or Facebook or Microsoft or Amazon or whoever could honestly say... or were confident enough in their implementation that they could lie and be confident about not ge

      • it will be scanned by Apple’s servers as it gets transferred in to your iCloud account

        No. It will be scanned by the user’s device which will then generate a key that will either be able to be used with other keys to decrypt the content if there are enough matches, or that does nothing because it isn’t a match. That way they can still say that your pictures are encrypted before being uploaded to Apple’s servers and advertise how (supposedly) great their privacy is compared to the competition.

      • Apple has been HORRIBLE at explaining this new 'feature' (actually two features). Your description of this merges the two functions into one not-quite-correct function (don't worry, a lot of people have done this).

        Feature #1: Scan iCloud for hashes that match know bad images (What is a 'bad' image? And who decides?). This one is easy to circumvent but not using iCloud (which I never have and never will). THIS function has nothing to do with iMessage.

        Feature #2: When an image is sent via iMessage, yo

        • That isn't true--this isn't just about icloud or imessage, the point is to have access to scan images on your phobe. They're not supposed to be analyzing the photos you take if you're not using icloud, but a lot of people don't even understand that they are. Riddle me this, how are they going to prevent you from uploading child abuse photos to their service, without first analyzing the image to make a hash to compare it to?
          • I keep reading Apple's description of these new features, and they seem to contradict themselves. When I read their website, the following line led me to believe that only photos in iCloud are being scanned:

            new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos.

            So, they are comparing image hashes of images stored in iCloud, not your phone, right? Except:

            Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

            So, they are scanning on your phone, and NOT in iCloud? Have I mentioned that Apple is awful at describing how they are going to subvert our privacy?

    • Just don't use iMessage. There are dozens of better alternatives.

      Problem is, all messaging platforms besides SMS require that both communicating parties be on the same platform.

      • SMS has the same requirement.
        You need a phone number by a mobile phone carrier. And it doesn't work on PC and tablets without a cellular connectivity.
        Although there are some workarounds such as Google Voice, it's still far from ideal. You can lose your number when you change country (or even worse, carrier) and messages sent to another country are not always free. The simple fact that the carrier has the ability to bill per message means SMS is crap.

        • by ls671 ( 1122017 )

          SMS works fine with my VOIP providers. No cellular connectivity needed at all to work on PC and tablets to receive/send from/to anybody.

  • ...they haven't got a clue what actually is going to happen. If you don't want your images scanned (not your phone!) quit storing your porn in iCloud. Only an idiot would do that anyway.

    • Most people are idiots. Or at least in cases like this behave like idiots. If you need a citation, just read some more comments here. Or log into whatever social media and read comments. It shouldn't take long to convince you how high the percentage of idiots is vs. non idiots.

      • People are idiots for... not wanting their phone scanned for god knows what, in real time? Is that what you're saying?
        • You need to do some research. Apple is NOT scanning your phone. It will only scan images uploaded to iCloud. Read much? Sheesh. No wonder the Trump cult is so popular.

          • by Rujiel ( 1632063 )

            Maybe I can help your stupid ass learn to RTFA. You literally need an update on your phone to receive this "feature", which completely eviscerates your bullshit

            https://www.google.com/amp/s/a... [google.com]

            "Cloud-based photo storage systems and social networking sites already scan for child abuse imagery, but that process becomes more complex when trying to access data stored on a personal device.

            Appleâ(TM)s system is less invasive in that the screening is done on the phone, and âoeonly if there is a match is

        • No, I was saying most people are idiotic enough to store their porn in iCloud. It was a direct response to the statement made above.

          I'm on the side of thinking Apple is making a really atrociously dumb move here. This will be abused. 100%. And if people keep themselves informed they'll lose customers over it.

  • Apple's motivation (Score:5, Interesting)

    by Dan East ( 318230 ) on Thursday August 19, 2021 @12:34PM (#61708083) Journal

    I have a hunch about Apple's motivation, but Apple would never disclose this publicly. I think that government / law enforcement keeps pressuring Apple on moral grounds that their messaging and devices are too secure. Apple won't give FBI access for this and that, and that could be used against Apple in some moral sense, in that they are protecting the worst of the worst (child pornographers). Yes, privacy is also a moral high ground, but the flip-side of that is the ability to investigate and prosecute criminals to at least some extent.

    Thus, by coming up with some (theoretically) privacy-safe method of patrolling for child pornography, Apple could get at least one government vector of leverage off of them. All of these big companies (Facebook, Google, Apple, Microsoft) would much rather self-police than have government intrusion or laws that stifle them in some way. So if they see that government action is on the horizon, they'll try to stay a step ahead and use technology to get that focus off of them.

    • This is exactly it. I was incorrectly linking to the EARN IT Act of 2020 [wikipedia.org] last week as "proof" that Apple and others are legally required to do this sort of work, but that act hasn't yet been passed (of which I had failed to take note). Even so, the industry surely sees the legislation working its way through the system, so they know which way the winds are blowing. Putting practices like these in place is a sure way to take the wind out of Washington's sails and ensure that Congress is less likely to bring

      • National Security Letters are secret. We have no idea of the pressure being applied.

        Also caution that whatever Apple can read they can also write. It is not an uncommon thing to plant evidence [reason.com] when they can't find any

    • by dfghjk ( 711126 )

      Don't share any of your hunches. First, this has nothing to do with "moral grounds", either government "moral grounds" or Apple "moral grounds" as if either of those exist. Second, this has nothing to do with Apple devices being "too secure". Instead, it has to do with Apple dealing with government pressure to intrude on people's privacy. Citizens have protections from intrusion from the government so government has private business do it for them. If Apple doesn't play ball, government makes doing bus

      • by ljw1004 ( 764174 )

        Don't share any of your hunches. First, ...

        On the contrary, Dan East, please do continue to share your hunches. I thought your hunch was interesting and I learned from it (even if I don't agree with it). While dfghjk made valid points and valid nits, they didn't touch the central plank of your hunch which is that this is a way for Apple to reduce government pressure - through the gamble that they can offer up this right now, and presumably get reduced (presumably US) govt pressure for a while until the US govt turns more authoritarian. The reason I

      • Don't share any of your hunches.

        Yet you go right ahead and feel free to provide your opinion regarding my opinion.

        I don't appreciate or respect your "worst of the worst" characterization

        And I don't care whatsoever about your feelings regarding my post, even if you sobbed a little because of it.

    • by gweihir ( 88907 )

      It is pretty clear Apple is bowing to pressure. That is the only plausible reason why they are not backing down after this massive backlash. Hence they have already been compromised and extending the search parameters is just a question of time. This also means their assurances they will keep this limited are completely worthless.

  • Just stop buying Apple products. It's all snob appeal anyway.
    • by dfghjk ( 711126 )

      It's not all snob appeal, but it might be worthy of consideration. The problem with that "solution" is that there is no reason to believe that other options do not do the same sorts of things. Regarding cell phones, Google has been doing this kind of crap already.

      We once had some reason to believe that Apple championed its customers privacy. That is no longer the case, but it doesn't mean anyone else does.

      • They’re still better about it than most. They don’t sell their customers, unlike Google, FB, Amazon, etc but your data is still available by warrant. The hash/match scheme they’ve introduced has (minor, probably ineffective) safeguards, and that concerns me. But then again, if you think that Amazon photos aren’t analyzed for ad targeting and worse I’ve got a bridge to sell you (see section 1.1 of the Amazon Photos TOS).

        Apple is taking a lot of heat for this move, but people don

  • Apple should add several safeguards to prevent abuse of the CSAM instead - Have a public process for adding/removing/suspending hashes - this will ensure nation states donot misuse the system by adding their own hashes - Have some sort of public appeal process, so as to deal with false positives - Set up some sort of way to compensate for any damages caused due to a false positive detection
    • Have a public process for adding/removing/suspending hashes ..

      How do you, as a member of the public, know what the original image is for a particular hash to know whether said hash is worthy of being added or removed?

      .. this will ensure nation states donot [sic] misuse the system by adding their own hashes.

      What's to stop some nation state to compel Apple to add new hashes under a gag order under threat of being expelled from said nation state? Even if the hash list were public, what's to prevent Apple f

    • Apple doesn’t control the known CSAM hashes, it’s all done by 3rd parties, which itself introduces vulnerabilities.

      It’s unclear how many matches trigger account review. Apple says that it’s a “threshold,” but that can mean anything. At least the nature of the hashing algorithm makes accidental collisions extremely rare (adversarial images are another issue, but those more of a curiosity than a threat). Reporting is only done after human review, although what the review co

    • by gweihir ( 88907 )

      They cannot. Obviously they are implementing this thing only because they could not withstand government pressure. Hence why would they be able to withstand pressure to not put in any safeguards against abuse? They are compromised. They just try to keep a semblance of their honor intact for marketing reasons.

  • The people who have the keys to your OS have always been able to do whatever you can do on your device. There is no scenario where this feature will compromise your privacy any more than is already possible. This hand-wringing by people who should know better smells of ulterior motive.
    • Bingo. At least Apple has a published process. Amazon, Google, FB, etc? Not so much. Their TOS do not limit what they can do with your data at all, and they get away with it because they’re opaque.

  • This is the single biggest mistake Apple has made in the history of the company. Yes, even bigger than hiring John Sculley.

    Apple has spent years cultivating a privacy first image, and they threw it all away in one fell swoop.

    I'm having a hard time understanding the rationale behind this decision. It seems to go directly against everything they have been working towards.

  • Oh dear, does the writer think they're looking for flesh tones? As opposed to looking for files that have the same hash as known bad content?

    • A bit more sophisticated than looking for flesh tones, but very different than looking for files which have the same hash as known bad content.

      The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

      When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

      Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.

      Source: Apple [apple.com].

  • by timmyf2371 ( 586051 ) on Thursday August 19, 2021 @03:50PM (#61708647)

    Like many people, I am very much in the dislike camp. In my case, I have recently acquired a Samsung device (you can take me at my word, or don't..) and am in the process of learning how to use it and will in due course start backing up folders and files directly to my NAS.

    With that said, I am curious as to why Apple is so insistent that this won't and can't be expanded to other use cases, as well as why those in favour of the concept aren't in favour of expansion.

    Let's start by saying that child sex abuse is a heinous crime. I am in favour of criminalising those who abuse children and subjecting them to the full force of the law. However, child sex abuse is not the only crime that causes harm, and detecting images of such acts after they have taken place only serves to criminalise the viewers; if the images exist then the child has already been abused.

    I'd like to explore the ways in which the two technologies which Apple is deploying could be used to great benefit. I may well stray into the use of technologies which aren't in scope of this initiative, but which do or will exist in the near future.

    Revenge Porn

    The act of sharing nudes or explicit videos after the end of a relationship affects both children and adults. There are also mobile apps such as Snapchat where people share suggestive or nude pictures with the expectation that these will 'self destruct'.

    With the neural hashing technology, we could see the process become more secure and eliminate cases of revenge porn. When a relationship ends or when a self destructing message is shared, a hash of 'cancelled' images could be added to iOS devices worldwide and thus preventing the sharing or viewing of these private images.

    The same principle could be used for images which present a national security concern. The exact definition would vary for each country, but expected results could well include photos of prisoners in orange jump suits, videos of bombs hitting hospitals or even photographs of tanks parading in city squares.

    Missing Children

    Child abduction is not a new thing. We have seen photos on milk cartons, AMBER alerts and other such initiatives which have varying rates of success.

    Apple says that there are one billion iPhones in use worldwide, so let's do a modern take of what the search for missing children could look like.

    We know that the technology to scan for particular faces exists in iOS because the photo gallery helpfully categorises all faces of a person together. We also know that iMessages will acquire this new feature which can scan images for particular content types.

    So let's marry the two together: in the minutes after a child abduction, the parent can upload as many photos of the victim. Apple will then push an alert to all the iOS devices around the world. Hopefully someone, somewhere has or will take a photo where the missing child happens to be in the background and boom: we get a time and location for the police to act.

    Fugitives

    The same principle as outlined for missing children could apply here, except this time with images of criminals uploaded by law enforcement.

    Naturally, we would need to trust law enforcement to use this feature correctly, but if we could quickly identify their appearance in the background of images, we could get a location of any and all fugitives including those suspected or convicted of violent crimes or even those who committed a minor crime such as shoplifting or drug use.

    Unlawful Protests

    The concept of an unlawful protest has started to make its way to the western world. Even the UK, which purports to be a western democracy, has introduced laws around curbing protests and there is even the concept of needing to apply to the police if you wish to march.

    The concept of face identification, which does exist today, could be used alongside the technology deployed to the one billion iOS devices out there in the world. Those who dar

    • by xalqor ( 6762950 )

      Your ideas are essentially mass surveillance using all private property as government sensors. Someone taking a photo at a park will not know which faces the government is looking for, or why.

      Their device will eventually just report on everything it captures, because someone will make the case that if we only start looking for criminals after their crime, we're losing valuable intelligence. Better if we have a complete profile on the whereabouts and activities of everyone all the time so when they do someth

      • This is precisely part of the point I was trying to get across... I did mention at the top of my post that I am against any of this and have actually moved away from Apple, and I do talk about the slippery slope argument towards the end. I apologise if I wasn't clear enough but the purpose was to try and demonstrate the ease of moving from acceptance of (a), to accepting (b) and eventually realising that we have reached (z).

        I'm no marketing expert, but even with the way I described my ideas, I suspect a lar

        • by xalqor ( 6762950 )

          I read your post again. Here are the two parts that I think caused me to ignore the disclaimer and think you were taking the opposite position:

          I'd like to explore the ways in which the two technologies which Apple is deploying could be used to great benefit.

          And:

          It is illogical to create this technology and insist that it will only be used for one narrow field, as that would waste its potential.

          Some people might call that a slippery slope. I encourage other readers, who are also against this technology, t

On the eighth day, God created FORTRAN.

Working...