Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Security Apple Your Rights Online

Apple Confirms It Will Begin Scanning iCloud Photos for Child Abuse Images (techcrunch.com) 135

Apple will roll out a technology that will allow the company to detect and report known child sexual abuse material to law enforcement in a way it says will preserve user privacy. From a report: Apple told TechCrunch that the detection of child sexual abuse material (CSAM) is one of several new features aimed at better protecting the children who use its services from online harm, including filters to block potentially sexually explicit photos sent and received through a child's iMessage account. Another feature will intervene when a user tries to search for CSAM-related terms through Siri and Search.

Most cloud services -- Dropbox, Google, and Microsoft to name a few -- already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM. But Apple has long resisted scanning users' files in the cloud by giving users the option to encrypt their data before it ever reaches Apple's iCloud servers. Apple said its new CSAM detection technology -- NeuralHash -- instead works on a user's device, and can identify if a user uploads known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks to verify the content are cleared. News of Apple's effort leaked Wednesday when Matthew Green, a cryptography professor at Johns Hopkins University, revealed the existence of the new technology in a series of tweets. The news was met with some resistance from some security experts and privacy advocates, but also users who are accustomed to Apple's approach to security and privacy that most other companies don't have.

This discussion has been archived. No new comments can be posted.

Apple Confirms It Will Begin Scanning iCloud Photos for Child Abuse Images

Comments Filter:
    • https://www.apple.com/child-sa... [apple.com]

      This is going to be baked in the OS, and appears to be situated around reporting already known images sent or received by a user.

      Basically it's an anti virus scanner for CSAM that calls the authorities when a hash matches.

      • by quenda ( 644621 )

        This is going to be baked in the OS, and appears to be situated around reporting already known images sent or received by a user.

        Basically it's an anti virus scanner for CSAM

        Good analogy. But if this runs locally on your device, it must be either downloading a massive hash-table of known CP, or uploading a hash of all your photos.
        Even if just a few megabytes of hashes per phone, that is petabytes total. What is it supposed to achieve?
        Catching some sad old perve's circulating vintage known CP, and so dumb they keep it in iCloud. Might result in some suicides and prison terms, but any evidence it will actually protect kids?

        An AI that detected *new* CP would be more effective, b

        • I bet they send a hash of all of your photos. Then they can track your photos and where they're sent via the hashes. What a wonderful way to add even more surveillance!
          • by Sloppy ( 14984 )

            No need to speculate; they say it:

            Appleâ(TM)s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on usersâ(TM) devices.

        • Maybe its just not that big? Like, perhaps theres a known 'corpus' of creep shots they find repeatedly in raids, hash against that and you'll get the majority of it, or something.

    • by gweihir ( 88907 )

      That is basically the motivation behind this. The illegal images claim is just a nice, convenient pretext (a.k.a. "lie"). Of course, they claim to "preserve user privacy", but that is also just an obvious lie.

  • I have no illegal things in iCloud... but I keep everything encrypted anyway so *shrug*
    • Re:Encrypted (Score:5, Interesting)

      by cygnusvis ( 6168614 ) on Thursday August 05, 2021 @03:16PM (#61660319)
      LOL actually read the article after I commented... and the fact that apple has the ability to decrypt my files basically means that encryption is useless. I will just not keep files in iCloud *shrug*.
      • Try reading it again. They're scanning on the phone, before it is encrypted.

        • They are doing that, but then they can also decrypt the images, presumably to allow them to report you to law enforcement.

          • They are doing that, but then they can also decrypt the images, presumably to allow them to report you to law enforcement.

            You presume incorrectly. Apple has always held the encryption keys for iCloud Photos because iCloud Photos are sharable via the web (i.e. they have to have the keys). Other web-based photo sharing service hold the keys as well, and most of them are scanning for child porn too. After all, recent legislation makes them liable for it [wikipedia.org], so while Apple has been getting a lot of attention, it's actually been happening silently at all of the major services for quite awhile now.

            • I spoke out of turn and would like to correct my mistake.

              Apple has up to this point had the keys for content that you make publicly available via the web, but not for all of your photos that you upload to iCloud Photos. I was in error to suggest otherwise, because those have historically been encrypted in a manner that Apple cannot decrypt.

              That's still largely unchanged today. Apple doesn't have any way to decrypt typical photos (unless you enable web access, as already mentioned). The big change here is th

        • Try reading it again.

          I suggest following your own advice.

        • So chewing up battery then? Got it.

      • Didn't everyone already know this? I mean, when the FBI was trying to force them to open up that California terrorist's phone, Apple told them forcing an iCloud backup would give them access - and told them how to do that. (but the FBI was more interested in raising political hay, so they did their own thing)

    • Perhaps Apple simply want to deter users from putting stuff on their servers that could incur liability to Apple if Sec. 230 is killed.

  • by Anonymous Coward

    Maybe the editors here should start scanning for dupes.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      The editors are too busy double checking the contents of their iCloud

    • by sabri ( 584428 )

      Maybe the editors here should start scanning for dupes.

      It's not a dupe. The previous post mentioned scanning your phone for content. That's a big difference from scanning their cloud servers.

      Don't upload your illegal shit to other people's computers. Sounds like a Crime 1-oh-1 lesson.

    • We'll have world peace, honest global government by consensus, and reverse global warming before that happens.

  • Didn't they just do that a few hours ago... Dupe

  • Taken at face value, it looks like this feature is limited to things that connect to a child's account. Though I am skeptical of this technology's effectiveness in fulfilling its purpose, it is not nearly as far-reaching as initially reported here.

    If that doesn't sit well with Apple customers, they are free to take their business elsewhere. Hooray free market!

    • by AmiMoJo ( 196126 )

      Seems pointless to limit it to children's accounts. It's just a database of known image checksums, and children are not likely to be sharing known child abuse images.

      The tech detecting unknown images makes more sense for children, but I'd be surprised if it actually worked. All previous attempts have failed.

    • If that doesn't sit well with Apple customers, they are free to take their business elsewhere. Hooray free market!

      Might explain why Android doesn't have an iCloud. Too much of a headache for too little benefit.

      • If that doesn't sit well with Apple customers, they are free to take their business elsewhere. Hooray free market!

        Might explain why Android doesn't have an iCloud. Too much of a headache for too little benefit.

        Google Drive is out there, samizdat, and they already scan for the illegal shit. They were doing it before Apple too. The summary even notes that.

        • It's not a mandatory part of Android unlike iCloud.

          • It's not a mandatory part of Android unlike iCloud.

            Where did you get the idea that iCloud is mandatory? I've had iphones for a decade now. and this is news to me, because I've opted out.

            • Back when I had iDevices, they'd turn it on and take data whenever there was an update or refresh. Even if you checked every use they'd have taken some or all rendering any privacy ambitions moot. Remember when cloud accounts were compromised and celebrities had personal images there despite not turning on cloud storage?
              • Back when I had iDevices, they'd turn it on and take data whenever there was an update or refresh. Even if you checked every use they'd have taken some or all rendering any privacy ambitions moot. Remember when cloud accounts were compromised and celebrities had personal images there despite not turning on cloud storage?

                I know they ask if I want to turn on icloud. I have turned it on to see what it's all about. I'm not certain how the celebrities were compromised. I suspect they had no idea what they were doing, and turned it on without thinking. I can't find anything in iCloud the times I've checked it out. Found nothing. Turned it off. Anyhow, I don't have any nekkid pictures on my phone anyhow.

    • Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life [eff.org]

      We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

  • by logicnazi ( 169418 ) <gerdesNO@SPAMinvariant.org> on Thursday August 05, 2021 @03:23PM (#61660367) Homepage

    This is not fucking ok. I'd much rather apple give up on end to end encryption than add fingerprinting to end devices for illegal content. As always, the danger is always that the most disgusting awful thing will be used as an excuse to violate a general principle and once you've crossed that line there is no natural stopping point.

    First, it will inevitably be abused. Someone will figure out how to make a seemingly fine photo match and send it around. Someone will figure out how to excerpt enough of the non-illegal content of an image registered in their abuse database and stick it in another image to get it recognize.

    Also, once you say that you can force load a program for checking for illegal content onto a phone where does it stop? With recent advances in machine learning you could learn to identify drug sales in text messages with decent probability...or tax fraud or whatever. Maybe not perfectly but surely with sufficient accuracy to justify a warrant. And what if the Chinese government demands they also report back fingerprints of mentions of Tianemen or whatever.

    • by I75BJC ( 4590021 )
      Reports of cell phone, tablets, and computers being searched by Federal Government agents and Grandma's photos of their grandchildren that result in Grandma being harassed and some of those times, Grandma ends up on the Sexual Predator list. This occurs with human being with some sort of human judgement ability. How much worse will it be when AI is used. AI does not have the discernment of even a poorly functioning human in areas requiring actual Moral Judgement.

      "It's for the children!"
      • by logicnazi ( 169418 ) <gerdesNO@SPAMinvariant.org> on Thursday August 05, 2021 @07:31PM (#61661463) Homepage

        TBF they aren't actually considering using AI per se to identify the photos. Rather, they use a fingerprinting system and a giant database of child porn images (somehow this is maintained or allowed by US law enforcement for this purpose) so they aren't training an AI to guess if unseen images are child porn but rather checks if the user has an image that's on the list.

        However, remember that the fingerprinting system needs to be robust against both reencoding/recompression and resizing/backgrounds/etc.. The fact that it's *not* intelligent means it has no way of knowing if the component of the image it is recognizing actually displays abuse or not.

        So I'm expecting that someone is going to find an image on the child porn list with a large section of purely mundane content (e.g. some furniture or crap in the background) and will put that into some meme or innocuous photo and send it around. I'm kinda hoping someone does this relatively soon with a popular meme or something just to illustrate how bad an idea this is.

        But once you are doing some fingerprinting which can be tricked it's not clear why this is different in kind from using ML to probabilistically detect new images or other illegal activity.

    • by suss ( 158993 ) on Thursday August 05, 2021 @03:48PM (#61660517)

      Like they don't already do that for their Chinese masters... It's the price of doing business over there.

      • I don't think they do. The way China enforces it's viewpoint controls online isn't usually by trying to make it completely impossible to evade or catch every instance of someone talking about something so I kinda doubt they are demanding this from apple already. Indeed, I think this kinda porous approach makes the system more effective and ominous because it means that most normal users can evade or escape the inevitable overblocking but it discourages mentioning it and ensures that high profile individua

    • by AmiMoJo ( 196126 )

      The boat sailed long ago on forced loading. Apple controls iOS, no side loading, no replacing the OS with your own, and no uninstalling Apple's apps.

      Most iPhone users have a folder full of Apple crapps that they don't want it use but can't uninstall.

      • Privacy and control are different issues. Apple hasn't been telling users they will have control over their phones and what is on them. Indeed, they've been touting their walled garden and forced defaults/options as benefits protecting users (no worry about side loading malware) etc..

        But, at the same time, they've been doing a major PR (and I believe lobbying) push on privacy talking up new features like private browsing as well as end-to-end encryption of messages and phone encryption. They've even been

    • by tlhIngan ( 30335 )

      Well, it runs on your own phone, for starters. It's not Apple running it on iCloud.

      I'm not convinced it works on photos - it likely only works on the existing databases that exist, albeit with a bit of "AI" to get around trivial edits to screw up the hash.

      After all, it's a bit difficult to determine if a photo with a bit too much flesh tone is legit, so I'd have real doubts Apple is scanning photos. Instead they'd be scanning downloaded images and likewise when those images are shared with others.

      You have t

    • by mjwx ( 966435 )

      This is not fucking ok. I'd much rather apple give up on end to end encryption than add fingerprinting to end devices for illegal content. As always, the danger is always that the most disgusting awful thing will be used as an excuse to violate a general principle and once you've crossed that line there is no natural stopping point.

      Told you I did, listen you did not (perhaps not you in particular, but Apple fanboys in general that you may or may not be, so select as appropriate).

      For years Apple fanboys have crowed that Apple cares about their privacy, Apple protects them from law enforcement, Apple does no wrong whilst Google spies and snitches. Well here's a big mug of "I fucking told you so" because I told you that Apple does it, they're just being less honest about it.

      Now Google may advertise but it's unobtrusive and definit

  • Baby pictures (Score:5, Insightful)

    by kackle ( 910159 ) on Thursday August 05, 2021 @03:35PM (#61660441)
    Someone here mentioned how his mother liked to show his baby pictures which included him standing naked next to the bathtub. It's interesting because he finds it embarrassing (pardon the pun), his mom thinks it's cute, most everyone else doesn't care at all but a pedophile might find it arousing. So, the perception is truly in the eye of the beholder. Now Apple/others will stick their opinion into the mix. I hope a lot of mothers don't get inconvenienced/hurt by their decision-making AI. That can't happen, right?
    • by AmiMoJo ( 196126 )

      This actually happened in the UK. An artist put on an exhibition that included a photo of her children nude at the beach. The newspapers found out and went nuts. It was at the height of a paedophile panic.

      Actually come to think of it Japanese situation Takako Kido sometimes photographs nude children (usually with their mothers), and I follow her on Instagram.

      https://www.instagram.com/taka... [instagram.com]

    • ... known child abuse imagery ...

      Is this checking an on-device photo against a database of hashes of known images? Given the "NeuralHash" label, it suggests flagging any hash that is numerically 'similar' to a known hash. The "Neural" part also suggests it will attempt to identify nipples and other body parts, then demand an Apple contractor look at your photo to confirm. Does it escalate because a child isn't wearing clothes or it meets some definition of sexual grooming?

      ... child sexual abuse material ...

      What does that mean? Parents frequently take photos of their fam

    • Someone here mentioned how his mother liked to show his baby pictures which included him standing naked next to the bathtub. It's interesting because he finds it embarrassing (pardon the pun), his mom thinks it's cute, most everyone else doesn't care at all but a pedophile might find it arousing. So, the perception is truly in the eye of the beholder.

      The problem is that the pedophile doesn't actually need porn to find kiddies arousing. We had a case here in which a guy was tried and convicted, and all he had on his computer was non-porn images of kids. He obviously had a problem, I think what his original issues was trying to make a date with an underage girl in a chat room who was of course a cop. But the images were used as part of the evidence to convict him.

    • So, the perception is truly in the eye of the beholder.

      Acrotomophilia - sexual attraction to amputees.
      Coulrophiliac - sexual attraction to clowns.
      Chremastistophilia - sexual arousal from being robbed.
      Dendrophilia - sexual attraction to trees.
      Hoplophilia - sexual attraction to guns (see also Texan).
      Mucophilia - sexual attraction to mucus.
      Toxophilia - sexual attraction to the sport of archery.

      If you can think it up, someone has wanked over it.

    • You're an idiot and NotEmmanuelGoldstein is apparently correct.

      The word "known" is used at least twice, meaning if I'm 10 and post a photo of my junk it's new, not "known".

      The national center for missing and exploited children maintains a set of hash values for known abuse images, and I would assume this is some sort of TinEye-like similarity algorithm which can identify cropped, color manipulated, or otherwise altered photos which have already been determined to be illegal.

      That can't happen, to answer your

      • Except your computer gets hacked and the pic you took gets passed around and ends up in the exploited children database. Then you get busted.

    • This all depends - the Apple system will check your photos (on your phone) against a database of known image signatures. Thus, just because you've got a bare-backside picture on your phone means nothing at all.

      However, if, as you describe, the picture is (say) leaked to a paedo user group, they might share it around. One of them gets caught, and all their images, including this one end up in the signatures database.

      One would hope that the human reviewers would be able to tell the difference between an embar

    • by mjwx ( 966435 )

      Someone here mentioned how his mother liked to show his baby pictures which included him standing naked next to the bathtub. It's interesting because he finds it embarrassing (pardon the pun), his mom thinks it's cute, most everyone else doesn't care at all but a pedophile might find it arousing. So, the perception is truly in the eye of the beholder. Now Apple/others will stick their opinion into the mix. I hope a lot of mothers don't get inconvenienced/hurt by their decision-making AI. That can't happen, right?

      I don't think they'll be targeting infants (although they may, Apple have a history of getting things horribly wrong) rather images of prepubescent or adolescent children.

      So the School photographer, the parent/aunt/grandparent who may have pictures of an 11 yr old girl in a swimming cossie on sports day... because as you've intimated, false positives never happen and if you've been branded a paedo in the US (where Apple is located) that pretty much strips you of any argument beyond the courtroom (and eve

    • This exact issue came up immediately when I was talking about these features with my wife. We try to be careful with the photos we take, but there have been some candid photos of naked babies over the years. We haven't shared any of those with others, but even if we had Apple would not have flagged any of those photos for reporting, so far as I can tell.

      Just to walk through the details since there are several related features that overlap in terms of how they operate, let's say I take a photo of a naked tod

  • Any half way intelligent person who knowingly has illegal content will know better than to store on some one else's sever. ...so what is this actually doing?

    • Scanning local files.
    • by nightflameauto ( 6607976 ) on Thursday August 05, 2021 @04:01PM (#61660617)
      • Virtue Signaling (THINK OF THE CHILDREN!)
      • Utilize the above to justify doing something most users wouldn't like if it were common practice.
      • Feature creep, because once it's there it WILL be "enhanced" to include document scans for illegal activities.
      • Make it seem common practice.
      • Welcome to the future. Please do not wrong-think, or you will be reported.

      Seriously, the entire goal here is to open the door just a tiny little crack into privacy invasion in the name of doing some vague good, then slowly creep the door further open with each iteration after until it's wide open and we're all being watched 24/7 by the corporate overlords, with direct reporting to authorities when we do something awful like let a kid run naked through the house after a bath or contemplate taking a substance currently considered illegal.

      This may not be the definition of the slippery slope, but it's a damn fine model of it.

    • After reading a few articles, I think they are trying to identify instances of grooming. An older groomer with a collection of known bad images will send them via Messenger to a child they are grooming and ask for similar pictures in return. The child, who is more likely to have automatic iCloud backups enabled, will receive known bad images (which then get backed up into iCloud where they are scanned) and produces their own images to return (which are again backed up into iCloud and scanned).

      Apple to sc [aljazeera.com]

  • What's next, is Apple's AI gonna rat out everyone with the Scorpions' Virgin Killer album [wikipedia.org] on their iPhone?
  • Surely there will be some permission setting somewhere to toggle this - or perhaps a notification to ask:

    Apple would like you scan you photos
    for naughtiness - everything is
    performed on-device and no data
    is sent to Apple or shared with third
    parties,except if we deem it to be
    unacceptable material.

    Allow - Deny

  • If Apple actually succeeds in implementing this, I predict that they'll end up with excellent blackmail material on some powerful people in industry, government and the Catholic church. That will certainly improve their chances of getting favorable legislation or deals. It's like they took a page from the Russian's Trump control manual.
  • Every tine someone wants to do something shady its "for the good of children". They will scan trough all your stuff and if you object to this you are automaticly labeled "pro child abuse". Start by trying to make it illegal atleast to beat your kids... cant ubderstand how that is still allowed in most of usa...
  • For those who have not followed thru there are serious distinctions:

    1 The earlier article was about the iPhone being scanned for photos.
    2 This article is about iCloud being scanned for photos.
    3 The earlier article was assumed to be in error by many comments.
    4 This article is confirmed by Apple.

    It is true that the word 'Apple' appears in both articles.
    No other connection.

    • So basically, Apple says they will scan your iCloud photos. Apple did not confirm or deny if they will perform local scanning on your phone.
  • If you're using iCloud, you should already be assuming Apple is looking through absolutely everything you put there. This goes for every other cloud service as well, obviously. Apple might just be slightly more despicably evil than other companies, but the difference isn't big.

  • by davecotter ( 1297617 ) <.me. .at. .davecotter.com.> on Thursday August 05, 2021 @04:47PM (#61660829)

    i've created a change.org petition.

    if you care about this issue, you can sign it here: http://chng.it/4wFSfVgPgL [chng.it]

    -dave

  • Five years ago Apple had execs that understood, or at least listened to people that understood that there was no such thing as "encryption breaking AND privacy". The CEO of Google ranting in public about Apple being able to charge a premium for privacy was gold, Apple was doing just that and it made them money. Boohoo

    Now the CEO of Apple is some fossilized brain specimen that thinks porn filters work, when any competent programmer could tell him they don't, and can't wrap his head around the idea that ma
  • Apple network admins always have the best porn!
  • Remember when taking photos of your baby on a bear skin rug or taking a bath was considered normal parental doting, not evidence of pedophilia?
    • That was before hysterical neo-Victorian moral panic gripped the US. I hope to live long enough to see it destroyed by events, ANY events.

  • From the tech crunch article:

    "NeuralHash uses a cryptographic technique called private set intersection to detect a hash match without revealing what the image is or alerting the user." Why NOT warn the user. So if someone innocently downloads what turns out to be illegal material they are referred to law enforcement rather than warned on the spot?

    "Apple said that there is a one in one trillion chance of a false positive, but there is an appeals process in place in the event an account is mistakenly
  • Today, on It's The Mind, we examine the phenomenon of Deja Vu
    https://www.dailymotion.com/vi... [dailymotion.com]

  • Good luck with that. Once Apple's brilliant algorithm misidentifies your picture of a water lily as child abuse, you will be banned for life, and you can call an 800 number and speak to a robot if you don't like it. But, hey, look at the bright side -- you'll be able to speak to the robot in Spanish if you want to.

  • As usual, first it is "For the Children" but it won't be long before it is "For the Politicians".

    I have a friend who works for Apple who deals with this sort of stuff and he outright acknowledges that they have kiddie porn on their servers and so technically they are in possession and Apple as a company is in violation of the law. This, of course, probably holds for just about any company that has a repository of user supplied content.

  • Will they need to arrest all the employees who volunteered to work on the project?

On the eighth day, God created FORTRAN.

Working...