Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
IOS Apple Technology

In Internal Memo, Apple Addresses Concerns Around New Photo Scanning Features (9to5mac.com) 101

Sebastien Marineau-Mes, a software VP at Apple, talks about the company's upcoming controversial photo scanning features in an internal memo to employees: Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.

Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple's deep commitment to user privacy.

We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built. And while a lot of hard work lays ahead to deliver the features in the next few months. [...]

This discussion has been archived. No new comments can be posted.

In Internal Memo, Apple Addresses Concerns Around New Photo Scanning Features

Comments Filter:
  • by Sebby ( 238625 ) on Friday August 06, 2021 @09:46AM (#61663541)

    Surely there will be some permission setting somewhere to toggle this - or perhaps a notification to ask:

    Apple would like you scan you photos
    for naughtiness - everything is
    performed on-device and no data
    is sent to Apple or shared with third
    parties, except if we deem it to be
    unacceptable* material.

    *we reverse the right to define/change
    "acceptable" at any time.

    Allow - Deny

    This feels like the start of the very slippery slope.

    • Dang it!

      I guess I'm in good company(?) [slashdot.org]

      • by saloomy ( 2817221 ) on Friday August 06, 2021 @10:13AM (#61663721)
        No one misunderstands what Apple is doing. They wrote software that runs on our devices and spies on us. Fucking asshats.
        • But but... Noooes! This is different: it's for the children!

          Next project on the agenda: scan your device for terrorism material. Because terrorism is horrible and anything goes against terrorism. Just like anything goes for the children.

          • by saloomy ( 2817221 ) on Friday August 06, 2021 @11:36AM (#61664065)
            You know whatâ(TM)s is worse than murder? Do you know what is worse than raping children? Do you know what is worse than terrorism?

            Subverting freedoms of others. Coercion or authoritarianism. Hitler, the worst person in human history didnâ(TM)t rape children. He didnâ(TM)t personally kill. He stole freedom from others. The most free country at the time marshaled an unstoppable force to destroy his government. His government spied, threatened, robbed, and genocided. They robbed freedom from millions.

            I am sorry, but someone has to say it. Raped children, dead innocents, sometimes the bad guy gets away⦠it is all worth it, because freedom is worth everything. Many people have what Lincoln called the last full measure of devotion that you might have freedom. Freedom is everything. Freedom to be secure in your person and effects is paramount. What Apple is doing is wrong. Even if it saves a child from rape or murder. It is not worth giving up freedom.
            • Mane people gave* what Lincoln called the last full measure of devotionâ¦. Typo
            • You know whatÃ(TM)s is worse than murder? Do you know what is worse than raping children? Do you know what is worse than terrorism?

              Murdering child-raping terrorists? That sounds pretty bad.

              But not to worry: at least we can take comfort in knowing that murdernig child-raping terrorist iPhone users will be caught and dealt with swiftly by the Apple political correctness police.

            • You know whatâ(TM)s is worse than murder? Do you know what is worse than raping children? Do you know what is worse than terrorism?

              Subverting freedoms of others. Coercion or authoritarianism.

              All of those things are coercion or authoritarianism.

            • Will that be your defence in court when Apple scans your iPhone?
              • Stupid argument. Not having something to hide doesnt negate the need for privacy. I want to be in control of my data, and I want to control nobody elses data. I do not want to guess if one of my pictures is sent to a reviewer because some AI that I have not audited decided to silently upload my photo to some reviewer. This sort of technology will be expanded and we will never know the next time something like Watergate happens because the govt goons will have caught Deepthroat. Privacy has as useful place i
                • Not an argument. I was just picturing you going full-on Mel Gibson, child-rapey, Godwin's law in the name of freedumb in a court of law. That'd be something I'd pay to see.
    • Allow - Deny

      Nah - they'll probably use a dark pattern for this, because "children", or "courage"; so something more like:

      Allow - OK

    • this gives every thinking person (I know, small subset) the firm knowledge that apple does not care one whit about actual privacy; they want headlines and brownie points (uhm, should I have chosen a different term? hmmmm.)

      invasive spying is invasive spying, period.

      fuck apple. there has never been a valid reason to believe anything they say, that you can't directly verify yourself. we dont have access to chips or code, so really we cant trust a damned thing they say.

      like politicians, when someone says som

      • Totally agree (Score:5, Insightful)

        by SuperKendall ( 25149 ) on Friday August 06, 2021 @11:40AM (#61664087)

        this gives every thinking person (I know, small subset) the firm knowledge that apple does not care one whit about actual privacy

        I totally agree.

        In the past I've defended Apple's record on privacy because they have done a lot of things that truly help user privacy.

        On this news of Apple scanning photos for illicit content, I was waiting for more evidence Apple itself was actually considering it - and here it is.

        Without doubt Apple can no longer be considered to be truly supportive of user privacy.

        What a shame, because where else is left to turn?

        At the very least I will no longer user Apple's camera app if at all possible, or store photos where iOS can find them.

        Not because I have anything to hide, but because they have no right to look.

        • At the very least I will no longer user Apple's camera app if at all possible, or store photos where iOS can find them.

          All you have to do to avoid scanning is to turn off iCloud Photos.

          And it doesnâ(TM)t scan for photos that might look like child porn. It compares photos to a database of known child porn images.

          These articles explain the system in more detail:

          https://www.macrumors.com/2021... [macrumors.com]

          https://www.macrumors.com/2021... [macrumors.com]

          Do I like it? No. But it does look like Apple has still tried to strike a privacy balance in several waysâ¦

          • While I agree that seems to be the way the thing works -- scanning iCloud content against known images, who the hell takes existing CP from the net and routes it to iCloud? It doesn't make a lot of sense.
            • While I agree that seems to be the way the thing works -- scanning iCloud content against known images, who the hell takes existing CP from the net and routes it to iCloud? It doesn't make a lot of sense.

              One of the "Sharing" options if you Click on an image is to "Save to Photos".

              At that point, if iCloud Photos is Enabled, it is automagically copied up to the User's iCloud Storage. IIRC, At that point (prior to Encryption) is where this (on-device) "comparison" takes place.

      • by gweihir ( 88907 )

        Indeed. My take is this is probably a deal with some law&order, aehm, "people", to avoid something even more intrusive, but once the capability is there it is very easy to use for other stuff. That is essentially assured to happen. The "children" argument is bogus.

    • by Anonymous Coward on Friday August 06, 2021 @10:31AM (#61663811)

      Slippery slope? This is a flume ride with a frictional coefficient approaching zero and descent angle approaching 90 degrees.

    • Funny you think Apple would actually give you a choice in how their hardware/software work. Apple is all about user choice.
  • Seriously. Loading up a bunch of iPhones with a random sampling of images, a few of which are actual kiddie porn, and testing to see if this catches at least X% of the kiddie porn sounds like a dangerous idea.

    • by calih71282 ( 8485049 ) on Friday August 06, 2021 @10:13AM (#61663719)
      There is a Government provided dataset of child pornagraphy images that were seized from actual investigations and convictions that are used for the training of the system.
    • by Samantha Wright ( 1324923 ) on Friday August 06, 2021 @10:13AM (#61663723) Homepage Journal
      RTFA. The phone downloads a database of hashes for known images and checks your messages, albums, etc. against them. If the database is never updated, it'll only detect consumers, not producers; if it is frequently updated, then it will only detect producers whose work is distributed widely enough to end up in the database. Testing its efficacy would be more about seeing how many filters it takes to defeat the hashing algorithm, since it has zero chance of catching new material.
      • Resize, crop (Score:5, Insightful)

        by WoodstockJeff ( 568111 ) on Friday August 06, 2021 @10:32AM (#61663821) Homepage

        Not to mention that such hashes could be easily defeated by resizing or changing the cropping of pictures, changing their hash. Depending on how the source material is hashed, just adding a border would probably prevent a match.

        This will, of course, mean that older phones will be slower, and need to be upgraded.

        • They use a neural net to handle all sorts of obfuscations. So it's a neural hashing technique. It would possibly identify new CP.
          • by nagora ( 177841 ) on Friday August 06, 2021 @11:43AM (#61664115)

            They use a neural net to handle all sorts of obfuscations. So it's a neural hashing technique. It would possibly identify new CP.

            Or, you know, be wrong.

            Have fun explaining how a neural net works to your friends and neighbours after you're arrested for owning child porn that was actually a picture of the Pink Panther's car or something.

            • You joke, but my partner actually got suspended from Facebook for a day, for uploading a picture of a new (at the time) Samsung phone. These neural detection algorithms are not infallible, and you never know what will trigger a false positive.

              • by nagora ( 177841 )

                You joke, but my partner actually got suspended from Facebook for a day, for uploading a picture of a new (at the time) Samsung phone. These neural detection algorithms are not infallible, and you never know what will trigger a false positive.

                Well, I wasn't joking. Mistakes like this can happen even with humans monitoring or just looking at photos being printed in a shop. Once it's mentioned on the local news, your life is fucked no matter what happens at appeal.

          • by DrYak ( 748999 ) on Friday August 06, 2021 @11:50AM (#61664151) Homepage

            So it's a neural hashing technique. It would possibly identify new CP.

            Then there are two risk:

            - Generating bonkers false positive.
            Think about Facebook whose adult filter wrongly recognized an elbow as boobs [cnet.com].
            It might accidentally recognize wrong things falsely as naked children (Think of the Manneken Pis in Bruxelles in Belgium [wikipedia.org] - well, actually DO NOT think about him, depending on were you live you might be accused of wrong think paedophilia [telegraph.co.uk].
            Which brings me to the next risk:

            - Technically not entirely incorrect but judicially wrong.

            Even if Apple's neural net is perfectly good at recognizing children's skin and only children, there another matter:
            the legality of those pictures vary by jurisdiction.

            It might be that any picture of a naked kid, no matter the circumstances, might be considered illegal in some more puritanical region of the globe (can someone from the US confirm ?).
            In a lot of European jurisdictions, nudity per se isn't illegal. Context and intent do matter. Spencer Elden's picture or Nirvana's album is definitely naked, and definitely displaying his penis, but would never be considered obscene in Europe.
            As another example, in Japan, only the actual real human beings are considered. So if adult, consenting, mangakas are drawing lolicon and consensting adults consumers are reading it, no actual real-world children was harmed, no matter what the imaginary stories in the pictures represent.

            This above situation could lead to false positives, leading to false accusations, leading potentially to people getting kicked out of their backup storage, despite never having broken any actual CP law where they live and no children having been actually harmed.

            • I have to say that if no one told me that those were her elbows, I would have only seen boobs as well. It took me a bit to sort out the picture, and I'm not an AI.

          • They use a neural net to detect nudity being sent via Messages to children's devices and then blur it out behind a warning message, and even then, only if the parent of the child opted-in to the use of that feature. Apple doesn't get a report when that happens, nor does law enforcement. I'm not seeing any mention of them relying on neural networks for any of the flagging that leads to them getting a report. All of those systems seem to be based on matching against the hashes of known child porn.

          • by gweihir ( 88907 )

            No. It can only identify pictures it knows.

            • by gweihir ( 88907 )

              Well, apparently things are more fuzzy than that. Hash-lists are pretty much restricted to known pictures and some tolerate some modifications. It is unclear what this thing by Apple really is. Apparently Apple claims can identify "nude pictures" it has not seen before. That would make it far, far more dangerous, because it may well misidentify things and it carries a lot more active functionality. For example, it is very easy to quietly slip other types of pictures in there to find and really hard to find

        • by _xeno_ ( 155264 )

          Not to mention that such hashes could be easily defeated by resizing or changing the cropping of pictures, changing their hash.

          Not how it's hashed. It's hashed using what Apple's calling "NeuralHash" [apple.com] (which is a name several other projects already use, because of course it is, because Apple always manages to reuse names). Their example shows that changing an image from full color to black and white doesn't change the hash. Resizing also shouldn't change the hash for the most part, until you start making it small enough to remove details.

          Cropping might change the hash if it's sufficiently cropped.

          Depending on how the source material is hashed, just adding a border would probably prevent a match.

          Probably depends on the size of the

          • But basically it's looking for "features" in the photo and the hash is based on that.

            Wrong!

            It is looking for entire image matches to a high degree of certainty.

            And it doesn't set a Flag until you cross a threshold of "hits", and then, it doesn't disable your AppleID until you cross a threshold of Flags.

            So, it seems quite conservative in its design, actually.

      • I'm more interested to know what they do with false positives.
      • by Agripa ( 139780 )

        RTFA. The phone downloads a database of hashes for known images and checks your messages, albums, etc. against them. If the database is never updated, it'll only detect consumers, not producers; if it is frequently updated, then it will only detect producers whose work is distributed widely enough to end up in the database. Testing its efficacy would be more about seeing how many filters it takes to defeat the hashing algorithm, since it has zero chance of catching new material.

        So Apple says, but would you trust them not to lie now? At best they have arranged plausible deniability while facilitating government spying.

        • There's nothing to be gained from bothering to entertain such intermediate slivers, quanta, and shades of doubt and paranoia. Apple has famously been uncooperative with governments, even forcing the FBI to go to a third party to crack phones for them [adn.com]. Until you see an actual headline that confirms a material change in that stance, all you're doing is fearmongering.

          But let me show you how to do it properly: the spooks don't need to see inside your phone to spy on you. They control the networks [wikipedia.org]. Building back

          • by Agripa ( 139780 )

            There's nothing to be gained from bothering to entertain such intermediate slivers, quanta, and shades of doubt and paranoia. Apple has famously been uncooperative with governments, even forcing the FBI to go to a third party to crack phones for them [adn.com].

            And Google was uncooperative, until it was discovered that law enforcement had access to the unencrypted links between their data centers.

            The government *needs* Apple to appear to be uncooperative so that people will trust them with their data, putting it right where the government can get to it. Another example of this is the The Electronic Communications Privacy Act of 1986 which appears to make unencrypted email private so people will trust it, while in practice it is not. The same thing can be said fo

      • by idji ( 984038 )
        And any rogue government just gives Apple a pile of documents that the "insist" needed to be added to the CSAM hashes, if they want to keep doing business in their country. Perhaps even photos of Winnie the Pooh, which highly offends the Chinese Dictator-for-Life.
        • You mean this [theguardian.com]? Or this [washingtonpost.com]? Nation states with a coherent anti-free-speech ethos do not need Apple's cooperation nor its API to extend their information sanitation policies into iPhones. Be more worried about these dumbasses [wired.com] doing it.
  • by cfalcon ( 779563 ) on Friday August 06, 2021 @09:47AM (#61663555)

    This only makes sense if they are adding end-to-end encryption to their cloud. Right now, they scan every file you upload to their cloud server side.
    Moving that scanning to client side isn't going to be lucrative for them, especially if, as they have stated, they will ONLY use it to scan images that go up to their cloud (if you don't trust them on that, well, that's reasonable, but I'm going to pretend that's not a concern).

    So what does Apple get out of this second scan? I think their plan is to add real encryption, with a key known only to the USER and not to Apple. This would make their servers impossible for them to scan, and they would have to rely on this client side scan to make that possible (the option with no scan is technically possible today and some services offer it, but Apple would probably get hammered by various governments if they switched that on).

    So that's my guess- either they intended to add full encryption to the cloud (and this idea came from that), or they still intend to do so (and this idea precedes that).

    It's still gonna have hash collision and false positives, but given that these costs are externalized to innocent customers, Apple is willing to pay them.

    • You can encrypt your data client side before sending it to the cloud. This second scan would scan that data before you encrypt it.
  • That justifies anything /s

    Apple needs to read the comments to the previous article. There's no point in repeating all the reasons this is an abhorrent idea.

  • by _xeno_ ( 155264 ) on Friday August 06, 2021 @09:52AM (#61663591) Homepage Journal

    Ever since Apple announced that they feel a pressing need to scan iPhone users phones for what apparently is now called "CSAM," I've been waiting for the news to drop of some law enforcement agency connecting iPhone users to a large ring sharing those images.

    That's the only reason I can think of why they'd be doing this. Because they know they're about to be hit with massive negative publicity for helping child abusers stay hidden from law enforcement, and are desperately trying to get out in front of it.

    • by leonbev ( 111395 )

      The depressing thing is that this new kiddie porn scanning technique is probably using a backdoor in iCloud's encryption that was actually developed for the Chinese government (or some other authoritarian regime) to track their citizens. I have a feeling that this is more of a cover story for why that backdoor exists to begin with.

      • by Cinder6 ( 894572 )

        Why would Apple need to use a backdoor? They own the software, and this scanning is done before it enters iCloud.

    • That's the only reason I can think of why they'd be doing this.

      It might have something to do with a new law saying they're liable for it if it's happening on their platform: https://en.wikipedia.org/wiki/... [wikipedia.org]

  • This feels like they're treating everybody as criminals, and will only consider you "innocent" once they've scanned your device. Just like the Border Patrol does.

    • by King_TJ ( 85913 )

      That's exactly what they're doing. Society no longer relies on parents to parent their kids and protect them from harm. We expect "big tech" to do it for us now, and with a sweeping, blanket method of forcing everybody using an iPhone to participate in constant verification that their photos comply with the law.

  • Did Apple recently acquire Georgia, Hawaii and Puerto Rico?
  • The developer who had to train the AI model to know what to look for was probably not cool. Hope that process was done with respect.
    • There is a Government provided dataset of child pornagraphy images that were seized from actual investigations and convictions that are used for the training of the system. Other companies use the same dataset for training.
  • From the article, it seems like they've got a list of hashes of known child pornography. They're comparing that list with files stored on iCloud. If you're storing a lot of these images on iCloud, they notify the police. I'm OK with that. I am a little worried about the client-side aspect of this, but they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there. I'm a bit concerned about this, but if I were Apple I wouldn

    • As they say, "The road to hell is paved with good intentions."

    • Re: (Score:3, Insightful)

      by kiliheg491 ( 8480339 )
      Now repeat what you said but change "child pornography" with another phrase. Try it with the phrase "anti-government material".
    • > they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there

      The downside for mislabeling a dog photo is small, the downside for mislabeling an innocent photo as CP is great. These models make "false positives" and "false negatives". There is a trade-off between the two, and how the threshold is set depends on Apple.
    • by _xeno_ ( 155264 )

      From the article, it seems like they've got a list of hashes of known child pornography. They're comparing that list with files stored on iCloud. If you're storing a lot of these images on iCloud, they notify the police.

      For now, yes, it only triggers if it finds a match with known imagery. It's not attempting to classify the images or recognize new ones, it's only looking for existing flagged images.

      For now.

      I am a little worried about the client-side aspect of this, but they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there.

      And this is where the real threat is. If Apple is OK with scanning for matching images, just wait until the police demand they leverage their facial recognition technology for known criminals. Both sides of the aisle should be worried: why not use it to look for images that look like they were taken by 1/6 protestors? W

    • I am a little worried about the client-side aspect of this, but they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there.

      And it doesn't work very well. If you've got a decent size library of photos, there's sure to be more than a few false positives. I just searched for "candle" and among the photos that are, in fact, of candles - there's also a picture I took of my turntable playing a record.

  • Teens are going to switch to Samsung so they can keep sexting their boyfriends/girlfriends.
    • No there not, as always, once Apple does a thing so will everyone else
      • by ghoul ( 157158 )
        I am actually Apple did this before Android did it. About the only thing in the last 10 years that Apple has done before Android and I have worked for Apple for the last 11 years. Apple philosophy is never to be at the bleeding edge. let others do the bleeding and once we see the issues then release a more polished version. Apple has held back ready tech many times to let Android go first and do the bleeding.
        • making their own hardware comes to mind... The touch phone in general comes to mind. If you deny that apple is the market leader then your simply an apple hater.
          • by ghoul ( 157158 )
            Last 10 years. Since Jobs died Apple has been a very different company. Its not like they cant be the first to release something, they dont want to be. They would rather let Android go first and take all the bad press when a new feature doesnt work as expected, then come in next year with a more polished version. Its called Customer delight.
  • by dfghjk ( 711126 ) on Friday August 06, 2021 @10:24AM (#61663783)

    Keeping children safe is NOT an important mission for Apple. For a parent, yes. For law enforcement, yes. For Apple, no. It is, at most, a minor aspect of the features they provide.

    Worse, when an executive starts out with such a bold lie to justify what the company is doing, you know the rest is bullshit.

    • If they wanted to keep my children safe they would ensure their privacy rather than trawling through their stuff and reporting it to some authority.

  • Surprised? (Score:5, Insightful)

    by Fuzi719 ( 1107665 ) on Friday August 06, 2021 @10:33AM (#61663833)
    Apple already turns over data to the Chinese government. Their claims of "privacy" are nothing but a smokescreen and marketing.
    • This will be a seminal moment. This is how companies DIE.

      Mark my words, even when their sales go down the basement because users start leaving in droves they'll keep at it because "We must... we must do it for THE CHILDREN!!"

      Apple Management will be completely powerless to stop the demise and the company will disappear in oblivion, just like it was about to do in the late '90's before miracle worker Steve Jobs came along.
  • by dlleigh ( 313922 ) on Friday August 06, 2021 @10:50AM (#61663915)

    How soon until bad guys start surreptitiously sending offending images to the iPhones of people they don't like? Apple will do the rest of the work for them.

    Operating systems at least try to keep malware at bay, but images? Consumer devices are designed to suck those up from the web, social media, e-mail, texts, and what have you.

    And some tech savvy cretin with too little regard for humanity is going to code up a way to disguise the images so that the recipient never sees them, but the bit stream will match hashes in the government database.

    Good luck explaining all that to a jury.

  • I don't want your explanations, I want you to immediately STOP this idiocy or you can kiss your sweet green behind goodbye. Apple will see a huge exodus of users who don't like the company scanning their stuff.

    I was actually starting to buy into the whole "privacy" focused Apple PR, but this charade has made me change my mind. If Apple believes its buyers care more for children's virginity than their privacy they're sadly mistaken or just misinformed.
    • The problem is: where will you go? Android has its own share of privacy issues (for many a reason to buy Apple in the first place), and a lot of the specialized "privacy phones" have turned out to be either full of holes, or law enforcement honeypots.
    • Apple will see a huge exodus of users who don't like the company scanning their stuff.

      That is extremely unlikely.

  • I really wish users would stop treating technology like it's magic. I've said this before and I'll say it again. If you bought a nightstand from Walmart, the typical citizen would not consent to Walmart coming in every day to rifle through the contents of this nightstand to see if there was anything on their "objectionable" list. Now, if the nightstand magically transported the contents to a warehouse, that'd be understandable. But the company that you bought from or made the device going through your thing
  • by VeryFluffyBunny ( 5037285 ) on Friday August 06, 2021 @01:23PM (#61664457)
    ...grinning like a bald baby while his giant penis rocket thrusts into the sky?
  • What if your wife or husband is a little person. Will your pictures interacting with them get flagged and viewed by Apple certified peepers? Sorry Apple, you've jumped the shark this time. Your company is just as creepy as Zuckerberg's now.
    • What if your wife or husband is a little person. Will your pictures interacting with them get flagged and viewed by Apple certified peepers? Sorry Apple, you've jumped the shark this time. Your company is just as creepy as Zuckerberg's now.

      It doesn't work that way, and you should know that by now.

      Troll.

  • by PinkyGigglebrain ( 730753 ) on Friday August 06, 2021 @03:32PM (#61664947)

    Keeping children safe is such an important mission

    do justifications like these for any action raise red flags for anyone else or is it just me?

  • Shades of the 80's and "Think of the children!"... yeah we heard that from the politicians so much, they thought of the children back then all right and that is why we're so economically screwed now.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...