Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apple

Apple Warns Staff To Be Ready for Questions on Child-Porn Issue (bloomberg.com) 63

Apple has warned retail and online sales staff to be ready to field questions from consumers about the company's upcoming features for limiting the spread of child pornography. From a report: In a memo to employees this week, the company asked staff to review a frequently asked questions document about the new safeguards, which are meant to detect sexually explicit images of children. The tech giant also said it will address privacy concerns by having an independent auditor review the system.

Earlier this month, the company announced a trio of new features meant to fight child pornography: support in Siri for reporting child abuse and accessing resources related to fighting CSAM, or child sexual abuse material; a feature in Messages that will scan devices operated by children for incoming or outgoing explicit images; and a new feature for iCloud Photos that will analyze a user's library for explicit images of children.
Further reading: Apple's child protection features spark concern within its own ranks.
This discussion has been archived. No new comments can be posted.

Apple Warns Staff To Be Ready for Questions on Child-Porn Issue

Comments Filter:
    • by Anubis IV ( 1279820 ) on Friday August 13, 2021 @05:03PM (#61689857)

      Google\Android next?

      Google already does it. Google made over 540,000 reports of child porn to law enforcement in 2020 alone [missingkids.org]. Same for Facebook (over 20 million reports). And plenty of others too.

      People seem to be unaware that Congress carved an exception out of Section 230 last year [wikipedia.org]. Whereas these companies are typically protected against liability for content their users share, the EARN IT Act puts companies on the hook for child porn if they fail to police their platforms for the stuff. Facebook, Google, and others saw the writing on the wall and added these sorts of protections to their cloud services awhile ago.

      Apple is late to the game and is sticking to on-device checks against hash fingerprints of previously seized child porn, so they're easily the least invasive, despite the reporting that's been going on this week (much of which was incorrect, but given Apple's abysmal handling of the subject I don't blame reporters for their frequent misreporting of what was going on).

      • The nightmare EARN IT Act hasn't been passed yet. Are you think of FOSTA/SESTA? The first line of the Wikipedia page you linked to says "The EARN IT Act of 2020 (Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020) is proposed legislation..."

        Hopefully the EARN IT Act never passes because it will deal a devastating blow to internet privacy that's already teetering on the edge of non-existent. Some versions of it will result in a ban on non-backdoored encryption.
    • Yep. Everybody copies Apple.

      Privacy? HA! Not in tbe near future. You have to be searched up and down because you may be a criminal, and you will continue to be nagged about website cookies and that "your privacy means so fucking much to us!"

      Orwellian doublethink and other such nightmares are coming to pass and the entire world population is just somwthing to be fucked around wirh and to make rich powerful bad men even richer, powerful, and bad.

      Roll over dog! Good boy!

  • day one, audit says everything is fine
    day two, the government censors your winnie the poo pics

    • [Grainy B&W tv ad of a runner zooming through a dystopia of marching worker drones sitting to watch Big Brother lord over them. She carries a sledge hammer, but gets caught and tackled to the ground.]

      Vooive over: "And you will see why 2022 will be exactly like 1984."

      Congratulations, Apple. You lived long enough to become the villain, as you defined it.

  • by Lost Race ( 681080 ) on Friday August 13, 2021 @02:22PM (#61689165)

    Apple's naughty image detector is not about protecting children, it's about protecting Apple. They just don't want that shit on their cloud servers, too much liability for them.

    • If that was the singular motivator, they could have just done it without saying anything at all. No, this was being marketed as a "protect the children" feature and did not account for this potential backlash. Who could be against stopping child porn?
    • by tlhIngan ( 30335 ) <slashdot.worf@net> on Friday August 13, 2021 @03:23PM (#61689443)

      Apple's naughty image detector is not about protecting children, it's about protecting Apple. They just don't want that shit on their cloud servers, too much liability for them.

      Exactly. Though, it's actually the law in the US that hosting providers can't carry that stuff, and Apple is one of the last providers for actually doing this - every other provider around already has scanners using the same databases, and Apple was dragging their feet for years.

      The scanner runs locally on images that you upload to iCloud. If it detects a match, the details (but not the image) are sent to Apple who can check it on iCloud themselves to see if it's a false positive and if not, the image is removed from their server, but not your device.

      But it doesn't matter if you disable the upload to iCloud - every other photo sharing site has to comply as well, so switching to say, DropBox or Google Photos or other thing will have the same result.

      • But it doesn't matter if you disable the upload to iCloud - every other photo sharing site has to comply as well, so switching to say, DropBox or Google Photos or other thing will have the same result.

        To your point, Dropbox made nearly 21,000 reports to law enforcement last year, and Google made over 540,000 [missingkids.org]. Of the 21.4 million reports made in total last year, Facebook had the lead with 20.3 million. In contrast, Apple only made 265, which is low enough that it makes me think they only reported cases that were brought to their attention by others (e.g. pedophiles whose photos were discovered after they turned on public sharing of photos via the web).

        As you suggested, however, Apple is legally required t [wikipedia.org]

        • As far as I can tell, the Earn It Act has not yet been signed into law, so they're not legally required.
    • All I can think of is this bit from Bill Burr https://youtu.be/z2phiQeq6M8?t... [youtu.be]
  • by psergiu ( 67614 ) on Friday August 13, 2021 @02:30PM (#61689193)

    Millions of pedophiles arrested in Europe, all Christian Orthodox churches shut down by police after Apple notified authorities of the horrifying child pornography sessions called "baptism" where infants as young as a few months were stripped naked in front of perverse adults taking pictures of their helpless bodies.

    • Will Apple also flag pictures of Jewish child torture sessions known as bris?

      I'm guessing religious nutters of all denominations will not be using iPhones to keep photographic memories of their weird rituals.

      • Will Apple also flag pictures of Jewish child torture sessions known as bris?

        If that happens, Apple's campus will probably get destroyed by that Jewish Space Laser I was hearing about.

        • If that happens, Apple's campus will probably get destroyed by that Jewish Space Laser I was hearing about.

          Destroyed? Doubtful. Probably just take a bit off the top and call it a day.

      • by xalqor ( 6762950 )

        You mean circumcision [mayoclinic.org]? Yeah it would be weird to photograph that. Also it's weird to take photos and videos during childbirth. So what are you going to use instead of your iPhone?

        • You compare the barbaric circumcision with childbirth?! Sure childbirth isn't pretty but that's the point. A reminder so you reconsider making another one. Without a recording you only remember the feelgood part of it.

          • by xalqor ( 6762950 )

            When I'm thinking about what to do, I'm less concerned about how difficult something is, and more about whether it's worth the effort and risks involved.

            There are only three ways to solve our overpopulation problem on Earth: 1) decrease reproduction rate, 2) increase death rate, and 3) increase emigration rate.

            Since we don't have any viable way to send people to live off planet, #3 is out for a while. Increasing death rate by any means is usually considered a bad thing by many people, whether it's criminal,

      • by ELCouz ( 1338259 )
        Thanks for reminding us how barbaric religion can be. FFS let the child decide later on.
        • The success and indeed proliferation of religion literally depends on the child not thinking.

      • "Will Apple also flag pictures of Jewish child torture sessions known as bris?"

        They actually take pictures of that?! That stuff is hard to look at. Not something a normal person would keep as a memory. Can only hope the poor kid doesn't find that content.

        "I'm guessing religious nutters of all denominations will not be using iPhones to keep photographic memories of their weird rituals"

        The iphone is considered a status symbol so I think many will risk it.

        I can only assume they already have pictures uploaded.

    • by _xeno_ ( 155264 )

      No, because that's not what that part of the system does. It only works for detecting specific "bad images."

      Part of the problem is that two features that are very similar but work very differently "leaked" at the same time. (Hadn't heard about the Siri feature before but that's not really a privacy issue - it's just a new set of keywords that triggers a new response.)

      Basically, there are two different things Apple is doing that invade privacy in two very different ways:

      The first is an ML model that can dete

      • What's the system that they 'trained on 250,000 CSAM images from the NCMEC database'? Training a machine learning algorithm isn't matching known images, and you wouldn't train a general porn identifier on CSAM. Either they somehow screwed up their own press releases or something fishy is going on with what is doing what.
        Also since the program runs locally and scans on device before upload, Apple would have to comply with government requests to scan other folders besides iCloud uploads.
        • Forgot to mention, the known images DB is far larger than 250k images, so it couldn't be that type of error either.
        • by _xeno_ ( 155264 )

          What's the system that they 'trained on 250,000 CSAM images from the NCMEC database'?

          You're going to have to find the quote on that because I have no idea where you got that.

          The NeuralHash thing they do based on the NCMEC database is trained, as far as I can tell from their tech documents [apple.com], on photos in general. It's basically designed to be able to tell if two pictures are the same picture and that's it.

          They then run that database against the trained ML model to generate the hashes. As best I can tell from their summary, the hashes are literally hashes of the neural network after running an

    • Oh, just wait until they start analyzing everyones' archived pictures from Burning Man. There is no shortage of boob, butt, vag, and penis there; even if only in the background of the subject. (I have two albums for every Burn I attended... a main album and a work safe one. A cursory glance and guesstimate would indicate that between 1/5th to 1/3rd are non-worksafe. And I *don't* go around looking for the naked people.) Naked people don't usually have any place to carry ID. Pretty much no one at Burni

      • Reminds me of something the county sheriff (similar to police chief) did where I used to live... two teenage girls were standing on the side of the highway, flashing their breasts at everyone driving past. The sheriff, driving in the opposite direction, saw someone snapping a photo while driving by, recognized the girls and knew they were only 16, so spun around, flipped on his lights, chased down the driver that took the photo, and charged him with possession of child pornography. The charge was dropped, b
      • That's not how the software works. It doesn't identify genitals that appear to be those of underage kids. The software uses indexing (or "fingerprinting") technology that identifies known images of child sexual abuse. If you're going to throw out hypotheticals, it's probably a good idea to read up on what it is that you're talking about and how it works.
  • The Applephiles will whine a while, then be first in line to buy the next iWhatever.

    Count on it.

  • There might be the classic Coppertone ad in it. One which I won't link to, for the sake of those using Apple devices. lol :)
  • ... a new feature for iCloud Photos that will analyze a user's library for explicit images of children.

    This includes Apple employees too -- right?

    (I'd hate to be the Apple employee that gets caught with CP in their iCloud or on their iPhone. Talk about irony.)

  • by Otis B. Dilroy III ( 2110816 ) on Friday August 13, 2021 @04:32PM (#61689735)
    Federal government pressures Apple
    Apple says hey, we already have the technology in places to spot child porn
    We can easily modify it to locate any image that you (the Feds) want to find

    Federal government pressures Amazon
    Amazon says hey, we have a nationwide network of video spy devices already in place
    Amazon wins NSA contract.

    Federal government pressures Facebook
    Facebook says hey, we can identify every anti-vaxer militant racist on the planet.
    We also have algorithms and data sets that let us predict who might become an anti-vaxer militant racist in the future.

    Federal government pressures ...
  • Does this infect you only if you are an iCloud user?
    • Supposedly that is the case, but who can really believe that they won't also scan on the phone if you don't use iCloud?
  • Can you put the kiddie porn automatically on Amazon Photos instead?

    Asking for a friend.

  • a feature in Messages that will scan devices operated by children

    Um, anybody know how they are going to tell an adult-operated device from a child-operated device?

  • I think there is an angle most comments are missing: how easy it becomes to destroy someoneâ(TM)s life. Let us see two scenarios. 1. You have access to someoneâ(TM)s phone for some seconds; 2. Take pictures of child porn using that phone while locked (for instance, of pictures stored in your own phone); 3. Put the victimâ(TM)s phone back. Now when they unlock the pictures you took will be copied to the gallery and uploaded to the iCloud without the victims knowledge. Scenario 2: 1. You know (or assume) your victim has the auto-downloading of pictures enabled on some IM app; 2- Send incriminating pictures at a time the victim is unlikely to be using the phone. The photos get copied to the gallery; 3- Delete said photos on the chat to both you and the victim. The victim will only see some erased messages in your chat. By the time they notice the pictures in their gallery, it will be too late and it will be up to them to prove they are not guilty. I can come up with other scenarios. And people better versed in security will too.
  • Apple must have trained their models on a database of child porn, and their developers must have used child porn when constructing those models (I can't imagine a more repulsive job unless it's working on snuff videos), so is Apple in illegal possession of child porn?
    • Apple must have trained their models on a database of child porn, and their developers must have used child porn when constructing those models (I can't imagine a more repulsive job unless it's working on snuff videos), so is Apple in illegal possession of child porn?

      Apple didn't create the database of hashes of child sex abuse images [theverge.com]. The National Center for Missing and Exploited Children did. Apple just compares hashes of images uploaded to their servers to hashes from known images of child sex abuse. So no - Apple is not and was never in illegal possession of illegal images, at least not for the purposes of creating this database.

  • They are essentially comparing the images against a list of SHA256 of known kiddie porn on device, so that's just a few milliseconds of hash creation and a table lookup before sending the image to iCloud. When it is submitted to iCloud and flagged as kiddie porn a human will verify the image and then whatever takes place takes place. The part I'm wondering about is how the Apple human is looking at the flagged photos. I was under the impression that all of my iCloud files are encrypted and even Apple can't

Repel them. Repel them. Induce them to relinquish the spheroid. - Indiana University fans' chant for their perennially bad football team

Working...