Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apple

We Built a CSAM System Like Apple's - the Tech Is Dangerous (washingtonpost.com) 186

An anonymous reader writes: Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple's own employees have been expressing alarm. The company insists reservations about the system are rooted in "misunderstandings." We disagree.

We wrote the only peer-reviewed publication on how to build a system like Apple's -- and we concluded the technology was dangerous. We're not concerned because we misunderstand how Apple's system works. The problem is, we understand exactly how it works.

Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we're also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.

We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

But we encountered a glaring problem.

Our system could be easily repurposed for surveillance and censorship. The design wasn't restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.
About the authors of this report: Jonathan Mayer is an assistant professor of computer science and public affairs at Princeton University. He previously served as technology counsel to then-Sen. Kamala D. Harris and as chief technologist of the Federal Communications Commission Enforcement Bureau. Anunay Kulshrestha is a graduate researcher at the Princeton University Center for Information Technology Policy and a PhD candidate in the department of computer science.
This discussion has been archived. No new comments can be posted.

We Built a CSAM System Like Apple's - the Tech Is Dangerous

Comments Filter:
  • by cygnusvis ( 6168614 ) on Friday August 20, 2021 @11:51AM (#61711953)
    Surveillance and censorship is itâ(TM)s only purpose, even if for now is only censoring illegal content and surveilling criminals
    • Re: (Score:2, Insightful)

      by Trump One ( 8427569 )

      even if for now is only censoring illegal content and surveilling criminals

      I don't think they've been limiting themselves at all. Seems to me big tech has been overtly censoring perfectly legal content and normal law abiding citizens, just because they have a different political view.

      I encourage everyone who has been subjected to censorship at the hands of big tech to join President Trump's lawsuit [takeonbigtech.com]. The tyranny must be stopped!

    • "illegal content and surveilling criminals" but there is no way to know unless you check and record everything, on everybody.
  • by sinij ( 911942 ) on Friday August 20, 2021 @11:52AM (#61711963)
    Surveillance and censorship was always the purpose, anything else is the excuse. Child exploitation, a heinous crime, is successfully detected and prosecuted without this technology. This is why you hear in the news about pedophile rings getting busted. Even politically-connected billionaires get eventually busted, this is how seriously society takes this.
    • And since even people who are otherwise impervious for law enforcement get caught, it seems there isn't really any reason for these suspicious tools for that purpose.

      So what could it be?

    • Surveillance and censorship was always the purpose, anything else is the excuse. Child exploitation, a heinous crime, is successfully detected and prosecuted without this technology.

      With a lot of damage done between beginning and end. It's not just about catching, but catching early.

      • by DamnOregonian ( 963763 ) on Friday August 20, 2021 @12:38PM (#61712105)

        With a lot of damage done between beginning and end. It's not just about catching, but catching early.

        Definitely. The next step is weekly warrantless home inspections. After all, what's probable cause for the curtailing of civil liberties if not the potential for children to be hurt? And finally, when the technology is available- catch it before they even commit it- then we'll have saved them all!

        • After all, what's probable cause for the curtailing of civil liberties if not the potential for children to be hurt?

          Who will you target first? Those with children (since parents and very close relatives are overwhelmingly the perpetrators of CSA), or those without children?

          Or, and this is the one that the average politician will protest against most strongly, those who have children who they haven't publicly acknowledged?

          • Who will you target first? Those with children (since parents and very close relatives are overwhelmingly the perpetrators of CSA), or those without children?

            Whoever pisses me off the most politically, of course.

        • by dfghjk ( 711126 )

          Just need to get the hashes for images you have just taken and enter them into the database just before checking the database for those hashes!

      • by dfghjk ( 711126 )

        Is it though? After all, how long does it take for a known image to be entered in the database? Is this a database of 40 year old images or of data intended to catch crimes "early"? Is this system even capable of "catching early"?

      • With a lot of damage done between beginning and end. It's not just about catching, but catching early.

        So, now that the entire world knows Apple is scanning for CSAM images, who exactly do you think this is going to catch? Probably a couple of idiots who haven't gotten the iMessage. For that, we intrusively scan a few hundred million phones of otherwise innocent people, and, as a bonus, create a system that can literally scan for any other content as well?

        Anyone who owns an iDevice now knows that there is a thin technological + human line between them and a call to the FBI/country-specific authority. I me

    • by Junta ( 36770 )

      To be fair, detecting and prosecuting exploitation isn't either "we can't catch them" or "we catch 100% of them", so there is a question of "what percentage of incidents are caught?" We caught and prosecuted murderers in the 1800s, and yet somehow forensic advances were still warranted. So there is presumably some area for improving our success on this front.

      Another is to what extent is the evolving norms may impact detection and prosecution. If unencrypted photo storage is a frequent element, and that get

    • There absolutely are people who want to snoop, for various reasons. That's absolutely true.

      It would be an error to say:
      People want to snoop, the sun doesn't really rise in the morning.

      Two unrelated facts can be true. The fact that people want to snoop doesn't make every other statement false.

      Another true statement is that sexual abuse of children is huge problem. The numbers are staggering. The stories are heart wrenching. It's very much NOT a solved a problem. In fact, 1 in 5 girls and 1 in 20 boys report

      • Doggie and Hellman proved the cryptography community wrong in 1976. A challenge for today's smartest people is to find ingenious ways to fight the victimization of children, without a police officer in every house or every computer.

        So instead of utilizing 2 other blatantly unconstitutional methods, a third is OK?

        Granted, it's not unconstitutional for Apple to do this- the 4th amendment doesn't bind them
        But that's the fucking point, and that's why it's so insidious.
        It's a circumvention of constitutional protections via the private sector.
        I'm sympathetic to the goal, and I believe Apples intentions to be bona fide.

        But good faith doesn't make it right.

        • Yes, of course! Of course if somebody says "this shouldn't happen", that means EVERYTHING ELSE *should* happen.

          Just like if "the sun rises in the morning" is true, that makes every other statement false.

          You're brighter than that.

          • Yes, of course! Of course if somebody says "this shouldn't happen", that means EVERYTHING ELSE *should* happen.

            We're not talking about EVERYTHING ELSE. We're talking about the status quo.

            Just like if "the sun rises in the morning" is true, that makes every other statement false.

            Nope. No one claimed that.

            You're brighter than that.

            Sure am. I'll re-quote.

            Doggie and Hellman proved the cryptography community wrong in 1976. A challenge for today's smartest people is to find ingenious ways to fight the victimization of children, without a police officer in every house or every computer.

            If I'm to interpret this without the context of the discussion at large (Whether it's OK for Apple to play CP police on hardware that I own) then this entire post is a meaningless bit of you waxing philosophical.
            Ergo, it's reasonable to come to the conclusion that you feel that the smarted people today have risen to the challenge, and this was their solution, and that it's OK. Becaus

            • Oh, I see.

              When I said:
              we shouldn't .. a challenge for today's smartest people is to find

              You read:
              We should ... back in 2009, Hany Farid found

              When I said we shouldn't, I mean we shouldn't.
              When I said today's challenge to to figure out, I didn't mean it was all figured out 15 years ago. By "shouldn't" I meant "shouldn't" and by "today's challenge is to figure out", I meant that today we have the challenge of trying to figure something out. I guess for whatever reason you thought I meant the exact opposite of

              • Ignoring your pompous attitude,
                You said:

                A challenge for today's smartest people is to find ingenious ways to fight the victimization of children, without a police officer in every house or every computer.

                So, in the context of everything else you wrote, and the fact that Apple is not the police, I took what you said to be tacit approval of the private sector playing the police on hardware we own.
                I.e., the 2 conditions you gave are technically false. There's no police officer in every house, or every computer, even with what Apple is doing.
                You meant to imply that Apple == the Police. Which is fine, being the distinction is only borne out in legalese, not in rational

        • If Apple caved to government threats, then they'd be government actors for the purposes of civil rights. The government has applied substantial pressure in public, I'm willing to bet there's a good chance the pressure applied in private was even greater, possibly to the point of crossing the line of coercion.
      • by hawguy ( 1600213 )

        Another true statement is that sexual abuse of children is huge problem. The numbers are staggering. The stories are heart wrenching. It's very much NOT a solved a problem. In fact, 1 in 5 girls and 1 in 20 boys reported they were victims of childhood sexual abuse. Of course we don't know the exact numbers, but we know it's rampant. We also know it's rampant online.

        The problem with this system is that it's not going to stop new sexual abuse, it's only matching known pictures, so it will catch the traders, but not the original abusers (who aren't going to upload their pictures to iCloud). But it's a very small fall down a slippery slope to have Apple scan *all* photos to look for sexual abuse (and why stop at children, adults are abused and photographed too) and send them for verification when they reach some threshold.

        So if you really care about stopping CSAM, then yo

        • I think you've identified (though exaggerated) a couple problems; now I wonder what idea you have - even fantastical, ridiculous ideas - for what would solve the problems.

          Public key encryption was "impossible", Elon Musk likes doing "impossible" things, and I love doing "impossible" things, so "impossible" ideas just might end up being great ideas.

          • by hawguy ( 1600213 )

            I think you've identified (though exaggerated) a couple problems; now I wonder what idea you have - even fantastical, ridiculous ideas - for what would solve the problems.

            Public key encryption was "impossible", Elon Musk likes doing "impossible" things, and I love doing "impossible" things, so "impossible" ideas just might end up being great ideas.

            Not all problems have reasonable solutions, even horrific problems. After all, we let millions of children die of starvation each year even though there is plenty of food produced in the world each year. Even in the USA, we let kids suffer through hunger and poor nutrition, and those problems are arguably much easier to solve than ending CSAM.

            • > Not all problems have reasonable solutions, even horrific problems. After all, we let millions of children die of starvation each year

              Let me see if I understand the parallel you're drawing here. I think I'm missing something. Perhaps you can help me understand the difference between the comparison your making and this wording:

              I don't see an easy solution to provide food for starving children, so we shouldn't try. Similarly, stopping the rape of kids isn't easy, so we probably shouldn't bother to do any

              • by sinij ( 911942 )
                I think you are not wanting to understand. A measure that would put everyone that much closer to surveillance state is not a worthwhile trade-off to only marginally decrease child abuse.

                Consider these questions:
                1. How effective would a well-publicized method of scanning for child pornography would be at catching or preventing on-going child abuse?
                2. How much net harm this would cause if every single child in US grows up with less freedoms?
                • I believe I understand what you're saying there. I agree with you.

                  You think scanning people's phones, at least in the ways we can currently conceive of, isn't worth the loss of privacy. We agree on that.

                  What I'm not clear on is what parallel you're drawing to starving children.

    • Surveillance and censorship was always the purpose, anything else is the excuse. Child exploitation, a heinous crime, is successfully detected and prosecuted without this technology. This is why you hear in the news about pedophile rings getting busted. Even politically-connected billionaires get eventually busted, this is how seriously society takes this.

      Looks like Prince Andrew might get away with allegedly raping children. Apparently, he's currently hiding in his mother's castle up in Scotland.

  • Self incrimination. (Score:5, Interesting)

    by Fly Swatter ( 30498 ) on Friday August 20, 2021 @11:55AM (#61711971) Homepage
    With the scanning software running on your own property, I argue that it is the same as testifying against yourself. So...

    It tests the fifth amendment.

    It tests warrantless search.

    So many problems with what Apple is doing it isn't funny - it's serious.
    • It tests the fifth amendment.

      It tests warrantless search.

      No, no it does not. Apple is not a government actor nor are they acting at the behest of government actors.

      Now, if a government actor were to go to Apple and say "We believe Fly Swatter has child porn on their phone and we want you to scan it" without a warrant, that *might* be a problem depending on jurisdiction, level of involvement etc. As of now there is no indication that Apple is acting at the behest of any other entity.

      • No, no it does not. Apple is not a government actor nor are they acting at the behest of government actors.

        Whether or not it's at the behest of, is irrelevant.
        It's arguably Fruit of the Poisonous tree. Except it won't be in this case, for the kids.
        Apple can only be acting as a Government agent in this regard. They have no law enforcement power. They're circumventing the fact that the Government is bound from being Big Brother, and acting in that capacity on their behalf.

      • by tragedy ( 27079 )

        No, no it does not. Apple is not a government actor nor are they acting at the behest of government actors.

        I could have sworn I saw one of their statements somewhere saying that they built this system in response to requests from law enforcement.

      • nor are they acting at the behest of government actors.

        Do we know that for sure? Because the government has on numerous accusations accused Apple publicly of supporting terrorists and pedophiles. I'd be very interested to know how much more pressure they exerted behind the scenes to get a company that's spent a great deal of effort marketing itself as protective of user privacy to completely torch that reputation.

    • The device might be your property (or your employer's, or your mom's, etc.) but the OS and the software are not. Besides, China, Syria, Isreal, the NSA, and Apple themselves aren't interested in US Constitutional law. They're interested in catching dissidents, whistle blowers, leakers, etc. Do we really think Apple isn't going to drop hashes of their own leaked, internal documents into the database to find and punishes those involved?
    • With the scanning software running on your own property, I argue that it is the same as testifying against yourself.

      You can't be compelled to testify against yourself. You can however do so voluntarily, which you do according to the ToS you voluntarily agreed to. This will be tested on many levels, but it won't be a fifth amendment issue.

  • by Alascom ( 95042 ) on Friday August 20, 2021 @12:07PM (#61712015)

    >Our system could be easily repurposed for surveillance and censorship.

    This is not a flaw - this is the intended goal. Slippery slopes really do exist - often intentionally.

    • >Our system could be easily repurposed for surveillance and censorship.

      This is not a flaw - this is the intended goal. Slippery slopes really do exist - often intentionally.

      Over-the-top cynicism is a good way to get modded up on slashdot. That doesn't make it accurate. Apply Hanlon's Razor liberally.

  • Mostly images are stored in the clear on the cloud provider and you have nothing but their word that they aren't messing with the content.I suppose the biggest question is whether matches drive things to the cloud storage that didn't otherwise go there, but broadly speaking handsets pretty much give up all their data to some cloud provider or another. Saying that they could surreptitiously apply a different database doesn't mean much when they could surreptitiously do whatever snooping they want since they

    • Given the legal status of these images, I'm surprised that services that store data in the clear don't have to at least check that they aren't storing something illegal. I get the impression that if my friend gave me a CD with some photos on it and then the cops busted my friend and then found I had that CD, I would be in a load of trouble. Maybe not too bad if I didn't know, just no more than them confiscating every computer I own. But probably going to jail if I happened to have taken a look to see wha
      • by zekica ( 1953180 )
        Not true at all. There is a requirement for you to report it if you find it, but there is no requirement for you to look for it.
      • Many of the big services do. Facebook, Google, Microsoft... they scan the content of yours on their servers and report CSAM. I'm surprised Microsoft isn't including such as scan as 'telemetry' in Windows... it's probably going to eventually; it already keeps spontaneously syncing stuff with OneDrive, where it's scanned.
    • Mostly images are stored in the clear on the cloud provider and you have nothing but their word that they aren't messing with the content.

      Actually, you have their word that they are messing with the content. OneDrive, for example, scans uploaded photos and can suspend or terminate your Microsoft account if you store photos containing, I quote: "nudity, bestiality, pornography, offensive language, graphic violence, or criminal activity". Google Photos has similar terms. And so with the others.

      • by HiThere ( 15173 )

        Actually, that doesn't say that they scan your images, merely that if you do you violate the terms of service. And since that list include "offensive language" they can be pretty sure that you *are* violating their terms of service...they just have to pick the right sensitive flower to be offended. (I know someone who finds technical language to be offensive. A friend's wife finds precise qualification of meaning to be offensive. Etc.)

  • I'm glad this is pointed out clearly. It should however be easy to see for everybody: no audit of the software can check the integrity of the system if the database of hashed content can not be audited (at first it will be kids abuse pictures which are hashed and planted on iphones but who know what else). The crux is that just the nature of the database makes it impossible to audit. One not only has to trust apple, one has to trust the folks who maintain the dirty databases. And who wants to look at that?
  • Our system could be easily repurposed for surveillance and censorship

    This possibility has been brought up repeatedly since day one, so I'm not sure how this is news. Of course all you have to do is swap database 'kiddieporn' for database 'government doesn't like X'. I have a sneaking suspicion that Apple built this for the Chinese to use, and just hid it behind the 'think of the children!' mantra.

  • Do you think the NSA can't surveil you think again. We gave that battle up when we wet our pants over 9/11. The problem is false positives. Any of these accusations will destroy someone's life because prosecutors are so zealous and so willing to take any conviction they can get that they will cheerfully go after anyone, and even if you're innocent the risk and cost of engaging with our legal system is so high most people will be forced into a plea bargain.

    It's the same reason why drugs should be lega
  • Given the following:

    * Apple is a control freak, but not stupid
    * Apple cares enough about user privacy to buck the FBI
    * Apple follows the law, here and in places like China

    I have to conclude that Apple rolled this stuff out to try and forestall something worse being required by law.

    Yes, the system can be co-opted and used by bad actors like China for arbitrary data suppression.
    At least with this system, it will be obvious when they do that - there will be extra hashes outside the ones Apple gets from the US CSAM authorities.

  • Isn't hash matching pretty easy to work around? I.e. if you change a single bit in the image, it changes the hash, and it no longer matches?
    • Isn't hash matching pretty easy to work around? I.e. if you change a single bit in the image, it changes the hash, and it no longer matches?

      Sure, that's why CSAM doesn't use hashes, at least not hashes of the sort we typically use in computer science and computer security. The "hashes" are instead simplified, (probably) non-reversible descriptions of image contents, making them scale-independent, often rotation-independent (at least to a degree), and able to survive small changes in the source image. Matching a pair of such "hashes" is done with a distance metric, measuring how close the two values are, rather than exact matching.

      Yes, this m

  • Our system could be easily repurposed for surveillance and censorship.

    no shit, sherlock. it is, after all, NOTHING BUT CENSORSHIP. that it is "good censorship" doesn't make it any less censorship and any less obvious, and obviously prone to abuse like any other form of censorship. that this wasn't bloody obvious from the get-go to someone who apparently gets the point and implications of e2e encryption and privacy rights is just amazing. so much common sense and naivety at the same time ...

  • Apple isn't going to be spending the money to implement this feature without a good reason. They clearly aren't doing it because their customers want a child porn scanner. To the best of my knowledge they aren't being forced to implement a child porn scanner because of government regulations.

    So why are they doing this? Is there some PR benefit from this? Is there some anti-child-porn group pushing them to do it? Someone (e.g. spy or secret police agency in some country) pushing Apple to do this using the "t

  • Something like this is blindingly obvious as something just begging to be abused.

      I would've said "to the average citizen as well", but "Think of the children" is the power off command for your average citizen's brain.

  • Our system could be easily repurposed for surveillance and censorship. The design wasn't restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

    The crux is in the word 'could'.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...