Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy Apple

'Apple's Device Surveillance Plan Is a Threat To User Privacy -- And Press Freedom' (freedom.press) 213

The Freedom of the Press Foundation is calling Apple's plan to scan photos on user devices to detect known child sexual abuse material (CSAM) a "dangerous precedent" that "could be misused when Apple and its partners come under outside pressure from governments or other powerful actors." They join the EFF, whistleblower Edward Snowden, and many other privacy and human rights advocates in condemning the move. Advocacy Director Parker Higgins writes: Very broadly speaking, the privacy invasions come from situations where "false positives" are generated -- that is to say, an image or a device or a user is flagged even though there are no sexual abuse images present. These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse, or if an adversary could trick Apple's algorithm into erroneously matching an existing image. (Apple, for its part, has said that an accidental false positive -- where an innocent image is flagged as child abuse material for no reason -- is extremely unlikely, which is probably true.) The false positive problem most directly touches on press freedom issues when considering that first category, with adversaries that can change the contents of the database that Apple devices are checking files against. An organization that could add leaked copies of its internal records, for example, could find devices that held that data -- including, potentially, whistleblowers and journalists who worked on a given story. This could also reveal the extent of a leak if it is not yet known. Governments that could include images critical of its policies or officials could find dissidents that are exchanging those files.
[...]
Journalists, in particular, have increasingly relied on the strong privacy protections that Apple has provided even when other large tech companies have not. Apple famously refused to redesign its software to open the phone of an alleged terrorist -- not because they wanted to shield the content on a criminal's phone, but because they worried about the precedent it would set for other people who rely on Apple's technology for protection. How is this situation any different? No backdoor for law enforcement will be safe enough to keep bad actors from continuing to push it open just a little bit further. The privacy risks from this system are too extreme to tolerate. Apple may have had noble intentions with this announced system, but good intentions are not enough to save a plan that is rotten at its core.

This discussion has been archived. No new comments can be posted.

'Apple's Device Surveillance Plan Is a Threat To User Privacy -- And Press Freedom'

Comments Filter:
  • Too late (Score:5, Insightful)

    by hebertrich ( 472331 ) on Thursday August 19, 2021 @05:17AM (#61707097)

    Software exists , Apple can do it , cat's out the bag so .. forget it.It WILL be used . Whether it's apple or a political regime it will surface and get used if not in use already. Genies out the bottle and all those things ..

    • Re: (Score:3, Interesting)

      Funny how almost every news story here about politics or privacy results in a first post that basically is pro-corporate or pro-government status quo, encouraging people to just accept the stupidity and move on.
  • Which Android phone should I get?
    • by e**(i pi)-1 ( 462311 ) on Thursday August 19, 2021 @05:54AM (#61707159) Homepage Journal
      I was a die hard apple fan until August 5th. I yesterday transitioned from my iphone 12 to a Samsung galaxy S21 utlra 5, (google pixels were also on the short list but not available). The Syncios manager was a good tool to get things away from apple, like music, documents etc. At the moment my itunes library is unorganized in linux. Strangly enough, a simple grep to search for a specific song is more effective than an itunes search. All I so far needed transitioned over nicely. still, I'm still upset. This great company has lost its way.
      • Re: (Score:3, Funny)

        by 605dave ( 722736 )

        Yes Google is a MUCH better way to protect your privacy. As the progenitor of surveillance capitalism I am sure you have nothing to worry about.

      • I was a die hard apple fan until August 5th. I yesterday transitioned from my iphone 12 to a Samsung galaxy S21 utlra 5,

        This is great! You have left Apple to a platform that tracks your every move, scans your email, ad Google drive.

        Seriously though - you jumped from the frying pan into the blowtorch f personal surveillance.

        • The one true way to leave Apple and not be subject to surveillance would be to walk around with a black Bell landline phone dangling from your belt. Now all you would need is a place to plug it in and the appropriate plethora of local landline plans.

    • by 605dave ( 722736 )

      If you think Android is a step up in privacy I have a bridge to sell you.

    • Do you like OS updates? Well bad news in that aspect.

    • Which Android phone should I get?

      The one that does need to have you run an antivirus every time you make a phone call. Oh, wait!...

  • by Bruce66423 ( 1678196 ) on Thursday August 19, 2021 @05:35AM (#61707117)

    China is very likely to enforce this or something similar on software on phones sold there. Apple has a choice; abandon its major market, or find an excuse to surrender elegantly.

    Don't be surprised.

    • They don't need access to a phone. They have everything they need to see your traffic and likely have or can get any keys stored on the device. When your ISP is the CCP, there is no incognito mode.

    • by gweihir ( 88907 ) on Thursday August 19, 2021 @08:55AM (#61707601)

      Well, it is clear they have surrendered. But there is nothing "elegant" about it. They just admitted that all their concerns about user privacy of the last few years were direct lies. Not that this is any real surprise.

    • by AmiMoJo ( 196126 )

      Google chose not to do business in China. There is always a choice.

    • China already has all the unencrypted backups of the phones; China's laws already make it possible for them to grab this info off of any phone or backup at any time.

      Think, people. Oppressive regimes have no need for this rigamarole. China is a country openly practising a cultural and possibly a literal genocide on Uighur people, and you think they're going to care about tinkering with a phone to make it look like you had pictures on it that you didn't? They'll just toss you in jail and say you did a bad thi

  • Burn (Score:3, Insightful)

    by Malifescent ( 7411208 ) on Thursday August 19, 2021 @05:48AM (#61707135)
    I hope Apple burns over this. I've predicted that they won't be able to turn this around since backtracking would imply that their profits come before the sexual safety of children. So they'll stick to it, even if it means the company loses users hand over fist.

    The company will go down over this.

    It's an absolutely monumental mistake that the company didn't see this blow-back coming. World and dog is against this plan but they keep insisting they just need to "explain it better." That's usually a sign that they're stuck in the groove and resist crawling out of the hole they've dug for themselves.
    • Re:Burn (Score:4, Insightful)

      by 605dave ( 722736 ) on Thursday August 19, 2021 @06:33AM (#61707241) Homepage

      They won't burn. Most people understand that this is more of a gray zone than people around here want to admit.

      Apple is trying to solve a very real and very serious problem that most people understand is a huge issue in today's world. This is where the rubber hits the road. You would have Apple do nothing about serious abuses of their technology because of the (very real) potential abuses of their solution. Is your opinion no digital tools can ever be used to combat some truly evil things because they might be used in some future bad way? If you think this is wrong, is there a technology you can suggest that can help combat the very real problem of sexual exploitation?

      • People will view this as a threat to their privacy. Mark my words: Apple is going to be hurting.

        Aside from that: it's not Apple's job to police everyone's smartphone. Whether CP is present on smartphones in large quantities is highly questionable anyway.

        Other tech companies will take a wait and see attitude to see how this works out for Apple and I don't see Microsoft announcing a similar scheme on Windows anytime soon.

        In addition, the threat of false positives is HUGE since they don't use regula
        • by 605dave ( 722736 )

          Microsoft and Google have been scanning emails for chid porn for almost a decade now. Doesn't seem to have hurt them

      • Re:Burn (Score:5, Insightful)

        by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday August 19, 2021 @07:47AM (#61707389) Homepage Journal

        They won't burn.

        agreed

        Most people understand

        If your sentence starts like this, you are already wrong.

        that this is more of a gray zone than people around here want to admit.

        Putting aside the fact that most people have their heads completely up their assholes, this is not a gray zone. This is a hard line which must not be crossed. Inspection of users' data only goes one way. Either store users' data for them without inspection, or don't store it at all. There's nothing good down the path of spying on your customers.

        You would have Apple do nothing about serious abuses of their technology

        That's right. I would.

        because of the (very real) potential abuses of their solution

        s/potential//

        Is your opinion no digital tools can ever be used to combat some truly evil things because they might be used in some future bad way?

        They developed these tools so that they could use them in a bad way in the present, specifically to quash dissent in China. They are now making excuses for it on the usual "won't someone think of the children" basis.

        If you think this is wrong, is there a technology you can suggest that can help combat the very real problem of sexual exploitation?

        Mental health care.

      • by Rotting ( 7243 )

        Most people will not even realize this "feature" will exist or what the implications of it are. It's not like Apple is going to include this on the list of new features for iOS 15 page.

    • by larwe ( 858929 )

      It's an absolutely monumental mistake that the company didn't see this blow-back coming. World and dog is against this plan but they keep insisting they just need to "explain it better." That's usually a sign that they're stuck in the groove and resist crawling out of the hole they've dug for themselves.

      The thing is, Apple has had (generally) succeeded with just doubling down on "you're doing it wrong, you need to think more courageously like we do." Based on that history, it's easy to see why they would keep sticking to this line. Look how long it took them to admit that the butterfly keyboard design was unfixably flaky (not that they ever admitted this in so many words; they just courageously and semi-silently omitted it from future MacBook designs). It's understandable that they might see the best strat

      • The butterfly keyboard really hit Apple's credibility and outraged many of its customers. When you pay top dollar for a product and the company makes bumper profits you'd at least expect them to be lenient when it comes to fixing and replacing faulty hardware and software. Instead they accuse you of being a whining moron who's using it wrong.

        If they double down on this (like I've predicted) they're in for a hard ride.

        BTW: I hate me having to add my own HTML markup to my Slashdot comments. Why are the
    • by N1AK ( 864906 )
      Really wish this was the kind of place where people could pick this up with you in a year to see how smart you are feeling about your prediction this will bring Apple down. I'm right on the fence on this decision, but outside tech crowds I've got literally zero anecdotal evidence that anyone doesn't approve of, let alone opposes, this.

      It's not that I am not concerned about the slippery slope argument and where this could go, but frankly if Apple was willing to secretly divulge documents outside the annou
    • Google, Microsoft, Dropbox, etc already do this. Apple is just the first to make a big deal out of it and the first to do it locally on the device rather than the cloud storage (and it only does it locally on the device if you use iCloud so if you care, turn off iCloud).

      This is a thing that will blow over very soon and everyone will forget about it. I'm not a fan of reducing my privacy, but short of using some 3rd party android OS or other half backed phone OS I can't think of any other devices that would i

      • It's a huge deal doing this on the device. People expect their devices to be theirs with which they can do as they see fit, including downloading and viewing illicit material. Apple is creating a hugely dangerous precedent with this scheme.

        You can be sure that copyright holders will start pressuring Apple and the government to start policing for copyright infringement on their devices in the near future. Apple claims it will resist such requests, but what if the government starts mandating it by law?
      • (and it only does it locally on the device if you use iCloud so if you care, turn off iCloud).

        As I learned this weekend, it's not that simple.
        A software update (or one of the 57 click-throughs that occured after in order to be able to use the device again after it booted) happily turned it back on to me.

        Well, to clarify, it turned on iCloud Photo for me again.
        I suppose if I didn't have an account associated with iCloud at all, then it probably would have had a harder time... but then again, the device is essentially useless without that.

    • It's an absolutely monumental mistake that the company didn't see this blow-back coming.

      What blowback? People squak about literally everything on the Internet. If your prediction that Apple will "go down" over this is correct - if it really hurts their quarterly results - then that will be blowback.

      Here are my predictions though:

      1) The vast majority will have zero personal impact and won't care.
      2) Of those who care on the basis of what they read on the internet, the highest-impact stories will be

    • No, this is a thing that tech people like us are worried about. Nobody else gives two shits.

      But as tech people, we have to understand the nuance here. Apple isn't wholesale scanning your phone. They're scanning photos that are being moved to and from their iCloud Photo service. The processing happens to be on-device, rather than in their cloud. Facebook already scans all the pictures that are uploaded to that service and they rake in about 20 million CSA images a year.

      Is there a worry that bad stuff will be

  • It's such a nothingburger. Oh no if you choose to store your shit on icloud there's a tiny fucking chance a stranger from Apple will check a couple of your images because they have "similarity" (from the pov of artificial stupidity) to kiddy porn and if not just move on to the next stranger. The humanity!!!

  • You really don't need device level access to make this happen. Hackers have been scraping screens and using ML and AI to pull the more valuable information off our devices for some time now. Kinda makes using your personal devices for personal reasons a moot point. If it connects to a network, it will call home and reveal all.
  • It's all a marketing stunt. Apple wants everyone believe that if you use an Android you're a pedophile to get more people to switch to iPhone. They being able to zero in on people leaking their products is just a bonus to them.
  • "only a few of the calculations" would trigger the bug, but the user had no way of knowing which (the IRS even issued a "we don't care; you signed it, not the chip). For practical purposes it meant that ALL FP calculations were wrong.

    No matter how miniscule the chances of a false positive, and the disastrous consequences for the file holder (some places still have the death penalty for child porn), since you have no way of knowing if an image on your phone/tablet(/Mac?) will generate a matching hash, you

    • by Pinky's Brain ( 1158667 ) on Thursday August 19, 2021 @05:59AM (#61707177)

      What disastrous consequence? Even if they triggered on a single match what would happen? Some stranger from Apple in a max security setting will get to see 1 of your images (the chances of multiple accidental collisions is too small to be relevant). Then he sees it's only a dick pic and not cp and moves on.

      By requiring multiple matches there is no chance of accidental matching without actual cp or planted colliding images. Now the stranger at Apple sees some bullshit colliding images and moves on.

      Disastrous.

      • by 605dave ( 722736 )

        You have obviously read and understood what Apple is proposing and have a realistic approach to the issue. Why are you posting here?

      • What disastrous consequence? Even if they triggered on a single match what would happen? Some stranger from Apple in a max security setting will get to see 1 of your images (the chances of multiple accidental collisions is too small to be relevant). Then he sees it's only a dick pic and not cp and moves on.

        The real disastrous consequence is that Apple is turning dissidents in to the Chinese government, and they are doing this child porn shit in order to get the usual cadre of useful idiots to cheerlead for them as a result. Whether you know it or not, you're literally celebrating oppression.

        That you don't seem to know it is typically chilling.

      • I guess the question your post raises in my mind is "Why do you want someone at Apple looking at your dick pic?" Are you that proud of your penis or that uninterested in your own privacy and that of others?
      • by thegarbz ( 1787294 ) on Thursday August 19, 2021 @01:31PM (#61708447)

        Then he sees it's only a dick pic and not cp and moves on.

        And what if it's a picture of your daughter naked? Not CP just naked. In a country like America where people have been prosecuted for watching legal porn (but the actress looked young, omg, arrest the pedo fucker!), or in a country where people have been dragged through the police system for having pictures of their own children playing (sick pedo fucks, what would Jesus do!), or lynched in a park for taking pictures of their own children (he's a lying pedo, get him, we have morals on our side),

        yes. YES it very much could have disastrous consequences.

        If we were talking about a black and white concept here I'd agree with you, but there's nothing black and white about child pornography.

        Fuck man some western countries have proposed banning all porn with actresses who have A-cups, because OMFG THE PEDOS ARE EVERYWHERE.

        Fall in fucking line citizen.

      • "Even if they triggered on a single match what would happen? Some stranger from Apple in a max security setting will get to see 1 of your images"

        If someone at Apple can see 1 image, then a govt can compel Apple to allow someone NOT at Apple to see more than 1 image. A slight modification to wildcard ALL matches for images/files on a phone could easily allow this type of interrogation.

  • Collision are not some cyberswatting weapon unless you assume Apple doesn't verify matches, which they do. So they are only a DOS attack on the verification, let Apple worry about that.

    https://appleinsider.com/artic... [appleinsider.com]

    • How do they verify though?

      From: https://www.hackerfactor.com/b... [hackerfactor.com]

      "The laws related to CSAM are very explicit. 18 U.S. Code 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple -- not NCMEC.

      It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. 2258A is specific: the data can

  • by Qwertie ( 797303 ) on Thursday August 19, 2021 @06:19AM (#61707213) Homepage

    My concerns:

    First, no one is allowed to see the images that were used to create the database, since they are presumed illegal to possess or distribute (regardless of whether they actually are illegal to possess or distribute). While the term CSAM is being used, what assurance do we have that this only contains "abuse" imagery? For example, 16-year-old couples can engage in various perfectly legal sexual activities, but if those activities are recorded, the recordings are "child porn" under U.S. law. Does the CSAM database include such images? Many people would be happy if the answer is "yes", but I think that vigorous censorship of legal activities actually encourages child abuse. You see, because of the crackdown, the most practical way to acquire images of young people doing the nasty is to buy pictures from a black market, and the people most likely to take the risk of selling such imagery are illegal sex traffickers, not individual 16-year-old couples raising money to pay for college. (And because recordings of legal activities and illegal activities are both criminal to possess, a pedophile who might have been satisfied with a recording of legal activity might choose to buy abuse imagery instead if it is easier to find, which also incentivizes child abuse. Plus, a pedophile might look at the vigorous enforcement of laws against image possession, and conclude that downloading images is too dangerous and that he is actually less likely to get caught by actually abusing a child. Also, naughty pencil drawings are illegal under U.S. law [wikipedia.org], and some would argue that it would be safer for children if pedophiles were to look at fictional images rather than to purchase real ones.)

    And as many have pointed out, if there were completely non-sexual images in the DB, no one would know.

    The system also has an oldness bias: that is, old images are more likely to be in the database than new ones, and so a person is less likely to be caught by storing brand-new "0 day" images that they just purchased from a sex trafficker, and much more likely to be caught based on something found in an old CD from 15 years ago. Paradoxically, then, someone familiar with the system might be more inclined to pay for new images than old ones, which incentivizes child abuse (though only mildly, since there are safer ways to avoid the system, like storing nasty images on a PC/Android)

    • For example, 16-year-old couples can engage in various perfectly legal sexual activities, but if those activities are recorded, the recordings are "child porn" under U.S. law. Does the CSAM database include such images?

      How would Apple have the image hashes if they weren't uploaded to the god damned Internet and circulating as child pornography? So THAT happens, and you're asking what about the people in the video when they have the video in their phones? Seriously?

      "I'm the one that took the video"
      ", your honor."
      Hmm, does it pass the test?

    • what assurance do we have that this only contains "abuse" imagery?

      You could read their Threat Model [apple.com] paper in which they discuss the answer to the question. And in fact, they even formalize that Source image correctness (bolding in original) is a design goal, which they formalize as

      there must be extremely high confidence that only CSAM images â" and no other images â" were used to generate the encrypted perceptual CSAM hash database that is part of the Apple operating systems which support this fea

      • I have RTFTM.
        It's clear that their coal is legitimately to catch what they say they're trying to catch.
        The security mechanisms in place are obviously bona fide.

        But good faith does not make an action immune from bad consequences.
        There are many reasons to believe that a persons right to security in their persons, houses, papers, and effects, against unreasonable searches and seizures shouldn't be violated by a private corporation not-so-bound by said constitutional protection.
        Apple is free to do this, u
    • Essentially it's telling pedophiles (stupid enough to store their shit on their phone...really?) that they just have to be more creative about their framing, posing, and subjects.

      Setting aside the efficacy of AI, I'm still wondering what PRECISELY we're even CALLING child porn since we can't seem to decide on it either.
      In Japan, what I would ABSOLUTELY call uncomfortably-close quasi porn is dismissed because "well...it's a 400 year old demon in a prepubescent child's body, so not pedophilia".
      If I draw a big

  • I've already read that the algorithm was reverse engineered and people have created pictures that collide. People will protest this by mass altering everyday pictures on the web to match pictures already in the database and doing everything they can to get them onto everyone's devices.

    What they are doing is a much more complex version of what I did with a software distribution years ago. Instead of keeping a file of CRC32s to verify the distributions files, I wrote an algorithm to calculate and add four byt

    • Even if you manage to get something with the same checksum (keeping in mind a 32-digit checksum has 300 undecillion combinations), would it have the same filesize and dimensions?

    • Instead of keeping a file of CRC32s to verify the distributions files, I wrote an algorithm to calculate and add four bytes to every file that caused the CRC32 to be zero.

      Appending the CRC32 of the original data to the end of the file will cause the CRC32 of the combination to be zero. This is a well-known property of CRCs. There are some caveats (e.g. some CRC code inverts the result and you'd need to compensate for that) but the principle is straightforward.

      A less-known property is that CRC(X ^ Y ^ Z) = CRC(X) ^ CRC(Y) ^ CRC(Z). It's relatively simple to leverage this property to determine four bytes you can add to any given file to produce an arbitrary user-selected CRC32

  • The only images scanned by the system are images that are milliseconds away from being uploaded and stored in iCloud, unencrypted. The attacks contemplated here are ridiculously more complex than necessary, since the attacker can just directly scan your unencrypted photos in the cloud.

  • Take naked selfies and you might get your own pictures flagged as kiddie porn.
    • I used to frequent a forum aimed at people with various chromosomal or intersex disorders, and one section was for people who suffered from issues relating to delayed or impossible puberty in general and this issue came up quite often, most notably when Australia decided to demand a minimum breast size in porn because pEdOpHiLiA. What you're saying is a very real fear of people like myself, even if we can prove that we're adults, what if we're asked to prove when the photo was taken? What do you want me to

  • Apple likes to give off the image that it is our champion for privacy. No trackers, no unlocking phones, it all sounds good.

    But Apple still has your device -- your contacts, email, notes, photos, and all manner of private data -- at their fingertips to scan and otherwise do what they please.

    You think you're buying Apple's product when you buy their iPhone. No, you're buy your own membership card to becoming their product.
  • Seriously. Stop living in a fantasy world. This is every authoritarian's wet dream.

    This also gives is the only credible reason Apple had this really bad idea and is not insisting on doing this: They were pressured into it and were unable to defend themselves effectively. Sets a nice precedent and comes assured intent to misuse it a little down the road.

  • by CyberSnyder ( 8122 ) on Thursday August 19, 2021 @09:06AM (#61707633)

    Obviously I don't sit in a C-level position at Apple, so this is opinion, but it looks like Apple is doing the least intrusive, but still effective means of preventing child pornography from residing in iCloud. Due to changes in ISP safe harbor laws, they could be held liable for child pornography stored on their servers. They aren't scanning your hard drives, it's just a scan before upload to iCloud. Flag and set aside. Any other images are still encrypted in your iCloud account and not accessible. Does it mean there will never be any child pornography in iCloud? Of course not, but they've made it enough of a inconvenience that it will not be a frequented repository of that stuff. Scanning on device before upload is the least intrusive, yet still effective means of preventing child pornography on their servers and they will be able to say that anything that got through was because the Feds didn't flag that image. If you really must have kiddie porn on your iDevice, turn off iCloud. Apple really isn't trying to be the police, they're trying to not be sued / fined. But, yes, it's putting the piping in place to scan for other files.

  • It will be a bad thing even when used for its purpose. Just wait until your picture of some flower is deemed by Apple's infallible AI to be child abuse. Everything Apple you own suddenly stops working. You get a visit at 5 AM from the vice squad who take you away in handcuffs. Right, good luck with that.

    Apple's idea is that it is going to subject millions and millions of people to potential life-altering errors because there is a 0.0000000001% chance that they will discover some crime. But this will pl

Genius is ten percent inspiration and fifty percent capital gains.

Working...