Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Electronic Frontier Foundation Privacy Apple

Edward Snowden and EFF Slam Apple's Plans To Scan Messages and iCloud Images (macrumors.com) 55

Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). MacRumors reports: In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future. Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."

The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security." The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and "iCloud Photos" could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all "iCloud Photos" users, not an improvement," the EFF cautioned.

This discussion has been archived. No new comments can be posted.

Edward Snowden and EFF Slam Apple's Plans To Scan Messages and iCloud Images

Comments Filter:
  • the more people who sign this the better:

    https://www.change.org/p/apple... [change.org]

    -dave

  • To illustrate the issue I have, consider the following.

    There are loaded assault rifles at every street corner. They are _intended_ only to be used in self defence, or to prevent criminal activity. There are clear signs above them informing people that they are not to be used for any other purpose. But there are no actual barriers preventing someone from picking one up and using it to rob the local Kwik-e-mart.

    Do you think the trust placed would ever be abused?

    That is the problem here: once privacy is invaded, no matter how good the initial intentions, eventually it will be abused by many of those in a position to abuse.

    • That is why this is important. Communicate it to Apple. http://chng.it/Jq9xLmvmsz [chng.it]
      • by Ol Olsoc ( 1175323 ) on Friday August 06, 2021 @07:41PM (#61665517)

        That is why this is important. Communicate it to Apple. http://chng.it/Jq9xLmvmsz [chng.it]

        No one seemed to complain when it was regularly used on Google drive. Must be different.

        • by Rewind ( 138843 )

          I think the key difference is that is you uploading data to Googles servers. I think the owner of a server checking what is being uploaded (via hash) is a lot more reasonable than a company scanning user devices. I think most people would be fine with this if it was only iCloud content. I am not taking a side here, I just think that is an important distinction.

          • Re: (Score:2, Informative)

            by Ol Olsoc ( 1175323 )

            I think the key difference is that is you uploading data to Googles servers. I think the owner of a server checking what is being uploaded (via hash) is a lot more reasonable than a company scanning user devices. I think most people would be fine with this if it was only iCloud content. I am not taking a side here, I just think that is an important distinction.

            Except that the picures that get scanned are pictures that a person who uses ICloud photos is planning to upload to iCloud.

            https://apple.slashdot.org/sto... [slashdot.org]

            Seriously, they don't scan your iCloud images if you don't have iCloud imaging turned on. https://www.imore.com/psa-appl... [imore.com]

            I'm 100 percent certain that almost everyone here spreading the FUD are simply people that hate Apple. So that's how they justify and approve of Google Drive doing the same thing. Yet they go Reee! if the dreaded criminals fro

            • by mattr ( 78516 )

              Funny the security analysis document I read from Apple focuses on images being sent by iMessage to minors in a family sharing plan. I take it iCloud is actually being used to store those images in the background. It seems to be aimed at protecting minors from being traumatized by predators sending them known CSAM pictures, but such predators could just take pictures of themselves. Unless the scanning is expanded to recognize genitals in general it appears to be of limited use but a simple configuration away

              • Funny the security analysis document I read from Apple focuses on images being sent by iMessage to minors in a family sharing plan. I take it iCloud is actually being used to store those images in the background. It seems to be aimed at protecting minors from being traumatized by predators sending them known CSAM pictures, but such predators could just take pictures of themselves. Unless the scanning is expanded to recognize genitals in general it appears to be of limited use but a simple configuration away from scanning whatever, like political party logos. I think focusing on iCloud is not the point. It can be expanded through pressure from governments in the future, that is what people are worried about.

                How big a leap is it between getting hash values and actually looking at the pictures?

    • by gweihir ( 88907 )

      Exactly. Nice illustrative example!

    • Re: (Score:2, Interesting)

      by quenda ( 644621 )

      To illustrate the issue I have, consider the following.

      There are loaded assault rifles at every street corner.

      I'm pro-privacy, and pro gun control. But that is possibly the worst analogy ever on slashdot that did not include a motor car.

      https://yourlogicalfallacyis.c... [yourlogicalfallacyis.com]

      So I guess you want a complete ban on search warrants or police power of arrest?
      Let's not make murder illegal, because sooner or later we will be executing people for jaywalking.

      • Search warrants and arrests have to have probable cause.

        You having a phone with a data processing device and a photo camera is not probable cause.

        This up above programme is the equivalent of deciding to deal with the fact cases like Natascha Kampusch (which was horrible, don't get me wrong) happen via making sure cops can thoroughly search anyone's house and property whenever they want.
  • Apple's content detection doesn't work well as it is. If I search for a "candle" among the pictures of things that are, in fact, candles, there's also a picture of my turntable playing a record. Neural networks just aren't reliable enough that false positives aren't going to happen.

    I'd have no problem if this were implemented the same way other services do it, where they scan their own servers for potentially illegal material. But having a phone scan itself and narc on you? That's insanely draconian and

    • Apple's content detection doesn't work well as it is.

      No no no, you don't get it. This will be "AI" powered, and AI is amazing and can AI the AI AI so we can all AI.

      It's completely different from the rest of their systems. Trust us.

  • Putting Edward Snowden in the same sentence as an organization is probably not good for that org's reputation.

    But let's try it out, "Edward Snowden and Microsoft both announce new major leak." Nope. That did nothing to improve the reputation of either one and probably harmed the latter far more than the former.

    • The EFF started out defending hackers in court. Being associated with people like Snowden is who they are. If you don't like that, you probably don't like the EFF.

    • Ahhh I see you're one of the people who judges words based on who says them rather than their content, applicability and whether or not they make sense.

      • Who says them does indeed matter.
        When tRump made claims, his 30,000 + outright lies bore heavily on the reliability of the claims.
        Snowden has no such burden
        • It only matters when the only source of a claim is the person themselves.

          An appeal to authority is as much of a logical fallacy as an ad hominem. The two go hand in hand. No one here is suggesting people blindly trust, but rather examine the content and analyse the point being made.

          • Wrong.
            Argumentum ad magesterium is only a fallacy when the authority is in some way compromised.
            what you're thinking of is the famous Argumentum ad Verecundium, argument from INCOMPETENT Authority, but supposedly you know the difference.
            VRWC, incompetent.
            CNN, not so much, subject to failures but takes every reasonable precaution.
            Newsmax? Incompetent to tell day from night
            PBS? Not so much
    • Far more likely for the association to Hurt Edward Snowdens rep than the other way around.
  • I mean really. If you can't convince people to do something to protect adults, you use the magic words "for the Children!" and suddenly perfectly sane people give up all their rights.

    There are a lot of women that never agreed to porn, but have had naked pictures of them spread throughout the internet. If you are not willing to use this software for them, then you should not be willing to use it for the children.

    The way to save children is not to invade the privacy of the billions of people that have phones. Better ways to do it - and use those same ways to save the adult women that never agreed to porn too, while you are at it.

    • Imagine if Apple deoloys this. The doors are now ripped off the hinges, and all sorts of abuse will flood forth in an unstoppable torrent. Where Apple leads, others follow, and you won't be able to buy any new device where you are not invaded in this manner.

      Governments are salivating over this, more so than the rabid Apple fanboys did right before the first iPhone came out.

      The nightmare is about to begin and it's got the battering ram at the ready to break down the doors to allow the whole fucking a

      • Of course, Apple is standing at the gate making appeals to emotion, hoping somebody is dumb enough to open the door before it picks up that battering ram.

        To make it clear "For the children" is just a pretext, a bait so all sorts of corporate and government abuses will be allowed through. Abuses that are justified with "but they are already scanning through our phones, so what difference do the (other abuses) make?".

        As I said this is far less about "protecting children" and far more about introducing other,

        • Hmm, it seems someone working for those who are trying to deploy this are modding down posts explaining the true motives behind what Apple is trying to do.

          They don't have to worry as Apple (and every other corporation) will monster all of this through, and the public will just put up with the abuse as usual.

  • When this hit slashdot yesterday, the story was about how a single researcher at a lab in New York speculated that this was going to happen, without any substantive proof.

    Today's news is that people are slamming Apple for something that some guy a basement conjectured could happen?

  • So... what's the goal (of Apple) here? They detect "bad stuff" on a phone. Now what? Is a police department just supposed to take the word of Apple & arrest someone? Or maybe the PD will try to get a warrant. That will be an interesting affidavit to read "...because Apple said so..."

    Assuming you could have this "evidence" admitted in court that'll be fun to watch. And, oh my, the appeals will almost write themselves.

  • That something like this would get a company run out on a rail. Sign a pention? How about giving Apple the big "go fuck yourself" sendoff, and holding events such as burning a pile of iShit ala the "bigger than Jesus" outrage?

    Seriously, as it stands now Apple is just going to laugh and at most just introduce this garbage a little bit at a time. And when Apple does something, everybody wamts to do it. Apple is still seen as the cool guy and everyone wants to be just like Apple.

    We need to trounce Appl

  • The underlying issue is the question:

    What, of what I know, may I keep for myself?

    This, however, is impossible to answer.

    Because of this, any issues as the one in this discussion, will never be solved: only a compromise may be reached, at the very best. This means that, no matter what, there will always be people dissatisfied, people who feel their rights are violated.

  • It's easy to be on board with protecting children, but it's unlikely to stop there. "Think of the children", likely will later include chosen blacklisted images from those pesky "Domestic Terrorists" and bigots. Better round up those who saved pepe memes and 4chan satire.

    • Better round up those who saved pepe memes and 4chan satire.

      You assume the folks here have an issue with rounding them up.

  • now if they frame him with SOME planed CP will Russia trun him over?

  • by fafalone ( 633739 ) on Saturday August 07, 2021 @03:35AM (#61666309)
    Others have covered some of the other problems here, but I wanted to reiterate what's potentially the most problematic issue. In some places, Apple is claiming a PhotoDNA-like hash only of known CSAM material. Other places, they're claiming to use their "neuralMatch AI", a ML system trained on 200k CSAM images. That's a *very* different thing. If it's flagging photos that an AI thinks is naked kids, I guarantee there will be cases of short, flat 18yos triggering it. Which will fail human "verification" too, like the government expert pediatrician who testified 19yo commercial pornstar Little Lupe was beyond doubt a preteen. The guy they were prosecuting for CP for a DVD with her content was saved when she showed up with birth certificate and other ID proving her age, but what happens when the actor can't be identified?
    And if it's found the government threatened Apple to force them into this, there's constitutional issues. Courts have ruled if the government coerces a private entity to perform a search for criminal evidence to prosecute someone, they're acting as an extension of the government and the 4th Amendment comes into play.
    • Apple is claiming a PhotoDNA-like hash only of known CSAM material.
      Other places, they're claiming to use their "neuralMatch AI", a ML system trained
      on 200k CSAM images. That's a *very* different thing.

      Exactly.
      So if they know the hash of an offending image, what do
      they need "AI" for? Also, if somebody does have such
      an image on iCloud, couldn't they totally trash the hash
      by changing one pixel?

      The whole thing stinks of deception.

  • "Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook"

    short memory ed, https://www.cbsnews.com/news/f... [cbsnews.com]

  • I have to wonder how hard it would be for another AI to produce images that cause Apple's AI to give false positives, and how many times Apple's AI "cries wolf" before law enforcement ignores it.
  • by timmyf2371 ( 586051 ) on Saturday August 07, 2021 @01:39PM (#61667243)

    A UK-based organisation called IWF (Internet Watch Foundation) has, for over 10 years, maintained a list of URLs which it alleges contain images of child sex abuse. The list is not public knowledge but we do know that a Wikipedia page containing an album cover as well as the entirety of the Wayback Machine have been blocked at various points.

    Most of the consumer & mass-market ISPs in the UK subscribe to this list, but it is not mandatory. It was positioned as something which will help in the fight against child sex abuse and will help to prevent internet users from accessing such material.

    In 2011, the MPAA said to BT (probably our biggest ISP): "nice URL blacklist you have there; please add these URLs to it". The matter went all the way to the High Court of Justice which ruled in favour of adding URLs which did not contain images of child sex abuse to the blacklist. There is now an established procedure in which other such sites can be added.

    It is very easy to imagine how this ends up several years down the line. Here are a few possibilities which could conceivable trigger an alert to a user's national Government.

    - Chinese users with photos of Tiananmen Square on their phone.
    - Middle-eastern users with photos from a pride-type event.
    - American users with images of drugs.
    - British users with images which suggest attendance at unlawful protests.
    - Users from a country with photos suggesting opposition of the ruling President or party.

  • Apple's plan to scan images loaded to the cloud is revealed.
    Media loses it's mind

    Meanwhile google, FB, IG and others who have been doing it all along with data that sits on their servers continue scanning away and no one cares?

The use of money is all the advantage there is to having money. -- B. Franklin

Working...