Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy Apple

Apple Has Been CSAM Scanning Your iCloud Mail Since 2019 (9to5mac.com) 52

According to 9to5Mac, Apple has confirmed that it's already been scanning iCloud Mail for Child Sexual Abuse Material (CSAM), and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups, which sent the internet into a frenzy when it announced its intents to begin doing so. From the report: The clarification followed me querying a rather odd statement by the company's anti-fraud chief [Eric Friedman]: that Apple was "the greatest platform for distributing child porn." That immediately raised the question: If the company wasn't scanning iCloud photos, how could it know this? [...] Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task. Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the "other data" does not include iCloud backups.

Although Friedman's statement sounds definitive -- like it's based on hard data -- it's now looking likely that it wasn't. It's our understanding that the total number of reports Apple makes to CSAM each year is measured in the hundreds, meaning that email scanning would not provide any kind of evidence of a large-scale problem on Apple servers. The explanation probably lays in the fact that other cloud services were scanning photos for CSAM, and Apple wasn't. If other services were disabling accounts for uploading CSAM, and iCloud Photos wasn't (because the company wasn't scanning there), then the logical inference would be that more CSAM exists on Apple's platform than anywhere else. Friedman was probably doing nothing more than reaching that conclusion.

This discussion has been archived. No new comments can be posted.

Apple Has Been CSAM Scanning Your iCloud Mail Since 2019

Comments Filter:
  • That's fine (Score:5, Insightful)

    by Agent Fletcher ( 624427 ) on Monday August 23, 2021 @07:56PM (#61723145)
    You can scan all the stuff that leaves my device and enters your storage. It's normal it's expected. You can scan your iCloud all day but don't scan my phone or put those databases on my phone.
    • Re:That's fine (Score:5, Insightful)

      by gillbates ( 106458 ) on Monday August 23, 2021 @08:14PM (#61723193) Homepage Journal

      No, it's not fine.

      In the United States, even the mere suspicion of wrongdoing subjects the accused to very costly financial penalties even in cases where they are factually innocent. Even when the prosecution makes "innocent mistakes", the accused very often loses decades of their lives and lawyers fees totalling the cost of a house in the attempt to correct the mistake. And we, the taxpayer often foot the bill for wrongful convictions to the tune of millions of dollars. Cook County (Illinois) pays out millions of dollars a year to victims of the legal system for cases they get wrong. That's millions of dollars that could be providing free health care, better education, etc...

      Even an innocent "mistake" on Apple's part could cost third parties millions of dollars. That's not justice, it's not protecting children, and it's not acceptable.

      • Re: (Score:1, Interesting)

        by Anonymous Coward

        No, it's not fine.

        You didn't seem to care when Microsoft did it in 2009, or Facebook did it in 2011, or Google did it in 2012, or Twitter did so in 2014, or Flikr did it in 2019...

        No, you only seem to think it wasn't fine this year when Apple said they will too.
        It seems less of a "wrong" thing to you and more of an excuse as yet one more rant against Apple, going by your posting history.

        • Did abandoning all microsoft products and services and calling them a pile of shit not count?

          • You didn't seem to care when Microsoft did it in 2009, or Facebook did it in 2011, or Google did it in 2012, or Twitter did so in 2014, or Flikr did it in 2019...

            Did abandoning all microsoft products and services and calling them a pile of shit not count?

            I can't stop laughing, first at the 2012 date for Google, when they read your gmail from day one, not even going to get into Hotmail predating that, and the implication of dropping all Microsoft products in 2009 because of their *snicker* PRIVACY PRACTICES? Oooh boy.

            Kids these days

        • I don't know which is worse - that you think I hate Apple, or that you read my posting history. All the way back to 2009.

          Rediscover sunlight before it's too late!

      • Re: (Score:3, Insightful)

        by Aighearach ( 97333 )

        That's millions of dollars that could be providing free health care, better education, etc...

        What a load of horse shit. Universal health care saves money it does not cost money. Education increases prosperity, it does not use it up. Money is the stuff of trade, and it is not zero-sum. Buy and sell twice as often and there is twice as much of it. It is fiat; the amount that exists is the amount we think we need.

        You fear that prosecution of child predators will reduce your prosperity, and I wonder at that.

        • If universal health care saved money, small cash-strapped towns would be implementing it, rather than speed traps, to make ends meet. Every government which offers universal health care has increased in size, and with it, taxes. (Take, for example, the UK's NHS which employs around 2 million people)

          But as for the more specific point: Cook County already taxes people as much as they can bear, so, because they can't raise taxes, these damages for false imprisonment are paid from taxpayer dollars, dollars

          • Whatabutt the Alamo, Whatabutt that?!

            If being kind to children produced happier parents, wouldn't parents in those same small towns stop abusing their children?

            If gambling causes you to lose money, not become prosperous, wouldn't people stop gambling?

            Wouldn't people stop smoking cigarettes?

            Wouldn't people this, wouldn't people that??

            Start with a small tribe of cavemen. They have one healer. You fall and break your leg. Does it cost money for them to take care of you? Or do they benefit from caring for you?

            A

            • The problem you're having is that you're looking at society as a whole versus the government bureaucrat looking at how they can write a check with money they don't have. The societal benefits of socialized medicine are not taxable - yes, it saves people money, but the cancer case or heart attacks which don't happen don't result in billable, taxable revenue for the government.

              It's not a question of "Does providing healthcare for all benefit society", but how that system is implemented. In Europe, where

              • The problem you're having is that

                I'm not having a problem, though.

                the government bureaucrat looking at how they can write a check with money they don't have

                The United States does not have that particular problem.

      • by AmiMoJo ( 196126 )

        I've always thought it was wrong that you could be falsely accused, found innocent but still be bankrupted by it. There should be restitution for people affected by that, and their families.

      • No, it's not fine.

        In the United States, even the mere suspicion of wrongdoing subjects the accused to very costly financial penalties even in cases where they are factually innocent. Even when the prosecution makes "innocent mistakes", the accused very often loses decades of their lives and lawyers fees totalling the cost of a house in the attempt to correct the mistake. And we, the taxpayer often foot the bill for wrongful convictions to the tune of millions of dollars. Cook County (Illinois) pays out millions of dollars a year to victims of the legal system for cases they get wrong. That's millions of dollars that could be providing free health care, better education, etc...

        Even an innocent "mistake" on Apple's part could cost third parties millions of dollars. That's not justice, it's not protecting children, and it's not acceptable.

        It's suspicious that you consider the thing providing evidence to be the problem and not the justice system or prosecutorial misconduct... like cameras can lead to false arrests, so cameras are the problem?

        I'd rather have really good prosecutors, and grand juries have ALL the evidence than a broken justice system with less evidence. Not sure who you're trying to save with that one.

        • As a programmer I learned that it is often easier to keep a system from becoming more broken than it is to fix all of the problems in a broken system.

          Because the problems facing the legal system are rather elusive from a policy level - there's no good, overall policy which says, "You know, this shouldn't even come before a court in the first place", we have to deal with a system in which even the suspicion of wrongdoing can subject innocent people to a loss. If the state had to reimburse innocent partie

    • Re: (Score:3, Informative)

      by exomondo ( 1725132 )

      You can scan all the stuff that leaves my device and enters your storage.

      If it's end-to-end encrypted then you shouldn't be able to.

      You can scan your iCloud all day but don't scan my phone or put those databases on my phone.

      AFAICT it's actually comparing a hash of known exploitation material? Surely that hash could be computed locally on the phone and then uploaded to iCloud along with the encrypted photo and they can do the hash comparison on their end? That way they don't need your unencrypted data and you don't need their CSAM database.

      • If it's end-to-end encrypted then you shouldn't be able to.

        You can still have end-to-end encryption with more than one party privy to the key. All Apple has to implement is some code to send your device's encryption key to their cloud servers and they can scan whatever you've uploaded to their heart's desire. Of course, you have to trust that Apple isn't going to do anything malicious with access to your encryption key, but honestly, we're already at the point where Google might have the password to your bank account [wired.com].

        • If it's end-to-end encrypted then you shouldn't be able to.

          You can still have end-to-end encryption with more than one party privy to the key. All Apple has to implement is some code to send your device's encryption key to their cloud servers and they can scan whatever you've uploaded to their heart's desire.

          It's not end-to-end then is it, going from say my iphone to my macbook there's a man in the middle with the key that can decrypt the data.

          Of course, you have to trust that Apple isn't going to do anything malicious with access to your encryption key

          Right you only give it to people you trust, you don't need to trust Apple to that degree, the device can upload the encrypted file and the hash. No need to upload the unencrypted file, no need to have the CSAM database on the device and no need to give Apple your encryption key.

          but honestly, we're already at the point where Google might have the password to your bank account [wired.com].

          I'm not sure what you point is here? Forget encryption and privacy because you think the horse

    • It's both normal, and should be expected at this point, that you have 0 real control over your phone. Not only that, but they'll work tirelessly to find as many buyers for your data as possible.

      Hell, it doesn't even need to be about the money. If they think you're a pervert, or someone just wants to take a peek at what you've got in there, that's reason enough for them to start sucking more data.

      But hey, it's Apple... at least you can know, all that aside, that they respect your privacy. I mean, they're cha

    • Scanning isn't mandated by law, so I see no reason for cloud providers to do so preventively.

      If they really want to solve the problem they should simply end-to-end encrypt their cloud file storage (like Mega is doing) and be done with it. No one will know what's stored on your cloud servers so you won't have to keep explaining to governments why there's CSAM on your servers and why you're not doing anything about it.
    • Cool all of us will just set up SMTP on their dynamic IP home connection with port 25 blocked to take care of that "someone elses" storage problem.

      Just because we've become complicit for spam/malware filtering doesn't mean it can suddenly become a scan for illegal items or other things that they object to.
      Now it's child porn, then it'll be classified documents if it's not already, then hashes of files that allegedly spread what they deem "misinformation", or "extremism", etc.

  • by kriston ( 7886 ) on Monday August 23, 2021 @08:30PM (#61723231) Homepage Journal

    Flickr has been scanning photos for years, too.

    I have the Flickr app auto-upload from my phone. Someone texted me a photo and it was flagged on my Flickr account as being and "adults only" photo by their AI scanner.

    Nothing about this is new.

    • by fermion ( 181285 )
      So this is why all the porn is moving to gettr
    • The Photos app in Android scans local pictures and does some kind of analysis as well. You don't even need to upload them anywhere either, it'll scan them right off your SD card. It's already been deployed for at least a year or two, used for ads, but probably other stuff as well.

      The only new part with these revelations, is the reveal itself. That's typically years after these practices start. My knowledge about the Photos app, for example, has yet to be "credibly" revealed, publicly. (As far as I know, or

  • by Anonymous Coward
    Didn't governments just shut down a darknet child porn site with subscribers in the 100K+ range? It seems as if this guy was actually bragging in a sideways fashion. Sort of like, "hey, we're so secure that we're the preferred platform for trading child porn and thus we need to break our own security". If so, it's a total fail.
    • I can remember one that had 400K accounts linked to it. However, none of the users was ever identified since they were using Tor, ore maybe a few stupids who were using real email addresses. Only the system administrators could be tracked, probably because they had a physical server which could be linked to individuals (someone has to pay for the server).

      It seems to me that LEA can identify the physical location of a server even if it's on Tor, but not individual users.
  • Of course they were scanning for illegal content. They are obligated to report anything they can identify, and a failure to report, even for the "uh, we didn't know what was on there" excuse would not make them any less liable...

    Apple's only mistake was announcing they were going to do it. Had somebody told the "genius" marketer that thought they'd get only good press by announcing such a thing to shut the f up, nothing would have exploded. They already have our permission per the TOS that we all agreed to

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      They are obligated to report anything they can identify,

      No. They are required to report anything that they DO identify. They are not required to scan. They are not required to take any proactive steps of any kind to identify anything. That's true in the USA, and I think, but am not completely sure, that it is also true in all the other jurisdictions that have major clout on the Internet. The EU was talking about adding a proactive scanning requirement for copyright and may also have such a proposal for CP,

      • I doubt that any hardcore "professional" pedos use those services for anything incriminating to begin with, and if they do they surely encrypt it using their own tools. Any long-term pedos who would be caught by that kind of scanning have already been caught.

        You're assuming a lot about the pedos. They have arrested cops and sheriffs and other law enforcement who were dumb enough to record themselves on january 6th, and they should have known that their electronic devices were going to leak their details all over the place...

  • by ksw_92 ( 5249207 ) on Monday August 23, 2021 @10:32PM (#61723449)

    It's been an option on Cloudflare accounts for some time. Their blog post about it is rather interesting: https://blog.cloudflare.com/th... [cloudflare.com]

  • Secret scanning (Score:4, Interesting)

    by Malifescent ( 7411208 ) on Monday August 23, 2021 @10:48PM (#61723497)
    I'm guessing they've also scanned the phone contents of a limited number of iPhone users, just to get an inkling how much CSAM is present on users' devices.

    Obviously they're keeping this a well guarded secret, as this could blow the lid off their privacy image.
    • I'm guessing they've also scanned the phone contents of a limited number of iPhone users, just to get an inkling how much CSAM is present on users' devices.

      Obviously they're keeping this a well guarded secret, as this could blow the lid off their privacy image.

      And what, they also hacked a bunch of Google or Samsung devices for comparison?

      Or the FBI straight up fucking told them! Sorry for the language, but why is everyone going off on weird tangents on this. Apple may have undisclosed methods of detecting CSAM, but that's not required to explain how they would know they are the biggest platform for it. That is as simple as the people catching the offenders telling Apple, you know, most of these perps are using your platform. Wink wink nudge nudge, it would be

      • I'm making this claim because I don't believe simply analyzing the number of emails with CSAM attachments gives you a good indicator that Apple's iPhone is the "premiere CSAM platform."

        You have to scan the files on the phones too to reach a meaningful conclusion.

        I'm assuming they only did this temporarily and that the system isn't currently operational, since if this leaked it could sink the company. My guess is they shared the results with the government.
  • by aerogems ( 339274 ) on Tuesday August 24, 2021 @02:08AM (#61723845)

    Just because they've been secretly doing this sort of thing on emails going through their systems for years doesn't in any way, shape, or form, make it acceptable to do the same thing on a person's phone. It's like when you were a kid saying, "Everyone else is doing X!" and your parents come back with, "If everyone was jumping off a cliff, would you do that too?"

    Their running this scan on their own computers I can accept as a compromise since I'm using their systems and they're potentially on the hook legally if they allow that kind of behavior to go on unchecked. However, what is the basic definition of spyware if not an unwanted program running on your computer (or phone, which is fundamentally a computer) that secretly monitors your actions and reports back to a third party?

  • How many *millions* of cases of breaking the GDPR is that now?

    Let's see. If Apple is unlucky, this can quickly turn into "You have to sell parts of your company."

    • by cmseagle ( 1195671 ) on Tuesday August 24, 2021 @08:06AM (#61724455)

      From EDRi [edri.org], an organization advocating for Europeans' digital rights:

      On 29 April 2021, the European Union co-legislators (EU Member States and the European Parliament (EP) reached a provisional agreement (subject to a formal approval) on a temporary legislation to allow providers of electronic communications services such as web-based email and messaging services to continue to detect, remove and report child sexual abuse material (CSAM) online.

      Seems to me that Apple is in the clear on this, for now.

  • by Maelwryth ( 982896 ) on Tuesday August 24, 2021 @01:05PM (#61725419) Homepage Journal
    What appears to be being left out of the conversation is that it obviously doesn't work. Where are the thousands of arrests? Where are the tens of thousands? Facebook submitted 16.9 million times in 2019 and 20 million in 2020? Apple has been scanning secretly. Yet no arrests. This looks to me like another excuse to surveil people. After all, once the system is in place it isn't that far a stretch to submit a hash for other content.

Trap full -- please empty.

Working...