Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Privacy Apple

Apple's Child Protection Features Spark Concern Within Its Own Ranks (reuters.com) 99

According to an exclusive report from Reuters, Apple's move to scan U.S. customer phones and computers for child sex abuse images has resulted in employees speaking out internally, "a notable turn in a company famed for its secretive culture." From the report: Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread. Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

In the Slack thread devoted to the photo-scanning feature, some employees have pushed back against criticism, while others said Slack wasn't the proper forum for such discussions. Core security employees did not appear to be major complainants in the posts, and some of them said that they thought Apple's solution was a reasonable response to pressure to crack down on illegal material. Other employees said they hoped that the scanning is a step toward fully encrypting iCloud for customers who want it, which would reverse Apple's direction on the issue a second time.
Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.
This discussion has been archived. No new comments can be posted.

Apple's Child Protection Features Spark Concern Within Its Own Ranks

Comments Filter:
  • Seems like Apple's gone down the child-protection road, but have no reason to search for hackers, scammers, liars, or despot. At least they don't have a social network yet.

    • I'm not sure people want to see a naked hacker, scammer, liar, or a despot. Maybe that's why there's not a problem.

      • I'm not sure people want to see a naked hacker, scammer, liar, or a despot. Maybe that's why there's not a problem.

        Speak for yourself! Just for that I'm not sharing my collection of risque daguerreotypes of Ada Lovelace with you.

    • by Joce640k ( 829181 ) on Thursday August 12, 2021 @07:58PM (#61686227) Homepage

      Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

      In other news: Apple has no idea how much pressure a government can apply if it wants to. What will they do if a government bans Apple phones in a country unless they comply?

      • I think there has been some confusion on the issues here.
        Two things are worth being clear about in order to have a discussion that's fruitf

        • by raymorris ( 2726007 ) on Thursday August 12, 2021 @08:31PM (#61686299) Journal

          Well darn, I accidentally clicked sub it too soon.

          I think there has been some confusion on the issues here.
          Two things are worth being clear about in order to have a discussion that's fruitful, because it's based on actual facts.
          These are two things I notice as a career security professional which some commenters seem to have missed.

          First, the quick and easy point. The proposed system does NOT have any way of knowing what is on a user's phone. That is, it doesn't know what's in any of your photos.

          What is does is download a list of cryptographic HASHES of known child porn files. The hash doesn't let you reconstruct the photo. (If you could reverse a hash Bitcoin and TLS wouldn't work). It then locally computes the hash of any photo before it's uploaded to icloud. The check is whether the user is uploading an *already known* child porn file.

          The key point here is it's not doing any kind of advanced image recognition, or any image recognition. It's checking whether the bytes being uploaded are identical to the bytes of a previously known child-porn file.

          Secondly, I keep hearing "I don't trust Apple not to use it for ...".
          Here's the thing. If you have an iphone, you DO trust Apple, absolutely. Did you know Apple has a function that reads all of your keystrokes? They also have one that knows everything that is displayed on your screen.

          The Apple software that sees all of your keystrokes is the on-screen keyboard. That's Apple software. If you assume that China can get Apple to change the software to bad things, they can just as easily use their keyboard software to do $bad_thing with all of your keystrokes.

          Apple's software ALREADY knows everything displayed on the screen. Because it's Apple's software that draws the screen! You ALREADY trust Apple absolutely. If Apple were going to do something bad with their software, at the behest of any government or for any other reason, they can ALREADY do that!

          So any objection based on "I don't trust Apple's software" is based on a misconception. If you use an iphone, you're already giving Apple absolute control of everything on your phone. Because the phone *is* Apple's software.

          Also, any objection based on the thought that it's doing any kind of image processing is based on a misconception. The system doesn't actually even care if it's an image, a zip file, a video, or any other kind of file. The system simply checks the raw bytes to see if they match the raw bytes of a known child porn file. It doesn't know or care what any of the pixels are.

          There may or may not be other valid arguments, but any argument based on the idea that Apple will change the software to find other kinds of files doesn't make sense. They could just as easily change the keyboard or the music player or anything else to do that.

          • by thesk8ingtoad ( 445723 ) on Thursday August 12, 2021 @08:39PM (#61686323) Homepage

            As of this moment, it's an unknowable black box. That said, apple has stated that the system will match on a photo even if it's been resized, shifted formats or been otherwise altered. It is NOT a simple file hash. In order to create a hash that survives re-encoding or resizing, content analysis must be happening. Whether that changes your calculus or not is another matter entirely.

            • by raymorris ( 2726007 ) on Thursday August 12, 2021 @10:57PM (#61686579) Journal

              Ot matches resized I ages by essentially compressing the hell out of it as it generates the hash. It works by using LESS information about the images not more. Such that the system can't tell the difference between two very similar images. Conceptually, it's the same as resizing both images to 50X50 pixels and hashing that. Conceptually, but the math is significantly different).

              For any cryptographers reading that, the sentence may sound funny because that's actually what a hash function is - a compression algorithm. You compress the first n bits and second n bits, then combine the two compressed forms with xor or whatever. Then compress the next n bits and merge them in.

              Anyway, it's able to do that by LOSING information, by ignoring the details of the image.

            • by gweihir ( 88907 )

              What it would actually take is some really independent oversight committee that vets all hashes and certifies under oath to the public regularly that these are indeed the hashes they claim to be. Because the process is largely political (children are just the pretext du-jour here) that will be really hard to establish.

            • You're thinking of cryptographic hash functions which produce very different output for slightly different input by design. This is not a property of all hash functions, but only of the ones you're probably used to using.
          • by Asynchronously ( 7341348 ) on Thursday August 12, 2021 @08:56PM (#61686351)

            Almost correct. It uses hashes, but not the traditional way that you are describing:

            https://www.apple.com/child-sa... [apple.com]

            Scroll down to NeuralHash.

            • by gweihir ( 88907 )

              As long as these will also do an exact match, they can be used to find any known binary object.

          • Asynchronously already pointed out that you misunderstood the hash, but didn't follow up with the implications. What's also important is that the hashes are take to the phone in a format that means it's impossible to actually see what the hashes are.

            The implication is that it's possible for Apple to search for any image in your icloud photo albums and you wouldn't know they are doing it. The implication of that is that a government could corrupt or control Apple employees (a warrant with a requirement to

            • > The implication is that it's possible for Apple to search for any image in your icloud photo albums and you wouldn't know they are doing it.

              Obviously I didn't make my point clear. You said "it's possible for Apple to ...". It's been possible for Apple to do that since the very first iphone. Apple makes the *operating system*. The only way you can save a file on the phone is for *Apple's software* to get the file and put it on flash blocks. Apple's software *already* processes absolutely everything o

          • They are doing image analysis for the other new feature, allowing an opt-in parental control on Messages to blur sexually explicit photos and alert parents that the kids are sexting.

            That means Apple will be monitoring for something that prosecutors have sent teenagers to prison for.

            If they were just comparing hashes, they wouldn't say it was "trained on" a corpus of images.

            • Important distinctions include

              1. the feature you are describing is "opt-in", and only an option for child accounts attached to an adults family iCloud account. No child account, or no opt-in = no scanning.

              2. As has been pointed out already, you are already trusting apple with your data. After all, they have had object recognition in iPhoto for years now. You can go to your photo library right now and search for things you never labelled in photos. Wine bottles, selfies, boats, flags, whatever you wish, a
          • by gweihir ( 88907 )

            That is, it doesn't know what's in any of your photos.

            What is does is download a list of cryptographic HASHES of known child porn files.

            Factually incorrect. What it downloads is _some_ hashes and you have no way of finding out what they were taken of, _unless_ you have a matching file. As using illegal pixel files to identify legitimate hashes is right out (probably impossible to get them all and for numerous reasons you really do not want to get then in the first place), you have _no_ way to find out what the hashes were taken of. And _all_ larger block/scan schemes implemented so far immediately developed feature creep. This one will too

            • I actually CAN download the hashes from ICMEC and compare them to the hashes Apple or Microsoft is using, to the same extent I can look at ANYTHING an iphone is doing.

              So you now need to bring the International Centre for Missing & Exploited Children into the conspiracy.

          • by teg ( 97890 )

            What is does is download a list of cryptographic HASHES of known child porn files. The hash doesn't let you reconstruct the photo. (If you could reverse a hash Bitcoin and TLS wouldn't work). It then locally computes the hash of any photo before it's uploaded to icloud. The check is whether the user is uploading an *already known* child porn file.

            The key point here is it's not doing any kind of advanced image recognition, or any image recognition. It's checking whether the bytes being uploaded are identical to the bytes of a previously known child-porn file.

            It's important to note that these aren't normal cryptographic hashes. These are NeuralHash [apple.com]. It can recognize content that has been slightly modified

            Also, it's important to note that Apple has been trusted partially because they have gone out of the way not to have the ability to do certain things, but only on device and thus not easily available even if Apple is put under pressure. If Apple didn't have an ability to scan for types of content, authorities couldn't pressure them on it. With the ability avai

          • While I agree about handing over complete trust to Apple, I believe you miss the point that spying mechanisms in the keyboard, output functions and whatever, could generate suspicious traffic and thus be detected. This would probably generate some amount of uproar.

            By openly introducing a mechanism that scans files and pinky-swear that it is just for one kind of illegal content, they ensure that there is nothing suspicious happening. You can't tell a child porn hash from a tank man hash.

            The initial deploymen

          • There is a difference between trust and 'best choice we have'. I don't trust comcast, but I have no choice if I want internet. I don't trust apple, but I do find their ecosystem and privacy features to be better built than their competition.

            There has to be more to this than just this child porn thing. Based on their own descriptions it would be trivial to get around (Just don't enable iCloud) and it doesn't detect net new child porn. So why are they doing this? Because they have plans to make it more invasi

            • > There is a difference between trust and 'best choice we have'. I don't trust comcast, but I have no choice if I want internet.

              You trust them in that you give them all of your data.
              You trust them because you feel that you have no better option.

          • What is does is download a list of cryptographic HASHES of known child porn files. The hash doesn't let you reconstruct the photo. (If you could reverse a hash Bitcoin and TLS wouldn't work). It then locally computes the hash of any photo before it's uploaded to icloud. The check is whether the user is uploading an *already known* child porn file.

            So...all child porn distributors need to do is randomly change a couple of pixels in the file and they're good to go?

            • The system does very lossy compression before hashing, so minor changes are lost. Conceptually, you can think of it as similar to resizing all images to 40x40x256 before taking the hash.

              It's actually not a resize operation, but it's a similar concept of throwing away the details.

              Actually if you think about it, ALL hashes are lossy compression. From 10 MB of input (80 million bits) they produce 128 bits of output. That's a lot of lost bits!

              Hashes for many operations are chosen specifically to do a good job

      • > What will they do if a government bans Apple phones in a country unless they comply?

        That's easy - Apple is now storing CCP subjects' data in China where the CCP can scan it.

        Apple appeases totalitarians for profit.

        When they get an NSL they will also do it and you'll never know about it (unless the next Snowden does the right thing).

        • As others have pointed out, developing this system only really makes sense if full encryption of all icloud data is their end game. This 'feature' gives them a defense against accusations of being the device of child abusers, as they make their devices even that much harder for governments to get into.

          If all of the Chinese iphone users data is completely encrypted, and Apple doesn't have access to those keyes, then it won't really matter that the data is stored on servers in China, unless the Chinese auth
      • when the FBI wants an phone unlocked apple better help then!

      • by gweihir ( 88907 )

        In other news: Apple has no idea how much pressure a government can apply if it wants to. What will they do if a government bans Apple phones in a country unless they comply?

        That one I find surprising. Or rather I do not believe it. Clearly somebody at Apple has no clue about this, but others must know that regardless of how large and rich Apple is, they do not stand a chance against a larger government. (Small one: They can just stop doing business there, unless it is in the EU.)

        My take is that they allowed some SJW idiot to get too much power.

      • by Kazymyr ( 190114 )

        The next day after this feature is implemented, Apple will receive a letter from the Chinese government:
        1. You will allow us to inject hashes into your database.
        2. You will not tell anyone about this
        3. Or else, no sales of any Apple products anywhere in China.

        What do you think Apple will do?

        • Yeah, something like this is the biggest concern for most people I would assume.

          Doesn't need to be China, but they have demonstrated their dedication to surveillance and their willingness to throw their economic weight around for political gain. Throw in the significance of China as a market to the company, and it sets up a pretty clear path to some sort of sofie's choice for the company. Your profits or your customers privacy. You can't have both.

          Before there were legitimate technical reasons why they
      • What will they do if a government bans Apple phones in a country unless they comply?

        Isn't this already a question? If they announced today they were abandoning this how would it change that?

      • Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

        Sure, but Apple says a lot of things.

        Apple said that if you bought content music or video from their store, you could access it whenever you wanted for as long as you wanted. Recently, they argued in court that no reasonable person would believe what they said:

        https://www.americanthinker.co... [americanthinker.com]

        When they give in to whatever government they give in to, they wil

    • Well with Americans there is a degree of Hypocrisy.
      1. We want all the bad people to go to jail, and get punished very badly
      2. We don't want to be monitored because we could be caught being a bad person

      It is be tough on everyone else but me, attitude. To function in a culture, they are norms and rules we will need to follow, to allow the whole function. However there needs to be space to allow for individuality, mistakes to be made, new ideas and successes to happen.

      For the United States and its culture (w

  • by xalqor ( 6762950 ) on Thursday August 12, 2021 @07:56PM (#61686223)

    wasn't the proper forum for such discussions

    It seems that whenever something controversial is going on, whatever forums a company has already setup to discuss internal company issues become inappropriate somehow.

    • by Sebby ( 238625 )

      wasn't the proper forum for such discussions

      It seems that whenever something controversial is going on, whatever forums a company has already setup to discuss internal company issues become inappropriate somehow.

      I wonder if/how employees responded to that - "Well, what is an appropriate forum? Open, public discussion? Okay, I'll call the local news crew."

  • by Anonymous Coward

    Core security employees did not appear to be major complainants in the posts, and some of them said that they thought Apple's solution was a reasonable response to pressure to crack down on illegal material.

    If people would just change their iPhoto setting to "I don't want to upload dirty photos to iCloud" this problem would go away.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Yeah, no, I'm not convinced of that. The scanning feature will be deployed as an iOS update and so will be on everybody's phones regardless of whether or not they have iCloud enabled. Since the reported functionality scans photos "before upload to iCloud" there's no reason it even needs iCloud enabled to be able to rifle through the local photo collections.

      Given Apple is significantly moving the goal posts to install a scanner on everybody's iDevices who's to say if/when/how much further they'll move the go

      • by Sebby ( 238625 )

        Yeah, no, I'm not convinced of that. The scanning feature will be deployed as an iOS update and so will be on everybody's phones regardless of whether or not they have iCloud enabled.

        I've heard rumors (maybe it's been confirmed) that Apple will actually allow older iOS releases to be kept/installed on "current" iPhones (meaning you won't be forced to upgrade to iOS 15 if you want to stick to iOS 14, and still get security updates) - I wonder if this "feature" is part of the reason why.

    • by Pimpy ( 143938 )

      You're being overly simplistic. If I take a photo of my kids playing in the bath and send it to my wife while she's on a business trip, that's completely innocuous, while the same photo found on the device of someone with no connection to the children would be rather less so. This would even be normally encrypted if Apple hadn't conceded to unencrypted cloud backups. That being said, they certainly do not have an AI that is capable of making that kind of distinction with any kind of reasonable accuracy, whi

      • The only way your photo of your kids is going to raise a red flag is if:

        You send the photo of your nude kids from a child account (maybe your older child's phone), in which case an alert would be sent to you and your wife of possible sexting going on.

        OR

        That exact picture has already been determined by the courts to be CSAM, shared with the only corporation in the US legally allowed to possess CSAM, run through the Neural Hashing engine Apple is using for this, and the Neural Hash transferred to your de
  • by Anonymous Coward

    Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

    I'm confused. Did Apple get out of China? When it comes to China, you don't get any say in things like that. You do what they say, or lose access to their huge market.

    A gag order and National Security Letter in the People's Repulic of America wouldn't be too far behind.

  • Employees, what could they possibly know? They don't get paid nearly as much as I do, and I can fire them. That means they're idiots!
  • by theshowmecanuck ( 703852 ) on Thursday August 12, 2021 @08:05PM (#61686243) Journal
    They are using this as an excuse to invade privacy and scan your stuff with impunity. Who knows, maybe they are even in it with some government acronym. They use the child porn angle so that people are afraid to object to the invasion of privacy for fear of being labelled a deviant.
    • by geekmux ( 1040042 ) on Thursday August 12, 2021 @08:40PM (#61686327)

      They are using this as an excuse to invade privacy and scan your stuff with impunity. Who knows, maybe they are even in it with some government acronym. They use the child porn angle so that people are afraid to object to the invasion of privacy for fear of being labelled a deviant.

      65% of gun deaths in America are due to suicide, and more people are killed with kitchen knives than AR-15s, but guns are now evil, and "assault" weapon ownership means you're a suspected mass murderer. Makes perfect sense right? After all, I'm sure the way to cure obesity is to start banning all-you-can-eat buffets, and labeling those who want them, a fat-wielding terrorist.

      Weed is still a Schedule I drug with no proven medical use. Literally considered worse than fentanyl by the DEA. So of course you're some kind of criminal deviant for wanting to consume it. Even in the privacy of your own home. How dare you turn your back on the proud American tradition of mass addiction via Pill Mill, Inc. You must be some kind of Capitalistic traitor.

      This bullshit angle to destroy privacy via "Think of the Children", was quite literally to be expected.

      Next up, Help Prevent Domestic Terrorism with iSnitch. Best install it, or else you're assumed to be one of "them".

      • by larryjoe ( 135075 ) on Thursday August 12, 2021 @10:12PM (#61686515)

        65% of gun deaths in America are due to suicide, and more people are killed with kitchen knives than AR-15s, but guns are now evil, and "assault" weapon ownership means you're a suspected mass murderer. Makes perfect sense right?

        More than 73% [fbi.gov] of US homicides are by firearms (i.e., guns), while around 10% are by knives. Yes, if you break down the specific type of gun, then the number drops to however low a number one wishes to artificially pick. In the US, knives killed more people than rifles (4x), more than assault weapons, landmines, bazookas, hand grenades, nuclear weapons, and large fertilizer bombs combined. But those numbers merely detract but don't cover up the fact that 73% of US homicides are carried out with guns.

        How about this for a sobering fact? In the US, as many people die from guns as from drunk drivers. And that's not counting suicides. Counting both suicides and homicides, guns kill more than twice as many people as drunk drivers.

        • Re: (Score:2, Insightful)

          by AmiMoJo ( 196126 )

          I find it suspicious that there are no statistics on if gun ownership actually protects people from crime, or if it protects them better than say a baseball bat.

          You would think that the gun lobby would be keen to gather evidence that guns reduce the likelihood of being the victim of crime, or of being injured. Yet for some reason that data isn't available, and it seems like the sources that could provide it are for some reason not making it available.

        • According to your statistics, for year 2019
          13,927 homicides with a weapon (incidence 4,24 per 100,000 nationwide, a little more than half (~60%) of the world's average of 7.03)
          10,258 with firearms (73%)
          6,368 with handguns (45% of all murders, 62% of firearm murders)
          3,281 with firearm not specified (23,5% of all murders, 32% of firearm murders)

          And contrast to:
          364 with rifles (2,6% of all murders, 3,5% of all firearm murders)
          (keep in mind rifles is a very broad category and includes many many more types and m

        • 65% of gun deaths in America are due to suicide, and more people are killed with kitchen knives than AR-15s, but guns are now evil, and "assault" weapon ownership means you're a suspected mass murderer. Makes perfect sense right?

          More than 73% [fbi.gov] of US homicides are by firearms (i.e., guns), while around 10% are by knives. Yes, if you break down the specific type of gun, then the number drops to however low a number one wishes to artificially pick. In the US, knives killed more people than rifles (4x), more than assault weapons, landmines, bazookas, hand grenades, nuclear weapons, and large fertilizer bombs combined. But those numbers merely detract but don't cover up the fact that 73% of US homicides are carried out with guns.

          Not sure if you are purposely manipulating your point by removing self-harm, but suicide still accounts for over 65% of ALL deaths by firearm. This is why it becomes difficult if not futile to discuss the firearm "problem" in America while being ignorant of the real problem, which is mental health.

          It's like trying to discuss the obesity epidemic while purposely avoiding any discussions around junk food and fast food. Or discussions around alcoholism while purposely avoiding the existence of bars, liquor

        • In the US, as many people die from guns as from drunk drivers. And that's not counting suicides. Counting both suicides and homicides, guns kill more than twice as many people as drunk drivers.

          The problem with being able to defend yourself is that if you so choose, you can also attack people. Are you suggesting everybody be utterly defenseless against any attacker? Are we to go back to the times that the most physically fit get to bully and take everything away from everyone weaker than them? Why would you want that? Why would you want to be defenseless?

    • by U0K ( 6195040 )
      This one is an old tactic and has been known as one of the "Four Horsemen of the Infocalypse" for a while.

      Terrorists (see 9/11), drug dealers (see War on Drugs), pedophiles, organized crime. Duckduckgo search [duckduckgo.com].

      It's a huge red flag when someone does this, which should earn it a lot of scrutiny in order to determine with such a measure is justified or if you're actually throwing out the child with the bathwater.
      And if you live in a country where a part of the people holds some firm beliefs like any kind o
    • Hi everyone! I came across a new online bank app in Singapore, Lucy, and wanted to share. My friend told me about it when I needed to take out extra money, and was going to reach out to a moneylender. I’ve asked my boss for money a few times already, but felt embarrassed to ask again. All I had to do was friend my boss on Lucy, and I was able to request for a loan from her on the app itself. 1. It let me select my loan amount 2. Allowed me to choose loan for how many months 3. Let me choose how much
  • Anybody working for Apple should be considered the same as working for the North Korean government. NO TRUST.
    • I don't think it's in any way an exaggeration to say that the people who work for the North Korean government are literally slaves to a crazy autocrat. I also don't think it's an exaggeration to claim that they are aiding said crazy autocrat to commit atrocities including torture, murder, and famine (among other things).

      Apple employees, on the other hand, are helping a company that's probably worth more than North Korea make phones that may not respect user privacy as much as advertised.

      I'm a privacy nut. P

  • by RobinH ( 124750 ) on Thursday August 12, 2021 @08:19PM (#61686277) Homepage
    It's important for people to understand that the people who are expressing concerns about this aren't trying to protect child predators, and that's not anything to do with it. The concern is that you're giving more power to a third party that you really have no reason to trust. The simple example would be if a politician with an iPhone supported tax reforms that Apple didn't like, a little tweak to the algorithm could flag this politician as having CP on their phone, and suddenly they get a visit from investigators. Apple can always claim it was just a false positive, but it's a good way to harass politicians who are unfriendly to your cause. It's just a can of worms we shouldn't open. Then you have other issues, like if it flags a picture that an 18 year old took off themselves, the algorithm can't know the precise age, and now it forwards that picture to a human for verification. That opens up all kinds of ethical problems. There are just too many issues here.
    • The simple example would be if a politician with an iPhone supported tax reforms that Apple didn't like, a little tweak to the algorithm could flag this politician as having CP on their phone, and suddenly they get a visit from investigators.

      Except for this [washingtonpost.com]:

      If [the tool] finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

      So there will be human review *before* the authorities are contacted.

      • by Kazymyr ( 190114 )

        Human review carries with it the possibility of human error. Or in a political context, human "error".

      • Assuming Apple, in my example, is trying to act nefariously, the human check doesn't matter unless it was done by a 3rd party. The problem is simply that giving Apple the job of searching for CP gives them credibility when they claim to have found something.
    • I have honestly yet to see that come up as a rebuttal to the criticism, which is actually quite refreshing TBH. Most of the rebuttals I have seen to the criticism have been technical in nature.

      1. Apple can already scan, or allow to be scanned, anything in their icloud at the moment with impunity. That is, after all, how Facebook and Google report so much more CSAM than Apple. Nothing in US law (or others as far as I am aware) currently requires you to look for this content, only to report it if you do find
  • Said it before, but (Score:3, Interesting)

    by 93 Escort Wagon ( 326346 ) on Thursday August 12, 2021 @08:19PM (#61686279)

    It bears repeating. Apple will refuse requests, I’m sure. But given a FISA order to include hashes of other types of documents, or a new law compelling them to do so - Apple has always said they’ll obey the law, so they will fall right into line.

    So basically Apple is adding the means to provide law-enforcement with a ready-made back door, gratis. Not to mention this “feature” may turn out to be exploitable in other ways.Thank you, Tim Apple.

    My SE2 is going to stay on iOS 14 for as long as it’s maintained. After that, I’ll have to decide what my next phone will be.

    • by poptix ( 78287 ) on Thursday August 12, 2021 @08:38PM (#61686319) Homepage

      Worse, Apple doesn't even have to know.

      Dictator X wants to know who has Overthrow_Plans.doc
      FBI (or similar) submits hash of Overthrow_Plans.doc to NCMEC's database, claiming it's CP
      Apple reports all the people with that hash (ie: all the people with Overthrow_Plans.doc) to the FBI.

      That's the problem with using hash databases like this. It's one thing when you're using it to block people from uploading copyrighted content to YouTube, it's a whole other problem when you provide feedback to the authorities about who has what files.

      • Apple reports all the people with that hash (ie: all the people with Overthrow_Plans.doc) to the FBI.

        It's actually not quite that simple with what they are releasing, but what Apple is doing with this is indefensible so I'm not even going to bother explaining the nuances of something that at the core is abhorrent.

        I'm fine with any given evil being ascribed to Apple's technology - because given enough time you are probably right.

      • by yabos ( 719499 )
        Except they don’t just automatically report a single match of anything to anyone AND there have to be many many hashes matched for one user to actually have the phone notify Apple about the matches. The POSSIBLE matches are reviewed by a human before anything is done with it.
        • In their theoretical universe that's incorrect.

          In their theoretical universe Apple can be compelled to use this feature for any other parallel purpose at all, even if the implementation cannot support it, because governments can compel companies to do whatever they want.

          Interestingly their own argument is also that if Apple didn't implement this at all then governments couldn't compel Apple to do it at all. I find that odd, as the simplified, dissident hunting version of this, already exists across multiple

    • What requests? We have already seen court ordered requests to Apple and their excuse for refusing the requests have been solely based on the technical impossibility of meeting that request.

      I'm not so sure Apple will actually refuse a request they are capable of complying with.

  • by fahrbot-bot ( 874524 ) on Thursday August 12, 2021 @08:36PM (#61686311)

    Other employees said they hoped that the scanning is a step toward fully encrypting iCloud for customers who want it, which would reverse Apple's direction on the issue a second time.

    Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

    Yup, it would be better/easier if everything was encrypted so Apple could simply say, "sorry, we can't because it's encrypted" rather than "sorry, we won't because [reasons]". Because, if Apple can, but won't, eventually someone is going to step on Apple hard enough so they will.

  • Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

    And I'm sure the press will dutifully dredge up this statement when Apple starts accepting "requests from governments to use the system to check phones for anything other than illegal child sexual abuse material."

  • by PinkyGigglebrain ( 730753 ) on Thursday August 12, 2021 @09:42PM (#61686461)

    Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

    If they don't have it to begin with they wouldn't have to worry about the requests in the first place.

    If, well, lets face it, when, Apple implements this system it will be used for more than its original started purpose within a year.

    It took less than 6 months for the FBI to give guidelines to US law enforcement about how to use the original PATRIOT Act against drug dealers at the local level.

  • Is apples main selling point
  • [Grainy B&W tv ad of a runner zooming through a dystopia of marching worker drones sitting to watch Big Brother pontificate. She carries a hammer, but gets caught and tackled to the ground.]

    "And you will see why 2022 will be exactly like 1984."

    Congrats, Apple. You lived long enough to become the villain, as you defined it.

  • Since now Apple will just assumes I'm a pedophile waiting to be caught, I'm moving.

    I've had iPhones since the 6, never a real Apple fan, but I did appreciate their stance on privacy so it was a simple choice to have a "phone that just worked". Prior to my iPhones I used a Samsung Galaxy that had been flashed to CyanogenMod and then tweaked to hell and back for privacy, but it was slow, buggy, and because of the tweaking, buggier and even slower. I loved my iPhone =(

    However, the invasion of privacy is t

  • If I was designing a feature to scan messages I would probably want to implement this in the keyboard/input layer. Not in iMessage. That way when the pedo viewers stop using imessage(which I doubt they do anyways) it would apply to all apps.
  • by Malifescent ( 7411208 ) on Friday August 13, 2021 @12:11AM (#61686687)
    In the end it all boils down to Apple users caring more about their privacy than the sexual safety of children.

    Most iPhone users assume as the owners of the device they can do with it as they see fit, including viewing illicit material (not just child porn).

    I find it staggering that Apple blew billions of dollars of investment of its privacy-focused image with this snafu. People will distrust Apple from now on and will start to look for alternatives.
    • I believe Google will join in on the "think of the children, it's not just a terrific way for us to label you a deviant terrorist if you speak out against unreasonable stealth scanning of all your data" bandwagon. Then, that would leave only unofficial Android ROMs that don't have proper support of the "Play Store" if any, and have usually have countless problems compared to official ROMs, and the vast majority of users are not gonna go manually download "APKs", not to mention install a "custom ROM" in the
    • by realxmp ( 518717 )

      No, they have just read history and not even ancient history. Practically all of these systems that started out for detecting CSAM are now being used to detect copyright material, then terrorist material and then of course dissent! You think it won't happen in the US, and then watch the politicians gerrymander the hell out of your voting map, but you don't mind cause it's your party. There's also excellent scope for a new Watergate style dirty tricks campaign.

      If you give people tools that will let them sta

      • If you give people tools that will let them stay in power

        PLEASE give up on the idea that technology is the enabling factor and the blame there, and accept that it's YOUR responsibility to not elect authoritarians, and to be involved in responsible governance.

        Tech, new hardware and software capabilities, don't create authoritarian governments and dictators for life, you do, you give them power, other people do. ... 1984 wasn't about technology causing or creating, or even enabling Big Brother, it was about a culture of sadistically stepping on other people, in the

    • In the end it all boils down to Apple users caring more about their privacy than the sexual safety of children.

      It has nothing to do with the safety of children. This system won't find new pictures, it will only find pictures (assuming it is only used for pictures) that are already known i.e. the danger has already come to fruition.

      I find it staggering that Apple blew billions of dollars of investment of its privacy-focused image with this snafu. People will distrust Apple from now on and will start to look for alternatives.

      I seriously doubt that Apple thought this was a good idea. I feel fairly certain the concept was "forced" on them. It is not that I think Apple has their customer interests at heart, after all, they are an immoral company out only for the 'dollar'. But for exactly the reasons you just desc

      • This is obviously nonsense. How long before Apple starts using AI to detect a flesh-like appendage entering a child's body in everyone's iPhone photo's?

        The problem with bigots is that they're utterly convinced of the righteousness of their crusade. Apple will continue and double down on its CSAM program, even if it costs them enormous amounts of money in lost sales.
  • After having owned only android phones I was actually considering getting an apple phone because of its security reputation. Thanks but not thanks after reading this.
    • Exactly this. After many years of Android phones and the odd Apple work phone I have determined a phone is just a phone. I don't side load or anything like that so I really liked the privacy talk from Apple and started considering a switch. I am now pretty sure I will just stay put for a while, see how this all shakes out.
  • It would have cost Apple nothing to not implement this feature. But now they spend resources on developing and monitoring it.

    Why? I don't believe that many of their device users have requested this feature. "I won't buy an iphone because it doesn't scan my pictures for child porn".

    The reason is likely either that Apple estimate that they can use it to make money or gain economical or political advantages (which they wouldn't tell us), or that governments require it in order for Apple to operate in their cou

  • Policies and guidelines do not matter. Apple seems to think forcing someone to do something at gun point is beneath the governments of the world

  • Let's assume there are things on the internet that I don't want to see, and that get me into trouble if I see them. Or if I was a minor, things that my parents don't want me to see.

    Step 1: Everything is opt-in or opt-out, and nobody can check whether anyone is opted in or out. How you do this? An item in "Settings" which doesn't show whether you are opted in or out, and when you tap on it there are three options "Opt-in, Opt-out, Cancel". So you control this. Nobody can check it. It never displays your s
  • Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

    The way this is supposef to work is that Apple get a set of image hashes to search for from the govermentent in each country. What is Apple supposed to do if they get a match? They cannot collect and examine the images themselves since it would make themselfes guilty of the exact same crime they want to fight. The law doesn't distinguish between good guys and bad guys possessing child sex abuse images. If you collect and/or view them you simply are the bad guy. So the only option Apple has is to let the gov

  • The issue here is that Apple is proposing to automatically search all iphones for criminal content, not just iphones of suspected criminals. How they search (by comparing hashes) is a detail. What they search through on your phone (the set of images about to be uploaded to iCloud) is a detail. What they do when they find something (human review) is a detail. Details can change. The principle, that Apple should or shouldn't be allowed to search your phone for criminal content without any evidence that you ar
    • That's an interesting angle. It's a private company that's searching your private things. Technically their right under the law. But they are using it to check for hashes provided by a government database in order to turn people over to law enforcement. This reads more like the government using a private company as an end run around the 4th amendment. Which ought to be held as unconstitutional regardless of which side of the issue you fall on.

  • Given that Apple is not law enforcement, it is just as illegal for an Apple employee to look at child porn is it is for you to look at child porn. So, if they collect it from millions of people, and personally review each photo flagged as child porn , whoever is doing the reviewing will be guilty of multiple counts of willfully viewing child pornography.

    I mean, the I am just viewing it to report it excuse didnâ(TM)t work for Pete Townsend.

    How do we know who is verifying these photos anyway, and how do

  • Why do these tech companies think they need to be crusaders for Good? Everyone knows they are evil at heart.

  • I believe apple could simply apply machine learning to detect nudity in the camera api and automatically blur images. It seems to me that this would create a better pathway against apple devices being used for things apple disagrees with rather than viewing the data on devices after the fact. When I hear ____New Technology___ being used to stop ___ Really bad thing___ All I think is, okay so this is the obvious beginning where we all agree that the really bad thing must be stopped without the followup thou
  • This is gonna make Google and Facebook look like amateurs.

    Just think, Apple doesn't have to wait for you to open a browser to track your actions, interests, etc.

    They can just scan your phone.

    Got a lot of pictures of boats? You get boat ads.

    Send a lot of dick picks? You get ads for sex toys and blow-up dolls ('cause you probably gonna remain a virgin).

    Ads are just the beginning. There is no way that aren't gonna come up with ever more lucrative ways to monetize you.

    In 1999, nerds sided with

Trap full -- please empty.

Working...