Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Apple Technology

Apple Plans To Scan US iPhones for Child Abuse Imagery (ft.com) 314

Apple intends to install software on American iPhones to scan for child abuse imagery, Financial Times is reporting citing people briefed on the plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people's personal devices. From the report: Apple detailed its proposed system -- known as "neuralMatch" -- to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicised more widely as soon as this week, they said. The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.

The proposals are Apple's attempt to find a compromise between its own promise to protect customers' privacy and ongoing demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography. [...] "This will break the dam -- governments will demand it from everyone," said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the issue. Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple's move was "tectonic" and a "huge and regressive step for individual privacy. Apple are walking back privacy to enable 1984," he said.

This discussion has been archived. No new comments can be posted.

Apple Plans To Scan US iPhones for Child Abuse Imagery

Comments Filter:
  • by rpnx ( 8338853 ) on Thursday August 05, 2021 @12:45PM (#61659453)
    I don't care what justification or how evil the thing you're scanning for is. It's my damn phone. Leave me f***ing alone. I'd rather criminals not get caught than be spied on 24/7. I'm sure hoping most people are with me on this. Get government out of my computers!

    Enforce the 4th and 5th! Down with the surveillance state!
    • by nightflameauto ( 6607976 ) on Thursday August 05, 2021 @12:53PM (#61659491)

      Sadly, this sort of shit is enabled by all the people who constantly blab about how we need to be protected from everything. There's always been a little of that in the world, but since 9/11 we've seen it ramp up into the stratosphere and those of us that prefer privacy and freedom are being blocked out by people begging to have the government and the businesses that sponsor the government put safety, security and control above everything else.

      I believe we are beginning to exit the security theater stage of the slow creep towards complete oppression, and entering the actual oppression stage. The problem is, so many people are so worried about safety and security they'll never accept that there are legitimate reasons to be opposed to this type of thing.

      Mark my words, it's child porn today, it'll be filtering through your personal notes for wrong-think tomorrow.

      • by geminidomino ( 614729 ) on Thursday August 05, 2021 @12:58PM (#61659531) Journal

        I believe we are beginning to exit the security theater stage of the slow creep towards complete oppression, and entering the actual oppression stage. The problem is, so many people are so worried about safety and security they'll never accept that there are legitimate reasons to be opposed to this type of thing.

        It's not surprising. It's been 20 years next month since 9/11, so we've got an entire generation now coming into adulthood that grew up entirely in the surveillance age.

        • by Bodie1 ( 1347679 )

          Yes, and another decade worth of people who were too young to notice any difference when it changed entering their thirties.

      • by Xylantiel ( 177496 ) on Thursday August 05, 2021 @02:10PM (#61659923)
        No it is enabled by all these services that have created (non-)privacy policies so they can scan stuff for marketing purposes. If a company is offering a free service on which users are storing illegal images, why is the company shielded from prosecution if they have already claimed the right to scan those images? Can't have it both ways.
      • by MrL0G1C ( 867445 ) on Thursday August 05, 2021 @04:28PM (#61660749) Journal

        Just child porn? Or also terrorism? According to the Russians, running a news paper that does not say what the gov't want it to say is tantamount to terrorism. According to Israelis, not selling ice cream to people in illegal settlements is terrorism (I'm not kidding, they said that). According to the Chinese, simply being Uygur is pretty much enough to be treated like a terrorist.

        So yes, I'm inclined to agree this is disturbingly surveillance state. It's a little unusual that Apple went from shouting about being the defenders of privacy to saying they are now going to stop and search every single one of their customers phones, wow, complete opposites in such a short period of time.

        • The politicians have figured out that the notion of child porn turns most parents into brainless, raging lunatics who'll agree to anything as long as their young girl's virginity isn't taken by some middle-aged man.
    • I'm sure hoping most people are with me on this.

      They're not.

      Otherwise, we will see a massive drop in the number of iPhones in use in the US.

      We won't. Apple knows this.

      Apathy will welcome 1984 just as quickly as anything else.

    • Except for one small quibble. Apple computer is not the government. That distinction is important. You actually have a say in your government. You can with your vote influence public policy. You can influence public policy by discussing politics with your friends and neighbors. Unless you're a major shareholder you can't influence Apple in the slightest. Voting with your dollars doesn't really work because they'll just throw up a marketing campaign to counteract lost sales from the occasional privacy focuse
      • You can vote with your feet.

        I hope Apple has researched what its buyers think of this policy because it's completely contrary to the "privacy conscious image" it's trying to create.

        I don't care about little children getting raped if this means my phone is being scanned all the time. I reject and revolt against a totalitarian information dictatorship being introduced by any company or government.
      • by ShanghaiBill ( 739463 ) on Thursday August 05, 2021 @01:18PM (#61659653)

        That distinction is important. You actually have a say in your government.

        This is completely backward. I have no say on government surveillance policy because that is not on the ballot, and no political party is clearly "pro-surveillance", so who do I vote against?

        But if the policy comes from Apple, I can opt-out by not buying their products.

      • by tragedy ( 27079 ) on Thursday August 05, 2021 @01:33PM (#61659731)

        When it gets to the point where there's an automatic funnel from the device direct to law enforcement, then they kind of do functionally become part of the government. Think of it this way. The police want to search your house, but they have no warrant and the 4th amendment doesn't allow them to just break in, should they be allowed to pay an "informant" who happens to be a burglar, to break into your house and search it for them? Then let the burglar off the hook for burglary and be protected by qualified immunity and still get to present the evidence obtained at trial? Honestly, I'm not really sure they couldn't get away with exactly that, but any sensible consideration of that arrangement should conclude that the burglar is acting as an agent of the police, so the 4th amendment should apply.

        Same here. If Apple is cooperating with police to perform searches of peoples files, then they are acting as agents of the police.

        • by cusco ( 717999 )

          Anonymous tips are already used to get search warrants. Locally it turned out that one of the regular "tipsters" was a cop's girlfriend reading off a script he had prepared. They got caught because they were stupid enough to use her regular personal phone, not all of them are that dumb so I'm sure that it happens fairly frequently.

          • by tragedy ( 27079 ) on Thursday August 05, 2021 @03:37PM (#61660457)

            Oh sure, a lot of parallel construction and laundering of improperly obtained evidence, etc. surely goes on. I'm really thinking more of a situation where, for example, the Pinkerton private detective agency, under contract from the police forcibly perform "inspections" of everyone's home. Since they're private, the 4th amendment doesn't apply and since the police won't arrest them for breaking and entering or assault, there's no real recourse against them (that won't get you shot by the police), and the police have qualified immunity so they can't be sued, and there's no one to arrest them. That's an extreme situation, but I would say it's a pretty good parallel to the police having an arrangement with a computer company to rifle through your personal files for them.

        • by jeff4747 ( 256583 ) on Thursday August 05, 2021 @03:27PM (#61660401)

          should they be allowed to pay an "informant" who happens to be a burglar, to break into your house and search it for them?

          You ask this as if it's not already common.

          Though they usually just have the informant lie about you instead of bothering to break in.

    • I don't care what justification or how evil the thing you're scanning for is. It's my damn phone. Leave me f***ing alone. I'd rather criminals not get caught than be spied on 24/7. I'm sure hoping most people are with me on this. Get government out of my computers!

      Enforce the 4th and 5th! Down with the surveillance state!

      Apple isn't a/the government -- at least, not yet anyway. The 4/5th amendments don't apply to them.
      If you don't like the policies of a *company* don't buy/use their products.

    • by AmiMoJo ( 196126 ) on Thursday August 05, 2021 @01:49PM (#61659827) Homepage Journal

      When has an iPhone ever been yours? They have always been Apple's property, they are the ones with the keys, the ability to choose what software it runs.

      • by mjwx ( 966435 )

        When has an iPhone ever been yours? They have always been Apple's property, they are the ones with the keys, the ability to choose what software it runs.

        Exactly, it's why it's called "your Apple Iphone", to remind you that you never actually owned the phone, you're just using it with master Steves permission (Steve bless his soul).

        And that permission can be taken away, what the Steve giveth, the Steve taketh away.

        So for all the years Apple Fanboys have crowed that Apple doesn't do this. Apple doesn't monitor them, Apple protects them, here's a giant mug of "I fucking told you so" because I said Apple were spying as much as Google, if not more, but wer

    • This illustrates two important things: First is the leftist line "Just do this one little thing. It's no big deal." This can be applied to every erosion of our rights. Before you know it, you don't have any rights left. Second, the government knows damn well it can't do this or suppress speech or affect gun control or any other infringement of our rights so it enlists loyal private industry to do it for them.

  • Bs (Score:3, Insightful)

    by blahbooboo ( 839709 ) on Thursday August 05, 2021 @12:47PM (#61659459)
    There is no way Apple or any company would do this stupid a move. This is just a losing proposition with tons of issues. I don’t believe this rumor for a second.
    • Re:Bs (Score:5, Insightful)

      by geekmux ( 1040042 ) on Thursday August 05, 2021 @12:57PM (#61659521)

      There is no way Apple or any company would do this stupid a move. This is just a losing proposition with tons of issues. I don't believe this rumor for a second.

      Gosh, it's almost as if I've heard this before.

      Right before they removed the headphone jack...and standardized ports...and removable memory...

    • Re:Bs (Score:5, Insightful)

      by Murrdox ( 601048 ) on Thursday August 05, 2021 @01:02PM (#61659565)
      From what I can tell, the ONLY source for this story is ONE single person. I'm not sure I believe that this story is the case until more information is available from additional sources. "Apple has not confirmed this and so far the sole source is Matthew Green, a cryptographer and associate professor at Johns Hopkins Information Security Institute." https://appleinsider.com/artic... [appleinsider.com]
      • Re: Bs (Score:3, Interesting)

        by bradley13 ( 1118935 )
        It is also possible that this is a trial balloon. Apple may be getting lots of government pressure. Perhaps they want to find out how much backlash this would generate. In an entirely deniable fashion.

        I certainly believe the US government would want this. Agencies like the FBI are all about nailing the bad guys, collateral damage be damned. They're also not above creating bad guys, if they need to run up their numbers.

    • by lazarus ( 2879 )

      The problem with building devices that havre a reputation for being the most secure in the industry is that every criminal uses them. And Apple has to pay for every legal fight where they are being asked to unlock a phone as part of an investigation. Maybe they've weighed those costs against the loss of customers who still give a shit about their privacy? I don't know. But I agree with you, this is likely full-on BS.

    • I'm unsure about the scanning phones angle, but scanning their cloud storage might be prudent. If a person stores child porn on a free service that is already doing scanning for things like those "remember this moment" notifications and facial recognition to connect to contacts, it sure seems like the company could be compelled to include a child porn filter in their scan, and they might just want to get ahead of that. I feel sorry for the humans that have to vet the false positives, because I bet there i
      • How are you going to determine a false positive?

        If anyone saved pictures from dating apps, hookup apps, *-gone-wild subreddits, sexts, or anything else - and unlike Apple, we're not going to pretend that humans aren't sexual and don't save sexy pics, girls included; they are sexual, and they DO save sexy pics on their phones - then any of those pics could get flagged by some AI/ML algorithm.

        It seems safe to presume that your privacy will then be violated first by those Apple is having check this stuff (and

        • by cusco ( 717999 )

          I still remember the family who posted photos of their two year-old playing naked in the lawn sprinkler on their MySpace page and were charged with child pornography.

    • Especially for a company like Apple, who have build something of a reputation for protecting their customers' privacy. Nope, they will scan all your photos, have employees dig through the juiciest ones, and send anything that looks iffy to the police. How the hell is this a compromise between doing nothing, and allowing governments a back door into your product?

      I highly doubt Apple is seriously entertaining this idea.
    • by suss ( 158993 )

      They really needed Steve Jobs to tell them "No, you stupid fuckers, you're not doing this, and that's it".

      So many bad choices since he died.

    • There is no way Apple or any company would do this stupid a move. This is just a losing proposition with tons of issues. I don’t believe this rumor for a second.

      Oh. [apple.com]

  • Orwell (Score:5, Insightful)

    by Zak3056 ( 69287 ) on Thursday August 05, 2021 @12:47PM (#61659461) Journal

    This is, of course, only the ("noncontroversial") start. Once this is accepted to detect possible child abuse, the next phase will be something less horrific, and so on down the slope until a picture of you with the wrong person becomes a reportable offense (where you even get to incriminate yourself by taking said photo).

    Once again, 1984 was supposed to be a warning and not an instruction manual.

    There will be no curiosity, no enjoyment of the process of life. All competing pleasures will be destroyed. But always— do not forget this, Winston— always there will be the intoxication of power, constantly increasing and constantly growing subtler. Always, at every moment, there will be the thrill of victory, the sensation of trampling on an enemy who is helpless.
    If you want a picture of the future, imagine a boot stamping on a human face— forever. ”

    • Once again, 1984 was supposed to be a warning and not an instruction manual.

      The irony is palpable.

      https://www.youtube.com/watch?... [youtube.com]

    • If you want a picture of the future, imagine a boot stamping on a human face— forever. ”

      Pretty sure Apple will report you to the authorities for having that picture on your iPhone.

  • for scanning their users' devices using "neural" "AI" software that automatically sends them and unknown government overlords completely unknown types and amounts of information about all images and who knows what other files the users have. Also, the "neural matching" surely would never make mistakes, either.
    • Yeah, this. Google's AI alerts me that it sees people when a bug crosses across my Nest camera's lens. How can Apple's AI be nuanced enough to identify pornography from ordinary nudity?

      • by cez ( 539085 )

        Yeah, this. Google's AI alerts me that it sees people when a bug crosses across my Nest camera's lens. How can Apple's AI be nuanced enough to identify pornography from ordinary nudity?

        Ring Ring... "Excuse me ma'am, I'm looking at naked photos of your daughter on your husbands phone, I just wanted to check and verify what age she is... oh you don't have a daughter, well does he have a girlfriend? How old is she?"

  • Seriously? (Score:5, Insightful)

    by Dan East ( 318230 ) on Thursday August 05, 2021 @12:52PM (#61659485) Journal

    How is this supposed to work? How does it identify, say, a nude child compared to a nude adult in an image? Breast size? Skin complexion? Body hair? Exactly what metrics can you know a minor from an adult in a nude photo? Surely AI is not competent enough to be able to tell the difference in a photo without any other context.

    My mom likes to bring out photos (and slides - yeesh) of me and my brother when we were kids, and that of course includes the usual taking baths and stuff. "Oh, isn't his naked little butt cute standing beside the tub." Yeah, thanks for embarrassing me, mom. That isn't child porn. So are you telling me that a person at Apple may look be looking at pictures of my 5 year old child that I took while they were in the bath to see if they are porn?

    This reeks of privacy issues, technology issues, moral issues, on and on. Please tell me this isn't actually going to make it out of theory or research.

    • by larwe ( 858929 )
      Presumably it would work like say Facebook's systems - the AI would be tuned to err on the false positive side and minimize the number of false negatives. The positives are sent for human review. Of course this system is quite infallible and works very well :eyeroll:. I agree with one of the other posters here - this idea is instant death to Apple's "we keep your private stuff private" line, and I can't see it happening. If it did happen, it will undoubtedly have worse fallout than Alexagate (peoples' priva
    • "I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description ["hard-core pornography"], and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that" - Supreme Court Justice Potter Stewart
    • It could be that it just checks the checksum of your photos to match a list of known images. But yeah, if it finds something it sounds like a human could look at it - which really sucks.

    • Re:Seriously? (Score:5, Informative)

      by Mr_Silver ( 213637 ) on Thursday August 05, 2021 @02:50PM (#61660183)

      It'll use Microsoft's PhotoDNA algorithm which creates a hash that isn't impacted by basic editing - such as scaling, cropping, flipping or minor colour alterations. The hash is then compared to a list provided by organisations such as the IWF, NCEMC and others.

      If you've ever uploaded an image to a service owned by Facebook, Google, Twitter or Microsoft then it'll have been run through this check.

  • I thought Apple touted themselves as caring about your privacy and security? What happened iCultists? Apple is looking at your nudes now? Not only that, but if it's using an AI to do it, wtf did they feed it to be able to recognize child abuse images and where did they get it?

    What the actual f**k?

    • by larwe ( 858929 ) on Thursday August 05, 2021 @01:31PM (#61659727)

      wtf did they feed it to be able to recognize child abuse images

      There is a shared DB of known child abuse images that is maintained with Government support and used to train recognition engines (and also to recognize specific images when they crop up). The name of it escapes me for the moment, but for example it's what Facebook uses.

  • well that's outright insane.

  • ...that there won't be any false positives, right?

    Right?

  • "The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified."

    The problem here starts with the concept of "illegal imagery". There are illegal acts and possibly imagery that documents those acts, but that does mean that the Imagery itself is illegal nor can an AI or Apple staff make that determination.

    The problem then proceeds to "verified". No Apple employee can verify whether ima

    • Lots of 18+ yro women look much younger. If they choose to take intimate pictures, that's entirely their business and those photos are not there for the prurient pleasure of "a team of human reviewers"

    • get an good lawyer the chain of custody & discovery issues will kill any criminal court case.

      Also in the criminal court case with discovery you need to demand the source code and all logs.
      place the human reviewers on the stand
      place the human coders on the stand

      • by cusco ( 717999 )

        Ever watched the process of a child porn case of someone who wasn't rich? Long before the court date the accused has lost their job, probably most of their family, and most likely their home since they're now unemployable, and any standing they might have had in their community. Getting off on a technicality like chain of custody is immaterial, by that point they've already lost everything. For that matter they'll probably have to rely on a public defender since they can no longer afford an experienced a

  • Similes. (Score:4, Funny)

    by Ostracus ( 1354233 ) on Thursday August 05, 2021 @12:54PM (#61659507) Journal

    Apple are walking back privacy to enable 1984, he said.

    One of these days we'll find someone who's actually read the book.

  • The first mommy with baby pics will now be criminalized.

    • Re:Oh Great (Score:4, Interesting)

      by ebonum ( 830686 ) on Thursday August 05, 2021 @01:12PM (#61659617)

      Good point. In a lot of countries, it would never occur to anyone that a naked baby picture might be considered kiddie-porn. When I was a kid (In the US), the high school year book would always have a few pictures of graduating seniors as little babies. Some of them were nude. It wasn't a big deal.
      Now, having a picture of your own kid can for all effective purposes end your life.

  • by ebonum ( 830686 ) on Thursday August 05, 2021 @01:02PM (#61659561)

    Apple: "We follow the laws and regulations of the countries in which we operate."
    I assume this will soon include scanning all devices in China for "anti-Xi Jinping" (or "fill in the blank") material and reporting violations to the local police. I'm going to guess China will be able to figure out a way to make this a laws/regulation.
    The good old days of getting people to rat out their neighbors might be gone. This seems far more efficient.

  • Cellphones are a massive invasion of personal privacy just by carrying them around, powered-on. You're effectively reporting your whereabouts at all times to the telco, who can collect that data and do what they wish with it.

    But clearly, most of us feel like the trade-offs are worth still using one. (It's not like you can't just leave it someplace so your "trail" is lost, should you actually become that concerned about it. And most likely, our daily travels just aren't that exciting or anything we're TOO

  • by dlleigh ( 313922 ) on Thursday August 05, 2021 @01:04PM (#61659573)

    But then everyone on the opt-out list will be put on a child porn watch list.

    And no one will be able to remove the software in question from their iPhones because of the nature of Apple's cryptographically-enforced walled garden. The walled garden keeps you safe, remember?

  • by BishopBerkeley ( 734647 ) on Thursday August 05, 2021 @01:08PM (#61659597) Journal

    Gmail has been scanning user emails and reporting child porn to the authorities, per US law, for some time.
    https://money.cnn.com/2014/08/... [cnn.com]

    FT is a venerable institution, but this report is hard to believe. Scanning one's phone seems to go too far, and Apple has been the most resistant of all companies against such invasions. I hope they substantially deny this report soon.

  • False positives: Everything from baby pictures to pictures of consenting adults, to legal pornography,to pictures thought in good faith to be legal pornography that turned out to be illegal. Art etc.

    Slippery slope: Once we've accepted scanning to "protect the children" what about other crimes like terrorism, calling for insurrection, and copyright violations of Apple owned content

    Then when you get down to it, how is a phone different from a personal computer - the same arguement should let the go
    • Comment removed based on user account deletion
      • Also imagine the value of a hack that unknown to the user loads an illegal photo on their phone. If the scans include looking at images from cached web pages that would be extremely easy to do
  • by zerosomething ( 1353609 ) on Thursday August 05, 2021 @01:12PM (#61659615) Homepage
    I would hope this would be a big "NOPE" from their own lawyers. If they intend to use human reviewers they will have to take cautionary stance and anything that "might" be child abuse would be required to be reported. We already have parents being jailed because they let their kids go the playground alone. This will not end well for anyone. "OOH sorry but you know asian women look like 12 year olds". Holy fu*in racism Bat Man.
  • OK everyone here agrees that this is bad approach to combatting child exploitation and porn. Is there any approach to combatting child exploitation through these digital devices that is acceptable?

    • Would you find any approach where someone comes into your house every day and looks around for illegal materials acceptable?
      • by 605dave ( 722736 )

        No I wouldn't. And I wasn't saying that Apple's approach is ok. I am asking if anyone has an approach that is acceptable.

  • I don't own Apple products. Fuck Apple
  • The automated system would proactively alert a team of human reviewers

    The automated system would proactively alert a team of photo leakers.

    Fixed that for you.

  • Okay, I hope that everyone who abuses children burns in hell and dies painfully or at the very least gets arrested. But....But....Scanning people's phones looking for content that they created and notifying the authorities? This means everyone's phone is getting the content scanned all the time. This would get a big fuck you Apple from me as a user. If Google follows suit, this may just be the shot in the arm that Linux based phones need.
  • Seriously, this is a terrible idea. First, phones contains a lot of private, personal information. The very idea that some bunch of a faceless company, and their underpaid employees, are actively scanning through your phone should be abhorrent.

    Second, who defines "child porn"? A pic of the grandkids in the tub? What about little kids running around topless through a sprinkler in your back yard? How many people will be proving their innocence, and to whom?

    Finally, they justify this with kiddie porn, just

    • Second, who defines "child porn"? A pic of the grandkids in the tub? What about little kids running around topless through a sprinkler in your back yard? How many people will be proving their innocence, and to whom?

      There are laws that define if a picture of a child can be considered pornography or not. Your examples are not considered so unless there is emphasis on the chest or genital area (and the only exceptions to that are if the media is for educational, medical, or criminal investigation purposes), o

  • Let me guess...

    - Apple will roll this out
    - They won't respond to any reporters' questions about this
    - You can't turn it off
    - They won't publish documentation or commitments about how it works (and what it doesn't do) other than a vague advertisement
    - This will have full access to the personal lives of more humans than were alive on Earth in 1900

  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Thursday August 05, 2021 @01:34PM (#61659737)
    Comment removed based on user account deletion
      • Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.

        Uploading hashes that will only match specific images is troubling, but at least the chance of a false positive should be very low.

        Apple are control freaks, b

  • 1. Teenage girl (or boy) snaps naughty photo of him/herself.

    2. It gets flagged for a "human review."

    3. Coworker reports reviewer for viewing and having possession of child porn.

    4. Teenager sues Apple for unauthorized use of their image.

    5. ????

    6. PROFIT!

  • If this is actually true (paywalled article, so who knows what facts they are going on), yes, even though Apple is a private company where the Fourth Amendment doesn't apply, and "for the children!" is a root password, what keeps this from turning into something that checks for IP violations and turns someone into the RIAA because they have a MP3 file that was from a torrent?

    What happens if they are not on the "right" side of politics, and a stored mail trips a flag which gets sent and caused the person to

  • ... if the system *ONLY* phones home when it detects child abuse, and if the system has precisely *ZERO* false positives, then it would make no difference to me one way or the other.

    But A) software has bugs. False positives are inevitable. This reason alone is enough for me to be opposed to it, and B) while it is alleged to only be used to detect child abuse today, we do not know if that is all it is really detecting. Further, even if that were the case for today, the technology to phone home when ina

  • Wut? (Score:3, Insightful)

    by cygnusvis ( 6168614 ) on Thursday August 05, 2021 @01:47PM (#61659811)
    This is essentially agreeing to buy a home with the terms of service stating a company can come in regularly to make sure there is no illegal stuff. This is the exact opposite of civil liberty.
    • Re: (Score:3, Informative)

      Animals rescue groups love to put language like this in their adoption contracts. You have to agree to give them access to your home at any time they demand it, so they can ensure it is still safe for your adopted pet, for the entire life of the animal rather than just as a pre-adoption inspection. If you refuse, they can reclaim the animal from you. I remember seeing language that from from one rescue that specialized in a species of bird that routinely lives 30+ years in captivity. Every single person who

  • You take just enough that people wont drop you, then wait a bit and do it again. At some point they have taken everything. Now its abuse images, next it will be scanning your messages to see if you plan on committing a crime or w/e
  • Apple hiring photo analyzers ahead of the big Halloween photo blitz. Can you tell real blood from fake? Can you tell if that Hot Nurse is 17 or 18? Do you want to look into other people's private lives? Do you want to send the cops, in case you see someone you knew in middle school? You can do all of these for APPLE, and have the white ear buds of purity.
  • by BardBollocks ( 1231500 ) on Thursday August 05, 2021 @05:17PM (#61660931)

    say you're a whistleblower and NSO group's customer doesn't like you.

    uses their tool to plant child porn somewhere that you'll never look at yourself.

    Apple happens to 'catch' you a short time later - getting around the whole ''needing a warrant" thing.

    If they're up for chopping reporters into small enough pieces to fit in cake boxes, or kidnap them or poison them - you can bet they'll have no qualms about planting child porn on their phones either.

God doesn't play dice. -- Albert Einstein

Working...