Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Censorship Iphone Apple

Will FaceTime In IOS 26 Freeze Your Call If Someone Starts Undressing? (9to5mac.com) 43

Long-time Slashdot reader AmiMoJo shared this report from the Apple news blog 9to5Mac: iOS 26 is a packed update for iPhone users thanks to the new Liquid Glass design and major updates for Messages, Wallet, CarPlay, and more. But another new feature was just discovered in the iOS 26 beta: FaceTime will now freeze your call's video and audio if someone starts undressing.

When Apple unveiled iOS 26 last month, it mentioned a variety of new family tools... "Communication Safety expands to intervene when nudity is detected in FaceTime video calls, and to blur out nudity in Shared Albums in Photos." However, at least in the iOS 26 beta, it seems that a similar feature may be in place for all users — adults included.

That's the claim of an X.com user named iDeviceHelp, who says FaceTime in iOS 26 swaps in a warning message that says "Audio and video are paused because you may be showing something sensitive," giving users a choice of ending the call or resuming it.

9to5Mac says "It's unclear whether this is an intended behavior, or just a bug in the beta that's applying the feature to adults... [E]verything happens on-device so Apple has no idea about the contents of your call."

Will FaceTime In IOS 26 Freeze Your Call If Someone Starts Undressing?

Comments Filter:
  • by sinij ( 911942 ) on Saturday July 05, 2025 @12:44PM (#65499396)
    I have so many questions about this.

    Can I take a coat off? What about going shirtless? What about taking hijab or turban off, which may be intimate state of undress in some cultures? What about taking a glass eye out? What about animals, if my cat flashes his butthole in the view of camera while I am on the call, would that get me red-flagged? What about object mis-identification, can I eat a banana while of the call?
  • â¦if: - it can be enabled and disabled - all processing for detection stays local to the device and no data (including any metadata) is sent to any third party
    • Yes, no data is sent to a third party... except Apple and the FBI.

      • by torkus ( 1133985 )

        Playing it out, you do get to that point. It might be an opt-in function to start but then the "let's do more" crowd gets the idea to background record everything in case if flags nudity and then you can auto-upload that to ... where ever police FYI something something.

        Never mind they'll bury in the TOS that every call becomes 'anonymized training data' even if you don't use the service. There's a point where every interaction is scrutinized by some AI or algorithm and we're no longer adults but baby-sat

  • by drnb ( 2434720 ) on Saturday July 05, 2025 @01:03PM (#65499466)
    It was a feature request from CNN

    "Jeffrey Toobin is back at CNN eight months after exposing himself on Zoom"
    https://www.cnn.com/2021/06/10... [cnn.com]
  • Could be Tim Cook wants all your video uploaded to Apple to .. uh train the the AI?

  • It's called FaceTime, not DickTime. ;-)

  • So, um, facetime is detecting when we are naked and acting on that - with code written by an intern supervising AI-written code. What could possibly go wrong. lol.
  • So I guess we will see more tentacles from now on...
  • Looks like Apple are doing their very best to ensure that users such as myself will never use one of their NKB 'Nanny Knows Best' devices.
    Wouldn't be surprised if the next bit was "Oh and by the way, we had some of your videos sent to our contractors //for quality assurance and training// or whatever else"

    This whole idea of being tethered to their apps within the ecosystem is bad enough, but this just feels like a bridge too far.

    What's so special about Facetime anyway, just Signal FFS. It does video
    • by PPH ( 736903 )

      especially great for Pentagon employees

      No thanks. I've already seen La Cage aux Folles.

  • iOS is bitch lasagna!
  • Just slow it down and sync it with the music [youtube.com].

  • Imaging having that job -- to curate enough amateur porn to train an AI to detect it.
  • ... applying the feature to adults.

    How does one distinguish school-girl nipples from slutty-woman nipples? This will probably apply to everyone, including shirt-less men. As amusing as automatically protecting men is, 'protecting' women is a slippery slope.

    On the one hand, women need to be protected from dick pics and themselves. On the other, this is normalizing censorship: That has been increasing for 20 years.

  • Slashdot title: "freeze your call if someone starts undressing"
    iOS Warning: "paused because you may be showing something sensitive"

    Those are two very different statements. Meaning this system doesn't stop incoming content, it just gives you a speedbump when giving out nudes.

    And where's the test data set? What skin tones does it work on? What body shapes? I assume this news would be bigger if Apple finally got all that right rather than all the other nudity detectors which get so many things wrong. If I

  • Until it's my personal, private "AI" that's doing the 'thinking' about things...which doesn't share back to anyone...I've really limited interest in every cloud provider directly snooping all my data. E2EE is frankly the name of the game, and if you design your apps to access the decrypted in-flight data for 'monitoring' purposes then you've effectively built in the back-door the EU mandated recently.

    In a perfect world E2EE should be available for all services - especially with things like cloud storage -

As a computer, I find your faith in technology amusing.

Working...