Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apple

Apple Removes Nonconsensual AI Nude Apps From App Store (404media.co) 40

404 Media: Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images, a sign that app store operators are starting to take more action against these types of apps.

Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.

Apple's action comes after we reported on Monday that Instagram advertises nonconsensual AI nude apps. By browsing Meta's Ad Library, which archives ads on its platform, when they ran, on what platforms, and who paid for them, we were able to find ads for five different apps, each with dozens of ads. Two of the ads were for web-based services, and three were for apps on the Apple App Store. Meta deleted the ads when we flagged them. Apple did not initially respond to a request for comment on that story, but reached out to me after it was published asking for more information. On Tuesday, Apple told us it removed the three apps on its App Store.

This discussion has been archived. No new comments can be posted.

Apple Removes Nonconsensual AI Nude Apps From App Store

Comments Filter:
  • wat (Score:5, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday April 26, 2024 @09:16AM (#64427100) Homepage Journal

    Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images

    You literally cannot prevent that in an app which can make consensual nude images. Therefore the word nonconsensual is being used in order to trigger people into having a specific opinion. A better description is "an app which can be used to create fake nude images" since it can't literally show you what someone would look like unclothed.

    • Yeah this is definitely a "I'll know it when I see it" type of judgement but to be fair most of their customer base if presented with the question of "how would you feel about someone using an AI app to perfectly map your face onto a nude body and distribute them and most people will not know the difference or see it labelled as fake in any way" would have a negative reaction to that.

      If I was Tim Apple I would probably make the same call.

      • How will I generate my dating app profile picture now though ?
      • I'm not against them removing the apps, I'm against the headline trying to make me feel a certain way. Present the facts, I'll decide how I feel about them.

        • I'm not against them removing the apps, I'm against the headline trying to make me feel a certain way. Present the facts, I'll decide how I feel about them.

          LOLOL!!!

          If you removed all the Yellow Journalism from the Web, what remained would fit on a Floppy. . .

      • Yeah this is definitely a "I'll know it when I see it" type of judgement but to be fair most of their customer base if presented with the question of "how would you feel about someone using an AI app to perfectly map your face onto a nude body and distribute them and most people will not know the difference or see it labelled as fake in any way" would have a negative reaction to that.

        Well....if the resultant images made me much less fat, and a bit more ripped....I dunno...maybe?

        ;)

        jk

    • That was an advertised feature of the apps. Obvious removal if you advertise it as a feature.
    • Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images

      You literally cannot prevent that in an app which can make consensual nude images. Therefore the word nonconsensual is being used in order to trigger people into having a specific opinion. A better description is "an app which can be used to create fake nude images" since it can't literally show you what someone would look like unclothed.

      There's already an app you can use to make consensual nude images, it's called a camera.

      If you need generative AI to create the nude then it's non-consensual or a very weird edge case of people making their own fake nudes.

      • If you need generative AI to create the nude then it's non-consensual or a very weird edge case of people making their own fake nudes.

        Millions of people sell nude pics, it might be very weird, but not an edge case to generate them rather than spending hours staging and editing or paying for a professional photo shoot.

    • Your argument is literally beside the point. The images are nonconsensual because the subjects of those images did not agree to the images being created in the first place, nude or non-nude makes no difference. The fact that they involve nudity just makes the nonconsent 99.99999999% likely.
    • by AmiMoJo ( 196126 )

      It was the advertising that make it clear what these apps were being pitched as.

      I saw some on Twitter. They showed a guy sending a fake nude to a woman, who then begged him to delete it, saying "please, I'll do anything!" So pretty clearly a rape app, sold as being useful for blackmailing women for sexual favours.

      Apple doesn't allow such apps. I'm sure the developer of "Drunk Slut Location Pro" would argue that it is merely designed to help render medical assistance to vulnerable women who made poor life ch

  • by bugs2squash ( 1132591 ) on Friday April 26, 2024 @09:23AM (#64427124)
    Tattoo parlors will become the new bastions of bodily security
  • Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.

    I'm glad you're here doing all this good, preventing "nonsensical nudity".
    • Nonconsensual. Meaning, someone is taking one person's body and slapping on a different person's face and touting it off as the person.

      Coincidentally, it's almost always men doing it to women.

      • Nonconsensual.

        Yes, this was changed very quickly.
      • by jbengt ( 874751 )

        Meaning, someone is taking one person's body and slapping on a different person's face and touting it off as the person.

        The example in the original article was of an app that input a regular picture of a clothed person and output the same picture, but nude. It was not as apparently fake as slapping someone's face onto a nude picture.

    • Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.

      I'm glad you're here doing all this good, preventing "nonsensical nudity".

      Do you know how many Apps Apple has to churn through each day?

      Think about it.

  • by Archangel Michael ( 180766 ) on Friday April 26, 2024 @09:54AM (#64427212) Journal

    "Pictures. Or it didn't happen!"

    AI has ruined the joke.

  • Pics or it didn't happen!

    Oh, wait. No, I don't want to see that, because that shit's nasty and harmful. The existence of such images demonstrates the widespread existence of a tool that can be used to nasty, spiteful, disturbing ends against me, my family, people I know and care about, and the vast overwhelming majority of people that do not want that technology used on them.

    I understand the technology cannot be unmade; Pandora's box is already wide open. But that also doesn't mean Apple (or whoeve
    • Pics or it didn't happen!

      Oh, wait. No, I don't want to see that, because that shit's nasty and harmful. The existence of such images demonstrates the widespread existence of a tool that can be used to nasty, spiteful, disturbing ends against me, my family, people I know and care about, and the vast overwhelming majority of people that do not want that technology used on them.

      I understand the technology cannot be unmade; Pandora's box is already wide open. But that also doesn't mean Apple (or whoever else) has to make it easy to find and use.

      Related coverage: NYTimes [nytimes.com].

      Kind of why, when somebody drew their attention to its Existence, Apple Removed the Offending (and Offensive) and Developer-Terms-Violating App from their App Store!

  • by Petersko ( 564140 ) on Friday April 26, 2024 @11:38AM (#64427576)

    Don't say the quiet part out loud. For decades pot-related products like bongs got around the letter of the law by not using the words "pot" or "marijuana".

    Their sin doesn't seem to be the ability, but rather the advertised intent.

    • by HiThere ( 15173 )

      But that's a valid point, as it openly demonstrated intent. It also facilitates induction of those without much prior intent.

      OTOH, I suspect that the "prior intent" is there for practically all teen-aged boys.

  • by Stolovaya ( 1019922 ) <skingiiiNO@SPAMgmail.com> on Friday April 26, 2024 @12:03PM (#64427674)

    It's just hysteria around this kind of stuff. I guess it's difficult to care since this isn't really anything new, Photoshop has been around for quite some time. Even before that, you could cut and paste pictures together. I remember seeing "nude" photos of lots of different celebrities back in the wild west days of the internet. It's been a long time since the US was founded by Puritans, but those puritanical roots still dig deep, with all the hysteria we have around sex and nudity.

    • Photoshop has been around for quite some time.

      Yes, but even with a tool like Photoshop, it still took a nontrivial amount of skill and effort to make faked photos look real: making the paste seem-less, getting the skin tone to match, getting the lighting and shadow angles to match, etc. Modern AI apps allow anyone to make real-looking faked photos with no skill at the click of a button.

    • The anti-sex feminists don't like men fapping to women. The anti-sex conservatives don't like men fapping at all. So they get together to ban anything fappable. Replace fap with having sex with, staring at, trying to pick up, taking pictures of, etc etc. It's been happening for decades. It will keep happening until the conservatives lose their puritan roots.
  • What they're doing is banning nudes, with consent or not and for everyone, because some will abuse the capability.

    • by PPH ( 736903 )

      Probably any app that takes some image as an input _could_ be used to produce non-consensual content. If you just ask the AI for a generic blonde with big cans, you are most likely safe.

  • What is the point to this? You better ban all pens, pencils, crayons, paints, etc because they can all be used to create a nude of anyone you want.

    • Read the article. Please read it.
    • What is the point to this? You better ban all pens, pencils, crayons, paints, etc because they can all be used to create a nude of anyone you want.

      Do you really think the people in charge of this at Apple don't understand how content creation works?

      Here's the thing: Like Apple's Sr. V.P. Of Software Engineering, Craig Federighi, said (paraphrasing from memory) "We have to make iOS and ipadOS (and their App Stores) safe enough for kids; because we actually have kids, sometimes even infants, that use iPhones and iPads, every single day."

      So, since there isn't an "Adult" Apple App Store (nor could there be; kids are too smart!), there's little choice but

      • by tlhIngan ( 30335 )

        So, since there isn't an "Adult" Apple App Store (nor could there be; kids are too smart!), there's little choice but to nerf the App Store's content a bit.

        You can run one in Europe now, it's perfectly legal.

        And you can even verify it by requiring payment be done via age-verified means. Credit cards used to be 18+ only, but I'm sure in Europe there's probably other ways to verify.

        • So, since there isn't an "Adult" Apple App Store (nor could there be; kids are too smart!), there's little choice but to nerf the App Store's content a bit.

          You can run one in Europe now, it's perfectly legal.

          And you can even verify it by requiring payment be done via age-verified means. Credit cards used to be 18+ only, but I'm sure in Europe there's probably other ways to verify.

          We're not talking about Europe now, are we?

  • I, for one, support the ban of nuyde apps.

    Naked apps running around flaunting their bits are just gross!

    Put an interface on, for crying out loud. Think of the children!

    hawk

  • Last I checked you can use PhotoShop and other tools like this to create non-consensual images all day long. How about any CGI or special effect software? Or is this just a case of someone using just the right keywords to trigger some snowflake workers at Apple, but not have enough cloud to bad Adobe software from all Apple products?
  • Do these companies just invent fake rules by themselves or do they have a room of trained A.I. monkeys to do it?

An adequate bootstrap is a contradiction in terms.

Working...