Apple Removes Nonconsensual AI Nude Apps From App Store (404media.co) 40
404 Media: Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images, a sign that app store operators are starting to take more action against these types of apps.
Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.
Apple's action comes after we reported on Monday that Instagram advertises nonconsensual AI nude apps. By browsing Meta's Ad Library, which archives ads on its platform, when they ran, on what platforms, and who paid for them, we were able to find ads for five different apps, each with dozens of ads. Two of the ads were for web-based services, and three were for apps on the Apple App Store. Meta deleted the ads when we flagged them. Apple did not initially respond to a request for comment on that story, but reached out to me after it was published asking for more information. On Tuesday, Apple told us it removed the three apps on its App Store.
Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.
Apple's action comes after we reported on Monday that Instagram advertises nonconsensual AI nude apps. By browsing Meta's Ad Library, which archives ads on its platform, when they ran, on what platforms, and who paid for them, we were able to find ads for five different apps, each with dozens of ads. Two of the ads were for web-based services, and three were for apps on the Apple App Store. Meta deleted the ads when we flagged them. Apple did not initially respond to a request for comment on that story, but reached out to me after it was published asking for more information. On Tuesday, Apple told us it removed the three apps on its App Store.
wat (Score:5, Insightful)
Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images
You literally cannot prevent that in an app which can make consensual nude images. Therefore the word nonconsensual is being used in order to trigger people into having a specific opinion. A better description is "an app which can be used to create fake nude images" since it can't literally show you what someone would look like unclothed.
Re: (Score:3)
Yeah this is definitely a "I'll know it when I see it" type of judgement but to be fair most of their customer base if presented with the question of "how would you feel about someone using an AI app to perfectly map your face onto a nude body and distribute them and most people will not know the difference or see it labelled as fake in any way" would have a negative reaction to that.
If I was Tim Apple I would probably make the same call.
Re: (Score:3)
Re: (Score:2)
Go old school, hire a real airbrush artist.
Re: (Score:2)
Re: (Score:3)
I'm not against them removing the apps, I'm against the headline trying to make me feel a certain way. Present the facts, I'll decide how I feel about them.
Re: (Score:2)
I'm not against them removing the apps, I'm against the headline trying to make me feel a certain way. Present the facts, I'll decide how I feel about them.
LOLOL!!!
If you removed all the Yellow Journalism from the Web, what remained would fit on a Floppy. . .
Re: (Score:2)
Well....if the resultant images made me much less fat, and a bit more ripped....I dunno...maybe?
jk
Re: (Score:2)
Hey consent means you reserve to right to make yourself as much of a gigachad as you want. That's your right as an American damnit.
Re: wat (Score:2)
Re: (Score:2)
Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images
You literally cannot prevent that in an app which can make consensual nude images. Therefore the word nonconsensual is being used in order to trigger people into having a specific opinion. A better description is "an app which can be used to create fake nude images" since it can't literally show you what someone would look like unclothed.
There's already an app you can use to make consensual nude images, it's called a camera.
If you need generative AI to create the nude then it's non-consensual or a very weird edge case of people making their own fake nudes.
Re: (Score:1)
If you need generative AI to create the nude then it's non-consensual or a very weird edge case of people making their own fake nudes.
Millions of people sell nude pics, it might be very weird, but not an edge case to generate them rather than spending hours staging and editing or paying for a professional photo shoot.
Re: (Score:2)
Re: (Score:2)
It was the advertising that make it clear what these apps were being pitched as.
I saw some on Twitter. They showed a guy sending a fake nude to a woman, who then begged him to delete it, saying "please, I'll do anything!" So pretty clearly a rape app, sold as being useful for blackmailing women for sexual favours.
Apple doesn't allow such apps. I'm sure the developer of "Drunk Slut Location Pro" would argue that it is merely designed to help render medical assistance to vulnerable women who made poor life ch
Inked (Score:3)
And who are you? (Score:2)
I'm glad you're here doing all this good, preventing "nonsensical nudity".
Re: (Score:2)
Nonconsensual. Meaning, someone is taking one person's body and slapping on a different person's face and touting it off as the person.
Coincidentally, it's almost always men doing it to women.
Re: (Score:2)
Yes, this was changed very quickly.
Re: (Score:2)
The example in the original article was of an app that input a regular picture of a clothed person and output the same picture, but nude. It was not as apparently fake as slapping someone's face onto a nude picture.
Re: (Score:2)
Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.
I'm glad you're here doing all this good, preventing "nonsensical nudity".
Do you know how many Apps Apple has to churn through each day?
Think about it.
Pictures (Score:3)
"Pictures. Or it didn't happen!"
AI has ruined the joke.
Images (Score:1)
Oh, wait. No, I don't want to see that, because that shit's nasty and harmful. The existence of such images demonstrates the widespread existence of a tool that can be used to nasty, spiteful, disturbing ends against me, my family, people I know and care about, and the vast overwhelming majority of people that do not want that technology used on them.
I understand the technology cannot be unmade; Pandora's box is already wide open. But that also doesn't mean Apple (or whoeve
Re: (Score:2)
Pics or it didn't happen!
Oh, wait. No, I don't want to see that, because that shit's nasty and harmful. The existence of such images demonstrates the widespread existence of a tool that can be used to nasty, spiteful, disturbing ends against me, my family, people I know and care about, and the vast overwhelming majority of people that do not want that technology used on them.
I understand the technology cannot be unmade; Pandora's box is already wide open. But that also doesn't mean Apple (or whoever else) has to make it easy to find and use.
Related coverage: NYTimes [nytimes.com].
Kind of why, when somebody drew their attention to its Existence, Apple Removed the Offending (and Offensive) and Developer-Terms-Violating App from their App Store!
Good lord, man... learn the lessons of the past. (Score:3)
Don't say the quiet part out loud. For decades pot-related products like bongs got around the letter of the law by not using the words "pot" or "marijuana".
Their sin doesn't seem to be the ability, but rather the advertised intent.
Re: (Score:2)
But that's a valid point, as it openly demonstrated intent. It also facilitates induction of those without much prior intent.
OTOH, I suspect that the "prior intent" is there for practically all teen-aged boys.
Really Don't Care (Score:3)
It's just hysteria around this kind of stuff. I guess it's difficult to care since this isn't really anything new, Photoshop has been around for quite some time. Even before that, you could cut and paste pictures together. I remember seeing "nude" photos of lots of different celebrities back in the wild west days of the internet. It's been a long time since the US was founded by Puritans, but those puritanical roots still dig deep, with all the hysteria we have around sex and nudity.
Re: (Score:2)
Yes, but even with a tool like Photoshop, it still took a nontrivial amount of skill and effort to make faked photos look real: making the paste seem-less, getting the skin tone to match, getting the lighting and shadow angles to match, etc. Modern AI apps allow anyone to make real-looking faked photos with no skill at the click of a button.
Re: (Score:1)
How does the app confirm consent? (Score:2)
What they're doing is banning nudes, with consent or not and for everyone, because some will abuse the capability.
Re: (Score:2)
Probably any app that takes some image as an input _could_ be used to produce non-consensual content. If you just ask the AI for a generic blonde with big cans, you are most likely safe.
You can't block it (Score:2)
What is the point to this? You better ban all pens, pencils, crayons, paints, etc because they can all be used to create a nude of anyone you want.
Re: You can't block it (Score:2)
Re: (Score:2)
What is the point to this? You better ban all pens, pencils, crayons, paints, etc because they can all be used to create a nude of anyone you want.
Do you really think the people in charge of this at Apple don't understand how content creation works?
Here's the thing: Like Apple's Sr. V.P. Of Software Engineering, Craig Federighi, said (paraphrasing from memory) "We have to make iOS and ipadOS (and their App Stores) safe enough for kids; because we actually have kids, sometimes even infants, that use iPhones and iPads, every single day."
So, since there isn't an "Adult" Apple App Store (nor could there be; kids are too smart!), there's little choice but
Re: (Score:2)
You can run one in Europe now, it's perfectly legal.
And you can even verify it by requiring payment be done via age-verified means. Credit cards used to be 18+ only, but I'm sure in Europe there's probably other ways to verify.
Re: (Score:2)
You can run one in Europe now, it's perfectly legal.
And you can even verify it by requiring payment be done via age-verified means. Credit cards used to be 18+ only, but I'm sure in Europe there's probably other ways to verify.
We're not talking about Europe now, are we?
Eww. Gross! (Score:2)
I, for one, support the ban of nuyde apps.
Naked apps running around flaunting their bits are just gross!
Put an interface on, for crying out loud. Think of the children!
hawk
Is PhotoShop banned too? (Score:2)
Nonconsensual Pixels (Score:1)