There's nothing that AI can do these days. Even then, it can instantly undress a person without extreme edits right from the start.

Since this is concerning to everyone, especially teenagers, Apple has recently purged several AI image generation applications from the App Store. 

These apps were specifically designed to create nonconsensual nude images, sparking controversies. The crackdown follows investigative reporting by 404 Media that highlighted the problematic nature of these applications.

Undress Apps Are Everywhere on Social Media

Apple Cracks Down on AI-Image Generation Apps Promoting Instant Nudity Magic
(Photo : James Yarema from Unsplash)
With nudity now a hot subject in AI image generation, Apple is removing apps that promote the creation of "nonconsensual nude photos" of people.

404 Media's initial investigations uncovered a disturbing trend where AI apps were being advertised on Instagram with claims such as "undress any girl for free." 

According to 9to5Mac, these promotions often directed users to Apple's App Store, where the apps were misleadingly described as "art generators." The investigation revealed not only the existence of these apps but also highlighted their accessibility through mainstream advertising platforms.

Related Article: MIT Unveils AI Tool That Swiftly Generates High-Quality Images in One Step

Apple Took Quick Action to Delete Nude-Generating Apps

Although Apple did not immediately respond to the initial inquiries from 404 Media, the company acted swiftly upon receiving detailed information, including direct links to the apps and their advertisements. This proactive approach led to the immediate removal of the offending apps from the App Store to prevent the misuse of technology for creating harmful content.

The Nature of the Advertisements

A Futurism report gives some examples of how the AI image-generation apps work. One particular advertisement displayed a photo of Kim Kardashian next to provocative taglines like "undress any girl for free" and "try it." 

Another showed AI-generated images of a young-looking girl both clothed and topless, with explicit prompts such as "any clothing delete." These ads not only violated ethical norms but also raised legal concerns, especially regarding the protection of minors from digital exploitation.

Furthermore, these apps gained notoriety in recent months after being used to generate fake nude images of teenagers in American schools and across Europe. 

In response to curb these harmful apps online, platforms like Meta have taken steps to remove such advertisements, although challenges remain in effectively policing this content consistently.

While the Cupertino firm's actions are a step in the right direction, they also highlight the ongoing challenges in digital content moderation. Companies like Google have also faced scrutiny for directing users to websites like "MrDeepFakes," known for deepfake pornography involving celebrities and public figures without consent.

Apple knows that this won't be an easy fight and the deleted apps will only be replaced by new nude-generating apps. Indeed, this is a trend that needs to be stopped to safeguard the safety and privacy of users on social media.

If there's an app that can magically produce nude photos, there's also an AI camera called NUCA that can remove the person's clothing in a blink.

Read Also: AI Makes Photoshop Easier: Create, Edit Images Faster with Text Descriptions

Joseph Henry

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion