Apple has reportedly removed at least three apps from the App Store that claim they can use artificial intelligence (AI) to generate non-consensual nude images. 404 Media Said. These ads were discovered on Instagram.
according to 404 Media, Apple took action against these apps after publications shared links to these apps and their advertisements. This suggests that the tech giant couldn't find apps that violate App Store policies without outside help.
The report said it found five such ads after browsing Meta's ad library, where all ads on the platform are archived. Two of these ads were for web-based services that offer such services, while three directed to apps in the Apple App Store. Some of these apps let you swap faces in adult images, while others were marketed as “undressing” apps that use AI to remove clothing from regular photos of people.
Additionally, while Meta quickly removed these ads, Apple initially declined to comment and said it requested more information about these ads. This was after this article was published last week.
This isn't the first time Apple has been warned about AI-powered deepfake apps on the App Store. In 2022, several such apps were found on the Google Play Store and Apple App Store, but neither of the tech giants removed them. Instead, they asked these app developers to stop advertising such features on popular porn websites.
In recent months, undressing apps have spread to schools and universities around the world. Some of these features are distributed as apps, while others are offered as subscription services.
© IE Online Media Services Pvt Ltd
Date first uploaded: April 28, 2024, 12:18 IST