Apple Removes AI Apps for Generating Nonconsensual Nude Images
The tech giant took action after a media investigation revealed the apps' capabilities, highlighting ongoing concerns about AI ethics.
- Apple has removed three AI apps from the App Store that were capable of creating nonconsensual nude images of individuals.
- The removal followed an investigation by 404 Media, which discovered the apps through ads on Meta's Ad Library.
- Meta has since deleted the offending ads, indicating a broader industry response to the issue.
- The incident raises questions about the oversight and ethical implications of AI technologies in mainstream platforms.
- Apple's action comes ahead of its planned AI feature updates, underscoring the company's focus on user trust and safety.