Apple has taken a significant step by removing apps from the App Store that were promoting the creation of nonconsensual nude imagery using generative AI technology. This move highlights Apple's increased willingness to address the risks associated with such apps. While generative AI has proven beneficial in fields like photography and design, it has also been misused for creating deep fakes and nonconsensual pornography. Despite the potential dangers, Apple had previously been passive in addressing this issue until recently when it removed three such apps from the App Store.
The apps in question were marketed as being capable of generating nonconsensual nude images, including features like face-swaps on adult images and virtual stripping of clothing from subjects in photos. Apple's response came after being alerted by a report highlighting the existence of these apps. While the removal of these apps is a positive development, there are lingering concerns about Apple's oversight, as it had to rely on third-party alerts rather than detecting and banning such apps through its App Store Review process. This action marks a step forward from previous instances where Apple and Google were notified about similar apps but did not take immediate action.
Business Insider on MSN.com 9month
Isomorphic Labs, the AI drug discovery platform that was spun out of Google's DeepMind in 2021, has raised external capital for the first time. The $600
How to level up your teaching with AI. Discover how to use clones and GPTs in your classroom—personalized AI teaching is the future.
Trump's Third Term? AI already knows how this can be done. A study shows how OpenAI, Grok, DeepSeek & Google outline ways to dismantle U.S. democracy.
Sam Altman today revealed that OpenAI will release an open weight artificial intelligence model in the coming months. "We are excited to release a powerful new open-weight language model with reasoning in the coming months," Altman wrote on X.