Sun, Mar 30, 2025
Whatsapp

Rise of AI-powered 'undressing' apps sparks concern over image manipulation; women worst affected

Disturbing trend is part of broader issue of non-consensual pornography facilitated by advancement of artificial intelligence

Reported by:  PTC News Desk  Edited by:  Jasleen Kaur -- December 09th 2023 01:35 PM
Rise of AI-powered 'undressing' apps sparks concern over image manipulation; women worst affected

Rise of AI-powered 'undressing' apps sparks concern over image manipulation; women worst affected

PTC Web Desk:  Researchers have flagged a disconcerting surge in the usage of artificial intelligence (AI) to create non-consensual, digitally manipulated images of women. According to Graphika, a social network analysis company, the popularity of websites and applications that offer services to undress individuals, especially women, using AI has rapidly increased.

In a distressing revelation, the analysis highlighted that an estimated 24 million people visited such 'undressing' websites in September alone. Most of these services, categorised as 'nudify' apps, predominantly target women and utilise AI algorithms to morph images, presenting the individual in a nude state without their consent.


The disturbing trend is a part of the broader issue of non-consensual pornography facilitated by the advancement of artificial intelligence (AI), particularly recognised as 'deepfake pornography.' These manipulated images, often sourced from social media without the subject's knowledge or approval, raise profound legal and ethical concerns.

Graphika attributed the upsurge in popularity to the availability of open-source AI models that significantly enhance the quality of digitally manipulated images. Santiago Lakatos, an analyst at Graphika, emphasised the realistic nature of these AI-altered images compared to earlier versions, which were typically blurry.

Additionally, reports indicated that these apps engaged in questionable marketing practices, including suggestive language implying harassment and sponsored content on platforms like Google's YouTube. Both Google and Reddit have taken actions against these practices, with Google removing violating ads, while Reddit banned several domains involved in non-consensual content sharing.

Both Google and Reddit have taken actions against these practices, with Google removing violating ads, while Reddit banned several domains involved in non-consensual content sharing.

These 'undressing' services, some priced at $9.99 per month, claim high user engagement, with one app boasting over a thousand daily users, further raising concerns about the exploitation of AI technology for unethical purposes.

Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, expressed concerns over the proliferation of deepfake software among ordinary individuals, emphasizing instances involving high school and college students as both perpetrators and victims.

However, the absence of federal legislation directly addressing the creation of deepfake pornography presents a challenge. While there are laws prohibiting such content involving minors, there's a lack of comprehensive legal measures to counteract the proliferation of non-consensual deepfake imagery.

TikTok and Meta Platforms Inc. have taken measures to restrict keywords related to these 'undressing' apps, highlighting their commitment to safeguarding against harmful content. Nevertheless, the ethical and legal implications of the misuse of AI technology for non-consensual purposes remain a significant concern, warranting a broader societal and legislative response.

- With inputs from agencies

Top News view more...

Latest News view more...

PTC NETWORK