AI That Undresses People: Is There a Need For Urgent Regulation?
(Photo: Pexels/ThisIsEngineering)
AI That Undresses People: Is There a Need For Urgent Regulation?

Artificial Intelligence (AI) has gone further with the development of Undress AI, an AI app that enables one to undress women in photos. Its soaring popularity has made many question whether there's an urgent need for regulation as the new app seemingly violates one's privacy.

AI That Undresses People

Artificial intelligence-powered apps and websites that strip women naked in pictures are becoming increasingly popular. According to Graphika, a social network monitoring company, 24 million users visited undressing websites in September alone.

Popular social networks are used by many of these "nudify," or undressing, services for marketing purposes, according to Graphika. For example, the researchers found that there have been over 2,400% more links on social media, particularly on X and Reddit, promoting undressing apps since the start of the year. The services replicate an image of a person in their underwear using AI. A large number of services are limited to women.

These apps are a part of a concerning trend whereby developments in artificial intelligence are being used to create and spread non-consensual pornography, or "deepfake pornography," which is created material. Since the photos are frequently obtained from social media and shared without the subject's knowledge, agreement, or control, their widespread use is fraught with ethical and legal issues.

According to Graphika, the surge in popularity is linked to the publication of multiple open-source diffusion models or artificial intelligence programs that can produce significantly better images than those made a few years ago. The models that the app developers use are free to use because they are open source.

"You can create something that actually looks realistic," said Santiago Lakatos, an analyst at Graphika, noting that previous deepfakes were often blurry.

One photo that was uploaded to X to promote an undressing software contained verbiage that said users might take pictures of themselves in their underwear and send them to the person whose photo was digitally undressed, which may be considered harassment. Meanwhile, one of the applications is the top result when searching for "nudify" on Google's YouTube because it has paid for promoted content there.

ALSO READ: Robotics Bring Back Extinct Species Not With DNA But Engineering [Study]

Nudity AI Apps Spark Need For New Regulation

The founder of the technology consulting company Tech Whisperer, Jaspreet Bindra, believes that the first step in regulation should be to have "classifier" technology that can discern between real and fraudulent.

"Technology and regulation together have to be the two-pronged solution," he declared. We require classifiers to distinguish between what is real and what is not. Similarly, something created by AI has to be properly marked as such by law.

A legislative framework for AI is still up for dispute, but generative AI technologies like Undress highlight the need to move this process along quickly.

Deepfakes have previously demonstrated the damage they can cause by disseminating false information and fake news within the political arena; however, as evidenced by tools such as Undress, this threat is now affecting regular people as well, particularly women who unintentionally post photos to social media platforms.

RELATED ARTICLE: Crocodile-Like Beast That Roamed In Tanzania 240 Million Years Ago Discovered, Gets New Name

Check out more news and information on TECH & INNOVATION in Science Times.