AI ‘Undress’ Apps: The New Weapon of Blackmail and Extortion

Imagine riding public transport, oblivious to the fact that someone nearby could take a photo of you and, within minutes, manipulate it into a hyper-realistic nude image. This unsettling reality is made possible by AI-powered “undressing” apps, tools that are now being weaponized for extortion.

While the concept of fake nudes isn’t new, artificial intelligence has significantly lowered the barrier for creating them. Apps that “undress” images are part of this disturbing trend, making it possible for anyone with malicious intent to generate convincing fake nudes, often without the victim’s knowledge. Such apps can quickly become tools for extortion, blackmail, and psychological abuse, affecting the lives of countless individuals.

A Case of AI Misuse:

Take the case of Rohan*, who turned to loan apps during a financial crisis. What he didn’t expect was that defaulting on his payments would lead to his family becoming targets of AI-generated abuse. When Rohan couldn’t meet his repayment deadlines, he received threatening messages not just about his debt but about the wellbeing of his loved ones. His wife soon found herself the victim of AI-generated explicit images circulated to their social circle — images she never took but that looked alarmingly real.

Cybersecurity expert Vikram Sharma explains, “These AI tools can easily access someone’s photo gallery once permission is granted by an app. They then generate hyper-realistic nudes, often used for blackmail.”

The Growing Threat of AI-Generated Nudes:

The misuse of AI for generating fake nudes is spreading across different social platforms, but Telegram has proven especially resistant to regulatory action. Despite efforts from companies like Meta and Google to suppress the use of “nudify” services by removing related keywords or blocking advertisements, the underground market continues to thrive. Telegram, in particular, harbors countless channels offering these “undressing” services. Some even allow users to alter images further by adding suggestive clothing or placing the victim’s face on inappropriate videos.

Disturbingly, these tools don’t only target adults. A study conducted by a child protection organization revealed that even minors are familiar with how to create fake nudes using these apps. In one alarming incident in India, a schoolboy distributed AI-generated nudes of a classmate, which were then shared across social media platforms.

Big Tech’s Response: Not Enough

While some big tech platforms have taken steps to address the problem, they are still falling short. Google’s recent move to simplify the process of removing non-consensual nude images from search results is a step forward. However, for many victims, the damage has already been done by the time action is taken.

Unfortunately, completely eradicating these images from the internet remains a difficult, if not impossible, task. Even if major platforms remove the content, it can still circulate on private messaging apps like Telegram, where enforcement is weak. As Sam Gregory, a generative AI expert, notes, “By the time these images are removed, the harm is often irreversible.”

How AI Nudify Apps Work?

These apps use AI models trained on massive datasets of images, including nude and clothed bodies. When you upload a photo, the AI identifies areas covered by clothes and digitally “removes” them. Then, it fills in the blanks with what it “predicts” the body would look like underneath, creating a highly realistic fake image.

Most of these apps target women, although they can be used on anyone. They even advise users not to upload pictures of men or animals, since the AI algorithms are designed specifically to create female nudes.

The Rise of Nudify Apps on Telegram:

Despite efforts by major tech companies to crack down on these apps, platforms like Telegram have become a hub for such services. Hundreds of channels offer “undressing” services where users can upload a woman’s photo and receive a manipulated nude image in minutes. Many of these services are offered for free or via a freemium model, making them easily accessible to a wide range of users.

These channels even encourage unethical behavior. Some ads on these channels suggest taking photos of unsuspecting women in public spaces and using undress AI apps on them. This dangerous trend has led to an increase in incidents of harassment, blackmail, and exploitation.

Source

How to Protect Yourself: Prevention and Caution

Given the rise of AI-powered image manipulation, it’s critical to be cautious about sharing personal images and protect yourself from potential misuse. Here’s how you can safeguard yourself from becoming a victim of AI-generated nudes:

  1. Be Mindful of Your Online Presence: Limit the number of personal images you share on public platforms. Once an image is online, it can be easily downloaded, manipulated, and misused.
  2. Use Privacy Settings: Ensure that your social media accounts are private, and control who can see your posts. Only allow trusted individuals to view your photos.
  3. Be Selective with App Permissions: Many apps request access to your phone’s gallery, contacts, and camera. Only grant these permissions to apps from trusted developers and always read the privacy policies.
  4. Stay Cautious in Public Spaces: In an age where anyone can discreetly take a photo of you in a public space, be aware of your surroundings. If you suspect someone is photographing you without permission, report it.
  5. Monitor Your Digital Footprint: Regularly check for images or content about you online, especially if you’ve shared personal pictures in the past. Google offers tools to help remove non-consensual explicit images from search results.
  6. Take Immediate Action If You’re a Victim:
    • Document Everything: If you become a victim of AI-generated image manipulation, take screenshots of the images and any related messages or threats.
    • File Complaints: Report the abuse to the platform where the images are being shared and contact local authorities.
    • Use Takedown Tools: Platforms like Google allow you to submit a request to remove explicit images or videos of you that were shared without your consent.
  7. Educate Yourself and Your Community: Talk to your family and friends about the dangers of these apps. Spread awareness in schools, workplaces, and online communities about the risks associated with AI nudify tools.

Challenges in Stopping AI-Generated Nudes:

Despite efforts by major platforms like TikTok, Meta, and Google to block or limit the spread of these apps, they continue to thrive, especially on private messaging apps like Telegram. The anonymous nature of these platforms makes it hard to track down perpetrators or remove all instances of manipulated images.

Experts warn that while tech companies can take steps to reduce the visibility of such content, the harm is often done by the time action is taken. For the victims, the emotional and psychological impact of such violations can be profound and long-lasting.

Protecting Yourself: What You Can Do

For those targeted by these AI-powered extortionists, the key to combating this form of abuse lies in documentation. Victims are encouraged to take screenshots and save any images or videos that are used against them, as these can be vital for pursuing legal action or issuing takedown requests.

Ultimately, though, regulation and technological intervention need to evolve rapidly to prevent AI from continuing to be a tool for harm. While these apps exploit AI’s ability to create convincing fabrications, it’s crucial for tech companies and governments to stay one step ahead in the fight against non-consensual image manipulation.

Conclusion:

As AI technology continues to evolve, so too does the potential for its misuse. AI-powered nudify apps represent a significant threat to privacy, safety, and dignity. By understanding how these apps work and taking proactive steps to protect yourself, you can reduce the risk of falling victim to this growing form of cyber extortion.

The key takeaway is clear: Be mindful of what you share online, take control of your digital privacy, and spread awareness to ensure others can do the same. The misuse of AI to create fake nude images is a violation of personal privacy, but with vigilance and caution, we can protect ourselves and our loved ones from falling prey to these harmful tactics.

*Name changed for privacy reasons.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top