Generative AI refers to artificial intelligence systems that can create or modify images, video, audio, and text. Tools like Stable Diffusion, DALL-E, MidJourney, and various "undressing" apps use generative AI to produce content that may look realistic but is partially or entirely fabricated.
These tools have advanced rapidly in recent years, making it increasingly difficult to tell real photos from AI-altered ones with the naked eye.
AI undressing (sometimes called "deepnude" or "nudify") refers to using AI tools to digitally remove clothing from a photo of a real person. The result is a fabricated nude image that never actually existed — the person was never photographed that way.
This is done without consent and is considered a serious form of image-based abuse in many jurisdictions. These images are:
When you see the ⚠️ Possible AI Modification Detected banner on a photo, it means our moderation team has identified signs that the image may have been undressed or otherwise modified using AI tools.
Common signs include:
This flag is applied manually by our moderation team. It does not necessarily mean the image is confirmed to be AI-altered — it means there are enough indicators to warrant a warning.
RateMyBody.net takes the following stance on AI-generated and AI-modified imagery:
If you believe your photo was incorrectly flagged as "Possible AI," you can contact us to request a review. Please include:
We will review the flag and remove it if we determine the image is authentic.
If you see an image on this site that you believe has been AI-undressed or AI-generated, please report it using the ⚠️ Report button on the photo page, or contact us directly.
If the image depicts you and was created or shared without your consent, we will prioritise your report. You may also wish to:
This policy was last updated on April 12, 2026.
If you have questions, contact us.