NSFW Images Classification

Our adult content detection technology can recognize any not-safe-for-work (NSFW) content in an image. The model differentiates between explicit/non-explicit content in real time, returns the label, coordinates, and restricts the accessibility of any improper content.

Our AI-powered content detection system checks images generated by users on your platform and automatically identifies any suspicious content with maximum accuracy and speed. The model can also blur or block offensive content to keep your website and content safe for your audience, ensuring compliance with regulations.

Try Model on Rapid API
Nudity detection

Main Function

NSFW Images Classification API finds offensive, nude and inappropriate content in images/videos. Our NSFW recognition technology monitors all user-generated content on a website, automatically detects and can block any NSFW images that can be considered inappropriate in a workplace.

Powered by AI, our model is a fast and effective solution to avoid offensive content on websites and enable safe and appropriate content for users.

Input

The input is an image. 

Output

The API returns the following: 

  • Objects Array
    • “Box”: Gives a list of 4 integers, which are the coordinates of the object
    • “Score”: The confidence score
    •  “Label”: The label of the found object
  • Unsafe info (true/false)
    • “Unsafe”: True/false depending on if it detects nudity (True if there is nudity)

 Examples of labels:

  • EXPOSED_ANUS
  • EXPOSED_BUTTOCKS
  • EXPOSED_BREAST_F
  • EXPOSED_GENITALIA_F
  • EXPOSED_GENITALIA_M

For example

{
  "unsafe":true,
  "objects":[
    {
      "box":[339,30,363,55],
      "score":0.66,
      "label":"EXPOSED_BREAST_F"
    },
    {
      "box":[352,80,366,92],
      "score":0.60,
      "label":"EXPOSED_GENITALIA_F"
    }
  ]
}
Go to Top