A prominent AI image generator startup's database has been left exposed on the internet, revealing more than 1 million images and videos. The majority of these images involved nudity, depicting adult content, with some appearing to be non-consensual depictions of real individuals or children.
Security researcher Jeremiah Fowler discovered this security flaw in October and found that the same unsecured database was being used by multiple websites, including MagicEdit and DreamPal. Fowler stated that around 10,000 new images were being added to the database every day.
The exposed database contained a mix of legacy assets, primarily from MagicEdit and DreamPal, with SocialBook linked to it. However, SocialBook denies any involvement in the operation or management of the database.
AI image generation tools are increasingly being used to create explicit imagery of people, including women. An enormous ecosystem of "nudify" services has been reported to be using AI to strip clothes off images, often with millions of dollars generated annually from these operations.
The concern over child sexual abuse material (CSAM) has doubled in the past year, and reports have surfaced about criminals using AI to create CSAM. Fowler discovered that the exposed database held numerous files that appeared to be explicit, AI-generated depictions of underage individuals.
Fowler reported the data to the US National Center for Missing and Exploited Children, a nonprofit working on child-protection issues. The DreamX spokesperson has expressed their concern over non-consensual explicit imagery and stated they would do more than just generic pop-ups to prevent misuse.
The MagicEdit website listed multiple features that could be used to create explicit images of adults. Fowler found that these tools had been widely used, with AI-generated content often being marketed as a service for users.
Fowler believes that companies need to implement stronger moderation mechanisms beyond AI to protect users and prevent the misuse of their platforms.
Security researcher Jeremiah Fowler discovered this security flaw in October and found that the same unsecured database was being used by multiple websites, including MagicEdit and DreamPal. Fowler stated that around 10,000 new images were being added to the database every day.
The exposed database contained a mix of legacy assets, primarily from MagicEdit and DreamPal, with SocialBook linked to it. However, SocialBook denies any involvement in the operation or management of the database.
AI image generation tools are increasingly being used to create explicit imagery of people, including women. An enormous ecosystem of "nudify" services has been reported to be using AI to strip clothes off images, often with millions of dollars generated annually from these operations.
The concern over child sexual abuse material (CSAM) has doubled in the past year, and reports have surfaced about criminals using AI to create CSAM. Fowler discovered that the exposed database held numerous files that appeared to be explicit, AI-generated depictions of underage individuals.
Fowler reported the data to the US National Center for Missing and Exploited Children, a nonprofit working on child-protection issues. The DreamX spokesperson has expressed their concern over non-consensual explicit imagery and stated they would do more than just generic pop-ups to prevent misuse.
The MagicEdit website listed multiple features that could be used to create explicit images of adults. Fowler found that these tools had been widely used, with AI-generated content often being marketed as a service for users.
Fowler believes that companies need to implement stronger moderation mechanisms beyond AI to protect users and prevent the misuse of their platforms.