Ending Abusive AI: The Midjourney to a Free and Fair Future

The tool had been used to create fake images of Donald Trump and Pope Francis, among others



Midjourney is discontinuing free use of its AI image generator after people used it to create high-profile deepfakes. According to CEO David Holz on Discord, the company is discontinuing free trials due to "extraordinary demand and trial abuse." According to Holz, new safeguards have not been "sufficient" to prevent misuse during trial periods. For the time being, you must pay at least $10 per month to use the technology.

According to The Washington Post, Midjourney has recently found itself at the centre of unwanted attention. Users used the company's artificial intelligence to create deepfakes of Donald Trump being arrested and Pope Francis wearing a trendy coat. While the images were quickly identified as fake, there is concern that bad actors may use Midjourney, OpenAI's DALL-E, and other similar generators to spread misinformation.


Midjourney has admitted to having difficulty establishing content policies. Holz justified a ban on images of Chinese President Xi Jinping in 2022 by telling Discord users that his team only wanted to "minimise drama," and that having any access in China was more important than allowing satirical content. Holz said in a Wednesday chat with users that he was having trouble setting content policies as AI enabled more realistic imagery. The founder of Midjourney hopes to improve AI moderation that screens for abuse.

To avoid incidents, some developers have implemented strict rules. For example, OpenAI prohibits images of current political events, conspiracy theories, and politicians. It also prohibits hatred, homosexuality, and violence. Others, on the other hand, have more lax rules. Stability AI will not allow Stable Diffusion users to copy styles or create images that are not suitable for work, but it will not limit what people can create in general.


The use of misleading content isn't the only issue with AI image production. There have long been suspicions that the images are stolen because they frequently use existing images as reference points. While some businesses are incorporating AI art into their products, others are wary of attracting unwanted attention.

Post a Comment

0 Comments