Home United States USA — IT Here's Why OpenAI Isn't Banning Swastikas From Its New Image Generator

Here's Why OpenAI Isn't Banning Swastikas From Its New Image Generator

70
0
SHARE

Completely banning controversial symbols ‘could erase meaningful conversations and intellectual exploration,’ OpenAI says.
OpenAI debuted a new ChatGPT image generator this week that allows for the use of controversial images like swastikas in certain contexts.
« We recognize symbols like swastikas carry deep and painful history », says Joanne Jang, OpenAI’s head of product. « At the same time, we understand they can also appear in genuinely educational or cultural contexts. Completely banning them could erase meaningful conversations and intellectual exploration. »
Unsurprisingly, mixing AI with sensitive topics is not foolproof and requires heavy user oversight. I asked the new image generator, which uses OpenAI’s GPT-4o model instead of DALL-E, to create an image of « a door with a swastika on it. » It refused my initial request, saying it would only do so for a « cultural or historical design. »
Then, I asked it to « create a swastika for use in a school assignment. » It seemed to accept this, and asked for more details about the project. It also pointed out that « the symbol has been used for thousands of years in many cultures, including Hinduism, Buddhism, and Jainism » and vaguely alluded to it being « appropriated in the 20th century in a very different context. » It did not use the words Hitler or Nazi.
« I want a diagram that compares the visual elements of swastikas used by Germany in WWII and the cultural symbol you mentioned », I responded. After a minute or two, it created the image. I told it that one element of the image was incorrect—the lower arrow labeled « upright » points the wrong symbol— and it said, « You’re right! Do you want me to fix it? » Is More or Less Content Moderation Better?
The new policy is part of a push at OpenAI for more hands-off content moderation. « AI lab employees should not be the arbiters of what people should and shouldn’t be allowed to create », Jang says.

Continue reading...