OpenAI allows users of its AI art generator program DALL-E to edit images with human faces. This feature was previously banned due to fears of misuse, but, in a letter sent to millions of DALL-E users, OpenAI says it is opening access after improving its filters to remove images that contain sexual, political and violent content. contents.”
The feature will allow users to edit images in different ways. They can upload a photo of someone and generate variations of the image, for example, or they can edit specific features, like changing someone’s clothes or hairstyle. The feature will undoubtedly be useful to many users in the creative industries, from photographers to filmmakers.
“With enhancements to our security system, DALL E is now ready to support these delightful and important use cases, while minimizing the potential for damage from deepfakes,” OpenAI said in its customer letter. announcing the news.
—Allan Harding (@allanharding) September 19, 2022
Kind Dall-e allows faces again. He is me as a wwe wrestler taking a bath pic.twitter.com/bwoCHIDylF
—NymN (@nymnion) September 19, 2022
The decision is part of an ongoing negotiation between makers of AI art generators with their own users as they try to navigate the technology’s potential harms. As a well-funded company with ties to tech giants like Microsoft, OpenAI has taken a relatively cautious approach. But the company has been overwhelmed by rivals like Stable Diffusion, which places fewer constraints on users. This leads to faster technology development, but also makes malicious applications much easier. Stable Diffusion, for example, is already used to generate pornographic deepfakes of celebrities.
Such explicit hardware should be easy to block for OpenAI with DALL-E. The company’s terms of service also prohibit users from uploading images of people without their consent (although this is essentially impossible to proactively enforce with its current access model). However, no content filter is perfect, and there may be more subtle harmful use cases than non-consensual pornography.