Meta’s Oversight Board is urging the company to update its rules around sexually explicit deepfakes. The board made the recommendations as part of its decision in two cases involving AI-generated images of public figures.
The cases stem from two user appeals over AI-generated images of public figures, though the board declined to name the individuals. One post, which originated on Instagram, depicted a nude Indian woman. The post was reported to Meta but the report was automatically closed after 48 hours, as was a subsequent user appeal. The company eventually removed the post after attention from the Oversight Board, which nonetheless overturned Meta’s original decision to leave the image up.
The second post, which was shared to a Facebook group dedicated to AI art, showed “an AI-generated image of a nude woman with a man groping her breast.” Meta automatically removed the post because it had been added to an internal system that can identify images that have been previously reported