KUALA LUMPUR – Time to change the “photoshop” rules on porn deepfakes and bring them into the era of artificial intelligence, the oversight board told tech giant Meta.
The board, which is the “top court” for Meta content moderation decisions, was reviewing two cases on non-consensual deepfake images of women in India and the US.
“These two cases involve AI-generated images of nude women, one resembling an Indian public figure, the other an American public figure.
“In the first case, an Instagram account that shared only AI-generated or manipulated images of Indian women posted a picture of the back of a nude woman with her face visible, as part of a set of images.
“In this image, posted to a Facebook group for AI creations, the nude woman is being groped. The famous figure she resembles is named in the caption,” the board said in a statement.
“In the second case (American public figure), the explicit image had already been added to an Media Matching Service (MMS) bank for violating Meta’s bullying and harassment policy.”
Following this, the board recommended that Meta moves the prohibition on “derogatory sexualized photoshop” into the Adult Sexual Exploitation Community Standard.
Meta is also advised to replace the word “photoshop” in the prohibition on “derogatory sexualised photoshop” with a more generalised term for manipulated media.
“(Meta) should harmonise its policies on non-consensual content by adding a new signal for lack of consent in the Adult Sexual Exploitation policy: context that content is AI-generated or manipulated.”
The Photoshop software is widely used as a common reference for image tweaking. – July 25, 2024