Meta, the parent company of Facebook and Instagram, has launched a stringent operation to eliminate thousands of advertisements promoting artificial intelligence-generated “girlfriends.” This move comes after an investigative report by Wired revealed over 29,000 instances of such ads across Meta platforms, including Facebook, Instagram, and Messenger.
The ads in question feature AI-generated images of women in scant clothing and suggestive poses, engaging users in sexually explicit messaging. Wired’s investigation highlighted that more than half of these advertisements were labeled with the acronym “NSFW” (Not Safe For Work), indicating their inappropriate nature for workplace viewing.
According to the report, approximately 2,700 of these ads were active at the time of Meta’s notification last week.
Meta’s policies strictly prohibit adult content, which encompasses nudity, explicit or suggestive positions, and sexually provocative activities. In response to the findings, Meta spokesperson Ryan Daniels stated, “When we identify violating ads, we work quickly to remove them, as we’re doing here. We continue to improve our systems, including how we detect ads and behavior that go against our policies.”
Daniels reiterated the company’s commitment to robust content regulation in a statement to FOX Business, adding that Meta had commenced the removal of these ads for breaching its adult content guidelines. He also noted that groups or individuals often adopt new tactics to bypass detection, necessitating constant updates to Meta’s enforcement strategies.
This crackdown reflects Meta’s ongoing efforts to maintain a safe and policy-compliant environment across its social media platforms. The incident underscores the challenges tech giants face in moderating content that cleverly skirts around established community guidelines.