'A Tool for Scammers': AI Deepfakes Promote Fake Sex Drugs

‘Tool for grifters’: AI deepfakes push bogus sexual cures

  • The trend highlights how rapid advances in artificial intelligence have led to the emergence of what researchers call an AI dystopia – an online universe full of deception designed to manipulate unsuspecting users into buying dubious products.

WASHINGTON: Holding a giant carrot, a shirtless, muscular man advertises a supplement he says can enlarge male genitalia, one of countless AI-generated TikTok videos promoting unproven sexual performance treatments.

The development of generative artificial intelligence has made it easy and financially lucrative to mass-produce such videos with minimal human oversight. They often feature fake celebrity endorsements of counterfeit and potentially harmful products.

Some TikTok videos use carrots as a euphemism for male genitalia, apparently to get around content moderation that polices sexually explicit language.

“You'll notice your carrots have grown,” a muscular man says in a robotic voice in one video, directing users to a link to buy online.

“This product will change your life,” the man adds, claiming without any evidence that the herbs used as ingredients boost testosterone levels and send energy levels “through the roof.”

The video appears to have been created using artificial intelligence, according to a deepfake detection service recently launched by San Francisco Bay Area-based firm Resemble AI, which shared its findings with AFP.

“As this example shows, misleading content generated by AI is being used to promote supplements with exaggerated or unverified claims, potentially putting consumers’ health at risk,” Zohaib Ahmed, CEO and co-founder of Resemble AI, told AFP.

“We are seeing AI-generated content being weaponized to spread false information.”

The trend highlights how rapid advances in artificial intelligence have led to the emergence of what researchers call an AI dystopia – an online universe full of deception designed to manipulate unsuspecting users into buying dubious products.

These include everything from unproven (and in some cases potentially harmful) dietary supplements to weight loss products and sexual performance enhancement products.

“AI is a useful tool for fraudsters looking to create large volumes of low-quality content at low cost,” disinformation researcher Abby Richards told AFP.

“It's a cheap way to create advertising,” she added.

Alexios Mantzarlis, director of the Safety, Trust, and Security Initiative at Cornell Tech, has noticed a surge in “artificial intelligence” avatars and audio tracks on TikTok that promote dubious sexual products.

Some of these videos, many of which have received millions of views, promote testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

Even more worryingly, rapidly advancing artificial intelligence tools have made it possible to create deepfakes that parody celebrities such as actress Amanda Seyfried and actor Robert De Niro.

“Your husband can’t get it up?” asks Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, in a TikTok video promoting a prostate-health supplement.

However, the clip is a fake using Fauci's image.

Many edited videos are created from existing ones, modified with AI-generated voices, and lip-synced to match what the modified voice is saying.

“Videos of fake accounts are particularly pernicious because they further impair our ability to recognize genuine accounts online,” Mantzarlis said.

Last year, Mantzarlis found hundreds of YouTube ads featuring celebrity impersonators, including Arnold Schwarzenegger, Sylvester Stallone and Mike Tyson, promoting supplements marketed as cures for erectile dysfunction.

The rapid pace of AI-powered short-form video production means that even when tech platforms remove questionable content, nearly identical versions quickly reappear, turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring new solutions and more sophisticated detection tools.

AFP fact-checkers have repeatedly exposed fraudulent Facebook ads promoting treatments, including erectile dysfunction drugs, that use fake endorsements from Ben Carson, a neurosurgeon and former US cabinet minister.

However, many users still believe these endorsements are legitimate, illustrating the appeal of deepfakes.

“Affiliate marketing scams and dubious sex supplements have been around since the Internet began and even before,” Mantzarlis said.

“As with every other evil on the internet, generative AI has made this abuse vector cheaper and faster to deploy at scale.”



Source

Leave a Reply

Your email address will not be published. Required fields are marked *