Love, Lies, and Algorithms: AI Fuels the New Romance Fraud

May 2, 2025 | Cybersecurity
By Ashwani Mishra, Editor-Technology, 63SATS Cybertech

It was the perfect love story—until it wasn’t.

Like many others, Elena had seen the Netflix doc The Tinder Swindler and scoffed, “I’d never fall for that.” She believed she was smarter, more skeptical. But when she met “Michael” on a dating app, her resolve slowly unraveled.

He wasn’t flashy. No private jets. No diamond empire. Just a down-to-earth venture capitalist from Berlin who claimed to be expanding his blockchain business into Russia. Their chats quickly moved from the app to WhatsApp. He sent her voice notes, shared book recommendations, even checked in when her mother was hospitalized. There were video calls too—his handsome face always framed in a chic but sparse co-working space.

It felt real. Because it was meticulously engineered to feel that way.

Moscow: Cybersecurity firm Angara Security has issued a warning: a surge in sophisticated digital romance scams is sweeping Russia. Using AI-generated profiles, deepfakes, and clever social engineering, scammers are weaponizing intimacy.

The scheme is dubbed Pig Butchering. First comes the “fattening”—where scammers build trust, sometimes over weeks or months. Then, the “cutting”—when the victim is lured into a fake investment or business venture, convinced by promises of love and future riches.

“Michael” suggested they invest together in a crypto arbitrage platform. He even showed Elena a slick dashboard—her initial $1,000 appeared to grow daily. Encouraged, she added more. When she hesitated at $25,000, he reassured her: “This is our future.”

What Elena didn’t realize was that Michael didn’t exist. His videos were deepfaked, his voice notes generated by AI trained on real influencers. The “investment platform” was a front, part of an international Ponzi scheme. The sleek user interface? A static website with no underlying blockchain.

And when Elena finally asked to withdraw her money, the site went offline. Michael vanished.

Too ashamed to go to the police, she confided in a friend—who’d read about a similar scam in RIA Novosti. That’s when Elena realized: this wasn’t just heartbreak. It was identity theft. Financial ruin. And psychological trauma.

According to the 2024 NICE Actimize Fraud Insights Report, romance scams surged 133% in value and 50% in volume last year alone. These aren’t just one-off crimes—they’re part of global, coordinated fraud operations. Many are powered by criminal groups using machine learning to target specific demographics.

Social engineering lies at the heart of the scam. The fraudster becomes the victim’s confidante or romantic partner, tailoring their identity to align with the victim’s dreams. As trust grows, so does manipulation.

The payment methods used—often peer-to-peer apps like Venmo, PayPal, or local equivalents—make recovery difficult. If a victim authorizes the payment, they often have no recourse. In the UK, regulators are now forcing banks to share liability for such “authorized push payment” scams. Elsewhere, the losses fall squarely on the victim.

But the impact goes deeper than bank balances. Victims are often recruited into other schemes. Some become unwitting money mules, helping launder stolen funds. Others are drawn into fake jobs, used as shields for fraudulent transfers.

What makes this wave of digital scams particularly dangerous is the ease of fabrication. Fraudsters now use AI to create:

  • Deepfake videos mimicking real-time calls
  • False receipts, invoices, and crypto dashboards
  • Social media profiles with stolen photos and AI-generated content
  • Falsified legal documents to “prove” investments are real

In the case of “Michael,” the entire persona was built using a deepfake generator trained on a composite of European entrepreneurs. His “company” had a LinkedIn page, employees (also fake), and investor testimonials—all AI-generated.

Elena was just one of many victims used to fund the scam’s next phase.

The Human Cost

Most victims don’t come forward. Shame, embarrassment, and fear keep them silent. But cybersecurity experts say awareness is the first defense.

“There’s a psychological component that makes romance scams uniquely effective,” says a spokesperson at Angara Security. “They exploit emotional vulnerability, creating a sense of urgency and connection. It’s not just a scam—it’s betrayal at a personal level.”

Red Flags to Watch For:
  • Someone professing love or deep friendship quickly
  • Pressure to move the conversation off the platform
  • Investment or money-related talk within weeks
  • Reluctance to meet in person or inconsistencies in stories
  • Fake documents that look too “perfect”
Conclusion

Technology evolves fast—but so do scammers. What starts as a search for companionship can spiral into financial and emotional devastation. As AI and deepfake technology become more accessible, trust becomes the new battleground in the fight against cybercrime.

For Elena, the damage was done. But by sharing her story, she hopes others will pause, question—and protect themselves.

Because sometimes, the biggest red flag isn’t the lie you hear.

It’s the story you want to believe.