By Ashwani Mishra, Editor-Technology, 63SATS Cybertech
In the race to embrace artificial intelligence, thousands of users have unknowingly walked into a cyber trap
Cybercriminals are now weaponizing the public’s fascination with AI tools—using slick, fake websites and viral social media campaigns to spread a dangerous new malware strain: Noodlophile Stealer.
This isn’t just another run-of-the-mill phishing scam.
According to a recent report by cybersecurity firm Morphisec, this campaign marks a disturbing evolution in cybercrime.
Instead of peddling cracked software or fake game mods, attackers have pivoted to AI-themed platforms that promise users powerful video and image editing capabilities—all for free.
But there’s a catch.
The Bait: Free AI Tools That Never Deliver
It starts innocently enough. A Facebook post in a tech group—garnished with buzzwords like “AI-powered,” “instant results,” and “free video generation”—draws the attention of creators, influencers, and small business owners.
One such post reached over 62,000 views, showing just how viral this bait can go. The website looks legit. The social media presence feels authentic. The offering is irresistible.
Users are asked to upload their own images or video clips to generate custom AI content. Expecting a futuristic output, they instead receive a file to download—often named something like VideoDreamAI.zip. Buried within is an executable disguised as a video file: Video Dream MachineAI.mp4.exe. One click, and the damage begins.
The Switch: From Creative Tool to Credential Theft
That simple download triggers a stealthy, multi-stage attack chain. At the heart of this operation is Noodlophile Stealer, a newly discovered infostealer designed to raid victims’ browser credentials, cryptocurrency wallets, and sensitive data. In many cases, the infection also includes XWorm, a remote access trojan that gives hackers deeper, ongoing control of the device.
Morphisec researchers point out that this attack is notably more sophisticated than typical drive-by downloads. It’s not just technical trickery—it’s psychological manipulation. It preys on trust, enthusiasm, and the desire to create.
Who’s Behind Noodlophile?
The malware, now circulating in cybercrime forums as part of Malware-as-a-Service (MaaS) offerings, is marketed under catchy names like “Get Cookie + Pass” packs—bundles designed for account takeovers. Linguistic traces and social media activity suggest the malware developer may be based in Vietnam. Interestingly, researchers also found these developers actively promoting the malware on Facebook, directly engaging with user comments.
Noodlophile Stealer communicates via Telegram bots, allowing the stolen data to be siphoned discreetly back to the attackers. This setup gives even low-level cybercriminals a turnkey way to harvest data—no technical expertise needed.
AI: The New Weapon of Choice
While these new AI tools themselves are groundbreaking, their appeal and novelty make them perfect camouflage for malicious intent.
Gone are the days of poorly spelled phishing emails.
Today’s cyber lures are beautifully designed, cleverly marketed, and deeply targeted. They blend into online communities with ease and exploit the very platforms we use to discover new tools.
Creators, freelancers, and digital marketers—those most likely to explore new AI utilities—are now squarely in the crosshairs. They’re being tricked not through greed, but through curiosity and ambition.
What Can You Do?
The lesson is clear: if it looks too good to be true, it probably is—especially in the AI space. Here are a few steps to protect yourself:
- Avoid downloading executables from unverified sources, even if they come from popular social media groups.
- Check file extensions carefully—“.mp4.exe” is not a video; it’s a trap.
- Be skeptical of viral AI tools that ask you to upload personal media before showing results.
- Stay updated on malware trends—especially as they adapt to emerging tech trends like AI.
The Final Word
Cybersecurity isn’t just about firewalls and antivirus software anymore. It’s about critical thinking in a digital world that blurs the line between real and fake. As AI continues to shape our online experiences, the need for digital vigilance has never been greater.
What’s being stolen isn’t just data—it’s trust.
And once that’s gone, the road back is a lot harder than any video you were promised.