AI voice cloning supercharges “Hi mom” text hoaxes that convince parents to send money to scammers
- Cybercriminals are using AI voice cloning to exploit trust, mimicking loved ones in emergencies to steal money via scams like the "Hi mom" fraud.
- Scammers can clone voices from just seconds of social media audio, making fake pleas for urgent financial help seem terrifyingly real.
- The "Hi mom" scam starts with a message from an unknown number, often using AI voice notes to impersonate a child in distress needing money fast.
- Experts warn these scams rely on psychological manipulation, pressuring victims to act quickly without verifying the caller’s identity.
- To protect yourself, pause before sending money, verify the sender through a known number, and report suspicious messages to authorities.
Experts are warning that cybercriminals have been weaponizing artificial intelligence to exploit the bonds of trust between family and friends and rake in significant profits.
A disturbing new wave of scams, known as the "Hi mom" fraud, is sweeping across messaging platforms like WhatsApp, using AI-generated voice clones and emotional manipulation to steal hundreds of thousands of dollars from unsuspecting victims. With fraudsters now able to mimic a loved one’s voice using just seconds of social media audio, experts warn that these deceptions are evolving at "breakneck speed," leaving even the savviest individuals at risk.
The anatomy of the scam
The scheme begins with a simple message: "Hi mom" or "Hi dad," often sent from an unfamiliar number. The sender claims to have lost their phone and been locked out of their bank account, urging the recipient to send money urgently, whether it is for rent, a new device to replace the lost one, or another fabricated emergency. What makes these scams particularly insidious is that they often also use AI voice cloning, which allows criminals to
replicate a loved one’s voice with chilling accuracy.
Jake Moore, global cybersecurity advisor at ESET, explains: "With such software, fraudsters can copy any voice found online and then they target their family members with voice notes that are convincing enough to make them fall for the scam." Santander’s data reveals that impersonating a son yields the highest success rate, followed by daughters and mothers.
AI deception and social engineering
The scam’s effectiveness lies in its psychological manipulation. Fraudsters mine social media for personal details, devising believable narratives to pressure victims into acting quickly. Chris Ainsley, Santander’s head of fraud risk management, notes, "We’re hearing of instances where AI voice impersonation technology is being used to create WhatsApp and SMS voice notes, making the scam seem ever more realistic."
Once trust is established, the scammer them pivots to demanding money, always directing the funds to an unfamiliar account. Moore warns, "Scammers are increasingly getting better at manipulating people into doing as they ask as the story can often sound convincing and legitimate." For example, the victim's "son" or "daughter" may claim to be locked out of their bank account because their bank sent a confirmation code to the phone that they allegedly lost.
How to protect yourself
Experts emphasize vigilance and verification. Key steps include:
- Stop and think: Pause before responding to unexpected requests for money.
- Verify the sender: Call the person directly using a known number, or request a voice note with a prearranged "family password" (avoiding obvious phrases).
- Report suspicious activity: Forward scam texts to the police or report them via WhatsApp.
Moore advises, "Never send money to any new account without doing your due diligence, even if the narrative sounds plausible."
As AI tools become more accessible,
these scams will only grow more sophisticated. The best defense is skepticism. Be sure to question unusual requests, verify identities, and remember that no legitimate plea for help will punish you for taking a moment to confirm its authenticity. In the digital age, trust must be earned, not assumed.
Sources for this article include:
TheGuardian.com
DailyMail.co.uk
Independent.co.uk