'You will panic': Georgia mother targeted by ransom attack describes near-heart attack moment
Debbie Shelton Moore picked up the phone and heard the sound of her frightened 22-year-old daughter's voice calling for help. Then a man took the phone and demanded ransom.
"The man had said, 'Your daughter's been kidnapped and we want $50,000,'" the Georgia mother told 11Alive, an Atlanta NBC affiliate. "Then they had her crying, like, 'Mom, mom' in the background."
"It was her voice and that's why I was totally freaking out," Shelton Moore continued.
HOW AI SCAMMERS USE SOCIAL MEDIA TO TARGET VICTIMS:
WATCH MORE FOX NEWS DIGITAL ORIGINALS HERE
But it wasn't her daughter on the phone. It was an AI-generated clone of her voice made to scam the worried mother.
As AI has grown in popularity, criminals around the country have been using the technology to create fake voices in blackmail scams. One in four people have experienced a scam involving a voice cloned through AI or know somebody that has, according to a May McAfee survey of 7,000 people.
In March, scammers in two incidents in Arizona demanded ransom after using AI-generated voices of family members.
In Shelton Moore's case, the number that appeared on her phone was from the same area code where her daughter, Lauren, lived. The panicked Georgia mother believed her daughter had just gotten in a car accident.
"My heart is beating and I'm shaking," Shelton Moore said of the moment when she received the ransom call. The voice of her daughter "was 100% believable," she continued, "enough to almost give me a heart attack from sheer panic."
Artificial intelligence tools can recreate a human voices from short clips of audio recordings. (Getty Images)
ARTIFICIAL INTELLIGENCE CAN DETECT ‘SEXTORTION’ BEFORE IT HAPPENS AND HELP FBI: EXPERT
"It was all just kind of a blur because all I was thinking was, 'How am I going to get my daughter? How in the world are we supposed to get him money?'" she told 11Alive.
One of the male voices on the phone told Shelton Moore they had her daughter in the back of a truck.
However, her husband, who works in cybersecurity, was able to FaceTime their daughter and bust up the scam.
Local police dispatched after the ransom call found that Lauren was safe, according to 11Alive.
Still, Shelton Moore recommended families create plans to beat scammers.
"Of course, when you hear their voice, you're not going to think clearly and you will panic," she said. "The whole family needs to have a safe word or safe phrase that they’re not going to forget under duress."
The Cobb County Sheriff's Office also recommended earlier this month that families have a safe word or phrase as AI scams circulate the county.
Jon Michael Raasch is an associate producer/writer with Fox News Digital Originals.