Scottsdale Mom to Testify in Congress Over AI Deepfake Voice Scams

In April, Jennifer DeStefano, a Scottsdale mother, shared her story of becoming the victim of an artificial intelligence (AI) scam that used her daughter’s cloned voice to simulate a kidnapping and demand ransom.

DeStefano’s intention was to raise public awareness about this new, terrifying breed of deepfake scams, but the response she received exceeded her expectations.

In april, jennifer destefano, a scottsdale mother, shared her story of becoming the victim of an artificial intelligence (ai) scam that used her daughter's cloned voice to simulate a kidnapping and demand ransom.

The story quickly went viral, prompting widespread discussions, social media exchanges, and interview requests.

The chilling incident still haunts DeStefano. During the scam, she received a call featuring what sounded like her 15-year-old daughter crying for help, followed by a man demanding a ransom for her safety.

DeStefano noted that the voice wasn’t just a single statement—it was her daughter’s inflection, her sobbing, and it held a conversation.

This detail underscored the sophistication of the AI technology used in the scam. Despite the realism, the scenario was completely fabricated. There was no kidnapping; the voice on the phone was a clone, artificially created.

The story captured the attention of Georgia Senator Jon Ossoff, who invited DeStefano to testify at a Senate subcommittee hearing.

To highlight the issue and announce the hearing, Senator Ossoff used a voice clone in a video posted on social media, noting, “Congress must properly understand this revolutionary technology and its relationship to human rights.”

When DeStefano shared her experience, she did not expect the number of responses from others who had undergone similar experiences. It proved an alarming revelation about the reach and impact of AI scams.

This issue highlights the advancements in AI technology, with Subbarao Kambhampati, a computer science professor specializing in AI at Arizona State University, stating that a voice clone can be created using as little as a three-second sample.

Ai deepfake voice scams

The situation also raises concerns about the future of cybersecurity. Tim Roemer, a cybersecurity expert and former director of the Arizona Department of Homeland Security, expressed his worry about the misuse of AI, stating, “AI is going to be utilized faster by bad actors than it is by the good actors protecting us.”

Roemer also emphasized that the transition to a world where AI is prevalent will require time and adjustment.

DeStefano’s experience has led her to ask critical questions about the control and misuse of AI technology.

She questions what can be done to ensure the technology is only used ethically, and she challenges the need for stricter criminal actions against those who use AI and deepfakes to terrorize people.

In her eyes, these acts aren’t just pranks but forms of terror that should be treated accordingly.

This is not the first time we have seen incidents with deep fakes:

For more information on this story, we suggest watching the video below and referring to the original report from Arizona’s Family News.

Gary Huestis Powerhouse Forensics

Gary Huestis

Gary Huestis is the Owner and Director of Powerhouse Forensics. Gary is a licensed Private Investigator, a Certified Data Recovery Professional (CDRP), and a Member of InfraGard. Gary has performed hundreds of forensic investigations on a large array of cases. Cases have included Intellectual Property Theft, Non-Compete Enforcement, Disputes in Mergers and Acquisitions, Identification of Data Centric Assets, Criminal Charges, and network damage assessment. Gary has been the lead investigator in over 200+ cases that have been before the courts. Gary's work has been featured in the New York Post and Fox News.
Skip to content