Skip to main content

Like many types of crime, fraud has evolved with technology. While fraud is by no means a new concept, online fraud is being exploited across a range of sectors from insurance claims and law enforcement to recruitment. Instead of submitting physically altered documents to commit fraud, generative AI can be used to edit images to make it appear a car is damaged or used to create synthetic people for scams.

One of the most advanced, and latest, generative AI tactics to commit fraud is digital injection attacks. This crime injects replayed footage or deepfakes into a call to deceive identity checks or the other participants of the call. This can involve copying another person’s likeness or creating an entirely false persona using generative AI.

In this article we examine how digital injection attacks are used in fraud, why they are so dangerous for investigators and actions to counter the threat of digital injection attacks.

 

What is a Digital Injection Attack?

Digital injection attacks are a powerful cyber threat, as there are multiple ways criminals can inject an audio or video feed to spoof biometric verification or other members of the call. This can be accomplished by using a fake camera feed, or injecting media into the data stream between the user and an organisation.

While digital injection attacks can involve deepfake material, especially to imitate ‘liveness’, not all attacks are created with deepfakes. Recorded and replayed footage can be used in digital injection attacks, especially where a user uses their voice to access an account. This can be recorded by criminals and replayed. Read more about how digital injection attacks work.

This issue has recently been highlighted by OpenAI CEO, Sam Altman, in a CNN article where he says:

““A thing that terrifies me is apparently there are still some financial institutions that will accept a voice print as authentication for you to move a lot of money or do something else — you say a challenge phrase, and they just do it,” Altman said. “That is a crazy thing to still be doing… AI has fully defeated most of the ways that people authenticate currently, other than passwords.””

 

Four Real-World Scenarios of Digital Injection Attack

Now we have a clear definition of this cyber threat, here are four real-world use cases of digital injection attacks.

01) Insurance Claims Fraud

With the recent boom in hyper-realistic AI-generated imagery/videos, it’s no surprise this could shake up the claims fraud world. Criminals are using digital injection attacks to fake injuries or suggest there is damage to houses and vehicles, which is in fact generated by AI.

02) Finance

As mentioned by Sam Altman in the quote above, criminals can copy and synthetically generate users’ biometric information, including their voice, to gain access to financial accounts. Know Your Customer (KYC) forms part of key regulations and guidelines in the financial services industry. This requires organisations to know and verify who they’re communicating with. Whilst biometrics seemed at one point like the height of technology, something that was unique to individuals, is now under threat because of digital injection attacks.

03) Law Enforcement and Justice

Ever since the pandemic, operating remotely has become a common occurrence in both law enforcement and justice. This could involve conducting interviews, gathering testimony or hosting remote trial hearings. However, now that digital injection attacks are becoming more sophisticated, how can we trust that the person on the other end of the call is who they say they are?

04) Recruitment

A growing concern in 2025, specifically in recruitment, is the number of fake candidates that are being generated using deepfakes. While criminals can create entirely fictitious people, some are using real-life people for employment, committing identity theft in the process.

 

Why Digital Injection Attacks Are So Dangerous

Digital injection attacks have the potential to become a global cyber threat, completely altering the way we communicate and the way we verify who we are virtually. As previously hinted at, biometric verification was once seen as one of the most secure methods of proving your identity. Now anyone with access to a web browser can access online AI tools to generate videos and audio which replicate a real person. If a criminal has access to a person’s social media account, or if a person’s data is accessible via a data breach, the criminal can use this information to generate deepfakes.

In relation to investigations, this completely undermines trust in evidence. This is especially true of remote communication taking place over video or audio call, as a criminal can use spoofing techniques to change their voice or appearance.

With each passing day, the technology to create digital injection attacks is only becoming more sophisticated. This will make it more difficult to detect these attacks without the right technology.

 

What Investigators and Organisations Should Do Now

First and foremost, it is important to conduct a risk assessment of potential vulnerabilities where digital injection attacks can be used. Whether it’s over a remote video call or an audio call, it’s important to know and highlight where you are at risk.

Next, you can update protocols and guidelines for your organisation to include checks, and general awareness of the threat of digital injection attacks. This can be a challenge, but there are ways to prove that you are speaking with the right person such as Knowledge Based Authentication or Multi Factor Authentication.

Finally, there are emerging tools and technologies which have been designed to prove the provenance of video and audio calls, as well as evidence capture and upload. MeaConnexus is a secure, tamper evident interview platform which can prove when a person has tampered with the recorded interview.

 

Fighting Fraud in the Age of Synthetic Media

Although this article highlights this evolving threat, digital injection attacks won’t replace traditional fraud tactics. However, it’s something that in the coming years, especially in the 2030s, has the potential to completely disrupt how we communicate and authenticate ourselves remotely.

By taking the time now to consider the actions highlighted in this article, you can prevent reputational damage and fines or payouts from inaction.

Ultimately, the question remains: can you truly trust who you are on a call with?

About Mea Digital Evidence Integrity 

The Mea Digital Evidence Integrity suite of products has been developed by UK based consultancy, Issured Ltd. Benefitting from years of experience working in defence and security, Issured recognised the growing threat from digital disinformation and developed the Mea Digital Evidence Integrity Suite of products to ensure digital media can be trusted.
MeaConnexus is a secure investigative interview platform designed to protect the evidential integrity of the interview content. With features designed to support and improve effective investigations, MeaConnexus can be used anytime, anywhere and on any device, with no need to download any software.
MeaFuse has been designed to protect the authenticity and integrity of any digital media from the point of capture or creation anywhere in the world. Available on iOS, Android, Windows and MacOS MeaFuse digitally transforms the traditional chain of custody to ensure information is evidential.

Disclaimer and Copyright 

The information in this article has been created using multiple sources of information. This includes our own knowledge and expertise, external reports, news articles and websites.
We have not independently verified the sources in this article, and Issured Limited assume no responsibility for the accuracy of the sources.
This article is created for information and insight, not intended to be used or cited for advice.
All material produced in the article is copyrighted by Issured Limited.

Interested in Hearing More? 

To receive regular updates and insights from us, follow our social media accounts on LinkedIn for Mea Digital Evidence Integrity and Issured Limited.
Additionally, sign-up to our Mea Newsletter to receive product updates, industry insights and event information directly to your mailbox. Sign up here.
View our other articles and insights here.