iProov, a specialist in online facial biometric verification and authentication technology, said a 150% increase in cyberattacks using emulators posing as mobile devices shows how attack vectors arrive and scale very quickly.
What are Emulators?
Emulators are programs or devices that enable a computer system to behave like another device. Indeed, a “proliferation” of low-cost, easy-to-use tools has enabled threat actors to launch “advanced, scalable attacks with limited technical skill,” company officials said.
Biometric authentication is more secure than traditional methods of authentication, such as passwords. It is also convenient and works in a number of public settings, including airports, banks and hospitals. Privacy is one drawback, as information can be stolen or forged and the systems themselves can be hacked.
iProov's Report Examined
Some key findings from iProov’s Biometric Threat Landscape 2023 Report, include:
- Digital injection attacks, in which a cyber attacker bypasses a camera feed to trick a system with synthetic imagery and video recordings, occurred five times more frequently than persistent presentation attacks (i.e., showing a photo or mask to a system) online in 2022.
- Attacks on mobile platforms using software called emulators, which mimic the behavior of mobile devices, are increasing. There was a 149% increase in threat actors targeting mobile platforms in the second half of the year as compared to the first.
- Attacks using deepfake technology used by hackers to create 3D videos that trick systems into thinking the real consumer is trying to authenticate were more prevalent than last year.
- Novel face swaps, which combine existing video or live streams and superimpose another identity over the original feeds in real-time, appeared for the first time in the first half of 2022. The tactic grew by 295% for the second half of the year, partly because they are challenging to detect, iProov said.
Commenting on the research, Andrew Bud, founder and chief executive of iProov, said:
“In 2020, we warned of the emerging threat of deepfakes being digitally injected into camera feeds to impersonate an individual’s biometric verification process. This report proves that deepfake attacks are now a reality. Even with advanced machine-learning computer vision, systems are struggling to keep up in detecting and triaging these evolving attacks. Any organization that isn’t protecting its system against these threats needs to do so urgently, especially in high-risk identity verification scenarios.”