Home Security Swindler’s List: Online Fraudsters Easily Can Steal Your Biometrics

Swindler’s List: Online Fraudsters Easily Can Steal Your Biometrics

No human is identical to another. Surprisingly, it makes us vulnerable to the plight of biometric attacks — let’s see how they are orchestrated.   

Real Body Snatchers

E4’s Misfits once farsightedly predicted that body hijacking could be possible. To orchestrate it, a hijacker would just need you to use some technological doohickey that you never let go of. The sci-fi plot has become reality, it seems.

Our biometrics are exclusively tied to our bodies. And paradoxically, it makes us unsafe in the face of biometric fraud. If somebody sets their mind to mimic your fingerprints or the collagen-based patterns of your iris, they have a chance to succeed. 

Virtually all your key biometrics are stored on your phone’s memory. Or shared via countless social media. In the context antispoofing technology, we have 4 critical weaknesses: face, voice, fingerprints, and iris — each one can turn into an Achilles heel at any moment. 

Let’s check the key attack scenarios that perps mostly stick to.

Trojan veteran Achilles had one weakness in his body We have at least four
Picture: Trojan veteran Achilles had one weakness in his body. We have at least four 

1. Face theft

Perhaps, it’s the easiest part to fabricate. Even if you don’t swamp your Instagram with hundreds of selfies, one or two videos featuring your visage or an ID card with your photo can provide ample source data for further machinations.

If the swindlers are lazy, untrained or stingy, they will print your photo. Or replay it from a mobile phone hoping that the verification system believes that it’s you. Luckily, this attack method is old-fashioned: today’s AI-powered security solutions will give them a disdainful smirk instead of authentication.

Creative con artists may craft a realistic silicone mask (like Kamenya Omote studio does) with your likeness via 3D printing. In some cases they apply extra makeup, glue a fake beard or mustache, put on a toupee and even attach a hidden heat emitter to give the mask a natural face temperature!   

Things get worse when deep learning makes a cameo. With an Artificial Neural Network (ANN), that can include such components as a motion estimator, fraudsters can sculpt a living mask of your face. And it can be worn in real time during a live call or an injection attack. 

Injection attack implies that perps have intercepted the video stream of your phone remotely. If they do this, they can inject whatever visual data they want, including the puppeteered copy of your face. So never install mobile apps from dodgy places.

Also check: Common social media privacy and security concerns

2. Reprinted fingerprints

It’s fairly easy to steal a face. But how is it possible with fingers? People don’t usually photograph their finger pads for all to see. Well, that’s when digital crooks get creative.

First, they have social engineering in their arsenal. If your fingerprints are a key to some treasury of valuable data and whatnot, they won’t hesitate to approach you in real life. Hold a scotch glass for a moment or enter your number on a pretty stranger’s phone and voilà: your fingerprints are ready to be lifted.

But this theft can be orchestrated remotely too. The textbook tactic is to masquerade a fake app as a trustworthy one. Once it’s on your phone, you will be politely requested to tap the touchscreen here and there. Especially in the region where the scanner is located.

When you do so, the app will capture your finger ridge geometry, and it will be sent over to the hacker’s HQ. After that a number of attack scenarios is possible: from making a gelatin-based copy of your finger pad to hosting an injection attack.

By the way, sometimes fraudsters don’t even need your physical fingerprints. A high-class scheme involves recreating a person’s fingerprints by a bunch of their hand photos. Sounds preposterous? Ursula von der Leyen doesn’t think so.

3. Iris mimicking      

Iris sample can be obtained in a fingersnap. If you have at least one clear photo of your face uploaded somewhere — your iris can be plagiarized. And even if you prefer to retain online anonymity, fraudsters can employ their IRL skills and snap a pic of you at a point-blank range. 

The rest is a matter of technology. The most canonical method is to craft an eye lens that will meticulously copy color and patterns formed by the melatonin granules in your iris. If a verification system is primitive and detects only these patterns, ignoring other physical properties of a living eye, the scoundrel will get a green light.

If a lens cannot be applied, then a glass eye can enter the stage. Used for cosmetic purposes by the likes of Black Bart, today this prosthesis can fool an iris scanner due to masterly painting and coloring. Ocular prosthetics have made a giant leap: it takes just a few hours to finalize a fake realistic eye from scratch.

Peter Falk (Columbo) had a glass eye since he was three and no one ever seemed to notice
Picture: Peter Falk (Columbo) had a glass eye since he was three and no one ever seemed to notice

4. Voice hijacking

Your voice isn’t safe either. What would be a baffling concept just 10 years ago is a scary reality today: fraudsters can replicate your voice like a rose-ringed parakeet. Again this is achievable through deep learning. 

What’s worse, voice-cloning tools are freely available online: Resemble, Descript, CyberVoice, and others. That means, a potential criminal doesn’t need to be a Python whizkid or lock an AI engineer in their basement to steal your voice. 

Anyone can do this provided they have enough training material: voice messages, audios extracted from vids or recorded phone calls. Currently, it’s a №1 deepfake-related ploy used by scammers. 

You can’t see who you talk to. But their voice is so bloody familiar! This is because AI has learned to pick up intonations, accents, timbre, and other quirks native to our speech. The end result is so realistic that AI even hosted Joe Rogan’s podcast In his voice. (He wasn’t fired probably because it was still cheaper to pay him.) So watch or rather listen out — a voice that you know perfectly well can be a frightfully real replica. 

Don’t Believe What You See

The future has come and AI excels at the human-imitating pantomime. But there’s a way to prevent it — antispoofing can sabotage even the smartest scam tactics. Hurry to learn more!