Can a Deepfake be the Achilles Heel of iPhone Security?



Deepfake

Having a face ID locking system for your phone is the safest way of keeping it secure. Or you think so. Security experts attending a black hat convention in Las Vegas in 2019 bypassed iPhone ID in just 120 seconds. All that researchers needed was a pair of spectacles, a tape, and, of course, a sleeping or unconscious iPhone user. They could achieve this because of the flaw they found in the liveness detection function of the biometric authentication system that is used by Apple for unlocking an iPhone. This was a real WTF moment at the conference. Well, breaking into phone security is not new.

There are many other instances when phone security was hacked using 3D-printed faces. In a series of ‘pretend attacks’ carried out by Sensity AI, a startup company focused on tackling identity fraud, used scanned images of similar images of a person to breach into his phone security. And it worked. Now, with deep fakes turning upside down the belief ‘Seeing is believing’, phones are not safe from unauthentic access.

Hold on. If you are an iPhone user, you can take a sigh of relief. Deepfakes seem to take not so well with the iPhone thanks to the deep sensor technology which the iPhone uses, along with its camera, to look at the face. Here is how it works: the iPhone facial recognition system comes with an infrared dot projector, which when covers the face with invisible points of light, and an infrared camera to form the 3D structure of the face. But remember, it is still vulnerable to 3D masks, so there is no reason to leave your iPhone alone.

The post Can a Deepfake be the Achilles Heel of iPhone Security? appeared first on .



Source link