By Chris Baraniuk
Technology of Business reporter

Publishedduration2 days agoshareSharenocloseShare pagelinkCopy linkAbout sharingRelated Topics

  • Coronavirus pandemic

image copyrightb hayes & nistimage captionFacial recognition systems are being adapted to work when people are wearing masks

Like many people who began to wear a face mask during the pandemic, Hassan Ugail quickly noticed one or two technical niggles.

His iPhone started having trouble recognising his face, which is how he preferred to unlock the device when out and about.

"I kind of have to take my mask off," says Prof Ugail, an expert in facial recognition at the University of Bradford. "I would rather it let me in by just looking at my eyes."

Coincidentally, research he conducted with one of his PhD students that was published last year had shown that half a face was enough for a specially trained facial recognition algorithm to work.

But out in the wild, some commercial systems that authenticate people via their faces were now stuttering thanks to the rise of masks.

The problem was highlighted in a July report from the National Institute of Standards and Technology (NIST) in the US. Researchers found the error rates of 89 different facial recognition systems they tested increased when the mouth and nose or bottom half of an individual's face was obscured – in some cases from less than 1% to as much as 50%.