Facial Recognition Software is Adapting to Face Masks

Face-mask recognition is here. Let’s breakdown what it’s being used for, and what it could mean for personal privacy.

What Happened:

The advent of COVID-19 has dramatically increased the frequency and number of people wearing face masks. Initially, this threw off traditional algorithms used in facial recognition software. 

New detection methods, such as the update to Apple’s iOS 13.5, can detect when someone is wearing a mask to prompt them for their passcode instead. In theory, the machine algorithm only needs to detect that it is viewing a face and recognize a mask, rather than employing facial recognition software that can identify an individual. 

Why it Matters:

Pandemic Response

Many states have issued mask mandates in public spaces, especially when indoors or when social distancing is not possible, to reduce the spread of COVID-19. 

Companies creating face-mask recognition software, such as Tryolabs, claim that they hope their technology can help inform policy decisions and spread awareness about mask-wearing and other public health strategies. 

Limited Data Privacy Regulation

Combining facial recognition with face-mask detection technologies would further advance its reach on compiling a database of faces and biometric information.  

The U.S. lacks comprehensive regulation regarding data privacy and use of facial recognition software. Most laws focus on consumer protection, such as Federal Trade Commission Act’s stipulations for privacy policies and data security and the Fair Credit Reporting Act

Only a handful of cities have banned or restricted facial recognition in their communities. Yet, only Portland, Oregon, has restricted its use in private businesses, whereas the others have only banned public agencies from doing so. 

Racial Bias

Facial recognition software has already demonstrated gender and racial bias, and these risks can apply to face-mask recognition too. Some cases have shown that facial recognition can be up to 34 percent less accurate for black women than white men. 

Mistakes from facial recognition have real-world consequences, including two separate wrongful arrests of black men in Detroit. 

The first, Robert Williams, was charged and held overnight for shoplifting based solely on facial recognition identification run on fuzzy security footage. The second,  Michael Oliver, was charged as the suspected high school student who stole a phone from a teacher’s vehicle despite him being 25 years old. 

Combining this bias with application against protestors presents an even greater risk of police harassment and wrongful arrests. 


connect