After reviewing 189 pieces of software from 99 developers, which NIST identified as a majority of the industry, the researchers found that in one-to-one matching, which is normally used for verification, Asian and African American people were up to 100 times more likely to be misidentified than white men.
In one-to-many matching, used by law enforcement to identify people of interest, faces of African American women returned more false positives than other groups.
Government study finds racial, gender bias in facial recognition software | TheHill
hmmmm