The facial recognition software used by police isn't always accurate — particularly if you happen to be non-white, and even more particularly if you are African-American. But the bigger surprise is that no one really knows how pervasive such problems really are, because few such systems are tested for bias.
↧