Facial recognition software tends to be less accurate for—or biased against—Black and African American faces, according to research funded by the Society for Industrial/Organizational Psychology Foundation Such bias could impact automatic interview scores, thereby creating systematic disadvantages for racial minorities if these algorithms are adopted by organizations.
Source: New feed 2