Microsoft still has a contract with ICE to use its facial recognition technology
以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.
Microsoft says its facial recognition tools are getting better at identifying people with darker skin tones than before, according to a company blog post today. The error rates have been reduced by as much as 20 times for men and women with darker skin and by nine times for all women.
The company says it’s been training its AI tools with larger and more diverse datasets, which has led to the progress. “If we are training machine learning systems to mimic decisions made in a biased society, using data generated by that society, then those systems will necessarily reproduce its biases,” said Hanna Wallach, a Microsoft senior researcher, in the blog post.
In February, a report from MIT’s Media Lab tested the facial recognition systems from Microsoft, IBM, and China’s Megvii and found that up to 35 percent of darker-skinned women had their gender misidentified by the systems. The report only confirmed what many had suspected for years — facial recognition systems can be biased by limited datasets and other factors like systemic racism. Back in 2015, Google identified a software engineer’s black friends in a photo as “gorillas,” and had to apologize for the error.
Still, while Microsoft’s announcement today indicates a reduction in racial bias within its facial recognition system, if law enforcement get their hands on the improved facial recognition tool, it’s unclear how that might play out for people of color and whether that could exacerbate already problematic practices. Microsoft partners with the US Immigration and Customs Enforcement (ICE), and its facial recognition tool is offered to government agents as a resource. Microsoft CEO Satya Nadella clarified in a memo last week that “Microsoft is not working with the U.S. government on any projects related to separating children from their families at the border” but he didn’t address how facial recognition might play a role in benefitting ICE’s work.