美国陆军正在开发 AI, 可以识别黑暗中的面孔和通过墙壁
以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.
The US Army is developing a machine learning method for identifying faces from thermal imagery. Soon the American government will be able to film people from outside of buildings, using cameras that can see through walls in near-total darkness, and an AI will recognize the people in the images.
Army Research Laboratory (ARL) scientists Benjamin S. Riggan, Nathaniel J. Short, and Shuowen Hu recently released 一张白纸 detailing military efforts to develop a method for applying facial recognition technology to images taken using thermal imaging devices.
According to Riggan:
When using thermal cameras to capture facial imagery, the main challenge is that the captured thermal image must be matched against a watch list or gallery that only contains conventional visible imagery from known persons of interest.
Such devices are common, especially in military use. Aircraft such as the Apache helicopter and ground vehicles like Armored Personnel Carriers are equipped with thermal imaging cameras to detect people in low-visibility situations.
It’s relatively inexpensive to deploy surveillance cameras or handheld devices capable of thermal imaging, and the AI being developed by Army researchers doesn’t have to be baked into the camera, it can review post-mission footage.
With this new technology the Army can feed thermal imagery to a neural network which will tell, not just whether there are people in a given area, but who those people are.
The project is still relatively new, but the early indications are positive ones. Right now the scientists can match faces in thermal images to those of a small-ish database. But with further development, the system could be expanded to conduct real-time facial recognition, in the field, through barriers that would otherwise block vision. And it could match the unique facial features of anyone a thermal camera can see to images in any database the US government has access to.
The Army public affairs office indicates this technology is being developed to aid in battlefield detection and assisting soldiers with identifying enemy VIPs or individuals on government watch lists. And that all sounds quite important, but there’s plenty of reason for concern.
In the US, the state of science and technology regulation is largely void of leadership. The White House is still without a science adviser at the time of this writing.
Technology, however, waits for no administration.
Thanks to Apple, as we predicted last year, facial recognition tech has exploded into use around the globe. The Cupertino company cleared the path to social acceptance with its purportedly harmless Face ID feature last year. And once all the cool Apple fans started using it, the rest of us had no choice but to live with facial recognition software in public.
And for what it’s worth, they accept it in China. People seem content to eschew personal privacy in exchange for the peace of mind that comes with knowing petty criminals who’ve committed financial crimes are off the streets.
A continuing theme in the field of artificial intelligence research seems to be the dichotomy between personal privacy and living in a surveillance state. Are we willing to trust that the US military won’t use this new facial detection technology against its own citizens?