It’s all too easy to pick up false positives
以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.
This week, the 南华早报 报告 that Chinese companies are “mining data directly from workers’ brains” using wireless sensors in hats. The article is full of exciting quotes about employers using technology to monitor their workers’ emotions, but the reality is that these hats probably don’t work very well.
The report is low on details, so the claims should be taken with a grain of salt. Allegedly, workers wear safety hats or uniform hats that have wireless sensors inside. These sensors pick up brain activity to send to an algorithm. The algorithm then interprets the data into various emotional states — for example, depression, anxiety, and rage — so managers can use the information to better plan break times and help everyone be efficient. Hangzhou Zhongheng Electric has been using it on its 40,000 workers since 2014 and has boosted profits by $315 million since then, according to one official who then refused to say more.
If true, Hangzhou Zhongheng Electric’s program brings up tricky ethical questions about how much privacy employees should have. But even if Hangzhou was mandating these hats, they’re not doing much because the technology for advanced “emotional surveillance” is not here yet. It’s not for a lack of trying; researchers, startups, and the US military have long worked on similar projects. But big obstacles remain, and these sensors and gadgets can’t accurately “read minds."
The Chinese program uses the sensors to record electrical signals in the brain, which is called electroencephalography (EEG). In EEG, electrodes record the brain’s neurons through the scalp. This brain activity can show us patterns and tell us if there’s something abnormal, but there are plenty of limitations.
First and foremost, we still don’t know how to perfectly record brain signals, says Barry Giesbrecht, a professor of psychology at the University of California at Santa Barbara and director of its Attention Lab. “EEG sensors are not only sensitive to brain activity, but any kind of electrical activity,” says Giesbrecht. So blinking or clenching your jaw could lead to a false positive, as could movement and sweat.
In experiments, researchers have their subjects blink and do small movements so they can teach the device not to count those signals as brain signals. But in a real-world setting, and with thousands of workers, this type of calibration would be much harder to do. Plus, while medical EEG uses “wet” sensors applied with a gel, a device like the Chinese hat is dry, and dry sensors are more likely to pick up noise.
Second, the algorithm that interprets the data might not be very good. (It’s hard to know here because, again, the article is low on details.) And finally, while EEG can tell us whether someone is awake or asleep, complex emotional states like depression and anxiety are another story. We don’t yet have a sophisticated enough understanding of which patterns of brain activity match which emotional stages, adds Giesbrecht. It’s not far-fetched to believe that we will figure this out one day; after all, changes in emotion will show up as changes in the EEG. But until then, it’s hard to pick up and interpret.
So though the article is full of words like “emotional surveillance program” and “mental keyboard,” for now it makes more sense to be concerned about the ethical question of employers using it en masse than to worry about employers somehow reading minds.