这篇文章来自 wired.com。原始 url 是: https://www.wired.com/story/this-startups-test-shows-how-harassment-targets-women-online/
以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.
Julia Enthoven didn’t think much of using her real name and photo in a chat feature on Kapwing, the website she cofounded last year. The site launched its online video-editing tools in October and has garnered 64,000 visits since. From the beginning, Enthoven’s team wanted feedback from users about bugs and feature requests, so they deployed a messaging widget from a company called Drift. Anyone visiting Kapwing’s website saw a chat box on the bottom corner of the page. If they clicked it, a message from Julia, alongside a picture of her face, popped up, encouraging them to ask questions and give feedback.
Almost immediately, the chat function became a vehicle for abuse. Enthoven, who spent two years as a product manager at Google before starting Kapwing, says that around twice a day, someone would respond with either rude comments (aggressive threats or name calling), heckling and harassment (sexual jokes, asking her out, suggestive photos and emojis, or comments on her looks), or trolling (offensive and sarcastic internet speak). Through the weeks, she tried to ignore the hate, alternating, she wrote, “between my social media feeds full of #MeToo statuses and the Drift app full of heckling.” That worked until she hit a breaking point one morning, when seven rude messages arrived before she got to work.
After that, Enthoven launched an experiment. She periodically changed the name and avatar for the messaging widget. For three months, she tracked the rate of harassment on 2,100 customer-service messages and saw firsthand what many larger, less personal studies have shown: There’s a pattern of who gets harassed the most online, with women receiving by far the most abuse. Enthoven found that the surest way to avoid harassment online is to be a man. If that’s not possible, be an androgynous cat.
The experiment also highlighted a tricky issue for young startups, where every user counts. Blocking the small group of bad actors could discourage essential feedback and bug reporting. Further, it’s hard to create or enforce community norms favoring civil discourse in private, anonymous customer-service messages.
Enthoven began the experiment by swapping out her photo for an image of her cofounder, Eric Lu. She was surprised to see harassment drop to nearly zero. “Maybe I knew this in the back of my head, but it was still shocking to me that the effect was so dramatic,” she says. Over the course of a week, only one user sent a rude comment.
After that, Enthoven changed the chat box’s identity to a blonde model named “Rachel Gray.” In less than an hour, the harassment resumed and continued for three weeks at a rate 50 percent higher than the level than Enthoven’s photo attracted. “People asked her to go on dates, demanded that she share nude photos, and pleaded for all kinds of sexual favors,” Enthoven wrote on her blog. “People called her names, cursed her out, asked her where she lived, and threatened her and the website. Mean, lewd messages came in from all over the world.” Enthoven isn’t sure why Rachel garnered such heightened harassment, but speculates that it could be that her photo looks more casual and less professional.
After Rachel, Enthoven changed the image to the company’s cartoon logo of a cat with a generic name, Team Kapwing. The harassing messages went away, and the logo has remained on ever since. It’s not an ideal solution, but as a startup founder struggling to grow her fledgling company, she’d prefer to take the easiest route. Kapwing benefits too much from the helpful feedback—which makes up the majority of messages—to remove the chat feature because of a hateful minority. Banning users that send hateful messages doesn’t seem viable at this point, she notes. “In some ways, it makes me sad that it’s harder if I represented myself online, but I also think [using the cat image] is just one easy way to get around it,” she says.
Enthoven says Drift, the company behind the customer-service-chat technology, spoke to her about potential changes to its product that could help companies block harassment and abuse. A Drift representative did not respond to a request for a comment.
Enthoven notes that because Kapwing started as a site to create memes, a sizable portion of its users are teenage boys who spend their days posting under pseudonyms on Reddit and Twitter. Where mischievous kids may have once stuffed the fast-food restaurant’s comment box with inappropriate comment cards, they can now send hateful comments directly to a worker’s cell phone at an unprecedented scale. “Teenage boys are not that mature and they probably have a lot of this kind of talk in their lives, not that it excuses this behavior,” Enthoven says. She says the experiment confirmed what she already knew about the harassment of women online, but felt shocking all the same.
The problem, while not insurmountable, is just one of the “1,000 cuts” many women working in tech describe. “The path to success is slightly steeper and it’s a little discouraging to me for sure,” Enthoven says. But blogging about the situation encouraged her. Afterward, a number of women used Kapwing’s chat box to share their own experiences of customer-service harassment, telling her they felt relieved to know they weren’t alone.
Danielle Citron, law professor at the University of Maryland and the author of Hate Crimes in Cyberspace, says Enthoven’s experience lines up with research on both abuse aimed at call-center workers and on online misogyny and the harassment of women. The combination of user anonymity—which creates a lack of accountability for one’s actions—and the distance created by screens heightens the conditions for abuse, Citron says. “You can’t see their face or their expressions and you can’t internalize how they’re feeling, so empathy is somewhat out of the calculus,” she says.
Further, since so few online harassers have been prosecuted, Citron says it’s been difficult to study their motivations. In the case of Kapwing’s attractive model, “Rachel Gray,” Citron says the increased abuse likely stems from resentment. “The woman who would never sleep with them, they think, ‘I’m going to reduce you to nothing and treat you like an object,’” she says. “That’s really what online abuse does.”