源之原味

对于 youtube 最大的问题, 谷歌仍然没有答案

 

抽象
At a House Judiciary Committee hearing today, Sundar Pichai dodged questions about YouTube’s ongoing moderation problem

这篇文章来自theverge.com。原文网址是: https://www.theverge.com/2018/12/11/18136525/google-ceo-sundar-pichai-youtube-moderation-hearing-house-judiciary-committee

以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.

From Pizzagate to QAnon, YouTube has a serious problem with conspiracy theories. The basic moderation problem has splintered into a number of different scandals over the past two years, including disturbing children’s content, terrorism videos, white supremacy dog whistling, and radicalization via YouTube’s algorithm. But when confronted on those issues at a House Judiciary hearing today, Pichai offered the same response that YouTube CEO Susan Wojcicki has offered in the past: there is no immediate cure.

The most vigorous questions came from Rep. Jamie Raskin (D-MD), who confronted Pichai over a 华盛顿邮报 report on conspiracy videos that plague YouTube. These videos, which he summarized as “videos claiming politicians, celebrities, and other elite figures were sexually abusing or consuming the remains of children,” are part of a conspiracy theory that suggests Hillary Clinton is killing young girls in satanic rituals.

“Is your basic position that this is something you want to try to do something about,” Raskin asked, “but there’s just an avalanche of such material, and there’s nothing really to be done, so it should just be buyer beware when you go on YouTube?”

Pichai didn’t endorse that position exactly, but he didn’t give much reason to expect improvement either. “This is an area we acknowledge there’s more work to be done,” the Google CEO told Raskin. “We have to look at it on a video-by-video basis, and we have clearly stated policies, so we’d have to look at whether a specific video violates those policies.”

Pichai added that YouTube and Google have “clear policies against hate speech,” which includes “things that could inspire harm or violence,” but he added that it isn’t enough.

“With our growth comes more responsibility, and we’re committed to doing better as we invest more in that area,” Pichai said.

YouTube has made some changes to address misinformation, although results have been mixed. The platform started adding “authoritative” context to its search results for breaking news stories earlier this year. This allows news organizations like CNN and 纽约时报 to populate first when people were looking for information on a major news event rather than conspiracy theories. A YouTube representative told 的边缘 in late October that fighting misinformation was key to the team’s work, but finding the right balance between different voices and credible news sources was also important. The platform launched information panels in July, for example, which the team hopes will help people make their own judgments about the information they see in videos they watch.

But those efforts don’t stop hateful content from appearing on the platform. Videos that are flagged, which account for a large percentage of removed content, according to Pichai, are then reviewed by the algorithm or a human moderator. If the videos violate the company’s community guidelines and terms of service, they’re removed. If creators feel that their video was removed erroneously, they can appeal to YouTube’s trust and safety team.

Like Facebook, YouTube relies heavily on machine learning algorithms to sift through the billions of videos on the platform. “Our goal is to be able to give you the most relevant information,” Wojcicki said in October. “We want those top results to be right.”

But many videos still slip through under the watchful eyes of YouTube’s trust and safety team. One video, uploaded by creator E;R two years ago, has come under scrutiny recently after the creator’s channel was given a shoutout by PewDiePie. The video, which PewDiePie said he didn’t watch, includes four uninterrupted minutes of a speech from Adolf Hitler and other anti-Semitic comments. It was up for two years before YouTube removed it, and that only happened thanks to reporting from major outlets.

Pichai’s comments acknowledge that YouTube has an issue on its platform that can’t be ignored. His testimony also accepts that it’s an area YouTube can only invest more time into and hopefully figure out a solution. There isn’t one just yet.

“We do strive to do it for the volume of content we get,” Pichai said. “We get around 400 hours of video every minute, but it’s our responsibility to make sure YouTube is a platform for free expression, but it’s also responsible and contributes positively to society.”

As it stands, YouTube’s community guidelines regarding hate speech and hateful content are somewhat vague. Videos that promote “violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes,” violate the company’s terms of service, but that doesn’t include videos that pose a question. A video called “What is Frazzled.rip?” or “What is Q Anon” would be allowed on the platform, even though the recommended videos just off to the side create a dangerous rabbit hole for people to fall into.

Understanding how to fix this problem means understanding how people are using — and abusing — the system. It’s something that neither Pichai nor Wojcicki have an answer to right now.

Leave A Reply

Your email address will not be published.