YouTube Enlists ‘ Trusted Flaggers That Money Making To Take your Free Speech IF THEY DON'T LIKE YOU’ Police Videos

Untitled134

YouTube Enlists ‘ Trusted Flaggers That Money Making To Take your Free Speech IF THEY DON’T LIKE YOU’  Police Videos

Untitled136
With 100 hours of video uploaded to YouTube every minute, it’s impossible for the site’s employees to keep tabs on the mass of content continuously pouring in. While most of it is innocuous enough, some prohibited material slips through the net, including pornography, gratuitous violence, and abuse of various forms.In a bid to catch such material more quickly, Google-owned YouTube has hired around 200 individuals and organizations to flag any material they deem to be in contravention of the video-sharing site’s guidelines, the Wall Street Journal reported on Monday.A person with knowledge of the matter told the Journal that most of those in the “flagger program” are individuals, though some are said to be “government agencies or non-governmental organizations such as anti-hate and child-safety groups.”
While the site already allows users to report videos containing possibly suspect content, it’s likely the material highlighted by those in the flagger program is fast-tracked to the YouTube team for evaluation. In addition, the Web giant has reportedly set up the system so that the flaggers can highlight content “at scale,” instead of selecting one video at a time.

UK government flaggers

The Journal’s report comes a few days after the Financial Times said Google had already given a number of UK security officials “super flagger powers” in an effort to “contain the proliferation of jihadist material prompted by the war in Syria but are likely to stir concern among civil liberties campaigners.”

Google confirmed with the FT that a UK government agency is indeed working to search for particular types of material, with a government spokesperson adding that it was looking for content that might have violated the country’s Terrorism Act.

Commenting on the system, a spokesperson for YouTube said the site has a “zero–tolerance policy…towards content that incites violence,” adding, “Our community guidelines prohibit such content and our review teams respond to flagged videos around the clock, routinely removing videos that contain hate speech or incitement to commit violent acts.”

Untitled138Untitled139Untitled140

Google was keen to point out that the final decision about whether a video is removed is theirs and theirs alone.

“Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong,” a spokesperson for the Mountain View company told the Journal.

=================================

YouTube Enlists ‘Trusted Flaggers’ to Police Videos

Untitled136

GoogleGOOG -0.21% has given roughly 200 people and organizations, including a British police unit, the ability to “flag” up to 20 YouTube videos at once to be reviewed for violating the site’s guidelines.

The Financial Times last week reported that the U.K. Metropolitan Police’s Counter Terrorism Internet Referral Unit has been using its “super flagger” authority to seek reviews – and removal – of videos it considers extremist.

The news sparked concern that Google lets the U.K. government censor videos that it doesn’t like, and prompted Google to disclose more details about the program. Any user can ask for a video to reviewed. Participants in the super flagger program, begun as a pilot in 2012, can seek reviews of 20 videos at once.

A person familiar with the program said the vast majority of the 200 participants in the super flagger program are individuals who spend a lot of time flagging videos that may violate YouTube’s community guidelines. Fewer than 10 participants are government agencies or non-governmental organizations such as anti-hate and child-safety groups, the person added.

In either case, Google said it decides which videos are removed from YouTube. “Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong,” a Google spokesman said.

Google’s guidelines prohibit videos that incite people to commit violence, or that show animal abuse, drug abuse, under-age drinking or bomb making, among other topics. Google maintains a separate system to monitor for copyright infringement.

The news about the super flagger program comes as some governments pressure social-media sites that they blame for civil unrest. In Turkey, Prime Minister Tayyip Erdogan threatened this month to ban FacebookFB +0.39% and YouTube because they “encourage every kind of immorality and espionage for their own ends.”

Untitled137

British officials say they use the program to refer videos to YouTube that they believe have violated the U.K.’s Terrorism Act. These are then prioritized by YouTube, according to Sarah Buxton, a spokeswoman at the U.K. Home Office.

“YouTube may choose to remove legal extremist content if it breaches their terms and conditions,” she added.

Google was not pressured to let the U.K.’s counter-terrorism unit into the program, the person familiar with the program explained. Instead, the government agency showed an interest in YouTube’s guidelines and spotted videos that violated the rules, the person added.

More than 90% of the videos identified by super flaggers are either removed for violating guidelines, or restricted as not appropriate for younger users, the person familiar with the program said. That’s a far higher percentage than regular users who occasionally flag dubious content, the person said.

 

 

%d bloggers like this: