Home United States USA — software YouTube removed 8M videos in 3 months, with machines doing most of...

YouTube removed 8M videos in 3 months, with machines doing most of the work

184
0
SHARE

Anyone can upload anything to YouTube, but the company’s computers are getting better at fishing out objectionable content. On Monday, YouTube revealed that it removed 8.3 million videos in the last three months of 2017, with its machine-based systems doing most of work.
YouTube has been having a torrid time of late with a number of high-profile brands pulling their ads from the streaming service after discovering some were being run alongside extreme content.
To reassure advertisers and deter the interest of regulators, YouTube recently decided to begin posting a quarterly Community Guidelines Enforcement Report highlighting its efforts to purge the site of content that breaches its terms of service.
The first of these reports, posted on Monday, reveals that the Google-owned company wiped 8.3 million videos from its servers between October and December, 2017. YouTube said the majority of the videos were spam or contained sexual content. Others featured abusive, violent, or terrorist-related material.
The data, which doesn’t include content deleted for copyright or legal reasons, shows that YouTube’s automated tools are now doing most of the work, deleting the majority of the unsuitable videos. Interestingly, YouTube noted that of the 6.7 million videos pulled up by its machine-based technology, 76 percent were removed before they received a single view.
The report also highlighted how its technology is helping to speed up identification and removal of unsuitable content: “At the beginning of 2017,8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views. We introduced machine-learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.”
Humans still play a role in keeping the service free of objectionable content. Just over a million of the deleted videos were flagged by a “trusted individual,” while “YouTube users” flagged another 0.4 million. A small number of videos were flagged by non-governmental organizations and government agencies. Flagged videos are not automatically deleted — some will be deemed OK by YouTube’s review system, while others will be slapped with an age-restriction notice.
YouTube also employs its own human reviewers who look at suspect content passed on by its machine-based system. The company is working to create a team of 10,000 reviewers by the end of 2018, and is also hiring full-time specialists with expertise in violent extremism, counterterrorism, and human rights. Regional expert teams are also being expanded, the company said.
The number of videos removed by YouTube in just three months may surprise some, though it’s also worth considering that the site has more than 400 hours of content uploaded each and every minute .
YouTube clearly still faces many challenges in cleaning up its service, but it insists it’s committed to ensuring it “remains a vibrant community with strong systems to remove violative content,” adding that future reports should demonstrate ongoing improvements regarding its procedures and technology for getting rid of unsuitable material.

Continue reading...