TheJakartaPost

Please Update your browser

Your browser is out of date, and may not be compatible with our website. A list of the most popular web browsers can be found below.
Just click on the icons to get to the download page.

Jakarta Post

YouTube removed over 8 million videos in just 3 months

News Desk (Agence France-Presse)
Thu, April 26, 2018

Share This Article

Change Size

YouTube removed over 8 million videos in just 3 months More than half of all “violent extremism” videos have fewer than 10 views, whereas in the beginning of 2017 that number was eight percent. (Shutterstock/Alexey Boldin)

Y

ouTube has released a transparency report that shows a high number of inappropriate content being uploaded, however automated flagging is speeding up the removal process.

It’s easy for the internet to get cluttered with spam and inappropriate content, which means major clean-up for big internet companies who receive massive amounts of uploads and traffic. YouTube is one of them — the company has had eight million videos removed in three months.

Seeking more transparency and less spam, Google, which purchased YouTube in 2006, has published an update regarding the ongoing removal of content that violates its policy. The company has released astonishing figures, along with a quarterly report on how Community Guidelines are being enforced.

The eight million videos that have been removed from the popular video sharing platform were “mostly spam or people attempting to upload adult content,” according to Google, “and represent a fraction of a percent of YouTube’s total views during this time period.”

Read also: Stimulate your mind through educational YouTube videos

Machines were the first to flag 6.7 million videos and of those, 76% were quickly removed before they received a single view. These machines are allowing the company to flag content at scale and they claim the technology is paying off in terms of high speed removals across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).

More than half of all “violent extremism” videos have fewer than 10 views, whereas in the beginning of 2017 that number was eight percent.

Although the deployment of machines may suggest a lesser need for humans, that has not been the case for YouTube. Their systems supposedly rely on human review, and thus the company has been busy hiring.

“At YouTube, we’ve staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams,” stated the company’s official blog.

As for this year’s goals, the brand is committed to bring the total number of people working on addressing violent content to the grand total of 10,000 across Google. Furthermore, there are plans to refine reporting systems and add additional data, including data on comments, speed of removal and policy removal reasons.

For anyone interested in reviewing the numbers, here is the transparency report.

Your Opinion Matters

Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.

Enter at least 30 characters
0 / 30

Thank You

Thank you for sharing your thoughts. We appreciate your feedback.