CNET -------- YouTube took down 8.3 million videos in the last three months of 2017, responding to criticism it's slow to address inappropriate content on its site. The majority of those videos were spam or attempts to upload adult content, the video-sharing site made the revelation Monday in its first quarterly moderation report. More than 80 percent of the videos removed were identified by machines rather than humans, highlighting the company's growing dependence on machine learning to cut down on content that violates the video-sharing site's policies. To learn more click on the picture below to read the article.
Share on Facebook
Share on Twitter
I'm busy working on my blog posts. Watch this space!