YouTube has for the first time published statistics about the total number of videos removed from its website, highlighting its continuing struggle to remove inappropriate content quickly enough to satisfy advertisers and regulators.
The video-sharing platform has faced a backlash over the past year following controversies over the publication of terrorist content, hate speech and sexually explicit videos of children — frequently alongside advertisements from well-known brands.
Figures published on Tuesday showed that the Google-owned group took down 8.3m such videos in the three months to December 2017. More than 80 per cent of these were identified by machines rather than humans — highlighting the group’s reliance on machine-learning algorithms to identify such videos.