YouTube's first three-monthly "enforcement report" reveals the website deleted 8.3 million videos between October and December 2017 for breaching its community guidelines.

During the specific period, the company said it removed over 8 million videos from YouTube. The majority of these 8 million videos were mostly spam or people attempting to upload adult content - and represent a fraction of a percent of YouTube's total views during this time period.

6.7 million were first flagged for review by machines rather than humans, Youtube said. Of those 6.7 million videos, 76 percent were removed before they received a single view.

Most complaints came from India, the US or Brazil.

Youtube said that at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views. The company introduced machine learning flagging in June 2017. Now more than half of the videos Youtube removes for violent extremism have fewer than 10 views.

Last year Youtube committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018. The company has also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and has expanded regional expert teams.