Google promises 10,000 staff to detect and eliminate extremist content
Google has announced that it will assign more than 10,000 staff to detecting and deleting extremist content on YouTube in 2018.
YouTube head executive Susan Wojcicki said that the site can have its dark side and it can spread harassment, hate and harm: "There can be another, more troubling, side of YouTube's openness. I've seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm."
In September of 2017, UK prime minister Theresa May requested tech firms take down all the terrorist material and end "safe spaces" for terrorists online. Wokcicki added that, since June, over 150,000 videos with extremist content had been taken down from the site and nearly two million videos reviewed.
The site uses human moderators but it's also developing a machine learning technology that will automatically detect violent or inappropriate content. The head executive said: "Human reviewers remain essential to both removing content and training machine learning systems because human judgement is critical to making contextualised decisions on content”.
She added: "We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018."
Back in June, the site released a four-step strategy to detect and eliminate this type of content from YouTube:
- Develop the use of machine learning to detect these videos automatically and be more efficient.
- Working with expert groups like for example the Anti Defamation League
- Tough treatment on videos that could be violent or that contain hate speech
- Redirecting the people who look for the extremest videos to ready-made playlists that sends out the opposite message and discredits the extremists opinions.
Google was also coming down hard on hateful online comments, fighting to shut down these profiles and save children from predators and bullies.