Alphabet Inc’s Google will implement more measures to identify and remove terrorist or violent extremist content on its video sharing platform YouTube, Kent Walker, Google's general counsel said in a blog post on Sunday.
"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now," Walker wrote.
Google plans to increase its use of technology that identifies extremist and terrorism-related videos, and is increasing the number of independent experts participating in YouTube's Trusted Flagger program.
The company also announced it's taking a "tougher stance" on videos that don't clearly violate their policies, and will make it more difficult for users to find videos containing "inflammatory religious or supremacist content."
In its final step, YouTube is working with Jigsaw to implement the "Redirect Method" across Europe, which redirects potential Islamic State recruits toward anti-terrorist videos in an effort to sway them not to join the terrorist group.
Google already has thousands of people worldwide who "review and counter abuse" on its platforms, and engineers have developed technology to prevent re-uploads of known terrorist content. The platform has also invested in systems that flag new videos that should be removed, and developed partnerships with various stakeholders, including expert groups and counter-extremism agencies, to "inform and strengthen our efforts."