Last month, Google steps it would take to help stamp out extremism and terrorism-related content online.
Today, the company is announcing a new initiative on YouTube to help guide people away from terrorism propaganda videos and steer them towards content that debunks extremist messaging and mythology.
It’s appropriately called the Redirect Method, because it essentially redirects users searching for specific keywords on YouTube to playlists featuring videos that counter extremist content.
Google’s Jigsaw team has been working with Moonshot CVE, a company that works with clients to counter “violent extremism,” to develop the Redirect Method.
Moonshot CVE was able to do extensive research into understanding how extremist groups use the internet and technology more broadly to spread their messaging before figuring out what tools would make sense to use on YouTube.
Google called this an “early” release of the Redirect Method and said it would continue to add more features going forward. Specifically, it wants YouTube to understand more search terms in languages besides English and use machine learning to automatically and dynamically update the keywords in its list.
The company also plans to work with “expert” non-governmental organizations (NGOs) to develop more videos to counter extremist messaging that’s aimed at people who are at different stages of the radicalization process.
It sounds like people who’ve gone deeper into extremist research would get different content than those who’ve just started exploring that world. Lastly, Google will continue working with Moonshot CVE to expand the Redirect Method in Europe (the company hasn’t said where exactly it works currently).
As for how Google plans to measure success, it simply says it’ll look at how much engagement the content gets from those being redirected to it. And this isn’t the only way Google is fighting extremist messaging on YouTube.
As part of last month’s announcements, it said that it was increasing the technology it uses to identify extremist and terrorism-related videos as well as increasing the number of human experts in YouTube’s “trusted flagger” program.
This may not be enough to stop terrorists from using YouTube as a recruitment platform, but it’s clear Google is taking the problem seriously.