After pressure from the government, YouTube has backed down on its stance that the Google-owned company isn’t responsible for user content. The site has removed a trove of videos with talks from Anwar al-Awlaki, an American-born Muslim cleric who became radicalized while he lived in London.
How many videos were removed?
Awlaki was put on the terrorist kill list under President Barack Obama and was killed in a drone strike along with three other Al Qaeda members in Yemen in 2011. After his death, his followers celebrated him as a martyr and posted more and more of his videos on YouTube, making him the internet’s top English-language jihadist recruiter.
This fall, the site held more than 70,000 videos with Awlaki’s work; after YouTube’s purge, searching his name only brings up around 18,000 videos, most of which deal with news reports about his death and other tangential material.
Can people simply re-upload them?
In theory, no. YouTube says that human reviewers must flag videos to start the process, but once they’re banned, a digital function will automatically delete more uploads of the same clip.
It’s great that Google and YouTube are finally taking steps to stop terrorist rhetoric from spreading on the video platform, but they must continue to be held accountable.
“We need to keep an eye on this,” Glenn said.
EDITOR'S NOTE: This article provided courtesy of TheBlaze.
BULENT KILIC/AFP/Getty Images
This is a rush transcript and may contain errors.
GLENN: Okay. I’m concerned about Google and YouTube because Google is becoming people’s news source. They’ve just hired 1,000 reporters and journalists. And — and this isn’t necessarily good. Because they’re starting to control everything.
YouTube is owned by Google. And it really needs to get its priorities straight. And we need to keep an eye on this.
The company is finally taking steps to block the late Anwar al-Awlaki’s videos that preach jihad against Americans. They have received complaints from over a decade. But YouTube is taking this major recruitment tool for terrorists and getting rid of it, which is a good thing.
But why are they now doing it? Why now?
Facebook, Twitter, and Google, which is YouTube’s parent company, got called into the principal’s office last week on Capitol Hill, where they were reprimanded for allowing Russia to exploit their services to mess with the US election.
But the problem of jihadist videos on YouTube has been around for much longer. This jihad propaganda influenced the terrorists who attacked Fort Hood, the Boston Marathon, San Bernardino, Orlando, and many of the attacks in Europe.
Six years after al-Awlaki was killed in a US drone strike in Yemen, his videos remained the top English language jihadist recruitment tool. A few weeks ago, a search turned up 70,000 results on YouTube. Seventy thousand videos.
Now, it’s only 18,600 videos. And most of them supposedly are just news reports and commentary about him. No longer his rants about killing Americans.
YouTube is using its video finger printing technology to find and block al-Awlaki’s videos. This could be a major turning point in policy, because YouTube and other social media companies have usually argued that they’re just neutral platforms. And no responsibility at all for what anybody posts.
Well, okay. But are you consistent? For a technology company, YouTube was seriously slow to get to this point.
They made their first public commitment to block videos like al-Awlaki’s in 2010. And yet, even his famous 12-minute call to jihad sermon was available on YouTube until 2016.
Now, here’s why I say they have to get their priorities right: YouTube restricted several Prager University videos that informed viewers about Islamic extremism, as if they contained controversial views.
Ironically, YouTube is now blocking al-Awlaki’s jihadist messages, the very kind of extremism that Prager University’s video exposed. It is just another example of how the left trips on its own shoelaces every single day.