YouTube’s latest push to ban terrorist propaganda across its ubiquitous video platform is getting off to a rough start. Earlier this week, noted investigative reporter and researcher Alexa O’Brien woke to find that not only had she been permanently banned from YouTube, but that her Gmail and Google Drive accounts had been suspended as well. She would later learn that a reviewer who works for Google had mistakenly identified her channel, in the words of a YouTube representative, as “being dedicated to terrorist propaganda.”
This drastic enforcement action followed months of notifications from YouTube, in which O’Brien was told that three of her videos had been flagged for inappropriate content, accused of containing “gratuitous violence.” None of the videos, however, depict any actual scenes of violence, except for one that includes footage of American helicopter pilots gunning down civilians in Iraq, which has been widely viewed on YouTube for half a decade.
While appealing YouTube’s decision, O’Brien learned that the mechanism for correcting these mistakes can be vexing, and that a fair outcome is far from guaranteed. By Wednesday morning, her channel was slated for deletion. The Google Drive account she was locked out of contained hundreds of hours of research—or years worth of her work—and was abruptly taken offline. She was then told that she was “prohibited from accessing, possessing or creating any other YouTube accounts.” The ban was for life, and with little explanation and zero human interaction, O’Brien’s research, much of it not accessible elsewhere, was bound for Google’s trashcan.
With the knowledge that YouTube has faced increased pressure from the US and European governments to crack down on the spread of terrorist propaganda—a consequence of which has led to investigators of US drone strikes, such as those at citizen investigative group Bellingcat, finding their own material disappeared from Google services—it wasn’t difficult to deduce what had happened to O’Brien’s account.
The problem was eventually addressed and representatives of both Google and YouTube later called O’Brien to apologize and explain the error. When she was told that her channel had been misidentified as an outlet for terrorist propaganda, she could hardly contain her laughter. “It was a series of unfortunate events,” a YouTube rep told her. The mistake, they explained, was the fault of a human reviewer employed by Google.
A spokesperson for Google told Gizmodo on Friday: “With the massive volume of videos on our site, sometimes we make the wrong call. When it’s brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it.”
This year, YouTube has begun increasingly relying on machine learning to find and scrub extremist content from its pages—a decision prompted by the successful online recruiting efforts of extremist groups such as ISIS. With over 400 hours of content uploaded to YouTube every minute, Google has pledged the development and implementation of systems to target and remove what it calls “terror content.”
Last month, a YouTube spokesperson admitted, however, that its programs “aren’t perfect,” nor are they “right for every setting.” But in many cases, the spokesperson said, its AI has proven “more accurate than humans at flagging videos that need to be removed.” In a call Wednesday, a YouTube representative told Alexa: “Humans will continue to make mistakes, just like any machine system would obviously be flawed.” The machine, which prioritizes the content reviewed by human eyes, wasn’t “quite ready,” she said, to recognize the context under which controversial content is uploaded.
The O’Brien incident demonstrates that Google has many miles to go before its AI and human reviewers are skilled enough to distinguish between extremist propaganda and the investigative work that even Google agrees is necessary to broaden the public’s knowledge of the intricate military, diplomatic, and law enforcement policies at play throughout the global war on terror.
Al-Qaeda and The As-Sahāb Tape
What prompted a Google reviewer to designate O’Brien as a purveyor of terrorist content? Well, for one, her channel contains actual al-Qaeda propaganda. But that propaganda is also an important piece of US history: A few years ago, it nearly cost former US Army Private Chelsea Manning a life sentence.
O’Brien’s channel contain portions of a June 2011 video presented by al-Qaeda outlet As-Saḥāb Media featuring Adam Yahiye Gadahn, a US-born al-Qaeda operative in the Arabian Peninsula, who—in earlier jihadi propaganda tapes rebroadcast by US network news—referred to himself as “Azzam the American.” In 2006, Gadahn appeared in an al-Qaeda documentary that features an introduction by Ayman al-Zawahiri, the al-Qaeda co-founder and current leader of the organization who succeeded Bin Laden in 2011.
In January 2015, Gadahn was killed in Pakistan in a series of US drone strikes, which also claimed the lives of foreign aid workers Giovanni Lo Porto and Warren Weinstein.
O’Brien’s interest in Gadahn has nothing to do with spreading his views on the “Great Satan” or his prophesies of American streets run with blood. The footage she preserved using YouTube’s service, which was also embedded in an off-site analysis, was used by military prosecutors to support criminal offenses at the court martial of Chelsea Manning. The criminal proceedings against Manning lacked contemporaneous access to the court record. Only the work of reporters, like O’Brien, who personally attended the trial, is available to the public.
The As-Saḥāb video featuring Gadahn came into play after the US government accused Manning of “aiding the enemy,” a charge that, unlike most derived from…