In 2008, the Senate Homeland Security and Government Affairs Committee, chaired by then Senator Joe Lieberman, produced a report entitled: “Violent Islamist Extremism,The Internet, and the Homegrown Terrorist Threat.” The report mentions a rap video called “Dirty Kuffar” (Kuffar meaning “nonbeliever”) that, according to the report, praises Osama bin Laden and the attacks of 9/11.
A few days after the report came out, Lieberman sent a letter addressed to then Google CEO Eric Schmidt pointing to the report and asking the company to remove terrorist content from the site, including things like the named video.
As some quickly pointed out, the one video named in the report was hardly espousing terrorism or hate speech. It may be mildly offensive, but it was clearly protected political speech.
The letter that Lieberman sent was accompanied by a list of other videos that Lieberman’s staff claimed were promoting terrorism content, and Lieberman asked YouTube to not only remove the specific videos that violated its policies, but to shut down the accounts of those who posted the videos in the first place.
Decisions for YouTube:
- How do you determine which content is actually promoting terrorism compared to those that are just discussing it, reporting on it, or highlighting terrorist attacks?
- How do you distinguish political speech from terrorist content?
- How do you respond to a sitting US Senator demanding the removal of content?
- If an account has violated policies against terrorist content once, should the entire account be shut down?
Questions and policy implications to consider:
- Since an elected US official can potentially change the laws, requests are likely to be taken more seriously. If the requests are to take down 1st Amendment protected speech, does that raise 1st Amendment issues?
- Is it possible to readily distinguish content that is promoting terrorism from that which is documenting terrorism and war crimes for the historical record?
YouTube chose to push back on Senator Lieberman’s request, putting up a blog post saying that it wished to have a dialogue with the Senator. While the company said it did remove some of the videos Lieberman’s staff highlighted, if they were found to violate its policies, it would not remove them all, nor would it shut down all of the accounts mentioned.
Senator Lieberman’s staff identified numerous videos that they believed violated YouTube’s Community Guidelines. In response to his concerns, we examined and ended up removing a number of videos from the site, primarily because they depicted gratuitous violence, advocated violence, or used hate speech. Most of the videos, which did not contain violent or hate speech content, were not removed because they do not violate our Community Guidelines.
Senator Lieberman stated his belief, in a letter sent today, that all videos mentioning or featuring these groups should be removed from YouTube — even legal nonviolent or non-hate speech videos. While we respect and understand his views, YouTube encourages free speech and defends everyone’s right to express unpopular points of view. We believe that YouTube is a richer and more relevant platform for users precisely because it hosts a diverse range of views, and rather than stifle debate we allow our users to view all acceptable content and make up their own minds. Of course, users are always free to express their disagreement with a particular video on the site, by leaving comments or their own response video. That debate is healthy.
Senator Lieberman continued to pressure Google over these policies over the years, including writing a similar letter to complain about “terrorist content” on the company’s Blogger platform a few years later. Since then, YouTube has ramped up its efforts to block “terrorist” content on the platform, but has also been accused many, many times of going too far and actually deleting content from human rights groups that were trying to document war crimes and other atrocities.
Written by The Copia Institute, December 2020