Hate speech on the Internet is increasing globally due to the advent of Web 2.0 technology like video file-sharing and social networking sites, experts say. “Everyone can become a publisher very, very easily, and so it’s becoming a more serious problem,” says Christopher Wolf, chairperson of the International Network Against Cyber-Hate (INACH), an umbrella group of nongovernmental organizations (NGOs) around the world that work to fight Internet hate speech.
A two-day Global Summit on Internet Hate held at the French embassy in Washington, D.C., recently gathered experts from around the world to discuss the challenges and possible solutions to online hate. The event, hosted by INACH and its U.S. constituent, the Anti-Defamation League (ADL), touched on legal issues, public-private partnerships, and the global nature of the Internet.
“The purpose of INACH and its annual convention is to have this international cooperation that allows for sharing of knowledge, exchanging best practices, and trying to coordinate measures against hate speech,” says Deborah Lauter, director of civil rights for the ADL. “So just bringing all these people from different countries together who are addressing this topic, in and of itself, is one of the goals of this conference,” she says.
Some European countries have made certain forms of hate speech, like Nazi propaganda and Holocaust denial, a crime. Free speech protections guaranteed in the First Amendment of the U.S. Constitution make it impossible to outlaw hate speech in the United States, however. This impediment presents one of the biggest challenges for those seeking international solutions to the problem of hate speech.
“The conclusion of those at the conference is even if a jurisdiction were to criminalize [hate speech], because of the First Amendment in the United States and the borderless nature of the Internet, content could be hosted from the U.S. and seen worldwide,” Wolf says. “And so…regulating content by legislation is not a terribly effective way of dealing with a lot of the content.”
That’s not to say that the legal framework can’t work at all. It was effective, for example, in one high-profile French case against Yahoo! in 2000. The case concerned the company’s online auction of Nazi memorabilia. A French judge ordered Yahoo! Inc. to make it impossible for French Web surfers to view Nazi-related auctions or face a fine. Yahoo! France, the French subsidiary, already had taken steps to block the auctions. Yahoo! then agreed to remove the items from all of its servers.
Despite the success in that case, experts note the ease with which anyone could relocate hateful content prosecuted in one country by reposting it to other Web sites hosted on servers in other countries.
Another problem with using legislation to regulate the Internet is enforcement, says Harlan Loeb, managing director of FTI Consulting and an adjunct professor at Northwestern University Law School. “You would have to begin with a kind of a coalition of the willing among nations on the enforcement front and then the resources to back up the effort,” he says.
Experts agree that part of the solution lies in working with businesses that provide access to the Internet or online applications. While the government cannot outlaw hate speech, a company has the right to establish a policy that requires users to abide by stated limits on what can be posted online. As a result of this realization, more attention was paid this year to including Internet service providers (ISPs) and major online portals in the conversation, Lauter says.
“For us, a strategy to work with the ISPs themselves and to help them craft terms of service that are effective and enforceable right now is a priority,” she says.
ISPs and popular Web sites like YouTube have a willingness to help but are frequently overwhelmed by the volume of activity, Lauter notes. YouTube, in fact, recently partnered with the Anti-Defamation League to launch an Abuse and Safety Center, which includes links and resources from the ADL and allows users to report content that violates YouTube’s community guidelines on hate speech.
James Cicconi, AT&T Inc.’s senior executive vice president for external and legislative affairs, highlighted challenges for the private sector in his keynote address. “[A]ll this power our company has to move and manage information does not include the power to censor it. Not even in the case of hate speech,” he said.
Cicconi and others contend, however, that the answer is not less information but more. “If we can educate kids who are growing up with the Internet about the evils of hate speech and the ways to filter it intellectually,” Wolf says, “then the impact of it will be lessened and perhaps we can persuade kids not to engage in that kind of speech.”
The Media Awareness Network, a non-profit organization located in Canada whose mission is to promote critical thinking about the media among young people, is an example of such an educational effort. In 2004 the group began developing programs that addressed online hate speech in an effort to inoculate kids against hate on the Internet and educate them so that they are not contributing to it themselves.
The Media Awareness Network’s ongoing programs include a licensed workshop for teachers, a series of free lesson plans available on the Web site, and two online games for students. The popularity of the games, which the group says are among the most accessed resources on their Web site, illustrates the need for such programs, according to Co-executive Director Cathy Wing.
The group has also made presentations at several conferences sponsored by the Organization for Security and Co-operation in Europe (OSCE). Wolf cites the Media Awareness Network as a program that U.S. experts could learn from.
Hate speech matters, because words have consequences and can lead to violence, Lauter says. “The Holocaust didn’t start with gas chambers, 9-11 didn’t start with planes crashing into buildings,” she says. “Those incidents—terrorism and hate and genocide—start with words and stereotypes.”