What do you think? Leave a respectful comment.

U.S. laws aren’t keeping up with spread of hate online, says civil rights advocate

Violent hate crimes are on the rise in the U.S. and across the globe. As a result, the ways in which hate groups use social media to threaten, galvanize and radicalize are drawing new scrutiny, including from Congress on Tuesday. Amna Nawaz reports on the Capitol Hill discussion and talks to Kristen Clarke of the Lawyers’ Committee for Civil Rights Under Law, a national civil rights organization.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

  • Judy Woodruff:

    The frequency and the scale of violent hate crimes have increased in recent years, not only in this country, but around the world.

    And that has brought increasing focus on how hate groups use social media platforms to threaten, to spread messages of ethnic and religious hate, to radicalize opinion in impressionable people, and even share live videos of an attack over the Internet.

    Amna Nawaz has this report.

  • Rep. Jerrold Nadler, D-N.Y.:

    Hate incidents are increasing in the United States.

  • Sen. Doug Collins, R-Ga.:

    Thank you, Mr. Chairman, for the opportunity for us to again condemn white nationalism. It's an opportunity that's unfortunate, but it is not unimportant.

  • Amna Nawaz:

    On the House Judiciary Committee's agenda today, hate speech and white nationalism, and how social media can amplify both.

    Eileen Hershenov from the Anti-Defamation League said violence by white supremacists is on the rise.

  • Eileen Hershenov:

    White supremacists have been responsible for more than half, 54 percent, of all domestic extremist-related murders in the past 10 years. And in the last year, that figure has risen to 78 percent of all extremist-related murders.

  • Amna Nawaz:

    Seated next to Hershenov, Dr. Mohammad Abu-Salha. In 2015, his two daughters and son-in-law were murdered in an alleged hate crime in North Carolina.

  • Dr. Mohammad Abu-Salha:

    There's no question in our minds that this tragedy was born of bigotry and hate. And I ask you, I truly plead to you not to let another American family go through this because our government wouldn't act to protect all Americans.

  • Amna Nawaz:

    Four months after those murders, a white supremacist killed nine black worshipers at a Mother Emanuel Church in Charleston, South Carolina.

    In August 2017, a white supremacist killed Heather Heyer, who was protesting a white supremacy rally in Charlottesville. In October 2018, a white supremacist killed nine worshipers in a Pittsburgh synagogue. And just last month, a white supremacist killed 50 people at mosques in New Zealand.

    Committee Chairman Jerry Nadler said today's hearing was prompted by that New Zealand attack and the role social media played by enabling it to be live-streamed.

  • Neil Potts:

    I would like to be clear. There is no place for terrorism or hate on Facebook.

  • Amna Nawaz:

    Facebook's public policy director, Neil Potts, acknowledged his company has been criticized for allowing hate speech on its platform, but he pushed back, noting they quickly removed the New Zealand shooter's live-stream.

  • Neil Potts:

    On Facebook, on Instagram, we took immediate action towards that video. Once we were made aware, we were able to remove the video within 10 minutes.

  • Amna Nawaz:

    Some expressed concern that tech companies like Facebook and Google were stifling free speech by banning certain topics.

    Republican Congressman Tom McClintock of California:

  • Rep.Tom McClintock, R-Calif.:

    Suppressing speech, even the most hate-filled speech, doesn't diminish its influence. It strengthens it. I don't think you can be both. You can't be a neutral platform and at the same time exercising editorial control over content.

  • Amna Nawaz:

    And activist Candace Owens testified she regularly faces hate speech and attacks as a black conservative.

  • Candace Owens:

    And we're not talking enough about political hatred in this country. We're not talking enough about conservative activists being attacked like myself.

  • Amna Nawaz:

    The European Union and Australia have taken concrete legislative steps recently, passing personal responsibility laws. If hate speech is allowed on social media platforms, it's the company's executives who are held accountable.

    We turn now to someone who testified today, Kristen Clarke. She is the president and executive director of the Lawyers' Committee for Civil Rights Under Law, a national civil rights organization. And she played an instrumental role in Facebook's decision last month to ban discussions on white nationalism.

    For the record, we asked Google and Facebook to join us. They both declined.

    Kristen Clarke, welcome to the "NewsHour."

  • Kristen Clarke:

    Thank you for having me.

  • Amna Nawaz:

    So, you heard us tick off a list of some of those attacks by white supremacists, but that's not the only threat to the United States, right? Was this topic worthy of a congressional-level hearing today?

  • Kristen Clarke:

    Absolutely. This was timely. This was necessary. And it's critical.

    These are issues that are life-and-death for people in our country and, frankly, across the globe. We need to understand better what is fueling hate today, and we know that it's rhetoric at the highest levels.

    We know it's policies that dehumanize and marginalize communities, but we also know that tech platforms play a big part in facilitating hate today. I was very pleased to see Congress take some time today to hear from Facebook and Google about what they're doing to stamp out hate, but no doubt part of what is fueling this crisis today is the fact that so much hate generates online.

    This is where people are coming together, where people are going to recruit members, organize rallies, target victims, broadcast their killings. So, it's really important that we get to the root of what's driving online hate today.

  • Amna Nawaz:

    And those are all things we have seen already happen, right?

  • Kristen Clarke:

    Indeed.

  • Amna Nawaz:

    So, let me ask you about this.

    As the hearing is unfolding and it is live-streaming, the comments on the hearing have to be disabled because of all the anti-Semitic and racist remarks directed at people in the hearing. You mentioned there were representatives from Facebook and Google there.

    Talk to me about the responsibility of those companies, those tech platforms to address that kind of hate and remarks.

  • Kristen Clarke:

    Yes, these online communities have really grown tremendously over time. And our laws and policies have not quite kept up lock and step.

    We used to have brick-and-mortar operations where people interacted, and we have public accommodations laws that prohibit discrimination in those spaces. But now so much commerce activity happens online. And, sadly, tech companies are simply not doing enough to make sure that these are not places that are leaving communities feeling vulnerable.

  • Amna Nawaz:

    Why do you think they're not doing enough? Why haven't they done more so far?

  • Kristen Clarke:

    I hope that Congress will do some work after this hearing, take what they learned today and think about the new laws that they can put on the books to keep up with the 21st century world that we're living in.

    I think that we basically rely on tech companies to police themselves, with some pressure from advocacy groups like ours. But, at the end of the day, what we need are strong laws on the books that can hold tech and social media platforms accountable for the ways in which they contribute and allow hate to fester.

  • Amna Nawaz:

    You don't think it will happen without more oversight?

  • Kristen Clarke:

    Not fast enough.

  • Amna Nawaz:

    So you were instrumental, as we mentioned, in getting Facebook to ban those white nationalist groups. How hard was it? You were there in the room with some of those executives too. How willing are they to try to tackle this problem?

  • Kristen Clarke:

    Yes.

    I mean, it took a lot of blood, sweat, and tears. We approached Facebook with this issue last summer, and it took many, many months of advocacy and pushing. And other folks in the civil rights community provided an echo chamber for our concern.

    And it was very frustrating to have a $50 billion company take the view that white supremacist activity is bad, but white nationalist and white separatist activity is OK.

  • Amna Nawaz:

    They would distinguish between the two?

  • Kristen Clarke:

    Yes.

    And we know these are all dangerous ideologies that are one and the same, are completely interchangeable, and all of them are ones that promote and incite violence today. These are groups who, you know, really embrace these ideologies of hate and are out organizing and targeting vulnerable communities based on those hateful ideologies.

  • Amna Nawaz:

    We should point out, in the conversation about screening language possibly censoring language, there is a real free speech question, right?

    And we heard that from Candace Owens on the panel today. She said: I am targeted. We don't talk about the political hatred that I face.

    What do you say to those arguments?

  • Kristen Clarke:

    That's a totally different issue.

    I believe deeply in the First Amendment. It's a bedrock principle in our democracy. But, at the end of the day, we're talking about conduct and activity that is not First Amendment protected speech.

    When you're out issuing threats to communities of color, when you're inciting violence, when you're using the Web to organize hateful rallies, at some point, we are far outside the First Amendment zone, and we're now in an area that really requires that tech companies police that space to make sure that these are safe online communities for the people who use these communities, but, more importantly, we need to make sure that we're not allowing online platforms to be breeding grounds for the hateful violence that we're witnessing in our communities today.

  • Amna Nawaz:

    Less than a minute left now.

    I want to ask you, what could some of these tech companies do today? What other steps could they be doing right now that would help to address this problem?

  • Kristen Clarke:

    They need to put more resources into this issue. They need more diversity around the table to ensure that broken policies or misguided policies like the one that Facebook maintained never rear their head again.

    And they need to partner and collaborate with civil rights organizations. We're in the trenches kind of dealing with the fallout from the hate crime crisis. We're working in the communities. We're helping victims.

    And they need to be sensitized to kind of the ugly realities of hate today and understand the role that they play in contributing to that hateful environment.

  • Amna Nawaz:

    Kristen Clarke of the Lawyers' Committee for Civil Rights Under Law, thanks for being here.

  • Kristen Clarke:

    Thanks for having me.

Listen to this Segment