A murder video posted online raises debate about Facebook’s responsibility

A video of a man being shot to death was posted on Facebook Sunday and stayed online for nearly three hours before it was taken down. A man identified as Steve Stephens is said to have recorded himself confronting and killing Robert Godwin Sr. in Cleveland, raising questions about the role of social media sites. John Yang talks to Emily Dreyfuss of Wired magazine.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

  • JUDY WOODRUFF:

    A cold-blooded murder posted on Facebook on Sunday set off shock and strong reactions across social media and beyond today, as the manhunt for the killer intensified.

    John Yang has the story.

  • JOHN YANG:

    The 57-second video shows a man identified as Steve Stephens driving the streets of Cleveland while talking on the phone. He then steps from the car and confronts Robert Godwin Sr., a 74-year-old retired foundry worker, a father of nine and a grandfather of 14, and shoots Godwin dead.

    The online killing reportedly remained on the site for nearly three hours before Facebook removed it.

    In a statement late today, the company said that the video has "no place on Facebook and goes against our policies and everything we stand for." The company said it's reviewing how it operates 'to be sure people can report videos that violate our standards as quickly as possible."

    This episode raises fresh questions about the role and responsibility of social media sites like Facebook.

    Before the latest statement, I spoke with Emily Dreyfuss, a senior staff writer at Wired magazine.

    I began by asking her what Facebook could do.

  • EMILY DREYFUSS, Wired:

    First of all, you know, it is true that Facebook is working very hard to keep videos like this off of its site.

    It's easy in a moment like this to say, you know, this is absurd that it was on Facebook for even three hours. But the fact is that a large apparatus of content moderation and work went into the effort to be able to even get the video down after three hours. And in order for it to be removed, that means that people on Facebook had to flag it as inappropriate, and then that flag had to be sent to people that Facebook employs all over the world to get rid of content like this.

    And they took it down. And, sometimes, this can take up to 48 hours. So, three hours here is not even long in the scheme of things. Now, Facebook could do more. And they are working hard to figure out what they can do.

    One of those things would be to use A.I. and allow artificial intelligence to help humans who are having to flag this sort of terrible, gruesome material so that we don't see it.

    One of the problems is that A.I. is not really necessarily ready and up to the task of that yet, so Facebook is still trying to figure out how to make this work.

  • JOHN YANG:

    I want to make sure I understand. They rely on Facebook users to flag these things, or do they have people on their staff watching videos being uploaded?

  • EMILY DREYFUSS:

    So, they don't have people on staff who are watching videos as they're uploaded.

    What they have, they employ — Facebook employs hundreds of thousands of people who they call content moderators, whose job is to just watch videos like this and, if they see something inappropriate, to take them down.

    However, these people are not watching them before the video hits the Web site. They're watching them after a video has been uploaded, and then after someone like you or me or anyone else on Facebook has flagged that video as potentially being inappropriate.

    At that moment, that video will then get sent to this team of content moderators who will look at it. Right now, Facebook doesn't have a system in place to look at the video and to moderate it and to decide what's on it and whether it's appropriate before it hits the Web site. Now, that is something they could decide to do, but that would radically change what Facebook is as we know it.

  • JOHN YANG:

    Talk about that. Rapid change what Facebook is as we know it. They try to be — they want to be crowdsourcing content. They want to have raw, emotional videos like this.

  • EMILY DREYFUSS:

    Yes.

  • JOHN YANG:

    But, at the same time, they don't want to cross the line, it seems to me. So how do they define what they do and what they are?

  • EMILY DREYFUSS:

    Yes, so this is an incredibly difficult challenge, not just for Facebook, but for society at large, for us to decide what is and is not OK to be shared on social media platforms.

    Now, Facebook has resisted calls to say that it is a media company, like PBS or Wired. Now, we, as journalists, have guidelines for how to deal with what kind of content is appropriate that we would put on TV or in our magazine.

    Facebook, by refusing to call itself a media company, tries to take responsibility away from itself and say, look, rather than being a media company, we are actually a mirror on society, and we're going to reflect society at its best and at its worst.

    Now, what we saw yesterday with Steve Stephens' video was clearly society at its worst. But this is a huge, huge challenge. And what Facebook wants to do is encourage people to upload video. And it says it wants to encourage people to upload emotional, raw, intense video, and it is succeeding.

    That's part of its business model. But if it goes too far and allows videos like this one to show up on people's timelines, it will also destroy itself, because you and I are not going to continue to log into Facebook if when we go on to see videos or pictures of our nieces and nephews, if we accidentally come upon a video of a gruesome, horrific homicide.

    So, Facebook has an incentive to not make us not want to upload things, which they may do if they were to censor us. You can imagine the outcry of people on Facebook if you went to upload a video, and it didn't show up right away because Facebook was waiting to approve it.

    People wouldn't like that either, but nor do we like having videos like this in our timeline. So, it's a really tough challenge for them to walk that line.

  • JOHN YANG:

    And also — we have about a minute left.

    They're also trying to bolster their Facebook Live feature.

  • EMILY DREYFUSS:

    Yes.

  • JOHN YANG:

    Now, this was a video that was posted, although this man did use Facebook Live to talk about what he did.

    What's this going to do to their efforts to boost Facebook Live?

  • EMILY DREYFUSS:

    Yes.

    With Facebook Live, what was, as we were describing, as a very difficult challenge becomes almost impossible. Unless they were to put Facebook Live on a delay, and it would cease to be live, there is no way that they can avoid things like this from hitting the site.

    And it's honestly a miracle that homicides and horrible, gruesome violence have not been already used in that way. That's something that Facebook has to know, and it's deciding that it can depend on its users to flag content soon enough to take it down. But it is definitely taking that risk in order to push this product that is live.

  • JOHN YANG:

    Emily Dreyfuss on the tough questions that new technology is providing us.

    Emily Dreyfuss, thanks for joining us.

  • EMILY DREYFUSS:

    Thanks, John.

Listen to this Segment