On Easter Sunday horrific footage of a 74-year-old man being gunned down on a Cleveland sidewalk was posted on Facebook by his killer, reigniting an ongoing debate over how social-media content should be policed.
But effective strategies for blocking every piece of offensive and illegal content have been elusive and may never be 100 percent effective, according to some experts. Others — including Facebook itself — say more can and should be done to root out offensive content, including hate speech, horrific and illegal snuff videos and fake news items that mold the opinions of gullible users.
Facebook says it receives millions of complaints objecting to content every week from its nearly 2 billion active users. When the company receives a complaint, an algorithm automatically flags the content, which is then reviewed by moderators to quickly determine if it violates the law or the company’s terms and conditions.
Footage of the murder of Robert Godwin Sr. by deranged killer Steve Stephens, 37, who committed suicide on Tuesday following a police chase in Pennsylvania, was publicly viewable on Stephens' Facebook profile for about two hours on Sunday. Facebook said it disabled Stephens' account 23 minutes after it received reports of the murder video, but it was publicly viewable long enough for users to capture the footage, prompting a plea on Twitter from one of Godwin’s grandchildren for people to stop sharing the video.
Desmond Patton, an assistant professor of social work at Columbia University, said that while the Godwin murder video should clearly have been taken down, it's one extreme example of a larger issue. Companies like Facebook, Twitter and Google (which owns YouTube), he said, need to recruit specialists and elicit feedback from community leaders to improve how content is moderated, including material that might not seem offensive to every user.
“I study violence on social media and all of the [problematic] content that I see almost never gets taken down,” Patton told Salon. “If you’re just using tech people from Silicon Valley [as content monitors,] you’re going to miss a lot of things. You need to diversify who makes these decisions.”
Facebook declined to comment to Salon about the strategies it’s considering to fortify its efforts to block objectionable and illegal content uploaded by its users, but having a more aggressive content filtering system could have unintended consequences. For example, would a stricter policy lead to the censorship of footage like the July 2016 shooting of Philando Castile by a Minnesota police officer? It could be argued that this video serves the public’s interest because it viscerally highlights the ongoing problem of excessive force inflicted on African-Americans by members of law enforcement.
Sarah Esther Lageson, a sociologist at Rutgers University’s School of Criminal Justice, said that Facebook is under intense pressure to take a stance and define its position on monitoring user-uploaded content, which could lead to more surveillance — something that not all Facebook users will welcome. But she said the benefits of having an open and easy way to produce and share online videos, which can highlight injustices and expose crimes, outweigh the negative effects of giving people so much freedom.
“Facebook will likely provide an array of creative solutions and will likely do their best to streamline oversight of user-uploaded content using [artificial intelligence] or machine learning, but I won’t make an argument that those efforts would catch every instance of an extremely rare event like this,” Lageson told Salon in an email.
Besides, she said in a follow-up phone conversation, horrific crimes take place in public no matter what we do to prevent them; it’s the new medium by which criminals can advertise their crimes that concerns people.
“This is clearly an innovative way of doing something that has always been done: People have always killed people in public, mass shootings happen,” she said. “That being said the internet is a way to get into people’s homes, which I think is what scares people, that you can’t even feel protected from witnessing a crime on your cell phone or your laptop. It’s one thing to see a crime happen on the street and another thing to see it when you’re on your couch.”
As Facebook and other social-networking service providers struggle to moderate the immense content stream coming at them from their users, the solution to the many problems that can arise is complicated. It requires, as Patton suggested, more feedback from experts and community members about how to establish policies for all types of harmful, violent and offensive content. And as Lageson pointed out, the fact that people can produce and share content so easily has helped fight crime and injustice.
The solution to the problem of preventing offensive, hateful, violent and murderous content from being distributed on social networks is as complicated as people are themselves, and there may never be a solution that satisfies everyone’s concerns.