This Easter Sunday, Steve Stephens uploaded a video on Facebook of himself shooting and killing Robert Godwin Sr. The video sat on the social media platform for several hours before it was removed; millions viewed the violence as the graphic scene went viral.
“We have a responsibility to continue to get better at making sure we are not a tool for spreading [footage of violence]” Facebook’s CEO Mark Zuckerberg told USA TODAY in an interview the week before the murder. “Those are all against our community standards. They don’t belong there.”
The broadcasted murder committed by Stephens would not have been a possibility on live television due to broadcast regulations – and the ability to stream his violent acts could have been a factor that motivated him to kill.
Clearly, we are becoming a “streaming” society. Instant gratification and the ability to speak to hundreds, thousands, or even millions of people exists in our hands. With this great power (and capability) comes great responsibility, and social media sites need to take action to combat this plaguing epidemic of violent streaming videos or images.
Stephens said in a second video, “I’m at the point where I snapped,” blaming the murder on his ex-girlfriend, Joy Lane, and his mother.
A nationwide search for Stephens ensued, and a $50,000 reward was offered by Cleveland officials to anyone disclosing information leading to his arrest. A McDonald’s employee named Henry Sayers served Stephens an order of 20 piece chicken McNuggets and french fries in Erie, PA. Immediately after, he contacted the authorities.
Police pursued Stephens and performed a Precision Immobilization Technique (PIT) maneuver on his vehicle.
Stephens then shot and killed himself. “As the vehicle was spinning out of control from the PIT maneuver, Stephens pulled a pistol and shot himself in the head,” Pennsylvania State Police told CNN on Tuesday.
This murder case is not the first time that violence has been broadcasted on social media for millions to watch.
Features such as Facebook Live are described by the company as “the best way to interact with viewers in real time”, allowing for moments to be shared with family and friends who may not be able to attend an event. The feature also allows users to share their thoughts with a vast number of followers in real time. However, with no regulations or monitoring in place, violence, murder, suicide, and other graphic scenes can be viewed by over 1.8 billion monthly Facebook users of all ages on the platform. When a video goes viral, users can’t simply “change the channel,” they must scroll past graphic footage to avoid it.
Television comes with ratings, broadcasting guidelines, and an expectation from viewers that if graphic scenes will be shown, a warning will be given. The Federal Communications Commission (FCC) regulates all communication in the United States, protecting viewers and their families from a endless stream of inappropriate content. Nevertheless, viewer discretion is not advised on Facebook, or any other social media site. If an obscene video is posted, there is no guarantee that it will be removed quickly – allowing for millions and millions to tune in.
In summer of 2016, Philando Castile‘s death was streamed on Facebook by his girlfriend. He was shot by police during a traffic stop in Minnesota. In January of 2017, a 14-year-old from Florida recorded her suicide on Facebook. Three men in Sweden broadcasted themselves assaulting and raping a woman in a private group on the site. Two journalists in the Dominican Republic were shot and killed while streaming live to Facebook followers. These are just a small number of many vicious and heartbreaking events that have been shared on Facebook.
While users can report content, social media sites currently use artificial intelligence and algorithms. These methods are unreliable to support the billions of users these interfaces support. A Facebook vice president, Justin Osofsky, said in a public post late Monday: “It was a horrific crime — one that has no place on Facebook, and goes against our policies and everything we stand for,” adding that the company was working to filter content in a more quick and efficient manner.
I sincerely hope that Osofsky means what he says, and that other social media giants will also take action to prevent unregulated and ferocious content.
This article was originally posted on chelsiearnoldblog.wordpress.com — a blog I created for “Applied Writing for the Media” (Communication 410) at Eastern New Mexico University.