By now, you’ve probably heard about the murder of Robert Godwin Sr. at the hands of Steve Stephens, which unfolded on Facebook Live video for the world to see.
If you haven’t, here are a few different articles.
Events like these raise so many questions about ethical responsibilities of these platforms, while simultaneously revealing the complexities of balancing content moderation and free speech in an open and connected world.
There are so many things we could analyze about this story, but since we’re in the business of social media let’s go with the most obvious question and then move on to dispel the myth that there is a simple solution.
What is Facebook supposed to do here?
Since its release, Facebook Live has had a multitude of incidents, a fact that has been widely reported.
After this most recent incident, Facebook released a statement saying the following:
“This is a horrific crime and we do not allow this kind of content on Facebook. We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety.”
It’s good to know that Facebook does “not allow this kind of content”. Assuming that the leadership of Facebook are not, in fact, secretly lizard people that are going about a nefarious plot to destroy all humans, it seems likely that they don’t want murders, rapes, and other violent crimes on Facebook.
Even in a world where it’s said that “all press is good press,” I’m pretty sure this falls outside of the lines.
We know that Facebook doesn’t want it on there and we know that–for the most part– its users don’t want murder, rape, and other violent crime on Facebook.
So, what’s next?
Every potential remedy to this situation comes with its own set of consequences.
The Three Options
The first two options that Facebook has are simple while the third is more complex. While I’m about to present three options, you’ll see that only one is advisable.
OPTION 1: IGNORE THE PROBLEM
The first option is to let all content flow freely on Facebook and let the community moderate itself. Simple, right?
The benefit:
Similar to Facebook’s policy around pornography and female nipples, it would be the Facebook users and some algorithms that are responsible for flagging offensive content. This is cost effective for Facebook and similar to how they dealt with the fake news epidemic: by diffusing the blame, and claiming relative freedom from culpability.
The Unintended Consequences:
The most obvious problem to arise from this approach is that the lack of oversight would open the door for more violent crimes to be posted on Facebook. The attention that these violent criminals would receive may feed into some sick desire for validation and fame that the news media has fed mass shooters for two decades now.
Beyond that, Facebook would continue to be dragged publicly for willingly ignoring the problem and would also eventually be held liable since free speech has limits and in this scenario a crime is being committed. Therefore, it we could argue that Facebook, by not intervening, is either negligent, or worse. Facebook would likely have no choice but to cooperate with law enforcement. This opens the door to the gradual erosion of any privacy rights on Facebook as the definition for what crime is sufficient to cooperate with law enforcement becomes less clearly defined.
OPTION 2: LOCK IT DOWN
The second option is to lock it down or shut it down. In this scenario, all content is either subject to approval before being published to a wider audience, or the live streaming feature is abandoned altogether.
There are people honestly questioning whether or not Facebook should (or will) shutter its live video platform:
The chances of Facebook just letting it go, locking it down, or shutting it down are slim to none. You know that, right?
Yet, here we are discussing the various options as if there really are three options. But let’s discuss this option, or a …
The benefit:
In both lock down and shut down, there should be no more violence on Facebook Live. In the case of removing it altogether, there is no longer a need to moderate live video either manually or automatically. Hooray! We stopped violent crime.
The Unintended Consequences:
In the case of removing it altogether, we lose one more method of communication due to the actions of an astoundingly small segment of the users. If we applied this same philosophy to guns and cars, no one would be driving and no one would have guns.
OPTION 3: INTELLIGENT MODERATION
So, here we are, with the only option that actually makes any sense at all.
This is an undoubtedly complex issue, as what we post online is a first amendment issue. Since this also involves criminal activity, the issue becomes more complex. What’s more, any policy that imposes a review of all content is a punishment to the vast majority of users who have done nothing wrong.
For these reasons, Facebook must invest heavily into AI and Machine Learning to help augment manual review, while also putting in place moderation, flagging and review from users. It will also be important to review the flagging mechanism and augment it with automation to build an early detection system that can help prevent these incidents in the future. This is not an overnight process, nor is this simple.
As Facebook implements new rules and policies, it should create a plain english guide that explains how it aides law enforcement, and for what types of crime.
Obviously, leaving the flood gates open and claiming no responsibility is negligent and unethical. This is especially true once Facebook introduces Live video ads. Imagine your company ad running inside a video of a sha
It also is a bad idea to shut down Facebook Live video just because some people are psychos. This is especially true since the presence or absence of Facebook Live video does little to solve the actual problem: violent crime.
THE REALITY OF TECHNOLOGY
With every new technological advancement there are positives and negatives. If there were no Facebook live, these same offenders would be posting Snapchat stories, recorded videos, and picture slideshow presentations.
The actions of a few cannot silence us as a larger group. Live video is not a weapon. And while Facebook does have a responsibility to do everything it can to prevent further incidents like this, I think we can all agree that it’s much more complex than shutting down the service or hiring thousands of people to manually review live videos all day and night.
What do you think?