Advertisment

Facebook forced to explain censorship policy for Live videos

author-image
CIOL Writers
New Update
CIOL New Facebook version for Android allows users to pull videos offline

The live streaming by the fiancé of an African-American Philando Castile, who was shot down by a police officer in Falcon Heights, Minnesota, after being pulled over for a broken taillight has captured the attention of millions around the world. The video was streamed live by Diamond Reynolds, Castile’s fiancée, via Facebook’s Live video feature, but remained unavailable for a long time after the initial post.

Advertisment

While some are accusing Facebook of willingly taking down the video for its violent content, the social media company insists that it happened due to a technical glitch. The company assured that it only removes content if it celebrates or glorifies violence, not if it’s only graphic or disturbing, according to a spokesperson. However, Facebook refused to detail exactly what caused the glitch.

CIOL Facebook forced to explain censorship policy for Live videos

This leaves us with a nagging gap to understand what really entails Facebook’s graphic content censorship policy. Facebook’s Community Standards outline what is and isn’t allowed on the social network, from pornography to violence to hate speech. They apply to Live video the same as to recorded photos and videos.

Advertisment

The policy states that Facebook will not allow and will take down content depicting violence, if it is celebrated, glorifies or mocks the victim. However, violent content that is graphic or disturbing is not a violation if it’s posted to bring attention to the violence or condemn it.

When users report any content, including Live videos in progress, as offensive for one of a variety of reasons, including that it depicts violence, the content is reviewed by Facebook’s Community Standards team, which operates 24/7 worldwide. These team members can review content whether it’s public or privately shared. The volume of flags does not have a bearing on whether content is or isn’t reviewed, and a higher number of flags will not trigger an automatic take-down. The team members are trained to determine whether the content violates Facebook’s standards under three scenarios—the content does not violate Facebook’s standards and is not considered graphic, and is left up as is; the content violates Facebook’s standards and is taken down; the content is deemed graphic or disturbing but not a violation, and is left up but with a disclaimer.

Castile’s video falls in the third category, and that’s why you will see a black disclaimer screen hiding the preview of the content and reads “Warning – Graphic Video. Videos that contain graphic content can shock, offend, or upset. Are you sure you want to see this?” These videos do not auto-play in the News Feed and are typically barred from being seen by users under 18.

Overall, these policies do not appear to be overly restrictive. Facebook’s censorship rules focus on the glorification of violence, such as videos posted to promote or celebrate terrorism.

The policy does not make distinctions about the cause of death, the relationship between the video’s creator and its subjects or the involvement of law enforcement. As with all content posted on Facebook, the creator retains ownership.

facebook