It’s true that social media have no active gate keepers like traditional media that checkmate and censor the kind of contents that are allowed, however; Facebook overtime has tried to explain what is and not allowed on its platforms through community standard. Today, the company published its internal guidelines and how its rules are enforced.
The most prominent challenge the company has faced since its inception is the line between content harassment, free speech and a host of many others. Like when the company faced criticism over how it deals with photos of breastfeeding mothers, nudity, hate speech, the list is endless.
The new guidelines covers: violence, criminal behaviour, respect for intellectual property, integrity and authenticity. So don’t be surprised when a content you shares magically disappears. It simply means it’s a copyrighted material that does not belong to you. Ultimately, what Facebook wants is to be seen as a transparent company by it many critics.
“We decided to publish these internal guidelines for two reasons,” noted Facebook’s VP for global product management Monika Bickert, in a blog post.
“First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – overtime. Also related to all of this,the company is also giving out a new appeals process for content that has been removed. “The company has faced backlash in the past over how difficult it is to get in touch with Facebook to explain to them that a takedown was perhaps a little harsh. You can’t argue with an algorithm, and an algorithm doesn’t have the nuanced understanding to differentiate between porn and a work-of-art.”
With the newly published internal guideline, Facebook promises that a person will review the post within 24 hours to assess whether its algorithms have missed the mark.
“We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up,” added Bickert. “We believe giving people a voice in the process is another essential component of building a fair system.”
While it is important for people to have the freedom to express themselves, it is also important that those expressions be checkmated. With the high rate of hate speech, nudity, and a host of other social vices, social media in general should take a cue from facebook by enforcing a bit of censorship to contents.
Facebook is an online social networking service headquartered in Menlo Park, California, in the United States. Its website was launched on February 4, 2004, by Mark Zuckerberg with his Harvard College roommates and fellow students Eduardo Saverin, Andrew McCollum, Dustin Moskovitz and Chris Hughes. The founders had initially limited the website’s membership to Harvard students, but later expanded it to colleges in the Boston area, the Ivy League, and Stanford University. It gradually added support for students at various other universities and later to high-school students. Its name comes from the face book directories often given to American university students.