It is indeed commendable that Facebook has literally made the world shrink; it has not just helped the like-minded people from around the world connect online but also has broadened their horizons in terms of giving a better understanding of cultures and other’s perspective. However, there is a flipside to a universal platform that lets people broadcast their thoughts as per their convenience. For a majority of the people, Facebook is a tool to connect, share and meet new people, for others, it is a great avenue to vent out their angst, spread hate and propagate their ideologies of violence as well as manipulate the others to follow them.
The advent of the new feature, Facebook Live has only made things worse. In recent times, there have been several disturbing reports of people committing suicide on a live stream.
In one of the most bizarre and horrific incidents, a person was live streaming as he killed his family members, the brutal scenes were captured live on camera and broadcast for the people to see. It does not deny that such atrocious events have always existed and we quite often read about it on the internet, however, with Facebook and Facebook Live, in particular, has not brought the incidents to the fore but also spurred it further.
Mark Zuckerberg, the CEO of Facebook, has been horrified by the occurrence of such events, and in a media interview even stated that it is quite difficult to run a company and also find measures to avoid violent content from being shared on the platform.
Whilst Zuckerberg is desperate to salvage the situation, it must be understood that it is literally impossible for Facebook to police everything around the world; they cannot stop people from posting real-time content. However, Zuckerberg has indeed initiated a few actions to curb the posting of criminal and gory content. Earlier this week, the CEO announced that the company would expand their content moderation team from 4,500 to 7,000 people over the next one year.
The expansion of the team is in addition to the various artificial intelligence measures, which were previously announced. Facebook is designing artificial bots specifically meant to detect a change in behaviour and posting patterns of the users and take preventive action.
The announcement of the content moderation team expansion is indeed a significant step forward taken by Facebook to make the platform a haven. Let’s us hope, the addition of the content moderators will eliminate the misuse or should be was say abuse of Facebook Live.