Human Resources



Facebook logo

Facebook will hire 3,000 new workers to combat violent video streamings

Human Resources magazine and the HR Bulletin daily email newsletter:
Asia's only regional HR print and digital media brand.
Register for your FREE subscription now »

In the next year, Facebook is looking to hire 3,000 people to speed up the removal of videos showing violence such as murder, suicide and other acts.

Considered to be the biggest threat to its public image, the hiring spree was announced by founder and CEO Mark Zuckerberg on Wednesday (3 May 2017) after users of the social media network were shocked by two video posts in April showing killings in Thailand and the United States.

One of the videos had shown a father in Thailand broadcast himself killing his daughter on Facebook Live. After 370,000 views, and more than a day, Facebook removed the video. Another video of a man shooting and killing another in Cleveland last month also shocked viewers.

The workers will be filling new positions and will monitor all Facebook content – not just live videos. However, the company did not say where the jobs would be located; although Zuckerberg said the team operates around the world.

Zuckerberg said in a Facebook post that these 3,000 new workers will be in addition to the 4,500 people who are already reviewing posts that may violate its terms of service. Facebook has 17,000 employees overall, not including contractors.

“We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down,” said Zuckerberg.

ALSO READ: Why Facebook is training its own AI talent, and you should too

The problem became more pressing since the introduction of  Facebook Live last year – a service that allows any of the social media network’s 1.9 billion monthly users to broadcast video, some which have been marred by violent scenes.

The company has been utilising artificial intelligence to automate the process of finding pornography, violence and other potentially offensive material. In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

However, Facebook still relies largely on its users to report problematic material. It receives millions of reports from users each week; but like other large Silicon Valley companies, it relies on thousands of human monitors to review the reports.

Following the release of its quarterly earnings yesterday, Facebook shares fell slightly and edged lower still after the bell.

Photo / 123RF

Hong Kong's leading C&B conference Employee Benefits Asia returns on 16 May
Contact us now for the amazing GROUP DISCOUNT

Read More News


Leave a Reply

You must be logged in to post a comment.