Facebook head honcho Mark Zuckerberg has announced 3,000 new positions to monitor video content after the recent spate of shocking video uploads to the site.
“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Zuckerberg said in a Facebook post.“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
Joining the social media platform’s community operations team, the new hires will review the huge number of reports of posts that might violate its terms of service they receive regularly.
These jobs are in addition to the 4,500 who already carry out the task, but will also allow them to tackle other heinous activity.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else,” Zuckerberg continued in his post.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.”