For some time now, Facebook has neglected the urgent need to remove highly inappropriate content that is viewed by their users, in varying age ranges from 13-70 year old. The social media giant has received criticism for not doing more to remove content that contain violent, abusive, sexual harassment, even some extra-ordinary cases leading to death by murder. Facebook is urgently expected to take action and remove such content faster. It promised to hire an additional 3,000 people to monitor video violence, but a Facebook spokesperson declined to respond on whether these are full time workers or contractors.
CEO Mark Zuckerberg announced in a Facebook post Wednesday morning. “Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Zuckerberg wrote. “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down,” he wrote. Zuckerberg also mentioned their community operations team already exist of 4,500 people, who review millions of reports about potentially offensive content a week.