Facebook recently issued an apology in the International Business Times for the way it handled a death threat on comedian Hari Kondabolu‘s page last week.
Kondabolu, who—like many comedians—uses his social media pages to post quick jokes, created a post that read, "Fox News host Brian Kilmeade wants to know why we aren’t ‘clearing the waters’ of sharks. WHITE PEOPLE WANT TO GENTRIFY THE OCEAN NOW?"
A comment from a user named Matt Cannon that said, “Keep making comments like you do and the only thing going to be fed to you will be a bullet,” led Kondabolu to follow the outlined protocol for reporting violations. Unfortunately, the company got back to him a few hours later only to say that the comment didn’t qualify as a violation of the website’s "Community Standards," which clearly state that Facebook will “remove credible threats of physical harm to individuals.”
Contacted @facebook about a death threat on a thread & they said it does NOT violate their standards. pic.twitter.com/uQ8MlAymU4
— Hari Kondabolu (@harikondabolu) July 22, 2015
“Sometimes you can say it’s a tricky line, but a death threat is a threat. He doesn’t actually want to feed me a bullet. He’s talking about killing me,” said Kondabolu in a statement to the International Business Times, adding, “We’re not talking about risking speech. We’re talking about safety.”
A Facebook spokesperson responded to the IB Times saying that they made a mistake in Kondabolu’s case: “As our team processes millions of reports each week, we occasionally make a mistake. In this case, should have removed this content, and we apologize for that."
This isn’t the first time Kondabolu has reported inappropriate posts to Facebook, though last time he said the site not only removed the content—someone left a comment with the n-word on one of his posts—but contacted the user as well. That’s what he wishes had happened here. “I can remove a message. What I need is a warning. I need something for them to discourage this behavior,” he said.