Skip to main content

Advertisement

Advertisement

The line between hate and debate online is difficult to draw

Facebook announced a small but meaningful change to its community guidelines in October.

Facebook is said to delete approximately 66,000 posts per week that are identified as hate speech. Photo: Reuters

Facebook is said to delete approximately 66,000 posts per week that are identified as hate speech. Photo: Reuters

Follow TODAY on WhatsApp

Facebook announced a small but meaningful change to its community guidelines in October.

“We’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest,” read the statement. “Even if they might otherwise violate our standards.”

This came after reports that Facebook employees had argued that Mr Donald Trump’s posts on Muslim immigration violated their hate speech guidelines. These are challenging times in the world of online debate. I should know. I am responsible for the Financial Times’ (FT) community on FT.com.

I oversee a small team of comment moderators to ensure civil discussion online. It might mean occasionally discussing with an editor whether a story should be closed to comments, or alerting a journalist to an interesting conversation below their story.

Sometimes I solicit readers for ideas on topics they would like to read more on, from Brexit to workplace diversity. It can be extremely difficult.

How do you keep discussions among people who strongly disagree polite and productive? And how do we make all readers feel welcome, particularly those in the commenter minority, such as women and young people?

It is not just news companies that face this challenge. Thousands of organisations employ community managers, moderators and “integrity teams” whose job it is to decide when a commenter has overstepped the line.

These relatively new roles are now key to all kinds of businesses, from review services such as Amazon and TripAdvisor, to platforms such as Kickstarter, eBay and Twitch. Even Fitbit has a moderated online community.

On all of these platforms, there is scope for hate speech, incivility and abuse.

Then there are the giants: Social platforms such as Facebook, Twitter and YouTube.

Facebook has a global community of more than two billion people and has struggled to deploy hate-speech guidelines to apply to such a wide range of nations and cultures. It has made some very public mistakes.

Last year, the activist Shaun King posted a screenshot of a slur-filled email he received in order to draw attention to the bigotry he regularly faced. Misreading the context, Facebook moderators deleted the post and temporarily banned Mr King’s account.

Social platforms are under pressure to curb hate and extremism. Facebook recently hired 3,000 content moderators, taking them to 7,500, almost certainly the biggest moderation system in the world. It is said to delete approximately 66,000 posts per week that are identified as hate speech.

Twitter, which has also been severely criticised for not protecting its users from abuse, has said it has ratcheted up the crackdown on abusive accounts.

Social networks’ algorithms serve users with content that reinforces their world view, whereas news organisations should expose readers to people with whom they disagree. For FT’s readers, such debates can be a benefit.

We recently surveyed FT subscribers and found that the top two reasons our readers like the comments are that they add insight and they expose them to different points of view. As one reader put it: “They challenge my understanding of the debate and force me to look at other opinions.”

Readers disagree on a range of topics. After the general election in the United Kingdom, thousands of readers considered the future of Britain’s political parties.

Last week, a group debated whether a story on the pressure to breastfeed is relevant to the FT. Mr John Authers frequently participates in lively debate under his markets and investment columns. We face dilemmas with no perfect answer.

Delete comments that are critical of refugees and you lose the opportunity for thoughtful conversation that could build empathy and change minds. If we do not hit delete and the comment thread is purely negative it could alienate and upset others.

In some cases, the best answer may be to turn the comments off completely — though this can frustrate readers. The FT’s guidelines are clear.

Readers are welcome to argue or reasonably criticise a piece, but not disrespect others or act so disruptively that they stifle productive debate. That line is hardest to navigate on the touchiest and most personal issues: Gender, race, religion, and the politics that stem from each.

Today it is more complicated, as issues that were once fringe — racism, anti-Semitism, white supremacy — have re-entered mainstream political debate.

But the line holds: If a comment is discriminatory — for example, slurs on the grounds of race, religion, sex, gender, sexual orientation, disability or age — it is deleted.

If commenters regularly violate our guidelines, they are banned.

Moderators will not delete a comment that says extremism is a problem in Muslim communities. They will if it says that all Muslims are extremists. In the future, much of the work moderators do will be automated. Google is working on software that helps news organisations better identify toxic comments, and initiatives such as the Knight Foundation-funded Coral Project are working on open-source tools to improve news sites’ commenting communities.

But none will fully replace human moderation, and there will never be objective right answers all of the time. Ultimately, each community manager must decide on its line.

That line will always be a bit fuzzy around the edges, and upholding it depends on good technology and human judgment. The line we draw may differ from the one chosen by Facebook, Fitbit or Breitbart — it may even be different from your own.

For the sake of civil discourse, readers may have to adjust accordingly. FINANCIAL TIMES

ABOUT THE AUTHOR:

Lilah Raptopoulos is The Financial Times’ Community Manager.

Read more of the latest in

Advertisement

Advertisement

Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.