Main content

Big media on Facebook: moderation in everything

Matthew Eltringham

is editor of the BBC College of Journalism website. Twitter: @mattsays

Tagged with:

A couple of weeks ago, a BBC journalist removed a comment from a discussion of the Notting Hill Carnival on BBC London's Facebook page because they felt it was inappropriate.

On the face of it, removing a comment from a Facebook discussion isn't a particularly big deal.

BBC London engages well with Twitter and Facebook and, as my colleague Claire Wardle has blogged, it has started to innovate too, with the use of Ushahidi's Crowdmap to report the London Underground strike.

And BBC London's intervention in the comment stream on the Notting Hill Carnival was textbook: it removed the comment and explained why - that the comment broke the house rules that are linked to on its page, because of bad language.

And the response from other members of the community was supportive - with Apple Leiper, for instance, writing: "Yay a page that is actually moderated! Nice one."

OK, fine - but, as the BBC (and everybody else) piles into social media, is a consistent idea of good practice emerging on how to deal with user comments? And does it matter?

To answer the last question first, well, yes, of course it matters. Big brands from the BBC to Coca-Cola are trying to build relationships with their audiences or customers through social media - and if those relationships are badly managed, those people will at best think less of the organisation, at worst go elsewhere.

Next question: what is good practice? There are important legal and user-protection issues that the community management agency eModeration discusses in a blog post which examines where the ultimate responsibility lies: leaving it up to Facebook isn't really an option.

The basic principles of good community management are pretty straightforward: be clear about what is acceptable behaviour, explain how you will ensure that behaviour is maintained, engage with your community, give them a reason for being there and do what you say you are going to do.

However, translating those principles into practice on social media spaces is a little more complicated for mainstream media.

First of all, what is acceptable behaviour in social media spaces?

Most mainstream media - including the BBC - have very clear house rules about comment on their own platforms that are pretty rigorously implemented. But the tone of conversation throughout social media is generally more robust.

So is it appropriate to enforce the same codes of behaviour? Are there, for instance, words that are acceptable on a BBC page on Facebook but not on the BBC website? BBC London - which references the BBC house rules on its Facebook page - thought not, and removed a comment because of an offending word.



CNN is less squeamish. A cursory read of the comments on a CNN Facebook page about the Tea Party's primary victory in Delaware found both particular words and a general coarseness of expression that wouldn't all have been allowed on a BBC website (and perhaps wouldn't have on CNN's either). But did they seem out of place or inappropriate on a Facebook comment stream? Actually, no, they didn't.

Most Facebook pages set up by mainstream media organisations - including many BBC programmes - offer little or no guidance to people wanting to comment on stories that have been posted.

NPR - with more than 1.2 million friends - gives no obvious advice to commenters on its wall, although a reply to a question on its discussion board offers: "It's our policy to remove ad-hominem attacks, spam and hate speech, but we draw the line there. When users submit reports about a particular comment, they go to Facebook and not NPR, so we're not in a position to process those complaints. Apart from that we generally give our FB fans some latitude, as it's ultimately their community."

The Economist just states that "commercial or offensive posts will be removed".

But the Guardian, the Independent, GMTV, Al Jazeera, Newsweek, the New York Times, New Statesman and the FT all have no explicit guidance on how comments will be moderated.

BBC Facebook pages, confusingly, have a range of different approaches - from Breakfast's explicit link to the BBC's own house rules to Radio Orkney's statement that comments "do not represent the views of BBC Scotland", to no guidance at all.

NPR's approach seems well balanced, although perhaps it would be helpful to make it more prominent - to avoid any confusion, misunderstanding or accusations of censorship.

Transparency is another key principle of good community management. Explaining what you're doing and why you're doing it not only prevents crises emerging - it also in itself brings people onside.

But is the absence of guidance across mainstream media Facebook pages a deliberate policy or just an omission? And what does it say about how seriously mainstream media take their social media activity?

It takes minutes to create a Facebook presence that satisfies superficial criteria: build the page, promote your content on it, attract a handful of friends and, hey presto, you have a social media policy.

Moderating or even hosting the comments requires resources, but brings important rewards. BBC London found that out - showing its audience that it is genuinely interested in their contributions and engagement; that they're not just being stuck up and ignored. As a result, BBC London's audience is likely to at least think better of it and this will encourage further contributions that could give it an editorial advantage over its rivals.

The resourcing doesn't have to be huge to start with - it doesn't take long to check through a stream of comments. The point is not so much the ongoing physical effort required, rather the realisation that it is required at all.

Like pets, Facebook pages are for life, not just for Christmas; and, like pets, Facebook pages are often very revealing about their owners.

Tagged with:

More Posts

Previous

Putin road movie hits YouTube diversion

Next