This article was published on June 18, 2011

Should user comments dictate online content?


Should user comments dictate online content?

You might not think about the comments section on a website very often. Heck, more often than not I glance over them as if they don’t exist, except for a few notable exceptions. Between the error-filled ignorance of YouTube comments and the sheer genius that can come in the section on other sites, there’s a wide range of interaction that has, in more recent times, begun to shape the face of how and what online publishers are putting out for the world to see.

In startling contrast to the glancing over that comments often get from readers, publishers toil with them on many different levels. Even here at TNW, we’ve had some of our most fiery discussions on the subjects of what comments systems to use, the pros and cons of each as well as making certain to interact and engage with our readers inside of them. To put it bluntly, the comments system and how it is used can make or break a site and should never be taken lightly.

But then there’s the other side of the story. What happens when publishers take things a bit too far in their efforts? The LA Times ran into this situation just the other day, spawning the thought to me that there were probably even bigger issues at hand when it came to online comments. So with an idea and some contacts, I went on a search for answers.

Is Hate Speech Free?

The problem that The LA Times had, specifically, was that there was a hefty amount of hate speech that was being interjected into the comments on a particularly hotly-debated post. Without getting into details or opinions here, the post was a story about Israeli security forces firing on pro-Palestinian protestors.

Obviously, this was going to cause some love loss, and that’s exactly what happened:

On Thursday afternoon, comments on the article were restricted, which means they’ll only be posted with a moderator’s approval. A note in the comments section from Reader Engagement Editor Martin Beck says, “It’s our opinion that this discussion has run its course. And moderators will be setting a *very* high bar for approval of any more comments.”

While what the Times did in this situation isn’t inherently dangerous, it does open the door to particular problems. By stating publicly that comments would be moderated for quality, the blog puts itself into the position of judge and jury, rather than allowing its comments to be an open forum for discussion.

The larger issue at hand is that people often times feel that they are granted a right to publish whatever they want in a comments section, regardless of malice, without due recourse. Unfortunately for them, most of them fail to realize that any sort of inalienable right to free speech does not come into play when leaving comments on a site owned by another party.  Call it censorship if you’d like, but the editing of rude and inappropriate comments has another, friendlier name — moderation.

In this particular case, the comments section was closed and The LA Times addressed the issue in another post. Without any doubt, however, the site will likely think twice before leaving a comments section open the next time that it feels compelled to publish such a work.

Wagging the Dog

The distaste that is left in a writer’s or publisher’s mouth after a battle such as the one here is second to none. There comes the point when it’s almost a natural progression to think about the reaction to a story before publishing the story itself and then to write the story accordingly. Two places can tell the exact same story in completely different ways, and this is behavior that happens on the Internet on a daily basis.

What’s left is the rather dangerous proposition of wagging the dog — that is to say that content would be dictated by the predicted reaction, rather than the importance of the subject.

Livefyre CEO Jordan Kretchmer feels the pain of this, but offers a scenario that takes things even a step further. Kretchmer tells me that Livefyre, a third-party comments system (and coincidentally the one that we run here on TNW), is able to provide analytics that could tell a publisher what subjects are hotly-discussed across the Internet at any given time.

The obvious danger in releasing the feature, says Kretchmer, is that it could easily encourage site owners to write only the stories that are most-discussed, foregoing ones that may be equally or more important, but that would drive less interaction.

While analytics and word trends are nothing new, having a system in front of you to tell you about hot topics in real time is something that few (if any) publishers have ever been able to access. The goal, says Kretchmer, is more along the lines of allowing a publisher to contrast more or less-effective work to their own, in order to ascertain what causes discussion, rather than to simply point the publisher toward a specific direction in his or her writing.

Finding the Balance

At TNW, we try to walk that fine line of balance between allowing users to have a voice and allowing them to spew venom at others. Along the way, we try to learn what our readers care more about and find the right paths upon which to direct discussion.

There are tools for trolls that are widely available on the Internet, and blog comments are just another one of them. While it’s worth noting that people will always gripe far before they will praise, that fact doesn’t change the idea that we need to offer an open forum without letting that forum dictate our coverage entirely.

By looking at our statistics and analytics, we can tell what stories catch your attention and we can drive a focus toward them in order to provide more of what you’re wanting. But what about the things that draw attention in a negative light or seem to encourage bad behavior? Should a caution against enraging the masses guide us away from writing stories that would be deemed important if reaction wasn’t in the equation?

The quaint notion of a letter to the editor is still alive and well, only now it can be done instantly and carries with it the inability to decide whether or not to publish it when you’re using an unmoderated system. Interestingly, in a rather limited, unofficial study that I did over Twitter, 88% of those who responded said that they’d prefer that comments be moderated. Though it’s worth noting that, of those who said that they should be, 100% said that it should only be done for the purpose of policing off-topic or hate-related speech.

In the grand scheme of publishing, we (as a whole of those publishing work on the Internet) have only begun to crawl, not yet even starting to walk. There’s much to be said for allowing the comments of your users to guide your editorial direction, but there’s likely far more to be said for bringing people the news that is important for them to hear, whether or not they realized that that importance.

It’s a battle of balance that is far from over, but proves to be an interesting challenge each and every day. So now it’s your turn. Sound off in the comments and let us know your thoughts. To what extent should comments left dictate the stories that we cover? Should it happen at all? Do you trust the publications that you read to find the best news and information, regardless of reaction? Let’s hear it.

Get the TNW newsletter

Get the most important tech news in your inbox each week.