During numerous conversations I’ve had with great journalists, publishers, and content professionals, I’ve found on the whole they don’t really like wasting too much time with numbers, let alone spend hours digging through content performance data.
Having access to dashboards and reports they can easily understand in just one glance, though? Well, that’s a whole different story.
If you too are working with content, you already know how challenging it is to make sense of your readers’ behavior and measure results. The market is filled with various content and web analytics solutions that promise to provide relevant data to you so you can act on it and make better editorial and business decisions — well, at least in theory.
Content analytics solutions often boast about quantifying human behavior and identifying different types of readers, as well as the level of their engagement and the degree of their loyalty. But is this really possible? Are the reports in Google Analytics and content analytics reliable? How much of the data truly paints the picture, and how much of it is just publishers playing the guessing game?
Opening Pandora’s box
Anyone operating in the online media business is well aware of the shift towards the reader revenue model. Display ads are becoming increasingly intrusive, especially if they are not contextual, and that’s before we even start to consider banner blindness, ad blockers, and the Google-Facebook duopoly.
In such a climate, reader loyalty and engagement are of huge interest to online publishers, especially if they’re operating on subscriptions.
Before transitioning to my current company, I was curious to find out in greatest detail possible how Google Analytics works and how it calculates things. I’ve learned many publishers use Google’s software to estimate the success of their content, even though this analytics tool was created primarily for e-commerce businesses.
The main reasons why? Well, it’s free and it’s pretty straightforward. As a content marketing professional, I’ve used Google Analytics before and reported about content performance to my clients. Like many, I took reports for granted and didn’t really invest much time into understanding how each metric is calculated and what it actually tells me.
But then I started learning. I’ve learned about single and simple metrics, and behavioral and complex metrics. I’ve learned that the metrics system of Google Analytics was later adopted and adapted by many analytics solutions on the market. I realized there are many gaps and flaws in the way things are calculated, and the publishers are not aware of them at all.
Pandora’s box had been well and truly opened and there was no going back.
“Don’t make me think”
Some of the key metrics publishers look at are Unique Pageviews and Pageviews, Average Time on Page, Time on Page, New and Returning visitors…
But what publishers believe they’re looking at in content performance reports and dashboards is not what they’re actually looking at.People don’t really want to think much when they deal with data and analytics. They want everything to be as clear as possible. So, when they, for instance, see a Behavior Report in Google Analytics, they naturally expect to see a report that’s about the behavior of their audience.
This is where it gets tricky. But first, let’s get our terms straight.
- Single metrics are one-dimensional and usually describe a single action that’s not necessarily tied to real human behavior. They often only measure browser events.
- Complex metrics combine different types of metrics which are weighed properly, in order to quantitatively measure behavior that matters. They take into consideration the actual human actions that happen behind the screen, not just what happens within the browser.
Single metrics used by the majority of analytics solutions are not suitable for measuring content performance, and I’ll explain why that is.
Expectations vs. reality
The core of so many reports across different tools is pageviews. Here’s how this metric is defined within Google Analytics:
A pageview (or pageview hit, page tracking hit) is an instance of a page being loaded (or reloaded) in a browser. Pageviews is a metric defined as the total number of pages viewed. […] If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.
Yep, you read that right. Pageviews may as well be called “Page-Loads” since they measure the number of times a page was loaded in the browser – not necessarily the number of times a certain page of content was viewed or read by a real person behind the screen. So, if a person accidentally clicks on the link and closes the page immediately, one pageview will be recorded. If they open it in a new tab and never go back to it, the pageview is still recorded.
It’s similar to the ‘Time on Page. This metric measures the time a page was open in the browser. And what about New and Returning Visitors? Google Analytics, like most of the analytics tools, use cookies to track users. However, if a user switches browsers or devices, they’ll be registered as new users. This messes up your traffic picture quite a bit, doesn’t it?
Another mistake publishers make is thinking Returning Visitors are actually Loyal Visitors. However, just because someone has returned to your website doesn’t mean they’re loyal to your publication. The concept of reader loyalty is much more layered and complex than that.
No, it’s not splitting hairs: a meow is not a roar
Glance over the analytics market and you’ll notice a lot of brands with rich narratives are promising enormous benefits and metrics that indicate engagement, but it’s mostly bullshit. Analytics reports that rely on single metrics cannot properly measure human behavior and its complexity, no matter what you call them. Only solutions that have developed behavioral metrics can do so.
You can call a cat a tiger and think it’s ok to do so. They belong to the same family tree of felines, so technically it’s not lying, right? However, a meow is not a roar.
Some publishers understand the fallacy of believing single metrics when measuring content performance. Others are not even aware of how things get measured, so they believe the reports, without ever questioning their accuracy.
Many publishers might say I’m just splitting hairs here and the reports are “good enough.” They’ve used Google Analytics and managed to make ends meet. If it ain’t broke, don’t fix it, right? However, things are broken and they do need fixing.
The shift from single metrics to complex metrics is happening. Complex metrics can indeed provide a more precise picture of reader behavior and help publishers understand their audience, and then serve them better. With the right type of data and insights, they can identify and then replicate success patterns.
We need to stop bending the truth because it’s easier. Only then can we truly offer content analytics solutions that support and help the publishing industry. Alternatives to shallow views do exist: do you care about them?
Get the TNW newsletter
Get the most important tech news in your inbox each week.