The Varying Virtues of Site Performance Metrics 31 July, 2007 — 10 commentsStuart Brown

Pageviews, bounce rates and goals

Posted in Analysis, Interactivity and Usability
Tagged with: , , , ,

Measuring site performance is somewhat of a black art - from the dubious numbers lauded by some sites to the notoriously skewed and untrustworthy figures from Alexa, Compete, and other metric & analytics folk - there simply isn't any single great method of gauging a site's performance.

Even with access to your own statistics, plunging the depths of server logs and turning them into something meaningful is a perilous task - the relative worth of pageviews, unique visitors, etcetera are not clear cut. With the advent of AJAX, the traditional means of gauging activity - the pageview - is being seen as increasingly irrelevant.

Sankey diagram of modelled web traffic Figure: A Sankey diagram modelling the flow and ultimate destination of traffic in an ideal web system.

Need for change?

A recent report by Nielsen/NetRatings acknowledged this shortfall in metric reliability, and proposed the adoption of a more content-neutral approach - the time spent on a particular site.

Of course, no technique is perfect - a particular method of measuring site performance will favour certain sites at the expense of others. For instance, a video site like YouTube would probably score poorly in terms of pageviews, but fare better in terms of the time spent viewing content.

The preferred means of performance metric for seasoned marketers is that of the conversion ratio - e-commerce, ad campaigns and affiliate schemes typically have a strong goal-oriented structure, which makes measuring performance a lot easier. For content-oriented sites (such as blogs), the goals are less clear, but striving to minimise bounce rates or increase subscriber figures are both empirically attainable - they can be measured, analysed, and improved upon.

Engagement over attendance

Sheer volume of traffic does not always mean a similarly large volume of engaged visitors - indeed, it's entirely possible for a site to attract millions of visitors and not attain any of the site's intended goals- whilst a site that receives the merest trickle of visits could very well have a consistently high conversion ratio. Higher traffic is undoubtedly better in most circumstances, but it's only an indicator of performance, not a direct measure.

It is this shift from simply getting the raw numbers of visitors to actually engaging (and subsequently interacting) with them, which has characterised the last few years of the web. This change towards a two-way relationship with sites comes with new technologies, and an increased willingness of web users to add their input on such sites - 'user generated content', or 'Web 2.0' if you wish to put a label on it.

In short, it's not always about the millions of pageviews, nor how high the Alexa ranking - in the long run, it's the quality of your visitors, your community - and the ease with which you can achieve your site's goals.

∗  View/add comments on this post (10)