No matter the size and scope of your website, you will invariably want to understand four key things about it in order to evaluate its performance. These are:
- How different groups of people get to your site
- Where they land on it
- How they move around its pages and
- Whether or not they ‘convert’ (that conversion may be a sale, engaging in a piece of your content, or signing up to something).
Whilst these all sound pretty straightforward insights, the reality of getting accurate information for each is often patchy.
We’re often called in by businesses to ‘get what data and insights you can from our current Google Analytics set-up!’ They seek our expertise, as they’ve often struggled to get accurate analytics’ information and in a format they can work with. Without this information, they struggle to review their website’s performance and take the right decisions for its development.
Why taking your analytics at face value can be risky
By way of an example, we were called in to review a financial services company’s analytics. At an initial glance it appeared:
- 25% of customers came straight to a product page (and that page incurred a 70% bounce-rate)
- 50% went to the ‘shop front’ page with a listing of products (that page had a 90% bounce-rate)
- 25% went to the company’s home page
- The organisation’s SEO campaign was driving people searching for their products to the product pages
- People were going directly to the shop front
- Media, email and social (LinkedIn in particular) were driving people to the home page
But wait! When we started to investigate further, some of the data didn’t make sense. In particular, that:
- 92% users were new
- The 90% bounce rate on the shop front was attracting visitors from direct sources
- There were very low dwell times (1min 30secs) and page views per user in some areas. In contrast, there were others with 100+ page views per user
We did some further digging into the sources of traffic and found that 30% came from Linux operating systems from the company’s Amazon Web Service – AWS – Domain. In this case the organisation was using AWS to monitor the up time of a page. The company also had a mixture of hostnames for staging sites, which their teams were working on.
So, unbeknown to the organisation, their website monitoring systems were not filtering out these sources in the analytics reports and the data wasn’t a true reflection of real customer behaviour.
Aligning base tracking to improve your measurement approach
Our client desired a true view of their customer journey as a one-time report. Historically they’d struggled with this, as it was proving difficult with their Google Analytics to isolate different audiences to their product pages, home page and shop front.
They subsequently found it hard to clean the data, identify the customer journey, set and then report on Key Performance Indicators’ progress (KPIs).
Whilst Google Analytics is designed for linear reporting requirements, it’s not ideal for all the insights an organisation wants to capture about its website(s). Often the information organisations seek would need to blend a number of variables in order to get more complete picture about customer behaviour and buying journeys.
Companies are also nervous about how best to change their approach and improve their current analytics’ set up. At the heart of that nervousness are these 3 common concerns:
- If we add in filters to improve data quality in the reporting, KPIs (such as dwell time or page views) may not be then comparable year on year – historic information may not align.
- If we don’t add in the filters, we’ll keep having the same problem.
- If we create a new analytics view there will be more data to manage.
It is true that as you start to add more areas to a site, more KPIs, more audiences or even more sites, data quality in the reporting functions of a historic analytics set-up can extrapolate the problems. See this article on tracking customer data across multiple domains for more detail.
Wising up to good and bad event reporting
Another common measure in a website’s analytics are events. Events typically look at an action triggered by or on a particular action, class or element on a web page. For example, we may look for a click event on a newsletter sign-up button.
Some common issues that can occur here include:
- The class is used for other purposes in other areas of the site, so the event trigger is ‘over firing’ and numbers reported are not representative of the actual situation
- The configuration of the tag is incorrect so it doesn’t capture all the activity and ‘under fires’
- Events are double firing, causing duplicate cases
- Events do not have a standard naming convention, making it confusing as to which event category/action/labels relate to each actions
Sometimes an event may capture a piece of text, like ‘home’ suggesting it’s related to the home page. But in different parts of the site that could be written using different character treatments – eg: home, Home, and HOME.
This then creates three separate ‘home’ event categories for analytics to try and evaluate. Also, when it comes to the reporting, events can be added together to get the total number of ‘home’ events, but metrics such as the number of users or sessions that include a home event are not robust. This is because a user could have both a home and HOME event and therefore be double counted.
In more advanced cases we have seen, events are triggered independently from the page view. Registering a new session, this then means that KPIs cannot be tied to the user’s journey or source.
And as you start to get more pages or add more sites to your digital assets, the problems become more challenging to manage. Changes to the site and tracking tags therefore need to be rolled out ‘in sync’.
What to check to improve your reporting accuracy
Here are some simple pointers of things to check in order to see how accurate (or not) your analytics’ reporting is…
- Check the hostname to ensure all traffic is from the right url
- Check the domain to see if there are any common or odd-looking sources of data
- See what traffic is coming from Linux operating systems, as these are typically servers
- Look at the interaction between sources, pages and events to see the individual actions
Once you have sorted any tracking issues and have improved your data quality, it’s also important to consider how to group common landing pages.
For example, you may have a number of visitors going straight to blog articles. These could be a range of pages – such as article1, article2 etc. Individually they may gain you a small number of visits, but collectively these may add up and form a key component of a customer journey.
Another element to look at is configuring your conversion KPIs. Irrespective of whether you use events or goals for your measures, it is important from a reporting perspective that you know what metric you are using for what conversion. This is because the user interface is designed to profile these attributes.
The pursuit of robust digital tracking
With so much to consider it is no wonder that robust digital tracking is still an ambition, rather than a reality for most organisations.
Good implementation and processes (see our article on 3 common mistakes to avoid with web governance), can help you to gain greater insights from your Google Analytics reports. In most cases our product WebFusion offers even greater flexibility, as well as faster and easier customisation. For example:
- With access to the raw data we can customise our reporting looking at data retrospectively – this means that in cases where conversion tags are missing we can find surrogate indicators
- We can remove bots, internal traffic, and extreme users from the analysis so you can focus on real customers
- Specific journeys can be mapped out and reported on over time to give comparative and trend forming insights
- URLs and events can be mapped into clean values
Can we help?
If you have been struggling to get insights from your Google Analytics in a format, frequency and accuracy you can rely on, do get in touch.