Content = Better User Experiences

by Jens Sorensen on October 6, 2011

We’ve officially moved into our new flat and slowly managed to furnish it. We had been living in a rather nice townhouse for the last two years but due to some very strange neighbours at the landlady’s current place she decided she wanted to move her family back in, which meant a nice eviction for us with 4 weeks’ notice. Initially, it seemed like a reasonable amount of time, but August and September is the probably the worse time to move, students trying to arrange last minute accommodation and we’re definitely in a competitive renters market. Couple this with the fact that we both work full-time which meant viewings would need to be during evening and weekends meant we needed to get our skates on.

So like any normal person (I guess), we hurriedly wrote down a list of requirements for our new property search.  These include:

    • 2 bed apartment / 3 bed house (extra room for guests)
    • Walkable to the city (Even though I rarely go)
    • Sea views (not the current lean out of the bedroom window I previously had)
    • Furnished (but we can’t be too fussy), white goods are a must though
    • We work Monday-Friday, so evening and weekend viewings only

Then we hit the property portals; RightMove, Findaproperty, Zoopla etc and there’s plenty of choice. We can use search facets to drill down by price, area, no. of beds, furnished/unfurnished etc. The large choice suggests we’d find somewhere easily but how wrong I was? Firstly, the majority of properties on these sites have one line descriptions with one or two photos (and some with no interior photos at all). I can only think of three reasons why an agent would do this:

    1. The agent didn’t have access to the property to take photos
    2. The agent can’t be bothered to take photos
    3. The agent wants to give you such limited information so you book a viewing

The first two are pretty lame excuses and therefore not really a justified reason. The third is fine but then maybe said agents should have some availability to actually view them. On more than 15 occasions when I called to arrange a viewing with separate agents the responses were one of:

    • We’re fully booked for viewings for at least a week and a half
    • We don’t do viewings after 5pm
    • It’s already been let

Like I said this is a busy time for moving in the area and renting is pretty competitive down here. But surely by offering up the information on the website I can make an informed decision to whether I want to even view a property rather than have to book a viewing to get the information. 90% of the properties we viewed with limited prior information we had no interest in. By making this information available it saves both my time and the agents.

This got me that that the property portal business model has some fundamental flaws, scrolling through hundreds of properties with no content suggest estate agents pay a fixed fee for property uploads rather than paying on a per property basis. From the agents perspective: If I have a property coming up on my books I am way more likely to get it uploaded as fast as possible even without any information because I can do it with no added cost and it will give my estate agency a greater chance of being noticed.

But in actual fact the following happens. For the customer: scrolling through hundreds of properties with no content makes my search experience frustrating and frustration isn’t really what make good user experiences. For the agent they waste valuable time showing properties which the customer has not shortlisted properly due to inadequate information.

Surely the quality and user experience of these portals would improve if agents were required to upload a minimum amount of information above that of a photo and one line description?

On a more positive note, we did eventually find somewhere to live and it ticked all the requirement boxes.

The flat and sea views

{ 0 comments }

I recently had a web analytics query from one of our clients specifically around the analysis they were doing for their library pages using the In-Page Analytics/Site overlay report. Being a council and receiving monthly visits in the tens of thousands it was pretty hard to believe that people were interested in some of the library pages but not necessarily going through to the library catalogue pages.

Not all clicks tracking

Links on the right have no clicks

So what’s going on?

The issue lies with the actual content. The three links we see to the right of the image (Search the Library Catalogue, Renew Library Items, and Archives Catalogue) all receive no clicks. These links however go to the sub domain of the site. But why is that an issue? Well, the In-Page Analytics report is quite misleading, it works by tracking the normal JavaScript snippet on your site that transmits the data to Google each time a page is loaded and therefore Google Analytics can conclude that a click was made, but it actually isn’t tracking the “clicks” itself. Because of this, if a click was made to an outbound link it will not track unless you also had the same tracking code on this external site. In this council example, tracking the clicks to the library pages would be entirely possible through the use of cross domain tracking from the main site to the library sub domain (explained below)

However, if like a lot of council websites or any website for that matter which has outbound links to other sources or even banner ads, the In-Page Analytics will not compute the clicks and therefore, although the report can give you a useful visual representation, the insights available from it are minimal, for me at least, making it a function of GA that I do not use.

Furthermore, the In-Page Analytics report is not able to work with the following types of content:

  • JavaScript links
  • Virtual pageviews created with urchinTracker
  • URL redirects
  • URL rewrite filters
  • Links to subdomain pages
  • Frames

And probably the most annoying  and what this report would be useful for is to identify interaction and usability without the need for A/B testing e.g. if one link on a page was more successful than another. However, because we’re not tracking ‘real’ clicks if there is more than one of the same links on a page (even if the anchor text is different) both of these links would give the same click percentage. Again, this is possible if you added an ID to each of the links, but this would be a hassle and not very good from an SEO perspective.

The solution

If visual representation of your data is what you are after tools like ClickTale or Crazy Egg are much more effective at click tracking. However, as I like to keep all my data in one package and don’t worry so much about visual representation I’d rather get the information in Google Analytics through other reports.

So to get the insight into what is happening on the sub domain through Google Analytics, there are a number of things we could do. Firstly, the library sub domain should have GA tracking added (you should be tracking all your sites after all); this would at the very highest level tell you how much the main council site is referring (traffic sources -> referring sites) but not for a specific page. For this you would need sub domain tracking, which basically says to GA the data from this site (sub domain) is part of the main site (make sure you set up filters though to distinguish between the two though).

Alternatively, if you want to track outbound links that are external to your site (e.g. you don’t own the other site). You can call the track outbound link action to allow the distinction of exits from any page. This action is very useful when looking at page exits (something you should be doing). We know there are good and bad exits depending on the stage of the conversion journey, however, without distinguishing whether or not these are due to outbound clicks (usually good), we might think that our site has an overly high exit rate. There are a number of ways to track outbound links; however I think in order not to artificially increase pageviews its best to use the event action.

{ 6 comments }

The gap between cutting costs and improving public services

February 18, 2011

I recently read an interesting post on Public Sector Customer Forums about a mother’s dreadful experience when trying to apply for the school admissions service on her local council’s website. The post highlighted several flaws in the system named as ‘A faster more efficient way to apply’. Firstly, I whole heartedly agree that making the service [...]

Read the full article →

How to identify Goals and KPIs for your website in Google Analytics

January 9, 2011

This post focusses on why, what and how to set up goals and KPIs (Key Performance Indicators) in Google Analytics. When delving into the world of web analytics it can become overwhelming, very quickly. You have a lot of questions, you’re expecting answers to those questions, and you’re expecting it to solve all your problems. [...]

Read the full article →

Deriving insights from the Hostnames Report in Google Analytics

December 7, 2010
Hostname report for GOSS

A hostname is the name of the device where you receive your traffic from. Every time a visitor arrives at a page where Google Analytics is implemented, the information is sent to GA and the domain/hostname is displayed in the hostnames report (from the dashboard in GA click Visitors > Network Properties > Hostnames). For example [...]

Read the full article →

An Overview of ‘A Brave New World: Digital Marketing in the 21st Century’ B2B Marketing’s Annual Conference

November 12, 2010

A stormy start to yesterdays B2B’s Annual Conference: ‘A Brave New World: Digital Marketing in the 21st Century’, situated in an area of London I am not familiar with, just across from Tower Bridge and Tower Gate Tube station. The map provided by B2B was surprisingly detailed, although I still managed to walk the complete [...]

Read the full article →

Should Google worry about its low switching cost business model?

July 29, 2010

In my last post I discussed how Google benefits from the long tail of search with its Adwords platform. Concluding that all seems well in Google land, more and more searches = more and more ads = more profit. But, what if users switch search engines, unlikely I agree, but it is conceivable. We all [...]

Read the full article →

How Google benefits from the long tail of search advertising (PPC)

July 16, 2010

It is well documented that 97% of Google’s profit comes from ads. Although they do have a lot of product offerings; Chrome, Gmail, Wave, Google Docs etc. these do not directly generate income. Therefore, if Google is an advertising company rather than a search company, it makes sense for them to ensure that you use [...]

Read the full article →

Tips and tricks on passing the Google Analytics Individual Qualification (GAIQ) test

July 2, 2010

Yesterday, I passed the Google Analytics Individual Qualification (GAIQ) test with a score of 91%. It’s a 70 question online exam which you have an hour and a half to complete and you have to get 80% to pass (Google recently increased the pass rate from 75% and the reduced the time allowed by 30 [...]

Read the full article →