In the non-too-distant past, you would see the search terms people used to reach your site through Google. So, to given an entirely random example, I know that three people visited my site yesterday having typed in ‘Myra Hindley’ (on account of me having a page on the subject).
I know those people stayed on the site for an average of 3 minutes and looked at 3 or 4 pages. Looking back over the last couple of months that seems to be fairly standard for keywords relating to Myra Hindley. From this, I can gauge that actually the content must be pretty decent and that I’ve done a good job.
That allies greatly with what Google have been telling me I should be doing for many years now: building great quality pages. I have a perfect set of metrics by which to measure this.
- I know that the site ranks in the first 3 pages for “Myra Hindley”
- I know that people who visit through that keyword rarely bounce
- I know that people who visit through that keyword stay on the site for a good length of time and explore further.
This is what Google has always said it wants: to serve great quality pages to its users. By giving me the necessary data to work with through Analytics, Google helps me to do that.
By contrast, the page for the recently discovered “London piss dungeon” news story fares worse. While traffic is higher due to the better position the page holds (what a claim! To be on the first page of Google for ‘piss dungeon‘!) the metrics are worse. People tend to bounce pretty quickly and not explore further. There’s a bunch of reasons there might be for that, and if I were so minded, I could work harder to do more with that specific tranche of traffic.
But Google have started on the road to anonymising chunks of that search data. The problem is only going to get worse. Already, the site I run as my day job is seeing this anonymised data rapidly becoming the biggest ‘search phrase’ we have after branded traffic (and will, on current trends, soon surpass that).
Herein lies the dilemma. It’s clear that the traffic lumped into “not provided” is fairly decent in terms of search interaction, but how can I tell what lies within, or how I can improve things further? Perhaps 900 of that number is some amazing first page term I don’t know anything about but people bounce after a couple of seconds. Or maybe it’s a tonne of great long tail phrases with good dwell times.
Shorn of that valuable bit of data, there is no way to tell, and experimenting with layouts and content and site structures becomes increasingly a shot in the dark – and a risky one at that.
Google is company with internal factions. And at the minute, the privacy faction is beating the user experience faction. There’s more of a conflict between those aims than first meets the eye. On current trends, we’re all going to be working in the dark. Not only will that affect the work that site owners do, but also the quality of Google itself.
Update: further fascinating discussion on Google and privacy has arisen on Gizmodo. Go read!