Google’s Declining CPCs

Google’s CPCs have declined yet again. 9% year on year, 15% quarter on quarter. They still trousered enormous profits, and once again the market responded by catapulting their share price even higher.

But that decline in CPCs is quite an important little niggle. I’ve remarked before that the profit margin for fungible goods trends to zero, meaning that at some point advertising using the pay per click model becomes unsustainable. Ultimately only a handful of big retailers have enough play in their margins to afford to fund advertising with a sub 5% conversion rate on desktop, and an even lower conversion rate on mobile.

This is why Google are scrambling to sell bigger ads and looking for ways to crank up mobile CPCs, with the same ultimately self-defeating logic. If Google take more money out of the market, then fewer retailers can compete. And if fewer retailers can compete, CPCs have to come down. Economics 101.

In a normally functioning market, falling CPCs should entice other retailers back in, but a lot of retailers have been so burnt that they’ve either gone out of business, or would rather just suck up transactional fees on eBay or Amazon – which they only accrue on an actual sale, rather than spending lots of money getting people to a website that might not even convert.

By shifting to Amazon or eBay, all the usability work is done and they effectively get brand protection, because people will think “I bought this from Amazon” and not “I bought this from Company X through Amazon.” Another reason then for businesses wonder why they should subsidise a brand and a website when there’s a whole ecosystem to sell through with a captive audience and where agencies can’t intervene with fees and advice that might turn out to be wrong.

Average CPC itself is a fairly useless metric to look at, but it does point to the bind Google now finds itself in. To increase revenue in a time of decreasing CPCs, they can only increase the number of ads or the eyeballs they serve – hence their ever-increasing cycle of acquisitions and search for revenue streams outside paid search. When you look at Google Play, Android Store and the various automotive initiatives, you can see what they’re gunning for: alternative revenue streams and a way to stay inside the customer journey in a way that appeals to advertisers with sufficiently deep pockets.

Case in point: the rolling out of the “store visits” metric. Google want to show that people who searched on Google then turned up in store – presumably with the intention of buying. Android users or people logged into the Google app can physically be traced. If I search for toasters on Google today, then turn up in John Lewis or Argos within a couple of weeks, that is a demonstration of Google’s value to a retailer. It’s a high tech version of a coupon in the local paper.

There’s a hint of desperation to that, in my mind. Almost any purchase with a research cycle is going to involve Google today, so the unique attribution (and therefore value) is pretty thin. Consequently, there’s not much value to the retailer to warrant spending more on Google.

Google is now so much of an infrastructure it’s almost like the Highways Agency saying you should advertise on billboard because people drove to your shop on a road they built. It’s true, but doesn’t quite add up.

And, irony within irony, Google’s conscious strategy of favouring big brands means that SEO and PPC alike make it almost inevitable that anyone searching on Google will encounter a big brand, even if the big brand spends a relatively minimal amount on either of these approaches.

In effect, Google are increasingly selling AdWords to a smaller subset of customers who actually don’t rely on Google in the first place. If you are a small business, you cannot compete on price in an auction based system if you sell physical goods. And the sort of proof-of-attribution model Google are working to is useless if you are a single site who sell nationally through the internet as your footfall is effectively zero. You’re also locked out of the SEO sphere by the big brands who can afford massive marketing pushes, and who escape penalties that would literally finish you because Google has to show them for the sake of their own credibility.

What’s that all mean? Well I’ll be damned if I know. If the last year has shown me anything it’s that I’m a terrible prognosticator. But, if I were a betting man, I’d say that Google is going to continue its trend towards being a playground for big brands and that small business in physical goods will continue to migrate to the Amazon/eBay model. The service industries will hang on to their local/personal diversity for longer, but in the end the same logic will apply: for a small local business, is there any point at all on spending money on a website when it’s primary function is actually to be a line at the bottom of your business card.

Or in short: will the “free internet” be reduced to a wasteland of useless, expensive advertising hoardings?


Sidebar: search Google for “Google declining CPCs” and the first page is horrible. Apparently that’s a query that deserves neither quality nor freshness. That no fewer than 3 of the top 10 results for that query don’t mention “CPC” or “decline” at all is further evidence (if you’re so-minded) that Google’s focus on search quality is less than stellar.

No: Google doesn’t index your meta description

I was asked recently whether Google actually indexed your meta description and I was about to say “of course!” when I had one of those rare flashes of caution and decided to check.

Our company’s home page has the natty meta description of:

“Trusted Dealers is THE SAFEST place to find and buy a second hand car online, with 10 points of difference to ensure you are happy with your deal.”

I know, I know – it could probably do with a refresh. Anyway, when you search Google for ‘trusted dealers’ it is this that displays in the SERPs.


Perfect, eh? Yet when you use the site: command to check whether it has been indexed, Google wags its finger and says no.


So that’s that cleared up and if anyone asks you, you can tell them ‘no’ and that I told you so.

But! The phrase does appear all over the internet. Remove the site: command from the query and no fewer than 129,000 matches are returned. Such as…. meta2

This happened because our blog was hacked for a brief while and a funnel page to a network of gambling affiliates was placed on the site. Someone then built a few hundred thousands links to this page – presumably using XRumer – and these links remain floating round the internet (incidentally: our massively expensive SEO agency didn’t notice this – I did, through a desultory check via

I don’t suppose there are many lessons from this except:

  • Don’t rely on an SEO agency to do everything for you
  • Update your WordPress often
  • Despite Google’s claims that it expunges bad content and links from its index, that’s clearly nonsense: three months after the hack, all those XRumer-built links and hacked blog posts, connected to an empty affiliate scheme in the gambling sector remain in Google’s index.

I don’t imagine there’s much margin any more in doing this sort of thing, but if people are still making something of a living by auto-creating hundreds of thousands of pages at once and hacking WordPress, then I guess my musing the other week about the status of black hat SEO might be out of date in itself.

If you have the energy, you can probably do something with this collection of bits and bats. Sadly, I don’t.

Spoofed Referral Traffic in Google Analytics

The contined spoofing of referral traffic in Analytics highlights a couple of things:

  • Shortcomings in one of Google’s flagship products
  • The shift away from old-skool SEO for spammers to more subtle ways of gaining traffic

My hobby site ( Go visit it now. Please) – even with its paltry visitor numbers (just shy of a couple of hundred per day) gets a small but noticeable trickle of traffic from fake sources such as:


These are covered in good detail over at Refugeek and by Dave Buesing (both sources have some good tips for removing these sites from appearing in Analytics if you want clean, realistic visitor numbers).

The basic method relies on the fact that Analytics can be spoofed – tricking the unwary visitor into thinking they are getting actual human visitors from sources. In fact, these are just faked visits by bots posing as browsers and passing through false headers.

The motivation seems to be (as far as I can tell) to get site visitors to visit these sites to see where their link is. Personal example: I started getting traffic from and visited their site to see where/how/why they were linking to me. I couldn’t find anything, but noticed that they had some on-the-face-of-things useful SEO tools. I signed up for a ‘free account’ and then promptly forgot all about them, but they still send me emails asking if I want to upgrade to their pro package.

It’s a cunning sleight of hand when you look at it this way. In an easily scalable way, they can effectively drive reasonable levels of traffic to their site by bringing themselves to the attention of anyone with Google Analytics installed. Once those people are on, the bait and switch takes place, and a certain number of people will thus sign up to their product. I imagine it’s probably profitable.

That’s obviously deceitful practice, but highlights how the nature of scamming has changed. As Google has made it harder and harder to spam the SERPs, so innovators/black hats (delete as per your prejudice) are looking for new routes.


A current fake referrer to my site disguises itself as Huffington Post. At first, I was briefly excited – perhaps I’d got a link from HuffPo! In fact, the referral itself was spoofed: the Huffington Post link – when clicked in Analytics – actually redirected to some Chinese shopping site, presumably dropping some affiliate cookies along the way to capture revenue from me should I ever do any shopping on (which is where the link actually redirected).

Update: on closer inspection, I’ve noticed that the URL is actually “”, which also explains how the redirect works.

It’s cunning stuff, to be sure, but I find it hard to believe that it’s a sustainable or large enough niche for anyone to make more than a few quid from. As I mentioned a couple of posts ago, it adds to my belief that black hat/affiliate sites are finally being shuttered by Google and the glory days of such operations are now behind us.

As such, we should actually tip a hat to Google in thanks. For many years, spammers and scammers tried – and succeeded – in keeping the SERPs cluttered with affiliate links dressed as content. Google announced their intention to do away with this years ago and now – if you want to go down that route – you have to go big on site quality and content. Of course, the high price of doing that makes most affiliate programs unsustainable because building the necessary traffic levels can’t simply be left to content spinning and xrumer any more.


Is Blackhat SEO Dead?

I’m not plugged into the SEO grid any more these days. At my end of the market, there is very little point in engaging with anything remotely dodgy and much of the work is curatorial or carefully technical. From time to time though, I descend from my ivory tower to pop onto the blackhat forums to seek for interesting snippets that might inform decisions we take inside the business.

And I can’t remember the last useful lesson I took away from these forays.

Example: at one time, Bluehatseo was a must-read site, packed with interesting ways to leverage content and build at industrial scale. It wasn’t something I ever did myself, but it gave me ideas and also meant I could talk sensibly with the more aggressive side of the SEO community (god, I hate that phrase). It also seemed to work. Whether or not Eli was kidding us all, I knew anecdotally of several people making a good living on the margins of Google – moving from market to market, building one from the other till their second incomes became their first incomes and even living off the results of their affiliate schemes.

I don’t get that vibe any more. It seems that times have changed – perhaps even that Google have won their long war of attrition against the “spammers” (as they defined them). You still get the odd lonely ranter complaining in the comments under every Searchengineland blog post about how crazy it is that some of their pages have lost traffic while others have gained, but somehow you know that their “25% drop” in traffic means a dip from 18 people to 14 people or whatever.

I had the honour of working alongside Dave Naylor at Bronco – a one time King of the Black Hats whose ability to spot and exploit a hole in Google’s algorithm was peerless. Today I don’t think he’d touch black hat with a bargepole – not merely because he now occupies a different space, but because the margins just aren’t there any more. Even while I was at Bronco, at least half the work coming in was from people trying to escape from under penalties they’d brought down on themselves.

(As an aside: get me that job at Searchengineland that consists purely of rewriting each Google announcement and transcribing their Webmaster videos – that’s some serious value-add right there, my friends)

And you know what? I welcome that change. I rarely visited an affiliate site and felt enriched by the experience. It annoyed the hell of of me to sit next to my wife while she was shopping and to see her going to click on what would clearly be an affiliate site before trying to find what she actually wanted.

And as an SEO, what could be worse than negotiating link prices from a faceless Estonian blogfarm owner?

Of course, the legacy of the spam wars is still with us. There are still bots mindlessly plugging Ugg boots on comment threads everywhere (I conceded defeat on my own blog recently and installed Disqus) and people buying and selling links by the thousand, but the more I look the more it feels like these are the last shots in a war that has concluded. a sort of digital version of the Continuity IRA.

I know I have (by approximation) zero readers, but if you are a blackhat making good dollar from it as we turn the corner into 2015, I’d be interested to hear about it.

Revisiting Schema

I have always been pretty dismissive of Schema. To me, it was and is an exercise in futility. The number of webmasters with the time, knowledge and inclination to enact Schema tags is a tiny fraction of the publishing audience, and thus its impact was never going to change the face of the world.

In addition, Schema itself is pitifully incomplete and actually retrograde, in that it attempts to force responsibility for telling search engines what a page is about onto site owners, rather than forcing the search engines to get better at what they’re supposed to be getting better at.

There’s also the notable potential side effect of Google slowing taking “ownership” of information away from sites. Google’s “Knowledge box” has, for some time now, slowly been taking traffic away from Wikipedia. Once Google “knows” something (the classic example being the height of the Eiffel Tower) it increasingly displays that information itself up front and centre rather than merely giving you a bunch of links to parse for yourself.

And actually, if you are sat in Mountain View that makes sense. The height of the Eiffel tower is known and is exactly the kind of thing that people could try to “spam” to get some AdSense revenue. Once you’re confident you have the right answer in your “knowledge box” database why run the risk of polluting your own reputation by sending people to potentially disreputable sources through the SERPs?

Everyone has experienced that moment when you’ve Googled something, clicked the first link and taken the answer as true, only to find later on that it was actually just rubbish. Not for nothing is Yahoo! Answers a poisoned chalice.

All of which brings us down to the subject of Schema. Schema is sold as a way for you to add a structure to your site to allow Google to get that knowledge directly, thus contributing to their knowledge graph.

As an SEO by trade, it would be remiss of me not to be dabbling with it to see what benefits and potential pitfalls it could have.

Enacting Schema

Firstly, it is worth pointing out that Schema isn’t brilliantly documented. To say it is backed by some of the biggest names in tech, it is (ironically) presented in such a way as if the internet hasn’t changed since 2004. It is text-heavy. There are no walk-though videos explaining the potential benefits. In this sense, it reminds me a lot of the W3C website – probably appealing to geeks, but lacking the sense of ‘fizz’ that is necessary to draw in the casual users who will absolutely define the success or failure of Schema or web standards (I once wrote at length about why there is no such thing as “web standards” but my ex-employer has deleted the post – I will revisit the subject in the future).

As such, the technically-minded can pick there way through to discover what it is you have to do to enact Schema. It’s basically adding a load of additional attributes to HTML elements and (disappointingly) adding additional <span> tags around things to fulfill Schema’s structure.

To see some examples, check out the <a href=””>source code of this page</a>. Apologies for my poor coding standards overall – it’s been many years since I considered myself to be a developer. As you can see, there are various additions to the code like:

<span itemscope itemtype="" itemprop="suggestedAnswer"> and <div itemprop="aggregateRating" itemscope itemtype="" id="rating">

That sort of thing.

Obviously, it’s fairly trivial to add additional bits of code, but it’s another layer of work to add to your to-do list and does add to nesting and tag redundancy, which runs counter to everything we’ve been told to strive for for the last decade.

And as such, it must compete with other less trivial matters like writing content, maintaining a database, running plug-ins, refreshing the design, promotion etc. So I imagine that unless there is a strong imperative, deploying Schema is going to be well down the list.

Secondly, while deploying the bits of code necessary to enact Schema is fairly trivial, understanding the way that a Schema ‘object’ is constructed is often very frustrating. In my original critique of Schema, I harangued the void about the limited range of things available. The Schema for ‘person‘ for example barely touches on what a person can be and is heavily skewed towards the professions.

And just try understanding why “diet” is a property of “person”.

The best way to actually test your Schema code is basically trial and error and constant testing through Google Webmasters Tools structured data tester. Some of the “errors” are baffling – for example when you are told that an “event” object must be in the future (a proviso I got over by simply ignoring it).


So. Having “done” Schema for my site, what are my findings? Honestly, it’s hard to tell. Positive aggregate star ratings always look pretty attractive in the results, so I have no doubt that my primitive voting system allied to Schema is helping to improve my clickthrough rates. Aside from that though? I can’t  say there’s been any notable benefit. I am deliberately not promoting the site as part of my experimentation, beyond automatically tweeting each new post and contributing a couple of comments to threads on other sites. As such, there is little wonder that my traffic remains below 150 visits a day (itself a riposte to those who would have you believe that active SEO promotion is a waste of time.) Since enacting Schema there has been no noticeable leap in the gentle upward slope of traffic, so claims that “doing Schema” is going to transform your website’s performance in itself are probably misplaced.

Nonetheless, enacting Schema has made me think more deeply about the way that data is structured and how I build content. I couldn’t recommend it in all good conscience, but as part of a broad effort to give Google what it claims to want, it is probably a tick worth having if your data in any way fits into any of the available schemas.

And, at the back of it all lurks the suspicion about what happens if all your markup and the trust it helps to build leads to your site getting highjacked by Google itself. For argument’s sake, let’s pretend that my article on the Yorkshire Ripper becomes the definite oversight. With so much Schema data in there – geo co-ordinates and dates of his attacks, properly attributed images, factually correct dates etc – Google could, unilaterally, decide to take that info as gospel and simply pull it through into their knowledge box. And what then for my traffic…?

In conclusion. As an exercise, Schema is worth thinking about and experimenting with, but as a long-term venture it comes with risks that probably at least equal the potential benefits.

Fare well, Google Authorship…

So it’s a fond-farewell to the occasional little portrait that accompanied things you submitted via or wrote on Google+. It’s officially dead in the water

I’ll just pause here for a moment for you to dry your eyes.

The whole thing was always a little bit shonky. The take-up was low, the benefits seemingly minimal,  and it became yet another thing used solely by SEOs to try and improve their rankings in organic listings.

In some ways, this highlights once again the shortcomings of Google’s mission to ‘organise the world’s information.’ It was easy enough to set up a profile if you could be bothered, but doing so didn’t somehow magically confer authority on either you or your content. In effect, some no-mark from Leeds like me could get their fizzog into the rankings alongside Polly Toynbee or Robert Scoble or whoever.

But that was just an attribution and a tiny sprinkle of glitz in the SERPs. It didn’t make you suddenly an expert in whatever you were talking about. It wasn’t a signal of quality or…. anything really. Just occasionally an extremely mild tingle of delight at seeing your face (or that of a friend) in the rankings.

And in a way, this only further serves to highlight the problem that Google has in the social space. I predicted (in a spirit of larkfulness) that Google+ would be dead in the water by 2013. I was obviously wrong, but only by a matter of time. Google cannot attract content users to seriously engage with its space. The game has been lost, and really all that is left is a disorganised retreat. The abandonment of Google Authorship is merely a waymarker on that long and dismal road.


Google unnatural links warning


When Trusted Dealers was founded, an agency was chosen to ‘do SEO’ on the basis of work they had previously done in the industry  – effectively for a competitor. When I took over the running of the site, it took me all of 4 weeks to decide that the work was of highly questionable worth.

When I finally extracted from them a list of links they had “built” it was clear that they had been fishing in the very bottom of the pool and we parted ways. In fact, apart from honouring a few pre-existing deals, we effectively stopped ‘linkbuilding’ as a discreet activity.

Despite that, I’ve known for a long time that we have a batch of bad historical links and have been waiting for notification from Google that they were “onto” us. Today we finally got the dread warning through Google Webmaster Tools.

Interestingly, it took a form I personally haven’t seen before – including this passage:

“We do realize that some links may be outside of your control. As a result, for this specific incident we are taking very targeted action to reduce trust in the unnatural links.”

I’m pleased about that. I’m 95% certain which links they’re referring to (there is no coincidence in timing) and also believe I’ll be able to get them removed.

What does it spell for an industry where I know for a FACT that several ‘top ranking sites’ are engaged in massive linkbuilding programs right this minute? Big change, I suspect. Having been courted by several companies offering their ‘expertise’ in this particular vertical, I expect to see some big smack-downs being delivered to people. All of which reminds me that I have a draft post about a typical industry experience, which I will finish soon so you can see under the hood of what actually happens.

Interesting times.