The Brand’s the Thing

It sometimes feel as though the writing has been on the wall for aggregators for as long as there’s been a wall. Philosophically, this is because they are perceived to be (or painted as) taking value from both ends of the chain:

  • Retailers must pay them for sales/leads/traffic (think: affiliate fees)
  • Advertising channels lose revenue to them as they offer a competing model (think: every use of the Booking.com app costs Google lost advertising opportunity)

As such, neither end of the chain likes them. Retailers want to squeeze aggregators to ensure they’re getting maximum value – hence the increasing importance of the role of analyst for retailers, desperate to know what they’re getting for their money and/or to chip their fees. In turn, the mechanisms designed to reward aggregators (affiliate/referral fees) are constantly downgraded

And meanwhile, advertising channels want to crush them altogether. Google finds ways to de-rank aggregators all the time on spurious grounds like “thin content” and “no added value” all of which is intended to push retailers back towards the deathly embrace of AdWords account managers selling the use of broad match keyphrases and high bids based on smoke-n-mirrors ‘quality scores.’

The counterpart to all this is that aggregators do have genuine utility – and thus value – to the ironically least-valued part of the chain: consumers.

Confused and GoCompare etc are mere aggregators in the technical sense, but where else, realistically, would you head to get an insurance quote? From a consumer perspective, the experience is near perfect: put in what you want, get transparent market prices and utility, and act accordingly.

But even here, you can see how their role as honest brokers is under attack at both ends of the chain.

It might have came to nothing, but Google bought comparison engines and promoted them ahead of organic listings for these sites. Meanwhile, the quotes you see on the site are increasingly just starting prices onto which the retailers ladle extra options – forcing you to phone them to navigate your way through an extended sales pitch to a higher price than the one you were shown.

So, although aggregators are often depicted as agents of bad faith (Google: “they offer nothing to the customer experience!” Retailers: “why should I pay extra for a sale I would have made anyway?”) more often than not they are at the mercy of attacks from these two directions. In most cases, it’s not a battle they can sustain. In the end, most fold and the retailers’ advertising money goes straight to Google (where, lest we need reminding, profit has to be found in a system designed to drive margins to zero).

Now, it is true that I work at an aggregator, so you can see my personal slant on this is coming from: we’re effectively locked out of the SERPs by Google whitelisting three or four major aggregators and splitting the remaining long tail traffic between retailers’ own sites and non-competitors such as review sites.

It’s also true that I’ve sat through a lot of pitches over the last couple of years where SEO salesman have said that an aggregator site like ours can link build their way into this space with relatively trivial budgets and some technical tweaks.

Well, a few years ago I was on that side of the desk making that pitch myself, but sitting at this side of the desk you see how pressure from people who advertise with you turn your notional budget into a zero sum game: to develop a strong SEO position means taking cash from a budget that is constantly hammered for Results! Today! and thus finds its way into AdWords instead. It is, simply put, a circle that cannot be squared. Growth takes time, and time is money and this is why aggregators are trying to bypass the online game by going through TV.

Trivago advertise all the time to human eyeballs in human houses in the hope that I’ll remember them next time I’m booking a hotel (incidentally, this is why things like rebranding Yell.com to… whatever it is now – and I genuinely can’t remember – is the height of stupidity) and why? Because they are building a brand.

People have been talking about ‘brand’ forever now (shit, I was writing about this 8 years ago!) but it remains the single most pertinent thing: in a cacophony of 24/7 advertising, viral spoofs, news jacking, PR shills, social media puff and manufactured conspiracy, who retains market share? Brands.

So when you look out of the window into your competitive space in 2018 that should be your starting point: brand. Do you have one, and if not – how will you get one?

Advertisements

Google’s Declining CPCs

Google’s CPCs have declined yet again. 9% year on year, 15% quarter on quarter. They still trousered enormous profits, and once again the market responded by catapulting their share price even higher.

But that decline in CPCs is quite an important little niggle. I’ve remarked before that the profit margin for fungible goods trends to zero, meaning that at some point advertising using the pay per click model becomes unsustainable. Ultimately only a handful of big retailers have enough play in their margins to afford to fund advertising with a sub 5% conversion rate on desktop, and an even lower conversion rate on mobile.

This is why Google are scrambling to sell bigger ads and looking for ways to crank up mobile CPCs, with the same ultimately self-defeating logic. If Google take more money out of the market, then fewer retailers can compete. And if fewer retailers can compete, CPCs have to come down. Economics 101.

In a normally functioning market, falling CPCs should entice other retailers back in, but a lot of retailers have been so burnt that they’ve either gone out of business, or would rather just suck up transactional fees on eBay or Amazon – which they only accrue on an actual sale, rather than spending lots of money getting people to a website that might not even convert.

By shifting to Amazon or eBay, all the usability work is done and they effectively get brand protection, because people will think “I bought this from Amazon” and not “I bought this from Company X through Amazon.” Another reason then for businesses wonder why they should subsidise a brand and a website when there’s a whole ecosystem to sell through with a captive audience and where agencies can’t intervene with fees and advice that might turn out to be wrong.

Average CPC itself is a fairly useless metric to look at, but it does point to the bind Google now finds itself in. To increase revenue in a time of decreasing CPCs, they can only increase the number of ads or the eyeballs they serve – hence their ever-increasing cycle of acquisitions and search for revenue streams outside paid search. When you look at Google Play, Android Store and the various automotive initiatives, you can see what they’re gunning for: alternative revenue streams and a way to stay inside the customer journey in a way that appeals to advertisers with sufficiently deep pockets.

Case in point: the rolling out of the “store visits” metric. Google want to show that people who searched on Google then turned up in store – presumably with the intention of buying. Android users or people logged into the Google app can physically be traced. If I search for toasters on Google today, then turn up in John Lewis or Argos within a couple of weeks, that is a demonstration of Google’s value to a retailer. It’s a high tech version of a coupon in the local paper.

There’s a hint of desperation to that, in my mind. Almost any purchase with a research cycle is going to involve Google today, so the unique attribution (and therefore value) is pretty thin. Consequently, there’s not much value to the retailer to warrant spending more on Google.

Google is now so much of an infrastructure it’s almost like the Highways Agency saying you should advertise on billboard because people drove to your shop on a road they built. It’s true, but doesn’t quite add up.

And, irony within irony, Google’s conscious strategy of favouring big brands means that SEO and PPC alike make it almost inevitable that anyone searching on Google will encounter a big brand, even if the big brand spends a relatively minimal amount on either of these approaches.

In effect, Google are increasingly selling AdWords to a smaller subset of customers who actually don’t rely on Google in the first place. If you are a small business, you cannot compete on price in an auction based system if you sell physical goods. And the sort of proof-of-attribution model Google are working to is useless if you are a single site who sell nationally through the internet as your footfall is effectively zero. You’re also locked out of the SEO sphere by the big brands who can afford massive marketing pushes, and who escape penalties that would literally finish you because Google has to show them for the sake of their own credibility.

What’s that all mean? Well I’ll be damned if I know. If the last year has shown me anything it’s that I’m a terrible prognosticator. But, if I were a betting man, I’d say that Google is going to continue its trend towards being a playground for big brands and that small business in physical goods will continue to migrate to the Amazon/eBay model. The service industries will hang on to their local/personal diversity for longer, but in the end the same logic will apply: for a small local business, is there any point at all on spending money on a website when it’s primary function is actually to be a line at the bottom of your business card.

Or in short: will the “free internet” be reduced to a wasteland of useless, expensive advertising hoardings?

—————————————————

Sidebar: search Google for “Google declining CPCs” and the first page is horrible. Apparently that’s a query that deserves neither quality nor freshness. That no fewer than 3 of the top 10 results for that query don’t mention “CPC” or “decline” at all is further evidence (if you’re so-minded) that Google’s focus on search quality is less than stellar.

I resent AMP

ampFirstly, the bog-standard explanation of AMP:

  • What’s it stand for?
    It stands for Accelerated Mobile Pages,
  • And what’s that?
    It is basically another web ‘standard’
  • …comprising of?
    A limited version of HTML that results in really lean pages that load faster (basically for mobile) and – perhaps more importantly – don’t contain as much crap as normal web pages.
  • And I should care because?
    Google are pushing the technology as a way the “open web” of normal websites can easily produce content that is kind of like the way that Facebook and Apple have built publisher platforms that sit exclusively within their ecosytems.
  • So what does that mean to a regular schmo?
    You’ve probably seen results like the accompanying screenshot at the top of this post if you’ve used Google on mobile in the last year or so.
  • I sense you are ambivalent at best about this?
    Yup. And now I’d like to tell you why.

It’s Yet Another Web “Standard” That is Anything But

I’ve said it before and I’ll say it again: there are no such things as web standards. Not by the properly-accepted definition of the term anyway.

For historic and practical reasons browsers are extremely forgiving about what they will render. Properly done, HTML would always have been a subset of XML and thus would have broken when written incorrectly. Missing a closing tag? “This page cannot be displayed.” Of course, this doesn’t happen, because in the nascent days of the web, every other page was hand-coded or written in something like Geocities editor, so if you wanted people to use your browser it had to be capable of interpreting the resulting code-soup as best as it could, otherwise no one using it would be able to read pretty much anything on the internet at all.

Secondly, most alleged web standards are weak. You only have to look at the 5 year box model wars (which are probably still being fought somewhere) to realise that the “standards” are open to different interpretations.

All this holds true for other alleged standards such as Schema, which I have also pontificated boringly about before.

So partly, my response is simply a yawn of boredom. I’ve been building web pages for getting on for two decades now and every year someone wants you to adopt some new standard for some tedious reason that ultimately is about making them money ahead of you.

In this case the justification is to create fast, lean mobile pages. But you know what: we already are. And if you’re not? Google penalises you in the mobile SERPs anyway.

Practical Maintenance

We’re happily in a place where – finally – any halfway decent designer can make a nice responsive page design that renders well on mobile, tablet and desktop through the many various CSS techniques and general level of agreement between browser vendors that happily now exists.

“Write once: display anywhere” is a nice little mantra. But with the advent of AMP, developers are being asked to create two versions of the same page, as seen in the guidelines. Once more, Jo Muggins is given yet another opportunity to unwittingly send her site to SEO oblivion.

amp2

Probably by the time you’re reading this (disclaimer: I haven’t been exactly quick off the mark to talk about this) there’ll be two billion duplicate pages floating around the internet and someone will be forgetting to update the AMP version and break whatever traffic they’re getting from it.

A Further Drift Away from the Point of the Internet

This is more of a moral-philosophical point about the evolving nature of the internet. Facebook are increasingly acting as a publishing platform rather than a social media site. It’s obvious why: you allow any Tom, Dick or Harry to go linking around to any old website on the internet, and you’re leaking visitors. You want them to stay on Facebook where you can monetise them. So naturally, FB have introduced a bunch of stuff that is effectively native publishing. Write your sordid article for your bottom-feeding website, but then republish it on Facebook according to their standards – and for no other purpose at heart than to make Facebook money (of which they generously allow you to keep a cut). This is the “walled garden” of the internet – and the backdrop to the almighty clash of the internet titans as to who controls your eyeballs and so, ultimately, advertising spend.

AMP is similarly pitched by Google – but mainly to keep people on Google and thus away from Facebook. If a story appears only Facebook, then Google might not even be able to access the damn thing, which spells trouble for them.

Hence the urgent desire to get people to publish a version for Google as well. Once again, your run-of-the-mill web manager is having their job role expanded to fulfil the whims of the colossal tech behemoths that run the show. Google push AMP pages to the top of the SERPs, so as per usual the people with the resources necessary to implement it get the benefit and the bit-players sink even further down the food chain. Google will certainly seek to dangle AMP as a mobile ranking factor so SEO guys can hawk it to their clients. But good luck to you trying to get your newly-minted AMP content anywhere past the big publisher sites.

And will it even last? Well the track record on these things isn’t great. The recently announced death of Google Authorship – which was created for similar reasons and created a similar amount of ultimately pointless work. A thousand and one articles are still out there telling webmasters how to leverage it to their advantage when Google killed it last week. Even the mooted problem that AMP is there to fix – fast responsive sites for mobile – is probably going to be swept away with the arrival of 5G inside a couple of years.

Despite all of these forebodings and misgivings, I will of course, be joining the serried ranks to implement AMP, because that is our lives now: doing as Google commands.

Aaaarrrrggghhhhlgorithms

It’s already a well-worn trope that algorithms are good for some things (processing huge amounts of data) and bad at others (anything to do with human interaction) and yet Instagram has now joined Facebook, Twitter and Uncle Tom Cobleigh in rolling out an algorithm that purports to display the ‘most important’ things in your feed.

In short: stop it.

algorithm

In long: It began – as many things do – with Google. Google were the first people to do a good job of automating the process of crawling and ranking websites in response to queries.

Prior to that, things like Yahoo, DMOZ, Best of the Web etc used human eyeballs to judge the quality of sites and pop them into categories. And guess what? Humans are both imperfect and corruptible and trying to put entire websites into one category is often impossible (hence my long-running belief that Schema is a backwards step). I can still just about recall the days when getting a site onto DMOZ was phase 1 of an SEO campaign, and meant trying to find someone who either accepted anything that was put in front of them, or someone who would accept anything that was put in front of them alongside a brown envelope with some cash in it.

So Google’s programmers wrote an algorithm. It followed every link to see where it led, and added that place to its index. Then it calculated the importance of pages based on the number of links, and the rest you know (i.e. their swift rise to total dominance made the internet accessible to all, and also hopelessly corrupted its very nature, turning everything into a commercial shitstorm and an entire economy based on the whims of this algorithm).

So. Algorithms have a place. It would be an act of absolute folly to try to replicate what Google does with humans. Google still pay humans to do a bunch of testing, but RankBrain is the first tolling of the bell for those guys.

Google search engineers, who spend their days crafting the algorithms that underpin the search software, were asked to eyeball some pages and guess which they thought Google’s search engine technology would rank on top. While the humans guessed correctly 70 percent of the time, RankBrain had an 80 percent success rate.

As most of the commercial web hands over usage data to Google through Analytics, and drives traffic to their web pages via AdWords or other Google properties, so mass data tools can be used to supplant human imperfections. If a site has a high bounce rate for a particular query, Google might fairly surmise that that site is actually not suited to that query and start to drop it down the rankings. Finding a replacement for ‘linkjuice’ has probably been Google’s top priority for years now, and each turn of the ratchet brings the end game closer.

Naturally, Google having set this tone means that every company wants to have an algorithm, BECAUSE ALGORITHMS. But it’s not always clear who these algorithms are meant to serve, or to what end outside the very specific needs of Google.

Twitter and Instagram are two big brands that have recently rolled out algorithms to their products that actually serve to defeat their own very nature.

An example: I follow a bunch of people on Twitter – from friends with a handful of followers to big accounts who tweet (seemingly 24/7 – get a life, guys!) about the search industry. Trad Twitter just showed everything in chronological order, which is perfect for the medium. How is any algorithm going to determine relevancy: no one outside Twitter’s engineers knows. While I’m not a programmer any more, I do know that all they’ll be doing though is chugging data. The algorithm will pick tweets for me from the people I followed based on either one or two things:

  1. Tweets which have had lots of engagement (I wouldn’t know about that: my stats are lousy)
  2. Tweets from people I most commonly interact with (which is about three people).

And why are they doing this?

Ostensibly to ‘improve’ Twitter so it behaves more like Facebook and thus can attract the idiots who populate that horrid corner of the internet. In reality, we all know that they’re ramping it up so they have a further means to shove advertising in. At first it will be indirect (“this tweet from Celebrity X was amazingly popular”) but then as Twitter’s finances continue to get worse they’ll just use it to sell another slot to advertisers until eventually they give up and sell to Yahoo! for them to hammer the final nails into its coffin.

And so it will come to pass with Instagram, Snapchat and whatever-the-fuck the “next big social media site” is. (hint: not Google+)

And why is this a bad idea?

There are some people I follow who I never engage with and who don’t have big follower counts, but whose content sparks trains of thought that otherwise might not cross my mind. Someone like @TheWarNerd tweets infrequently and has relatively (in the scheme of things) a low follower count of under 10,000 – but I wouldn’t want to miss a single tweet.

You know how this plays out without me having to type it. Shorn of metrics to analyse the way I follow him, Twitter will probably conclude that he is ‘less important’ and shove his tweets down the line. Instagram will do exactly the same thing and it will suck for exactly the same reasons.

Why does this keep happening?

I have a bit of a theory about why big tech companies start to “improve” their products until such a point that they become an unusable mess and die. It’s because of the typical life cycle of a platform and the peril of having a team of immensely talented idiots around the place.

  1. Great idea
  2. Great PR begets exposure
  3. Exposure begets big investment from someone
  4. Big investment means recruiting a whole bunch of Really Smart Guys to help get things scaled up
  5. A whole bunch of Really Smart Guys begets boredom when the scaling has been done and the name of the game is administration
  6. Boredom begets dwindling PR exposure
  7. Dwindling PR exposure begets the need to announce things
  8. The need to announce things begets asking the Really Smart Guys to think of ways to ‘improve’ things.
  9. Really Smart Guys’ improvements begets disenchantment because Really Smart Guys don’t know shit about how humans work
  10. Disenchantment begets falling user numbers
  11. Falling user numbers mean Microsoft or Yahoo! buys them – partly to get the Really Smart Guys and partly because they can’t think of their own ideas
  12. Someone realises it’s all been a huge waste of money
  13. It shuts

I don’t even know why I’m writing this, or where it’s going – other than it’s been almost a year since I wrote anything on my blog and one simply must keep up with these things so no one sees behind your carefully constructed facade to find the jaded 40-something web manager within.

Anyway, the point still stands: take yourself and your stupid algorithms and get in the bin.

 

Google in the mobile ecosphere

Mobile is a problem for Google. It’s a paradigm shift that few saw coming just 5 or 6 years ago, but the launch of the pocket web has begun to completely reshape our experience of the internet, which sites we interact with, and how we organise our spending. There are two reasons this creates problems for Google.

Less space for ads + different user XP

Here is a current screen cap of a typical Google search for “used ford kuga”

kuga

I’ve highlight the ads in yellow. Of the 10 available positions on the page, no fewer than 8 of those are paid ads. Given this experience, there is plenty of choice for the consumer even without scrolling down. In the desktop-only world in which Google’s ad model was conceived, this is wonderful – as you have many advertisers, all trying to appear in those top 8 slots, plus another 3 or 4 willing to appear further down the page.

On mobile the story is different. Here is the same result – again with the ads highlighted in yellow.

mobile1

As you can see, the amount of screen space given over to ads is similar – with only one organic result. But that screen space includes just 2 ads.

That means greater competition for those two spaces, which in theory means higher CPCs for those wishing to compete in that space.

But, from a user perspective – and assuming we value choice – there is very little utility there. If I want to see diversity, I scroll. And here’s the rub: we DO scroll with our thumbs.

The old rubrick about organic listings in particular was that most of the traffic went to the top 3 sites and that anywhere further down the list was almost invisible (I exaggerate a little). But that was driven in part by the mouse-point-click metaphor that belonged to the desktop. With a scrolling interface, we can are accustomed to the easy flick of a thumb to see more: hell – the interface demands it.

Thus, on mobile, it is likely that fewer people will actually click the ads as they will assess what they see and move down the page.

In short, whereas equity on desktop is split across potentially 12 different ads, the opportunity for Google on mobile is less. Even if you include the bottom of the page that’s still just 3 additional slots.

mobile2

5 slots instead of 10-12 means fewer opportunities for clicks. All things being equal, Google would have to see double the CPC or CTR from these ads to generate the same revenue from a mobile search as from a desktop search. And here we hit the wall of reality: in most markets, vendors are selling the same product with the same costs and the same margins. Investors might have been impressed by the stunning growth of the internet and Google’s revenues, but very real limits exist driven by real-world costs.

If I can buy Blue Widgets at £5 then so can my competitors. Having first-mover advantage on the internet might mean a window where I can buy clicks for 10p, sell the widget for £10 and thus make a profit according to my conversion rate. That doesn’t last, however. As more people come in, the market matures, margins narrow and thus the money available to spend on clicks declines. According to economic theory, the marginal profits on fungible goods are effectively zero. No wonder then that Google’s CPCs have been in decline for some time.

This probably is a peek behind the curtains as to the resurgence of brand building and display – none of which favours Google.

Apps are… better

Compounding Google’s problem on mobile is the very core of the mobile experience: the app.

I’ve opined before how a horizontal search engine such as Google is actually pretty clumsy when it comes to vertical searches like holidays, clothes shopping etc (other opinions are available)(other opinions are available). I’ve booked a couple of dozen hotels over the last year or two and the number of times I’ve used Google as part of that process? Zero*.

There’s always a danger of reading too much into your own personal experience (after all: I’m quite experienced and savvy these days) but specific apps just seem to make so much more sense than meta engines.

If you were actually looking to buy a Ford Kuga as per my example, downloading the AutoTrader app would make for a whole better experience than clicking through 5 or 6 different websites while trying to learn their varying internal logics and navigational methods.

In conclusion…

Google are still a money printing machine – even on mobile. That isn’t going to change any time soon and any advertiser who can afford to, has to be in the game. The day that the mobile web is the web is already here, and Google recent ‘mobilegeddon’ update is tacit acknowledgement of that fact.

In display, Google rule the roost – with the world’s most popular video channel, largest display network, and other native advertising tools for marketers to take advantage of: all of which yield good results if handled properly.


 

*Actually, that’s a small lie. A small example: when I stayed in Glasgow recently, I used Google to find out where the hotel was in relationship to various things I wanted to see and visit but the important bit – the transaction – was carried out through the booking.com app. What Google couldn’t do was monetise me.

No: Google doesn’t index your meta description

I was asked recently whether Google actually indexed your meta description and I was about to say “of course!” when I had one of those rare flashes of caution and decided to check.

Our company’s home page has the natty meta description of:

“Trusted Dealers is THE SAFEST place to find and buy a second hand car online, with 10 points of difference to ensure you are happy with your deal.”

I know, I know – it could probably do with a refresh. Anyway, when you search Google for ‘trusted dealers’ it is this that displays in the SERPs.

meta1

Perfect, eh? Yet when you use the site: command to check whether it has been indexed, Google wags its finger and says no.

meta3

So that’s that cleared up and if anyone asks you, you can tell them ‘no’ and that I told you so.

But! The phrase does appear all over the internet. Remove the site: command from the query and no fewer than 129,000 matches are returned. Such as…. meta2

This happened because our blog was hacked for a brief while and a funnel page to a network of gambling affiliates was placed on the site. Someone then built a few hundred thousands links to this page – presumably using XRumer – and these links remain floating round the internet (incidentally: our massively expensive SEO agency didn’t notice this – I did, through a desultory check via ahrefs.com).

I don’t suppose there are many lessons from this except:

  • Don’t rely on an SEO agency to do everything for you
  • Update your WordPress often
  • Despite Google’s claims that it expunges bad content and links from its index, that’s clearly nonsense: three months after the hack, all those XRumer-built links and hacked blog posts, connected to an empty affiliate scheme in the gambling sector remain in Google’s index.

I don’t imagine there’s much margin any more in doing this sort of thing, but if people are still making something of a living by auto-creating hundreds of thousands of pages at once and hacking WordPress, then I guess my musing the other week about the status of black hat SEO might be out of date in itself.

If you have the energy, you can probably do something with this collection of bits and bats. Sadly, I don’t.

Google’s Eccentric Choice of Review Partners

Google have proselytised about the value of reviews for many a long year (in internet terms, at least). Their own reviews system – like many of the company’s second-tier offerings – has never really garnered traction. Doubtless, this is partly because it is yoked to Google Plus / Google Profiles, but also because lots of other players in the space actually have better systems. As an example of this, one only has to check out the Google ‘reviews’ for Guantanamo Bay or Broadmoor Hospital.

broadmoor

While these might be hilarious, it doesn’t really indicate that Google is promoting or policing its product properly.

But what should be of more concern Google’s product managers in this space is the drawing in of review scores from other providers (hint: go look for a new job!).

Typically, the big one box/card for a company will highlight Google Reviews, but also draw from other sources around the web. This mirrors Google’s recent, Hummingbird-led forays into injecting information directly into the results rather than encouraging clicks to source data. This is very much a topic for another day.

In the meantime, however, just look how seriously Google are taking things. A search for “Chiswick Honda” has these results:

chiswick1

Reviews from… webcompanyinfo.com? websitepic.com? Both of these sites are actually just the kind of thin-content SEO shill that offer some crappy “seo data” for webmasters. Clicking the links from that one box don’t even take you to reviews! Just this kind of crap.

chiswick2

It isn’t hard for Google to determine who offer real reviews from real people – be it reevo, bazaarvoice, feefo etc – so why are they giving airtime to operators like this?

The answer lies, as ever, in monetisation and a power-play that Google is engaged with against review sites. As the day draws short, I will leave that for another post, but if you take anything away from this let it be this: Google’s concern for ‘quality’ is often skin deep where that ‘quality’ poses even a minor threat to its own model.

More anon.