Get the Google cop outta my shopping cart!

So now Google ranks my shopping SERPs by its opinion of customer service quality?


Do not want!


google cop servicii seoI’m perfectly satisfied with shopping search results ordered by relevancy and (link) popularity. I do not want Google to decide where I have to buy my stuff, just because an assclown treating his customers like shit got coverage in the NYT.


If I’m old enough to have free access to the Internet and a credit card, then I’m capable of checking out a Web shop before I buy. I don’t need to be extremely Web savvy to fire up a search for [XXX sucks] before I click on “add to cart”. Hey, even my 13yo son applies way more sophisticated methods. Google cannot and never will be able to create anything more reliable than my build-in bullshit-detector.


Of course, it’s Google’s search engine. Matt’s right when he states “two different court cases have held that our search results are our opinion and protected under 1st amendment”. The problem is, sometimes I disagree with Google’s opinions.


Expressing an opinion about a site’s customer service by not showing it on the SERPs that more than 60% of this planet’s population use to find stuff is a slippery slope. A very slippery slope. It means that for example I cannot buy a pair of shoes for $ 40 (time of delivery 10 days, free shipping), because Google only points me to shops that sell the same pair of shoes for $ 100 (plus fedex overnight fees). Since when did Google’s mission statement change to “organize the world’s shopping expeditions”? Maybe I didn’t get an important memo.


Not only that. Google is well known for producing heavy collateral damage when applying changes to commercial rankings. A simple software glitch could peculate the best deals on the Web, or ruin totally legit businesses suffering from fraudulent review spam spread by their competitors.


And finally, cross your heart, do you trust a search engine that far? Do you really expect Google to sort out the Web for you, not even asking how much of Google’s opinion you want to get applied when it comes to judging what appears on your personal search results? Not that Google will ever implement a slider where you can tell how much of your common sense you’re willing to invest vs. Google’s choice of goog, er, good customer service …


Well, I could live with a warning put as an anchor text like “show what boatloads of ripped-off customers told Googlebot about XXX” or so, but I do want to get the whole picture, uncensored.


End of rant.


Lets look at the algo change from a technical point of view:


Credit where credit is due, developing and deploying a filter that catches a fraudulent Web shop “gaming Google” out of billions of indexed pages within a few days is not trivial (what translates to ‘awesome job’ coming from a geek).


It’s not so astonishing that this filter also picked 100 clones of the jerk mentioned by the New York Times for Google’s newish shitlist. Of course it didn’t catch just another fishy site, same SOP, owned by the same guy. That makes it kinda hand job, just executed by an algorithm. Explained in my Twitter stream: “@DaveWiner I read that Google post as ‘We realize there is a problem that we can’t solve yet. We have a short term fix for this jerk.’”, or “so yeah, I stand by my statement: it’s a hand job to manipulate the press and keep the stock from moving.”


And that’s good news, at least for today’s shape of Google’s Web search. It means that Google does not yet rank the results of each and every search with commerial intent by Google’s rough estimate of the shop’s customer service quality.


Google’s ranking is still based on link popularity, so negative links are still a vote of confidence.


There are only so many not-totally-weak signals out there, and Google’s not to blame for heavily relying on one of the better ones: links. I don’t believe they’ll lower the importance of links anytime soon, at least not significantly. And why should they? I surely don’t want that. And I doubt it makes much sense, plus I doubt that Google can do that.


As for the meaning of links, well, I just hope that Google doesn’t try to guess intentions out of plain A elements and their context. That’s a must-fail project. I’ve developed some faith in the sanity and smartness of Google’s engineers over the years. I hope they won’t disappoint me now.


Of course one can express a link’s intention in a machine-readable way. For example with a microformat like VoteLinks. Unfortunately, nobody cares enough to actually make use of it.


Google’s very own misconception, er, microformat rel-nofollow, is even less reliable. Imagine a dead tired and overworked algo in the cellar of building 43 trying to figure out whether a particular link’s rel=”nofollow” was set



  • to mark a paid link

  • because the SEO next door said PageRank® hoarding is cool

  • because at the webmaster’s preferred hangout nofollow’ing links was the topic of week 53/2005

  • because the webmaster bought Google’s FUD and castrates all links except those leading to google.com just in case Google could penalize him for a badass one

  • to express that the link’s destination is a 404 page, so that the “PageRank™ leak”, er, link isn’t worth any link juice

  • because the author thankfully links back to a leading Web resource in his industry that linked to him as a honest recommendation, but is afraid of a reciprocal link penalty

  • because the author agrees with the linked page’s message, but doesn’t like the foul language used over there

  • because the author disagrees with the discussed, and therefore linked, destination page

  • just because some crappy CMS condomizes every 3rd link automatically for reasons not known to man



Well, not even all Googlers like it. In fact, some teams decided to ignore it because of its weakness and widespread abuse.


The above said is only valid for links embedded in markup that allows machine-readable tagging of links. Even if such tags would be reliable, they don’t cover all references, aka hyperlinks, on the Web. Think of PDF, Flash, some client sided scripting, … and what about the gazillions of un-tagged links out there, put by folks who never heard of microformats?


Also, nobody links out anymore. We paste URIs into tiny textareas limited to 140 characters that don’t have room for meta data like microformats at all. And since Bing as well as Google use links in tweets for ranking purposes (Web search and news), how the fuck could even a smartass algo decide whether a tweet’s link points to crap or gold? Go figure.


And please don’t get me started on a possible use of sentiment analysis in rankings. To summarize, “FAIL” is printed in big bold letters all over Google’s (or any search engine for that matter) approach to rank search results by the quality of customer service based on signals scraped from unstructured data crawled on the Interwebs. So please, for the sake of my thin wallet, DEAR GOOGLE DON’T EVEN TRY IT! Thanks in advance.




Copyright © 2014 Sebastian`s Pamphlets. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator/feed reader, the site you are looking at is guilty of copyright infringement and will be put down immediately. Please contact sebastians-pamphlets.com so we can take legal action immediately.

Plugin by Taragana  servicii seo

Sebastian’s Pamphlets
If You Enjoyed This, Take 5 Seconds To Share It