Hummingbird – The Opposite of Long-Tail Search

by Ammon Johns  |  Published 6:16 PM, Wed October 9, 2013

Those who know me, or know of me, know that I sometimes take a very different view from the majority. When other SEOs were happily tracking ‘hits’ and ranking positions as the sole measure of success, way back in the ’90s, I was never afraid to disagree and look at visits, conversions, sales, etc.

In hindsight, that doesn’t seem crazy or unusual at all. But trust me when I tell you that at the time mine was a pretty lonely voice in that regard. It’s why Ralph Tegtmeier (aka Fantomaster) said I was the first person he ever heard talk about Search Marketing, i.e. marketing via search, rather than just positioning.

Well, I’m just about to disagree with almost every post I’ve seen recently about Hummingbird, the new Google search buzzword. Even though I’ve stuck out my neck and disagreed with the herd many times before, and very successfully, I still feel a little nervous about doing so.

The common advice I’m seeing out there, in post after post, by people who generally know their stuff, and talk intelligently is that Hummingbird means you need to use more semantics in your pages and generally use the same techniques we’ve advised for years in regard to the ‘Long Tail of Search’.

According to all I have seen and understood of Google’s ‘Hummingbird’ update, this is not just a small misunderstanding, but a big misconception that sets readers headed further in the wrong direction entirely.

Considering that I’m arguing with other experts in their fields, several of whom, like Eric Ward (who wrote “How Will Google Hummingbird Impact Links? Here Are 6 Ways“), have as many years in this whole industry as myself, I need to make a clear, cogent argument for my case here. Please excuse me if it’s a little long.

Google Hummingbird Algorithm

Hummingbird – The Facts.

Danny Sullivan wrote a great Google Hummingbird FAQ that I think serves as an excellent piece on the basic facts of Hummingbird. You can also read Reuters coverage of the press release which has avoided adding too much speculation of its own.

Several of the news sites reporting the story noted specifically that Google was “short on specifics” and Tech Crunch wrote: “Despite a good amount of questioning from the audience on just how Hummingbird worked, Google avoiding getting too technical. While they did say that this was the biggest overhaul to their engine since the 2009 “Caffeine” overhaul (which focused on speed and integrating social network results into search) and that it affects “around 90% of searches”, there wasn’t much offered in terms of technical details”

As the BBC noted in its coverage, quoting Google: “Hummingbird is focused more on ranking information based on a more intelligent understanding of search requests, unlike its predecessor, Caffeine, which was targeted at better indexing of websites.”

Bear that last point in mind, and especially that the quote comes directly from Google at the time and directly in relation to what Hummingbird is about: “a more intelligent understanding of search requests“.

Also bear in mind that Hummingbird has been in effect for several months now, and according to Google affects 90% of searches. If it required any major shifts in your content you would have seen this long before now.

There is no major uproar about a massive update, no massive change to the visible search engine results that most people might try to track. I believe this is because Hummingbird affects 90% of searches, not search results. Hummingbird is all about better processing of the query, the search phrasing.

The Cause of Confusion

One of the main reasons many have been deeply confused about Hummingbird is that Google’s published news on the update that is most cited is a blog post by Amit Singhal and barely mentions it. The post gives more coverage to the Knowledge Graph, to Google Now, and to other applications that will make use of that better understanding of complex, or vaguely worded, queries.

It has lead to people equating Hummingbird with the Knowledge Graph, with Google+, and with all sorts of aspects of those other things mentioned. One such case being the coverage by The Telegraph –

Hummingbird does impact, significantly on things like the Knowledge Graph and Google Now. It impacts on everything Google do in response to any sort of query, because it changes how Google understands every query. Yes it is applying the use of synonyms and semantics, but it is applying it directly to the query.

Hummingbird is the Exact Opposite of Long Tail Search

SEOs used to have to worry about optimizing for ‘long-tail’ search terms. Long-tail search are the queries that are unusually worded, even completely unique. You used to need to write content that would include all of the synonyms that a long-tail search might have to have any chance of being in those results.

Hummingbird isn’t about long tail search. It’s entirely the opposite. Hummingbird is about taking long-tail, highly unusual and verbose searches, and serving them results as if they were clear short-phrase searches. It is applying semantics to the actual search query, and processing that, prior to actually running the results.

So, when I’m on a cellphone in Denver and ask Google “Where’s a good place where I can get a pizza?”, Google can take my location from the cellphone, understand that when I say ‘place’ in the context of the words ‘where’ and ‘get’ and ‘pizza’ that place is a synonym for ‘restaurant’, and can also include ‘diner’, ‘cafe’, and a dozen other words for places to get food, and effectively process the search as clearly as if I had searched for “good pizza restaurants in Denver”.

It would be lovely to assume that eventually it will understand the word ‘good’ is important and it should probably only include results with positive reviews, and it may even include all Italian restaurants (though that adds a risk of including ones that don’t offer pizza).

So, while some are advocating writing all your new content to exact-match more long-tail search, Google is doing the opposite. It is making the very concept of many long-tail searches go the same way as referral data. Google is trying to get away from exact wording to understanding the concepts. So no matter how verbose or roundabout your search for pizza restaurant in Denver may be, the search it runs is exactly the same as “Denver Pizza Restaurant”, “Pizza Restaurant Denver”, etc.

Google is applying the semantics and conceptualization to the search itself, the actual query, not to the pages.

That’s why despite this already being live for some time, and despite Google saying it impacts 90% of searches, there’s not been any huge shouting about changes to the SERPs, or massive loss of 90% of traffic, etc. In fact, with Google’s [not provided] shift away from even showing you the exact phrases, you can’t really detect the changes at all from the receiving end.



Ammon Johns has been a renowned Internet Marketing Consultant since 1997. Originally posting in webmaster forums under the pseudonym 'Black Knight', Ammon is cited as an important mentor for many now well-known figures in SEO such as Rand Fishkin, Bill Slawski, and Will Critchlow.

Ammon Johns is a frequent speaker at several of the main seminars and conferences such as Search Engine Strategies and SMX in the UK, and was one of the people at the very first ever PubCon in London 2001. His role in mentoring, and providing professional consulting to, many other SEOs has lead to him being sometimes called the SEO's SEO.

  • Warren Whitlock

    As I read all the chatter on this, it seems that many are trying to take a side on a battle that never happened, in a war that Google was always going to win.

    The trend has been, and will continue to be towards search engines finding the best possible result for the user. For a while, that includes algorithms that had easy to count quantities of keywords or links.

    As the algorithm’s get closer to finding “best pizza in my current contest” there will still be data factores, but to look solely at that misses the point. The search isn’t looking for the best data, he wants a pizza.

    • Ammon Johns

      I worry at times that it goes further, Warren. Google have said since the start that they envision the ultimate Google as being “like the mind of God” – ask it anything and it will demonstrate omniscience. But, do you think “the mind of God” would reply with “Oh, Saint Peter wrote a wonderful set of content on that, here’s a link”? Or would God simply give the answer, never mind poor old St Pete’s traffic and well considered content marketing strategy.

      Ultimately, Hummingbird will reduce the very concept of niche keywords. Oh, there will still be niche markets, and niche concepts, but the niche *wording* won’t matter. This will push increasing competition for the ‘Head’ concepts, leading to more sites having to pay for AdWords to get a look in against the big brands.

      The core direction of Google is to become more of an AI (Artificial Intelligence) and a huge amount of its direction over the past years has been in machine learning. Hummingbird will prove to be a large step in this regard, enabling ALL google’s products to determine more query intent, and by processing better input, be able to give better output.

      Where previously to make Google Knowledge Graph work you may have to put in a hundred ways to ask the same question, Hummingbird starts to allow Google to work out all variations. Thus the Knowledge Graph can expand far faster, and to more niche terms. Soon when you search for that pizza, it won’t send you to the pizza site at all, it will direct you to one of several nearby pizza restaurants itself, from a Google page.

      Those sites with TV Listings? Knowledge Graph will cut those out. Flights? Sports results? Dictionaries? All on the hit-list.

      All that schema markup you’re being told to add? This is directly applicable to machine learning – helping the machine by ensuring a good input data-set has consistency for better processing and learning. Eventually, even a search for your own name will turn up a Google Knowledge Graph result and only link to your own site bio as an afterthought, if at all.

      • Paul Gailey

        The Schema dilemma is a particularly galling one as you infer and as the cinema example cited earlier.

        The premise is along the lines of: markup your site so you can be ultimately subsumed by the organisation that your prospects were temporarily depending on to find you. Thomas Høgenhaven has articulated this well with a post about the The Prisoners’ Dilemma.

        I’ve noticed of late that the auto-hashtag Google Plus assigns to a block of text in a G+ post may not have featured in the text whatsoever. It’s interpreted the paragraph of text and determined it’s principal vocabulary and label. Does that sound hummingly similar?

        • Ammon Johns

          The Prisoner’s Dilemma is a great thing to know. I think I saw a piece earlier comparing this to schema, so allow me to add it here:

          Schema has little direct relation to Hummingbird of course, except as they both apply to machine learning. Schema is about improving the data set with markup, helping the machine to learn how to recognize similar elements even without markup.

          Hummingbird is also about improving data, but in this case, the query data, so that other algorithms can work with it more easily.

          The real “kick in the pants” with schema is that all using it are simply teaching Google how to give the exact same reward to those who didn’t bother. :)

      • Warren Whitlock

        When you are locked into a paradigm of “what does it take to get search traffic?” the only thing we can do is wonder what they are up to and worry if they are going to screw with us.

        Google is out to index all knowledge. They learned to put info on the SERP instead of a free link to Wikepedia. They will continue to consume the net like BORG.

        I hold that a business depending on search traffic is at risk no matter how well they optimize or beat the rules. Again, difficult to do from an SEO perspective, but I think we ought to understand that Google doesn’t need us and we’d better not “need” them… especially as an online marketing consultant.

        It’s like Google is having bacon and eggs for breakfast. SEO customers are chickens and are invested. To the pig, it’s much more important

    • Andrew Turner

      Warren you must be dreaming if you think that any search engine finds what you are looking for. In your current example there will be many more sites selling electricity or chocolate covered raisins than there will be Finding Pizza.

  • Gael Breton

    agreed with the concept of short tail. Google wants its advertisers to bid against each other and for that they work on having less result pages and “understanding” queries. This way, you’ll end up paying $5 per click instead of $0.5 because all the search volume will be on fewer queries :).

  • Grant Simmons

    Not agreeing 100% as I believe the outcome is better matching not just a simplification of query, a key difference to your statement above.

    I believe that a better understanding of the query means less ambiguity and more relevant results which does affect rankability and search visibility.

    Where I do agree is that this isn’t a long tail search opportunity, it’s a better content and better marketing opportunity, based on understanding of search query intent, context and conversational cues that will lead to an overall better result experience.

    Doesn’t negate all the other signals that underscore relevance, consider authority and measure engagement, but it will lead – IMHO – to longer tail queries returning better results, driving better CTR and post click engagment.


    I talk about some of this and my thoughts on what this means to publishers:


    • Ammon Johns

      Hi Grant. Thanks for the comment and your thoughts.

      As I mention in my reply to Warren Whitlock’s comment, the main drivers for this are better machine learning, to produce a smarter Google.

      If you follow that thought through, you then see it almost goes against what you surmise. Think of it this way, where a smarter marketer used to understand query intent, and great content marketing, Google is aiming to make that irrelevant. To apply that ‘intention analysis’ for ALL results before the search results are given.

      Where you could benefit from being one of very few companies offering the insight and matching searcher intent before, Google will now do that for all results. Taking away your advantage of smarter marketing and better content.

      Niche keywords? Irrelevant. Now only the concept, regardless of the wording is where Google want to play. Naturally, this makes it easier for Google to apply its own answers, cutting out third-party sites. It increases competition, by putting a far broader array of sites into the results, meaning more brands and authority sites for even niche search phrases, which in turn, means more smaller sites *having* to use AdWords to get above the brands.

      • David Amerland

        Ammon, this is not really true and Grant’s take is pretty much on the money and the article he’s written one of the best argued I’ve seen for some time. Hummingbird is not just Google “…applying the semantics and conceptualization to the search itself, the actual query, not to the pages.” You cannot have one without the other and the only reason Google could activate Hummingbird (now about a month old) is exactly because it was able to increase its Entities index and, by association, greatly enrich the Knowledge Graph. Googlers I spoke to at SMX East in NYC verified as much.

        You may want to take a look at this:

        • Ammon Johns

          It looks as if we’ll simply agree to disagree for a while, David, and see in 12 months which of us was closest to the truth. I really can’t do other than restate points already made about machine learning, how confusion was caused by co-citation of OTHER updates/features alongside Hummingbird, and point out how/why the Turing Test is still the benchmark for machine learning and artificial intelligence. Truly understanding spoken phrases is immense, and the key to being able to reply.

          I would suggest reading the second half of the Forbes piece that gave live coverage of the announcement, as that does possibly the best job of separating out Hummingbird references from the rest.

          Beyond that, lets schedule a follow-up for six months or a year and see what has come to pass in that time. My bet, despite it disagreeing with the common view, is as stated.

          • David Amerland

            I saw the Forbes piece the day it came out. I do not see how it changes anything I’ve stated. But as you say, time always shows what has happened.

        • Grant Simmons

          Thx David. Great thing about SEO is there’s many opinions, and we’re nice enough to settle disputes over a beer at [SMX/SIS/SES/Pubcon/SearchLove/eMarCom] (strike those that don’t apply) :-)

      • Grant Simmons

        Welcome lively debate!

        Imagine all the world’s experts lined up in a row. Each has a specific micro-niche.

        A question is asked, however obscure, and the leading authority steps forward to answer.

        The *reason* the leading authority takes the microphone is based on a Google engineer prodding the right guy with a stick because they understand what the question *means* to the nth degree, and present the most relevant, authoritative, topic expert to address.

        The niche is not in the query, the niche is in the ability to find the most relevant niche answerer.

        Where we differ is you’re thinking that many queries can be satisfied by one answer, whereby my thought is that better defined queries open the doors for many more (better) answers.

        The Knowledge Graph as an answer machine is a different animal, and I see more easily defined queries being pushed in that direction – these are functional queries needing less interpretation – “I need a flight from Seattle to LA on the 22nd at around 8am” – not complex queries requiring expertise, opinion and a “finesse” to understand intent, context, conversation. “What’s the best airline and time to leave for a quick trip from Seattle to a pro sports game in LA”

        Hopefully adds some color to my thoughts – never at my best at 4pm PST :-)


        • Ammon Johns

          >> Where we differ is you’re thinking that many queries can be satisfied by
          one answer, whereby my thought is that better defined queries open the
          doors for many more (better) answers.

          Au contrare, mon ami! Not at all.

          But I deal in a lot of competitive sectors where just getting into the top 10 is tough as is, without google widening the number of high-authority sites that can rank for the search term even without using the term.

          Across the board, this update will help high-authority domains get more exposure for what was previously long tail search phrasing they hadn’t targeted. But for one site to get more exposure, another site has to drop *out* of that top ten to make room. And we have seen enough times how Google reward the huge domains and big brands to know what this means not just for the little guys, but also the medium-sized ones.

          • Grant Simmons

            Au contraire, contraire :-)

            I’ve the opinion this will open up to more expert, *less* big brand results around key niches, where engines will see bigger diversified “jack of all trades” sites as “masters of (n)one”

            Beige the better understanding of query will fragment search to deliver a better match to the query, not (IMHO) less diversified results set.

            Time (and data) will tell, mon ami :-)

  • Alan

    Personally I don’t think it really matters what any of us SEO’s believes Hummingbird is about because Hummingbird is just another move by Google to becoming the destination and not just the journey to the destination.

    Eventually when “Google IS only Knowledge Graph and Carousels” it won’t matter what our sites do or look like because Google will be scrapping Wikipedia and putting up a page 1 with no links, just Carousels and adwords ads. Dr Pete tweeted a great example of the future of Google search for “diabetes symptoms” at and you will see the future of Google but extrapolate that to the first page being all like that, not just the above the fold experience.

    I wonder if finally guys like Danny Sullivan will think Google is evil when you do a search like

    “when and where is the next SMX event”

    and Google knowledge graph comes back with..

    “Don’t worry about SMX, SEO is dead”

    You have to ask yourself 10 years down the track will SEO be about making your site as scrape-able by Google as possible. That way your content is the content being displayed by Google and not Wikipedia’s and would that be a benefit anyway?

  • marknunney

    That almost nobody has seen results change significantly since HB was introduced is significant. What can we conclude from this (and the other known HB facts)? I suggest:

    The long tail either:

    • remains as much of a marketing opportunity or
    • HB allows G to learn and so G will ‘take away’ long tail results as Ammon suggests

    Either way I’m confident that one of HB’s main goals (or results) is to serve Google properties and repurposed scraped data.

    And I’m reasonably confident that long tail opportunities will remain but they wont be old school ‘keyword matches’ but more semantic ones. I say ‘more’ semantic because it will be a long time before G’s algo can pass the Turing Test.

    Connecting HB to (not provided) is intriguing. If not connected it is quite a coincidence. Hiding the evidence/clues? If so, for most sites you can still get enough keywords pre-HB and post-HB to see if anything has changed.

  • netmeg

    Two words. User intent.

    • Alan

      Two words Google intent.

      What do I mean? Yes user intent is very important to Google and in the short term yes your site will “probably” do better if you think about the user. However in the long run Google will use “user intent” to stop people from leaving Google and that is Google’s intention.

      It is nice to know as Google provides us with less and less metrics about our own sites and the interaction of the users. It uses these same metrics to keep people at Google via knowledge graph.

      An example of this one of my local cinema sites has contacted me because people are not arriving on the site anymore. When I looked yes you search for the cinema name is has a knowledge graph carousel of all the movies and screening times. No need for users to go to the site. We might think this is great for the user and it might be. However it is bad for the cinema because the cinemas site is much better at selling the movies to users than Google’s knowledge graph is. How do we know? because since the carousel has been in play online bookings have dropped by half. (remember these users are searching for the cinemas site). So obviously user intent is not gelling with Google’s intent.

      Is this what we really want from Google? To scrape our content and then halve our sales? Anyway enough ranting.

      • netmeg

        I don’t think what *we* want from Google is very high on their priority list.

  • Ammon Johns

    Thanks, Sean. Delighted you found it so.

  • Giuseppe Pastore

    I appreciate the post and I think it’s spot on too, but I tend to think longtail is not a right word to use: I consider a longtail search as a better defined entity, not the same one expressed with more words… Maybe it’s just me not understanding English 100%, but in my opinion while the concept is for “verbose” queries, it is not for “longtail” queries. I wonder if I’ve expressed well my ideas…
    Thanks anyway for the post, it’s an interesting point of view.

  • Joe Preston

    I agree with this analysis. This will mitigate the vast increase in (not provided) keyword data because the exact long-tail variations of head terms are going to matter less. If this is as fundamental as you propose it’s going to destroy automatically generated long tail content, which is a great thing for search quality. Gael mentions the very obvious effect of increasing the bidding competition on head terms.

    I am not sure I want Google to try to understand the concept of “good pizza” As much as you might disagree with the conventional opinions of SEO’s, perhaps I disagree with the center channel of pizza connoisseurdom. Also, do you think anybody actually searches for a “place nearby where I can get bad pizza?”

  • Adeel Sami

    Hummingbird is a complete-change of game from what we are used to of and will definitely not agree with the scenario at this time, but that’s the time that will tell us the truth of this change.

    I want an intelligent engine that could bring me the best, precised results to my query(ies). At this time, longing the search query comes up with more related, more precised results and with Hummingbird if the search query gets shorten, I don’t care as long as I am provided with the results I expected, Google intelligently judged for me.

  • Ammon Johns

    Thanks Gordon. Your observation is absolutely spot-on for what we’d expect of Hummingbird – that it would process queries with appropriate synonyms as if shorter or simpler queries, while taking queries with specifics as not having reliable synonyms, and so processing those exactly as written.

    I need to add a big shout for Bill Slawski’s excellent patent finding skills, as it was he that found a Patent so similar to the way Hummingbird interprets queries that I simply cannot imagine it not being the patent for the concept behind the Hummingbird name. If anyone has not read that one yet, please do so as it adds well to this discussion.

  • Mikkel deMib

    It is now more than 10 years ago that I started working with natural language search as VP of Technology at Ankio – technology wise one of the leading companies in this field. Back then, over 10 years ago, I predicted that all search engines would eventually have to go this way – understanding the meaning of hat people search for and not just matching the words they use.

    But if you look at the math behind natural language processing an exact match will still come out with a higher score. Even though a search for notebook will also find matches for laptops and portable computers texts about notebooks will have a higher score – just as a very simple example.

    Natural language search can give better results for some searches but as far as I see, and how I have preaches SEO for many years, it won*t actually change that much for us (the SEO’s). But only time will tell for sure …

    Thanks for a good write, Ammon. You know how much we often agree – and often disagree with the larger crowd of SEO’s hehe :-)

    • Ammon Johns

      Hey Mikkel, great that you enjoyed the write-up, and thanks a ton for the comment.

      I agree with you there. Most of the time the exact wording used should be more important than synonyms, and certainly if you are dealing with nouns. At the moment I can’t really see Google getting much further in this than knowing that “where can I get a notebook?” is the same query as “where can I buy a notebook?” and “what is a good place to obtain a notebook?”.

      Google will still have to deal with the vagueries of whether the notebook in question is the paper kind, or electronic. Hummingbird will help it to match back to any previous queries, I believe, so if you’d been searching for Laptops recently Google would be able to process that into your query too and favour more of the electronic kind.

      But they’ll still need to ideally provide some of both in the results in case you recently lost your bag and having replaced your laptop in your first searching session, are now replacing the stationary, and next will come the pens. :)

      The problem for me really is that niche wording used to be a great way to subtly sift the traffic. There’s a subtle difference in the kind of audience that will use the word ‘purchase’ or ‘obtain’ rather than ‘buy’ or ‘get’. Sometimes, as a marketer, even such subtle differences could be important. Always they provided niche opportunities.

      But now I’m pretty sure that where I have a page on a niche speciality site that used to be ranked #3 for “The Gentleman’s Guide to purchasing a suit”, I’ll soon be jostling for position with “Men’s guide to buying a suit” on some big high-street men’s clothing site that doesn’t deal with bespoke tailoring at all. Because the bespoke tailoring aspect was in the nuances, not the wording or the context.

      • Donna Duncan

        This last example resonates best with me – because it highlights the need for Google to continue to grow its revenue AND maintain market
        dominance. If it dilutes people’s ability to find highly relevant results, like the example of a “gentleman’s suit”, it’ll be shooting itself in the foot.

        I suspect we probably haven’t seen a huge impact as a result of Hummingbird yet because this first release is merely a framework for more to come.

        I have a systems background. When you’re making a major change to a huge decision engine like Google, best practice is to swap out the engine but keep all the rules and data the same for a while so you can better isolate problems as they occur.

        Once things stabilize and Google gains confidence that things are operating as expected, we will then likely begin to see subtle changes in search results driven by changes that Google slowly begins to implement “on the fly” involving new rules, new data, and new relationships between rules and data. I believe the big benefit for Hummingbird is in the speed, ease and lower cost of making changes for Google. Search impacts will be felt that same as always. We’ll just see MORE FASTER.

        So I agree that in a year we’ll be able to better understand

  • Seo Hop

    This is so simply explained. I don’t think anyone will miss the point here. Great thoughts and explanations. Something I don’t understand though is the messy-ness from this update. I found

    Non English, Not Related Websites In Google Local and wrote about it here So if they can pull semantics why can’t google find stuff like that.

  • Dan Shure


  • Dan Shure

    I couldn’t agree more with you Ammon! I’ve had the intuitive sense that (not provided) if you step back is part of a bigger move away from the idea of exact queries and a piece of the puzzle in the move towards the idea of taking a query and understanding the intention behind it.

  • Pingback: E2M Eagle’s Eye - 11 Oct 2013 | E2M Blog

  • Pingback: Google Hummingbird And Structured Data: Is There A Connection? | WebProNews

  • Pingback: Weekly Recap and Weekend Reading for 10/11/13

  • Tom George

    I only have three words for a super smart guy like yourself, curation, curation, curation!

  • Stoney deGeyter

    Ammon, Regarding people adding in related words in order to rank for “long tail” searches, I look at it differently. We integrate related words not to get ranked for those words but to provide more textual relevance to the short tail keywords that we are targeting. This seems to be exacctly what Hummingbird is doing, taking a query and rewording it for intent. But the same conceps apply to the page level. If a piece of content doesn’t use “related” words, Google can make the assumption that it is an extremely light piece of content. The related words give it depth of quality that other documents might now have, especially those “optimized” for specific keyword phrases.

    • Ammon Johns

      Stoney, Hi, and thanks for the comment. Where I disagree is with the line “what Hummingbird is doing, taking a query and rewording it for intent”. It isn’t. Not intent, just clarity of the question, not the reasoning behind it. Literally, its about taking oddly, or poorly worded queries and seeing if semantics can provide a clearer question with more answers.

      Bill Slawski found and gave great in-depth analysis of the patent that seems to cover the same ground, and that shows precisely how a mechanism may look for synonyms, predict their likelihood, (like the way Bayesian Filtering rates probabilities), and then run the search as though simultaneously running those synonym searches too. See his article at

      • Stoney deGeyter

        Ammon, I’m not sure I see a whole lot of daylight between getting “clarity of the question” and understanding the “reasoning behind it.” reasoning is essential to having true clarity. That’s OT to say Google has mastered that, but they do have billions of searches AND the sites that were clicked to help them better understand the intent of any given query. My best guess here is that Hummingbird is probably better at learning from the clicks than The algo was previously. But even if not, I still say related words are a stronger factor for ranking for the core keywords.

  • CarlosVelez

    So what are the implications from a marketer’s perspective, aside from diversifying traffic away from Organic?

    • Ammon Johns

      So many that I probably need to do a full “State of eMarketing” post to really do that question justice. However, in a nutshell:

      1. Branding is NOT just a technique for the big guys. In fact, it is harder for a large company to build and maintain a consistent brand that goes from the top CEO to the lowest front-line staff. Become a brand – focus on who your customers see you as, because THAT is your Brand.

      2. Keyword Research is going to be harder and less meaningful overall in the double-hit of Hummingbird and NotProvided. You’ll need to find other avenues to perform market research on what your customers and market are asking for.

      3. Concepts will slowly become more important than the words we use to communicate them, in SEO, but more meaningful than ever in terms of converting customers. We’ll not be sure of ‘mirroring’ the terms they used when searching, so we need to clearly communicate what we offer, and what it offers to them.

      4. Above all, maximise your conversion rates. The higher the conversion rate you have, the lower that makes your cost per aquisition through all marketing activity. It means that you can afford other means of driving traffic.

      5. Lastly, but not least, never complain that Google is killing your business. If your business is totally reliant on free Google traffic, that is a weakness that is YOUR fault, and needed fixing. If your business depends on free traffic, find additional means to generate it, such as viral marketing, and seriously … think about whether a business so unprofitable it can’t even fund its own marketing is really worth saving. It may be a very good time to revisit your entire strategy.

  • Doc Sheldon

    Ammon, your term “integrated marketing” may have more meaning under the skin than you intended. Taking Alan’s cinema site, as an example…. perhaps the website will see less traffic, but the cinema may sell more tickets, via the carousel visibility. So for a local brick and mortar, it may be a wash, possibly even a gain. Of course, in that example, the result is more informational than transactional.

    And I think that may be a part of Google’s direction that’s not getting talked about. They can’t suddenly stop sending traffic on transactional queries… they’d be cutting their own throat. And they’re certainly capable of distinguishing the queries, one from another. At first, my thought was that this would simply force site owners to buy ads to get traffic. Now, I’m not so certain that’s their intent (although I doubt they’d mind if it went that way).

  • Alan

    Comment of the year

    “The value of good content isn’t to get into search results, it’s to keep people from EVER going to Google in the first place.”

    • Writingprincess

      OMG…love this!!!! So many marketers don’t understand this and thus don’t give their customers a direct link to them through content. Get them there through SERP keep them there through your on marketing efforts ie., e-mail newsletters etc.

  • Chris Dugdale

    At last, a voice of logic and sanity in the cacophony of hummingbird noise.

    The commentary to date has been so pervasive and wide-of-the-mark (well, apart from one article by Bill Slawski that delved into some patent details) that I was beginning to ponder if I had firmly grasped the wrong end of the stick, or finally lost my marbles.

    I read one piece suggesting that you need to write all of your content framed as questions to match vocalised user queries. After a dozen or so articles like this I was ready to give-up and find another industry to work in (farming maybe).

    Thanks for restoring a little faith.

    • Ammon Johns

      Thanks, Chris.

      Yes, Bill and I talked a little before about how bad some of the posts and analysis had been on the topic.

      I think there is a slight tendancy for many writers to attempt to turn every bit of news into something that promotes or adds value to their own main product. So lots of copywriters wrote about how Hummingbird rewarded longer/better copy. Link-builders wrote about how Hummingbird made changes to links. Web designers and technical SEOs wrote about hummingbird made schema markup even more important.

      I just really wish that more writers realised that being wrong matters. It lasts. Its why I made sure to ‘show my working out’, show how I made my conclusions, and make it as self-evident as possible. But ultimately, each reader must always, no matter how good the source, treat all posted information as simply theories, and seek to prove or disprove those theories for themselves.

      As I said in an Interview by Rand Fishkin many years ago: “There are lies, damned lies, and (other people’s) statistics”. ( )

      • Tony B

        hahahaha, blast from the past – I actually remember that quotation! I was a mere junior back then by the name of karmakiller

  • Borislav Lojpur

    If you try the exact search query you chose in the article you would see that when you “ask” google “what is good place to eat in denver” you get plane search results and when you enter for example “best denver restaurants” you get google map, microformats with reviews…
    So suppose queries should stay exact, for now.

    • Ammon Johns

      I’m guessing that you are not *in* Denver, or near it, Borislav. Thus on the longer search Google decided that you first needed a flight to get there. Still not a great result, but that is NOT running the search I gave, considering that the search I mentioned was on a cellphone, located in Denver.

      Context matters. In fact, as Google Now grows, I would even go so far as to say there will be increasingly times where context may be more important than content.

  • Michael Schlotfeldt

    You summarized the topic extremely well. I loved this quote: Hummingbird isn’t about long tail search. It’s entirely the opposite. Hummingbird is about taking long-tail, highly unusual and verbose searches, and serving them results as if they were clear short-phrase searches.

    • Ammon Johns

      Thanks, Michael.

  • Nick Ker

    Excellent explanation. I wrote about Hummingbird and long tail a few weeks ago, but not necessarily advising on optimizing for long tail searches. Never really did see much point in trying to do that, when writing effectively and using a variety of words & phrases (rather than just keywords) would get the message across in a better way to a wider variety of people. I’ve had people ask “what long tail searches should I optimize for?” My answer was usually “all of them”, meaning write well about your topic and Google will figure it out. Now, it looks like Google really can figure it out better than before.

  • DaveKeys

    I have to agree with the author. In essence, Google is, as a result of Hummingbird, sometimes deciding it knows what you mean better than you do. I have seen posts and even live events where people are suggesting you stop using keywords, etc altogether and create content without them. I get it but that’s for marketing and conversion- not search engines. If anything, Hummingbird narrows down searches to more predictable conventions and waters down the effect of long tail- sometimes appropriately and sometimes it misses the mark just like you’d expect from any computer algorithm. Just ask Siri some long tail questions and see how unpredictable things can get. Google does a better job but I’m sure Hummingbird is a lot like those cheese commercials. It’s got a lot of growing up to do.

    • Chase Anderson

      I’m trying understand what you base your assumptions on:

      “Hummingbird narrows down searches to more predictable conventions and waters down the effect of long tail- sometimes appropriately and sometimes it misses the mark just like you’d expect from any computer algorithm.”

      Do you have long tail query results to compare to before Aug 26? It seems a little overly abrupt to start to declare that all of this is happening as fact without some repository of evidence to support it. Did the results change on Aug 27th? What do you have to show that it changed significantly and what statistical confidence is there in your findings?

      Look, I know that this level analysis is exactly the type of thing no one doing SEO for a living should have time for, but it’s exactly what is needed before we go about making wayward claims that spread like wildfire. Isn’t it?

  • Ammon Johns

    In several of the comments I have mentioned, or at least alluded to, the fact that Google widening the possible matches, naturally means that it is likewise widening the potential competition for ranking and traffic.

    In a post about Hummingbird from Heather Lloyd-Martin, in the comments, is what I believe is just one of several examples I have seen.

    The comment writer ‘Barry’ explains:

    “Our business pays people in Ireland money for their
    used mobile phones. We are an Irish company that is doing ‘what it says
    on the tin, ‘ no black hat or anything like that and still in the top 5
    on Yahoo, Bing etc, (and was on google until last week) and are 100%
    relevant to ‘sell mobiles online ireland’, but we’ve gone from page 1 to
    page 10 in some cases!

    Bizarrely ahead of us in the search are cash for gold sites in
    Australia, irrelevant computer recyclers in Chicago, sites that stopped
    operating a number of years ago, sites based in the UK that Irish
    customers cant use, and a bunch of ‘content’ sites that give people
    information in general!”

    Now, of course, there could be many other reasons for the loss of positions. However, this sort of thing can also be the result of Humminbird widening up potential query matches to include other ‘recycling’ related terms and results. It is something to consider.

    Your sites authority and brand will become much more of a deciding factor in the future than it was, as your ‘keyword strategy’ and exact content becomes less so.

  • Ammon Johns

    One of the good things that Google, and all search engines for that matter, try to do is to direct searchers to great content, without the writer or webmaster needing to be versed in SEO. In fact, to be utterly frank, most search engineers would rather there was no SEO, so they could get on with improving algorithms and results without having to worry about hacks, tricks, spam, and abuse of how it works (their view, based on the worst of what is done under the name SEO, not the majority).

    In general, pretty much any site that hasn’t made itself utterly inaccessible to spiders, will rank well for several things. Like, no matter if you knew nothing about SEO at all, Google would hope your blog ranked well, if not top, for a search for “Randy Hilarski’s Blog”, right?

    SEO is *only* needed for one of 2 distinct situations: either (a) to fix a problem that is preventing you getting your ‘natural share’ of search traffic for clearly relevant niche terms, or (b) to get a bit more than your fair share – i.e. improve your market share of search traffic.

  • Eric Ward

    Ammons – I don’t feel you are arguing with me at all. Nowhere did I state that Hummingbird was itself about long tail. My comment about Q/A content was in regards to those with content that already answers questions and performs well. As I wrote, it’s a mistake to knee-jerk content into Q/A format just for Hummingbird.

    We are much more alike than different here. I’ve been preaching against conventional SEO wisdom since 1994. People laughed in my face, thought I was nuts, they got rich gaming Google when I said it was foolish/dangerous. And sometimes I wonder if they were right, since here I am still working. But then Google appeared a few years later and it turned out my public relations and marketing approach to content awareness via relevant links was exactly what Google wanted to see in a link profile. Sadly, link building was quickly appropriated by SEO practioners and they never looked back. They ruined it. I watched them and shook my head. My business card has always shown my title as “Content Publicist – Linking Realist”. I’ll be the first to admit I have been wrong many times, but when it comes to understanding the in-bound linking aspect of Google’s algo, I’m pretty sure I’ve been proven right far more times than wrong.

    I have the utmost respect for you and have learned much from you over the years. No argument there :)

    • Ammon Johns

      Hi Eric. I’m truly delighted that you stopped by to leave a comment.

      I fully agree that you and I have a lot more in common than differences in our attitudes to many subjects, and especially links. But one thing that always amuses me about human nature is that it always when people very close and similar have small differences that those differences stand out the most? :) People radically different to us we kind of ignore, or dismiss.

      You mention one of life’s great oxymorons there: Conventional wisdom. I always find it is like content marketing – all emphasis on the first word and barely a shred of evidence of the second one. :) There is no wisdom at all in not at least challenging and testing conventions.

      I think the reason I saw such difference between us, subtle though it may be, is partly an accident of wording. Where you wrote “While you shouldn’t knee-jerk your content (or link anchors) into a 100% Question/Answer format, you may …” it tends to leave the impression that okay, not a hundred percent, but we should maybe tweak more Q&A format by 60%, or 40%, or at least 20%. When you then show Q&A format titles and link texts, it strongly reinforces that assumption.

      Certainly, the article does have a feeling that it is telling us we should change a few things, even while doing ‘more of the same’ in regards to long tail, etc. While actually, to some extent, it is a little less long-tail work required (at least in actually chasing phrases or particular words, even Q&A words) that is probably the real outcome.

      Lots of SEOs until now have acheived a lot with specific wording, especially on the ‘long tail’ format, and that should be something that has changed now. My own advice is certainly to chase long tail concepts rather than wording and phrasing. Plus, build up brand and authority signals because there will be some extra competition on most phrases as a wider array of possible matches are presented thanks to semantics.

  • Writingprincess

    I don’t think what you’re saying is particularly revolutionary or contrarian. It makes sense. As an editor for more than 18 years the one phrase I’ve always repeated to new writers was, “What are you really trying to say?” The fact is Google and SE’s trusted us too much to clearly communicate what we mean. However, I have to disagree that this is all about distilling search queries and machine learning and ignoring the conceptualization of content on the web pages. Not sure I would have one without the other. I see Hummingbird as a baby step in interpreting our sometimes cryptic communication style, returning results we want not what we asked for. I see this as a good leap forward for search, content makers and a power grab for Google. Yes, if Google controls the concept of search queries it ultimately controls what is meant by relevant ie., making it less likely that exact phrases will be returned and more likely that better marketers could get a leg up. But who is to say that after Google understands my search habits that it won’t return search results outside of the big brands to be more localized, even if those smaller brands don’t pay for Adwords? I think the cynical view is to see this move toward semantic web as a money-grabbing venture and stacking the deck on Google’s part and I’m not saying it’s not present, but I’m saying if it were all about that the system would implode because users would get a little miffed if every time they put in a query, including a query that says “What’s the history of shoes?” and the results came back with a commercial entity such as Nike. How does that help Google’s credibility? Maybe I’m naive but I think Google understands search users are its meal tickets not the people who buy the ads. (No one would buy ads if the people aren’t searching…)
    To me Hummingbird is a big ole’ customer service ploy to users as well as some other advancements such as machine learning etc.
    I think there’s a fine line between commerce and information and Google has stretched, prodded and wobbled on that line for a while and it’s new direction seems to fall out on commerce. But without the information search part the commerce wouldn’t last long would it? I think you’re right semantic web, and schema is just the construct by which the SE’s are learning to read and understand what we’re communicating. I think you called it machine learning. Basically I liken schema to teaching a computer how to read like a nine year old. We’ve already taught it the words and the letters now we’re teaching SE’s to understand the real meaning behind those words and better yet anticipate the next adventure. This is exciting times in search and a period that I believe Joseph Licklider wrote about in his “Man-Computing Symbiosis” essay. I am hopeful this is a strong step to eliminating the schemers and dreamers and the tricksters but I fear as long as SE’s have to depend on a construct such as schema that those will always exist.

  • Chase Anderson

    Why would google need to launch an infrastructure update… that made no impact to search to adjust understanding of queries based off your location? They already do that, amazingly well.

    Hummingbird is about Google integrating a connection to knowledge graph into their infrastructure (the massed data that relates people to places and things, not the little box in the right side of the SERP occasionally).

    Nothing changed on Aug 27th, other than some retooled elements of the existing algorithm to work with the new infrastructure. What changed then? Google added the ability to apply to algorithm updates to the knowledge graph data. Semantic understanding of queries is about using knowledge graph and it’s understanding of words and their relationships to people places and things. I explain in a lot more detail on

    • Ammon Johns

      Did you see the section of the post with the heading “The Cause of Confusion”? Re-read it, as you’ve just made the exact same mistake. Google did NOT have a launch just for hummingbird. They had a 15th anniversary event at which they unveiled many changes, only one of which was hummingbird. But the only thing they all had in common is that they are answers to “what things has google done lately”.

      • Chase Anderson

        Fair enough, I could see how you could come to that understanding but ultimately I don’t think I am confusing the messages.

        In your post you say:

        “It impacts on everything Google do in response to any sort of query, because it changes how Google understands every query. Yes it is applying the use of synonyms and semantics, but it is applying it directly to the query.”

        How does Google develop better understanding semantic understanding of queries? Knowledge graph treats people places in things in the same relational way our brains do. If you and I agree that hummingbird is improved semantic understanding of queries then my position is that the way Google has done that was entirely by connecting search queries to Knowledge Graph.

        From Danny’s coverage of the event:
        “Gave us an opportunity, Hummingbird did, to take synonyms and knowledge graph and other things Google has been doing to understand meaning to rethink how we can use the power of all these things to combined meaning and predict how to match your query to the document i terms of what the query is really wanting and are the connections available in the documents. and not just random coincidence that could be the case in early search engines”

        Given the extreme technical challenge semantic understanding has, it’s actually a massive leap forward to incorporate the support KG could provide in really understanding the language of queries. I honestly don’t see how Hummingbird could be anything but involving a deeper connection infrastructurally between the search algorithm and knowledge graph.

        If it’s not the KEY component of hummingbird, then what in the world is google doing to apply such difficult language filters onto search without causing massive slowing of their result speeds?

  • Spook SEO

    Hi Ammon!

    I totally agree with you here. Hummingbird is focused more on ranking information based on a more intelligent understanding of search requests. You got a very interesting post here!