How to Fix the Most Common Advanced SEO Issues – Part 1

by Alan Bleiweiss  |  Published 9:00 AM, Mon June 16, 2014

AUTHOR NOTE – This is part one of a mega-post on what the most common advanced issues are, understanding why advanced SEO issues need more attention than most SEOs give them, and how to fix those issues.


Many of the most important SEO considerations go unchecked while people fixate on other things.

It’s a true shame, really. Shiny objects, magic bullets, FUD—all contribute to this problem, of course. Yet if site owners, site managers, and SEOs would just go one step further in understanding some factors, most sites on the Web would be vastly improved. And many penalties would be avoided entirely.

What I’m talking about is not just for SEO. It applies to SEO yes. However it’s truly about user experience.

I’ve been saying this for years:

Google is only attempting to emulate users through formulaic methods.

Some people truly understand this concept. Liz Lockard tweeted this very concept recently:

Google - The Robot Costumed Customer

While Liz applies this concept to the content people read in that tweet, it applies to the entire life cycle—what do they want to find in search results? What do they want to have happen when they get to your site?

If you’re fixating on some things that might attempt to answer that, or if you’re fixating on trying to trick the robot-costume-wearing search algorithms, you’re going to fail at other things. And that’s going to cost you more than you might realize until it’s too late, and you need to hire someone like me to perform an audit after it’s all gone ugly.

SEO Patterns in Big Data

When it comes to understanding where most people fall short in SEO, I’m not claiming here that I’ve gotten anywhere near the scratching of the surface of data volume that Google has, or that any of several big-name, reputable resources have.

However, after having performed more than 175 deep forensic site audits over the past few years, I’ve accumulated a lot of data on the most common “advanced” problems sites face when they’re brought to me.

When I say a lot of data, I mean a LOT.

Over One billion Indexed Pages

And this is just the data on the audits I kept record of when I migrated to my Mac this year from my old PC!

Not Just Any Data

The numbers don’t reflect just any data either. They’re directly tied to sites that needed major help with a wide range of issues. Most had been penalized algorithmically, manually, or a combination of both. All needed to improve rankings and organic originated visits.

So one key common factor to this data pool that’s most important to this article is the fact that every one of those sites had problems causing them to fail to gain enough authority or trust signals to get maximum value from Google’s system.

I am no magician

Of course, I would also be remiss if I were to fail to state, for the record, that just because I’ve done that many audits, does not mean that every one of those 178 sites ended up resolving all of their problems afterward. No, that would be an outright lie.

Instead, what I can say for the record, is that most site owners, after getting one of my audits, have found it quite difficult to implement every recommendation I’ve made. Whether it was due to limited resources, financial constraints, corporate cross-division politics, or some other reason, when you’ve been given hundreds or thousands of hours of new tasking, it isn’t so easy to always get that done.

Even though that’s the case, more than 95% of those sites that were able to execute every recommendation have seen improvements. Often vast improvements. Sometimes, when the stars are aligned, seemingly miraculous improvements.

The Harsh New SEO World (And I’m Just One Guy With One Opinion)

We live in a harsh SEO world. And It gets more and more complex every year with all the changes Google implements, and because their penalties get more and more severe.

In the new, harsher SEO world, some sites are just going to suffer longer.

Gratefully, many sites I’ve audited have also seen at least some improvements even when only a portion of my recommendations have been implemented.

Once in a while, even sites that have taken every action I’ve provided them, sometimes continue to flounder.

7 months of SEO with no improvements

That’s not a matter of missing the mark on what’s needed. It’s just we can’t know up front sometimes how much heavy lifting may be needed. Or in some cases, it’s literally dependent upon Google refining something in their algorithms, taking the sting out of a previous update.

Eventual SEO Recovery

Given my experience, what I present in this series is pretty high up there for helping sites with advanced problems. Just be sure to take what I say here, and test for yourself if you’re doing or overseeing SEO. Don’t just rely on my claims.

While Every Site Is Unique, Most Sites Share Common Advanced Problems

SEO PrioritiesThrough all my audit work, I’ve learned that every site is unique in its specific SEO problems. The exact issues each site has vary greatly. Many sites suffer from varying degrees of those issues as well. So while two sites might both have poor content issues, one might have more problems as a result of duplicate content and the other might have more problems due to a lack of content depth.

During my audit work, one of the most important challenges is to properly classify issues in regard to level of importance.

Even if a site suffers from duplicate content, there might be other issues that need to be dealt with first.

Even though there are unique differences to every site, I’ve also found that most sites share at least some big, complex advanced problems—common flaws or issues. And more often than not, those common issues turn out to be high-priority tasking during the implementation phase after I deliver my audit.

Basic SEO Fails

Most people who “do” SEO can barely even grasp the fundamentals.

They read something or hear someone say something about basic SEO requirements, then they go off running into the wind, (falsely) confident they have all the information they need.

Or they rely on tricks and fake signals that used to work. Or maybe some that still work because Google hasn’t caught them yet. That’s crazy, foolish, and dangerous to a business’s future.

People who don’t grasp the importance of advanced SEO can destroy any chance of sustainability.

Just a few examples of this include:

  • 50 pages on the same topic, with each page having a variation of that single topic’s keywords. #AsshatSEO
  • Stuffing one or two keyword phrases over and over into every paragraph on a page. #AsshatSEO
  • Stuffing fifty keyword variations into a single page. #AsshatSEO
  • Getting as many links as possible pointing to the site, from anywhere they can. #AsshatSEO

Basic Problems Require Basic Training

Of course, these and many more problems are solely based on fundamental SEO and not recognizing the ramifications of those actions. Yet for the most part, these aren’t even the problems I’m going to be discussing here. If you can’t even grasp why getting even the most basic of SEO is vital, you’re already way over your head.

If you fail at basic SEO, step away from the SEO keyboard

Want to get better? Don’t just breeze through some article on the topic. Really learn the full spectrum of basic SEO. I recommend taking the time to readThe Beginner’s Guide to SEO over at Moz. It’s a great resource.

Oh, and if you take anything less than several hours over a period of several days to first read that guide, and if you then fail to pause long enough to comprehend how and why that knowledge needs to be seen from a big picture perspective, and if you then fail to actually put it to use in the real world, I can assure you, you’re still going to be doomed to fail.

Only AFTER all of that, when you are confident you truly understand THAT level, come back here. Because I’m going to talk about higher level concepts here.

Example “Basic” SEO Problem

Google re-writes titles

Greg Boser is one of the best people in our industry when it comes to grasping the bigger picture. He’s right about that—Google thinks your page titles suck. And instead of relying on site owners to fix them, Google’s own system has gotten better at writing them than most site owners, so they just replace yours with theirs in the search results.

As a result of that, more sites are getting the clicks they would have missed out on from their own page titles.

Don’t ignore sucky page titles. If you FIXATE on perfecting them though, #You’reDoingSEOWrong.

I’m not saying this means you can ignore your sucky page titles. Just that if you FIXATE on perfecting them, when much more critical problems exist, you’re making very poor use of your time in most cases.

That doesn’t mean I wouldn’t include page titles in one of my audit action plans. If your page titles are TRULY SUCKY, to the point where they don’t even come close to communicating what each page is about, you are harming your site’s chances of being properly understood.

At the same time, however, maybe all you need to do is make a couple of quick changes to your site templates to auto-generate most or all of your page titles in a more refined way. Or you can just take a syntax / method I recommend, and have a clerical worker replace your page titles. So the cost to fix those can be relatively low, while the value of improving them from an advanced SEO perspective can be very important to seeing gains.

If they’re “good enough” from my perspective, I won’t even bother including page title fixes in one of my audits because at that point, I can trust Google to do a better job at the rewrite at this point in your needs life cycle.

Basic SEO Still Needs To Be Refined At Some Point

Many of the sites I audit could use fixes to even basic SEO. So I’m not saying to ignore those. If your site has gotten out of the starting blocks though, and if it previously had strong rankings, or even still has somewhat strong rankings, those basic issues are almost always less of a priority than the type of advanced problems I find and will be discussing here.

When a site is as challenged on a complex level as those that come to me, some of those basic problems need to be set aside for the time being. Then, after the audit, and after the implementation of resolutions I put forth to address those big issues, it would make sense to work on refining the basics. This can only boost quality, relevance and trust signals even more.

So if your site is as hurting as those that come to me for an audit AFTER you get through the advanced problems, at that point, you would be wise to refine other things, like those page titles. Because even though Google does a better job than most site owners, we do not want to always rely on them long-term.

Advanced SEO Concept – Signal Relationships

At a basic level, any individual signal by itself matters. Properly seeded page titles communicate to search engines, “this is what this page is about.” They’re the first place search engines look to begin that understanding.

Primary Topic Signal Examples

At an advanced level, page title seeding is positively or negatively impacted by many other signals. If the primary focus of a page is too diluted, the most refined page title on earth will fail to pass the “this is the primary topical focus of this page” test. If a page about topic X is buried in the middle of a section on the site where the overall message of that section is “this section is about topic Y,” that will be one more conflicting signal.

Indexation Signals

If you have a canonical tag in a page header that says “this page URL is the one we want indexed,” and you also have a meta robots tag on that page to “noindex, nofollow” that very same URL, those are conflicting signals. If you don’t have a canonical tag on the page, and you do have a noindex,nofollow meta robots tag saying “don’t index this page,” that’s going to be ignored if enough other signals (inbound links from other sites, for example) point to that very page.

Those primary topic and indexation examples above are just a scratching on the surface of advanced SEO. And even within each of those, there are many other factors to consider.

Beginning The Multi-Signal Understanding of SEO

So far, I’ve provided an introduction to why advanced SEO is so important, how to see that basic SEO sometimes needs to be set aside if enough advanced problems exist, and have just begun offering examples of the complexity involved in dealing with advanced issues. In Part 2 of this series, I’ll start diving into real advanced SEO, and how to both identify when advanced problems exist as well as how to address those issues.

Check out How to Fix the Most Common Advanced SEO Issues – Part 2: URL Parameters!

About 

January 2014 marks Alan's 20th year as an Internet Marketing professional. Providing SEO solutions to clients since 2001, Alan specializes in forensic SEO audits and related consulting services to select clients around the world. Visit his site for more information on how he might be able to help you.

Comments
  • Antoine

    Hello Alan. Great article but there’s something I’m curious about : you say that if a page has robots tag to noindex,nofollow, it could, still, be indexed if signals such as backlinks are beyond a certain treshold. Never seen that before… you confirm ?

    • http://searchmarketingwisdom.com alanbleiweiss

      Antoine,
      Just saw this – would have responded sooner if I had seen
      it before. Yes, There are situations where Google DOES ignore the
      robots.txt file. Buried in their “Crawling” FAQ, they specifically
      state that. And I have seen cases where that’s happened.

      From the FAQ:
      “Note that using a robots.txt
      file to prevent Googlebot from crawling parts of your site does not
      guarantee that those URLs won’t be indexed. If we find other pages
      linking to yours, we may include your URLs in our index without actually
      crawling them. If those URLs are already indexed, we’ll most likely
      keep them in our index for awhile even if we are not able to recrawl
      them.”

      • Antoine

        Hello Alan. Thank you for your reply. FAQ tells us about “robots.txt”, not about meta robots tag. Crawling versus indexing…

        • http://alanbleiweiss.com/ alanbleiweiss

          ah wow not sure how my mind read it wrong. Google has officially stated that if you use the noindex robots tag, they will not index the page, or if it was previously indexed, the next time they come if it now has a robots noindex tag, it will remove the URL.

          While I have not seen a robots noindex tag ignored due to links pointing to a page, I have seen it ignored due to a conflicting canonical tag set up on the site where other pages on the site point to it.

          So I do not believe any signal is 100% safe, even if Google officially states otherwise.

          Heck, even the official statement about respecting the meta robots tag states

          “Note that because we have to crawl your page in order to see the noindex meta tag, there’s a small chance that Googlebot won’t see and respect the noindex
          meta tag. If your page is still appearing in results, it’s probably
          because we haven’t crawled your site since you added the tag. (Also, if
          you’ve used your robots.txt file to block this page, we won’t be able to
          see the tag either.)”

          See there where it says “probably”? That’s the open door to “there may be other reasons, but we’re not going into them here”. And I believe that’s because their system is far from perfect.

  • http://WebEminence.com/ Ryan Bowman

    Nice to read an advanced SEO article. Seem to always be reading the same basics that regurgitate the same simple concepts.

    I understand Liz’s tweet but not sure I agree. I’m always advocating building sites for humans and not worrying about Google too much. You have to make sure you’re compliant with the signals Google looks for, but their focus is on user experience and that should be the publisher’s focus too.

    • http://searchmarketingwisdom.com alanbleiweiss

      Ryan,

      While in an “ideal” world, you would only build for humans, the web is a messy place. And Google uses multiple layers of algorithms to attempt to emulate user experience via formulaic methods.

      They require consistency of signals, standardized methods of communicating. While content itself is pretty straight forward, the technical layers involved in distributing that content are extensive and that’s where most of the “messy” comes from. So sure – cater to humans first.

      Always cater to humans first. Just make sure it’s delivered in a way search engines can process without becoming confused.

      • http://WebEminence.com/ Ryan Bowman

        Makes sense – it’s a balancing act. But a lot of people write for search engines first and neglect human readers totally – to the point where even if a site ranks, it’s unusable to humans. Luckily, Google seems to have filtered out a lot of that stuff. But small businesses still write for search engines and even though they have no meaningful rankings, their direct & referral traffic has to suffer through the content created for search engine bots. That’s where thinking Google is your “ideal customer” can become problematic.

        • http://alanbleiweiss.com/ alanbleiweiss

          What I think it comes down to is the age old “critical thinking” factor and how that’s a major failure for too many people. Write for users and ignore the algorithm considerations or write for search engines and ignore the human factor. Either way, they’re both bad for business.