greg-boser-220-monochromeDecember 5, 2013Greg Boser has been optimizing sites sites since the late 1990s. He’s spoken and written widely about the key tactical and strategic issues in organic optimization over the years, and is a regular speaker at the important search trade shows, including SES and SMX. Greg was also a participant in the historic Dave Pasternack/Rocket Science SEO contest of 2007.

Greg is a very busy man and we were very glad that he set aside some time to answer our questions about the current state of SEO. Note: all of Greg’s opinions are his own.

Didit: What is your position on Google’s encryption of search results and the loss of organic keyword data? How would you advise SEOs to deal with the “not provided” issue?

Greg Boser: From a purely SEO perspective, the reality is that we’ve lost one of the most important tools in our tool box, and the loss of that tool means that making sound decisions relating to the proper course of action will be more difficult. All that said, marketers with a true SEO mindset always have been and always will be the cockroaches of the Internet. We adapt, overcome and become stronger and better every time we’re presented with “the sky is falling” and “SEO is dead” scenarios. As far as suggestions for dealing with the data loss, I think Rand (Fishkin) covered most of the obvious options in his whiteboard Friday on the subject.  Beyond what he covered, I’d say the items that would provide the most value would be:

  • Historical data mining – It can be quite valuable to look at how keyword data from other sources matched up with the data you used to get from Google. If you do that and find that queries, bounce rates, landing pages, etc. are all fairly similar, then you can have more confidence when it comes to making future decisions off of a much smaller data set.
  • Internal search data – Mining internal search data has unfortunately become almost a lost art form, but it’s something worth spending time on.
  • Organic PPC spends – I think it’s important to not blow off PPC as a data tool simply because you don’t want to let Google win, which is an attitude I’ve run into quite a bit lately. The key to using PPC for organic testing is to make sure you are comparing apples to apples. Doing that requires doing more than just asking the PPC team for their conversion data. That’s not to say that their default data doesn’t have value. It just means that conversion data tied to unique landing pages specifically created with a direct-conversion mindset don’t usually do a good job of telling you how a true organic page that you would like to rank for certain phrases might perform.So what you really need to do is develop some specific campaigns that utilize only your true organic content as landing pages. Doing that can provide some really quick insights into possible courses of action.

Didit: Bruce Clay recently said that “the only people who should be worried about “Not Provided” are “wannabe in-house SEOs.” Do you agree?

Greg Boser: I’m not real clear on the context of Bruce’s comment, but since he and I come from the same era in this business, I’ll step out on a limb and say that his comment was a dig at the fact that a solid chunk of 2nd and 3rd generation SEOs never really took the time to learn and utilize all of the tools and datasets that some of us old-timers use.

If my interpretation is correct, then yes, I absolutely agree.

That “lack of proper learning” phenomena tends to be more prevalent in in-house environments simply because you only have one client. Things also tend to move slowly, which means the standards used to determine success or failure are often pretty low.

In the consulting world where you are constantly required to identify problems and develop strategies across multiple channels, for multiple different clients, you can’t afford to be one-dimensional when it comes to the data you look at.

All that means that the true impact the data loss has on you is directly related to your overall experience. Which makes things a lot tougher for young, green in-house types 🙂

Didit: Do you think that Yandex or Baidu can provide the keyword information that Google is now encrypting?

Greg Boser: They certainly can’t provide the same data Google has removed, because the specifics of that data are directly related to a) the total volume of their user base, and b) the specific demographic makeup of that user base.

That said, I’m a big fan of Yandex. And I wish there was a way to get a substantial chunk of web users to take for a serious test drive.

Didit: What can you tell us about Hummingbird, and how it’s different from Panda and Penguin? Some folks say that Hummingbird is a bigger and longer term issue than ‘Not Provided.’ Can you elaborate?

Greg Boser: Comparing Hummingbird with Panda and Penguin is about as apples and oranges as you can get. Panda and Penguin are iterative filters designed to plug unforeseen gaping algorithmic holes that showed up as a direct result of the rollout of Caffeine, which was a major infrastructure update.

Although the phrase “algorithm update” has been used a great deal when discussing Hummingbird, it is really an infrastructure update, and that fact is why I think all of Google’s PR associated with the rollout constantly mentioned Caffeine.

Infrastructure updates are fundamentally more impactful than what I would consider a true algorithm update (tweaks in the weighting of a set factor set) because they allow the introduction of large amounts of new data, which ultimately increases the number of data points that can potentially be used in future algo tweaks.

That fact means that the initial impact of Hummingbird is fairly low, but the stage is now set for constant change to roll out over time. And I think the changes we’ll see will fall into 3 main areas:

  • SERP Consolidation – The first noticeable thing I think we’ll see is the type of reduction in SERP diversity that Ammon Johns is talking about in his post. Google’s ability to dynamically write titles combined with constant improvements in both the quantity and quality of knowledge graph data will ultimately result in the same basic set of results being returned for a much broader range of semantically complex queries. And that translates into fewer opportunities for companies to try and carve out less competitive SEO paths.
  • SERP CTR – The second impact we’ll see is a dramatic reduction of SERP CTR for a continuously expanding set of queries. Google is clearly going to push hard to provide more and more of the answers to queries themselves. And that means moving forward, having a very strong understanding of what types of phrases within your space trigger KG (keyword group) results, and which ones don’t will be critically important when developing organic strategies.
  • Author Rank – I think the biggest long-term impact that few (other than Bill Slawski ) are actually discussing will be Google’s ability to finally connect all the dots regarding  Author Rank.

Google’s historical inability to really understand topical authority has IMO been their Achille’s heel for quite some time.  But Hummingbird potentially gives them the ability to fix all of that. Once you have good historical data on the topic/sentiment focus of all three levels of content analysis (site level, page level, and author level), being able to get extremely granular with how both old fashioned links and social shares are weighted gets a lot easier. And that fact makes a world powered by true Human Rank far more plausible.

Didit: SEMPO recently reported that salaries for SEOs are shrinking. Is SEO still a good career path for young people to pursue?

Greg Boser: I think it is pretty clear that SEO as a stand-alone discipline probably isn’t the best career choice long-term. However, that in no way means that the core skill set and overall personality traits you need to have in order to be a great SEO are no longer necessary in this new “digital marketing” world. In fact, I’d argue that they are more important than ever. In my experience, the new digital marketers that come from an SEO background have a much better ability to properly connect the dots between data and creativity.

Didit: What is your position on content marketing? Is it a bubble or is content truly king? (to use a lame marketing metaphor)

Greg Boser: Content marketing isn’t anything new, and it also isn’t a bubble. The only thing that has changed over the 17 years I’ve been in this game is the methodology involved.

In the beginning of SEO time, when things were driven almost entirely by on-page factors, content marketing simply involved writing and publishing great content (which is probably responsible for the metaphor coming to life in the first place). If you did that, things like audience and links developed naturally by people who were looking for content resources in a search engine finding you.

Now that process is completely backwards. In order to show up in a search engine today, you need to already have a substantial amount of links and audience. But there is now real efficient new content discovery mechanism that works anywhere close to as good as a search engine, so now everyone has to run around begging those who have an audience to give their content some exposure in order to get the audience building process started.

Didit: Google Plus – Yay or nay?

Greg Boser: As a place where society as a whole voluntarily hangs out in order to engage in 2D socializing? A big Nay.

As a place where a small subset of the web who is entrusted with defining what content Google should consider important hang out? A big Yay.

Didit: Dan Shure in our recent interview said that “no one uses Bing.” Can marketers really afford to ignore these people?

Greg Boser: Many years ago, a friend who was involved with Ask Jeeves sat me down and asked me how one would go about spamming the butler. My answer was that I had no idea, but as soon as he started showing up in my log files, I’d let him know… 🙂

That type of resistance of the early adoption bandwagon has always worked well for me, because to me, SEO and online marketing in general is all about efficiency. That means investing enough time to have a solid idea of what a new company/platform is trying to do, and where they want to go, but not wasting time diving in too deep before they have proven themselves worthy of your time.

For me personally, I’m still not ready to take the Bing plunge. But that doesn’t mean I’m not quietly standing on the sidelines rooting for them. 🙂

Didit Editorial