Oct 11

Introducing the Controversial Theory of “Peak SEO”

oh frack!


Over the last few years in the oil industry, a popular theory has arisen that all of the world’s easily obtainable oil is running out.  The problem of increasingly difficult-to-extract oil is also compounded by the fact that oil producing countries are in many cases third-world nations that are industrializing and are now consuming more oil themselves, making them a less likely source of exports in the future.  The theory (called “Peak Oil“), although controversial, makes sense, as the supply of oil underground is relatively fixed.  Although natural processes doubtless create more oil, it’s apparently happening slowly over geologic time, while we’re extracting it relatively quickly;  it’s not really a renewable resource from the perspective of beings that live for only 70-100 years.

This theory is now being applied by others to the Gold mining market (i.e. the theory of “Peak Gold”), and when you look at graphs of production over time, the consistent rise in how much capital it takes to extract an ounce of gold from the ground and so on, it’s a very compelling theory as well.

I have been thinking about similar problems in the search world for some time now and would like to introduce the concept of “Peak SEO” – in other words, the concept that the supply of available keywords and SERP positions is essentially fixed, and increasingly the low-hanging fruit is being eliminated and reducing the ROI of SEO more and more every year.

The Supply of Available Keywords is Fixed

Well, in one sense, this is not exactly true.  Google’s VP of Engineering famously disclosed in 2007 that:

“20 to 25% of the queries we see today, we have never seen before.”

And currently Google’s website states:

“Since 2003 Google has answered 450 Billion new unique queries – searches we have never seen before.”

Certainly, that’s great from the standpoint that, new queries like [Justin Bieber], [Occupy Wall Street], and so on that few were searching on before do become popular and offer new opportunity.  But many, in fact, I would argue *most* of those new searches that Google refers to are simply what I would call “garbage” searches that are so long-tail, no one in the entire history of mankind is likely to ever be typing them again.  Think of terms you’ve typed yourself like [squeaky floor remedy if shim and nail approaches did not work].

My point here is, the supply of available keywords that *matter* from an SEO standpoint is quite fixed.  You’re not going to bother creating a page optimized for the term above, just for one or zero people to click on in the next 10 years, there’s no ROI in it.

Much of what SEO practitioners do is, research keywords, usually using the Google AdWords Keyword Research tool, select keywords that are high or medium volume with less competition, and then work on creating content with links to it that can rank well.

But think about this.  Clearly, in the old days at some point, [debt consolidation] must have been a highly lucrative and also achievable term to try to rank for.  Well, try ranking for that term now – good luck with that.  See my point?  If that keyword has run dry of opportunity from an ROI standpoint, OK, no problem, go find another one.  What happens when that one runs dry?

Each keyword is like a tiny oil well, that will always give up some oil, but only with increasing effort over time.  Eventually the keyword becomes too expensive to bother with, so you must move on.  But what happens when there are no more keywords  to move on to?

Empirical Evidence

I wish I had been tracking keyword ranking difficulty in SERPs for the last ten years, but I have not.  However, I do have one representative piece of clinical (anecdotal) evidence – hopefully others will offer their own in the comments below.

I recently did some keyword research for a major consumer products producer, and organized all the keywords I could come up with into a list of 75 categories (this was an overall list of keywords across a pretty a diverse range of concept topics, and the keywords I found were in the many thousands).

Then I went through the list and culled them for those that had traffic over 1,000 searches a month, and a competition level such that it shouldn’t be too difficult for this client to rank at least #15 with some reasonable effort.

For 21 out of the 54 categories, I couldn’t find a *single* keyword worth optimizing for – they were all way too competitive, the ROI was just not there.  Within the remaining categories of keywords where I successfully did find some good target keywords, there were many that were too competitive as well.

Altogether, of those keywords with 1,000 or more searches a month (3,353 of them), I identified 583 that were low enough competition to bother with.  That’s a ratio of only 17%.

Had I looked at the same keywords five years ago, that number would most certainly have been higher than 17%. Yes, there would have been less traffic five years ago, so perhaps there would have been fewer terms in the 1,000+ searches set, but the competition is the key item to focus on here.

Let’s Extrapolate and Forecast

Here’s an easy way to extrapolate those numbers out – let’s say the era of SEO began when the Rand Fishkin article was in Newsweek magazine, in 2005.  It’s been 6 years since.

Let’s also assume that all the keywords I examined started out at that point with no competition.  That would mean 83% of worthwhile keyword/SERP positions have been taken over in 6 years.  This would mean we have something like *one to two* years left for the remainder to be taken up.

You could well argue that the era of SEO began earlier than that.

Fine, let’s assume then that the era of SEO began in 1997, when Google was founded (it’s hard to make any assumptions earlier than that right?).  That means 83% of the worthwhile keyword/SERP positions have been taken up in 14 years.  That would leave roughly *three* years left for the remaining 17% to be covered, assuming that everyone going after these terms and SERP positions continue to work on occupying additional ones at the same rate.

Certainly, this particular keyword set may be more competitive than the average set of relevant keywords, but you get the general idea.

It Gets Worse – The Supply of *all* SERP Positions is Actually Decreasing

This is highly analogous to the part of the “peak oil” problem where oil-exporting countries are further exacerbating the problem by industrializing and using more oil themselves, leaving less and less available to export.  I am of course talking about Google taking over its own SERPS, with things like:

– Local Search Results
– YouTube Videos
– Google News Items

I don’t need to detail all of this – if you want the best survey of the shrinking available real estate on Search Engine Results pages, just read one of Aaron Wall’s great diatribes on this.  He has written on this many times, but here are a few:


Suffice it to say, Google is busily working on chipping away and presenting its own results in the SERPs rather than others – leaving that much less keyword “oil” available.

It Gets Worse – The Number of People Practicing SEO is Increasing

This certainly doesn’t help the individual practitioner!

I don’t have any statistics to offer, but it seems intuitive – Online Marketing is a hot area and more and more people are employed in it every day.  The proliferation of SEO tools, books on SEO, and so on are probably increasing the number of people who are doing SEO, whether full-time, part-time, or one-time.  Significantly, products like HubSpot make SEO more accessible to the average small/medium business as well, getting people who are less SEO-savvy to start targeting keywords and optimize content.

Does the Oil Industry Point to a Way Out?

Well, we can look to the Natural Gas and Oil markets and see that they have an out.

The Natural Gas drillers introduced two techniques to improve production.

One is called “fracking”.  No, this has nothing to do with old-school Battlestar Galactica.  It’s short for “fracturing” and involves creating underground explosions that create cracks in shale under the earth, then pumping water into the shale to create pressure.  This causes natural gas to pool into the cracks and then forces it to the surface.

The other technique natural gas drillers introduced was, rather than drilling holes vertically down and hoping they hit something, what they do now is, determine what depth natural gas is based on other successful holes, then they drill down and then drill *sideways*.    Instead of having uncertainty in X, Y, and Z, they cheat and start with a known good Z, fix Y, and then just vary X – this results in a huge jump in the probability of finding something.

The result of these two techniques was that the supply of Natural Gas skyrocketed, prices crumbled, and now there is so much Natural Gas drilling available that drilling for it has become a barely profitable business for a few years now.

In the case of oil, a company named EOG Resources decided to apply the same two techniques, not just to natural gas, but now – to oil itself.  As a result, North Dakota is undergoing a *huge* boom in oil industry employment, and the techniques, now adapted for oil, have the potential to revolutionize the entire oil industry worldwide.

So, are there any out-of-the-box possibilities for our industry?

Here are all the ones I can think of that can help delay the inevitable; other ideas appreciated of course..

1.) Mine other people’s algorithms besides Google.
Turning to channels such as Facebook, Twitter, Apple’s App store, videogame platforms, etc. is analogous to looking for oil in shale in North Dakota (or even on Mars) rather than in pockets in Saudi Arabia.  The rise of social media optimization offers a great opportunity for SEO practitioners, and anyone reliant on internet traffic, to branch out.  Fortunately for all of us, there are many algorithms out there to reverse engineer!

2.) Reduce the “I” in ROI through automation and Tools.
This can help tremendously.  By the way, notice who made tons of money in the California Gold Rush by the way – the guys who sold shovels, eggs, and Levis to the miners.  SEOMoz anyone?

3.) Go after ever lower-volume keywords more efficiently.
If the threshold for worthwhile keywords can be lowered (i.e. 500 searches a month, 250 searches a month, and so on), then certainly a much wider set of keywords becomes available (sort of like drilling for shale oil).  Automation and tools can help, as noted above, but also user-generated content.  Perhaps hybrid models where users create content and someone applies minimal effort to increase keyword density, optimize meta-tags etc, may become more prevalent.

4.) Shift focus to “Query-Deserves-Freshness” Keywords
Google has disclosed in the past that certain keywords are determined to deserve “fresher” (newer) results, like [justin bieber].  Perhaps SEO practitioners will learn to focus on these keywords, and will increasingly ignore queries that don’t particularly deserve freshness, like [herman melville].

What do you think?

The data I’ve quoted for the one case above is based on a few thousand keywords in one market.  But it’s consistent with what I’ve seen with a few other clients.

To me, the question is not, “are we going to run out of desirable keyword SERP positions”, it is instead:  “is it going to take 3 years, 5 years, or 10 years?”.

What does your data show?  What monthly local search volume do you use when determining which keywords are worth generating content for?  Does anyone out there have any historical spreadsheets they can dig into from 5 years ago on SERP competitiveness they can compare to the same SERPs now?

I believe we have already entered the era of “Peak SEO”, and the next few years will see keyword difficulty increasing, opportunities more scarce, and businesses and SEO practitioners turning to other, more lucrative channels for traffic.  This is not a radical forecast, it’s really just descriptive of what I believe we’re already seeing.

Do you have any numbers?  The industry badly needs numbers on this!


  1. Andres says:

    It is really true. Not only the landscape is much more difficult than it was even 1 year ago. Now many open web 2.0 platforms are putting tough rules to publish content.

    It is not easy anymore.

  2. Treadmill says:

    I believe it was Aaron Wall who asked the question (paraphrasing) “How many success stories have you heard from business starting since 2009 succeeding primarily through SEO?” I haven’t heard any answers, but it would be interesting to hear some.

  3. Tom Pick says:

    I think the answer is a variation of your idea #1: think beyond Google. The largest search engines used to be Google, Yahoo and Bing. Now, by volume of searches, they are Google, YouTube and Facebook.

    Internal searches on other networks like LinkedIn and Twitter are also growing rapidly and not to be ignored. Optimizing for all of those other places where you may be found can dramatically increase your non-search-engine search traffic. And the links created can help your rankings in “real” (Google) search as well over time.

  4. Richard Wong says:

    Hey Ted. I’ve been reading your blog for the past few days and you come up with some really thought-provoking stuff. I hope you’ve been doing well. Cheers.

  5. 2013 靴 says:

    Ahaa, its good dialogue on the topic of this article here at this weblog, I
    have read all that, so at this time me also commenting here.

  6. Very fascinating read. I do agree that there is only a fixed number of industry-specific keywords and that YES, the SEO competition is getting very aggressive for these keywords.
    However, I don’t believe that makes it nearly impossible to rank well for high-competition keywords. The secret is understand what your competitors are doing and constantly find ways to expand and improve on their SEO efforts.

    The top ten positions do not permanently belong to whoever currently has them. You can take the top position for the top keywords, it just takes A LOT of work.

    I own PEAK SEO in Vancouver, BC and I have been working hard on developing the closest thing possible to what could be called an SEO standard… the only problem is that the so-called “Standard” is really determined by your competition. Just because they are on the front page of the search results it doesn’t mean they know what they are doing… it just means that nobody has come along to outperform them.


Leave a Reply