Get Free Quote
« Could Web 2.0 Go the... | Blog home | Friday Recap: My... »
December 11, 2008

The Rarity of the Shared SEO Experiment

Print Friendly

I’ve heard from a number of superstar SEOs that experimentation is the cornerstone of successful search engine marketing. Considering the competitive nature of this field, the results of such testing are often held close to the vest. It’s not everyday that you get to see a search marketer’s detailed testing process and findings laid out for the masses. But that’s just what Matt Ridout did earlier this week.

On the SEOUnique Blog, Matt shared a little experiment with us. Using a test site that has apparently been around for at least three months, he says he:

  • Optimized the Meta data.
  • Included keywords and phrases within the page content.
  • And optimized the images (presumably addressing the ALT text, file name and surrounding text).

After improving those three basic SEO issues, Matt sat back and watched his rankings dance. He recorded the SERP positions of his keywords before making the changes and then once a week for six weeks. He then plotted his findings on nifty graphs and proceeded to share them with everyone interested — like me and (at the moment) 47 other impressed Sphinn members.

(As an aside, if people are wondering what kind of content is really worth our Sphinns, in my opinion, this is it.)

Let me quickly outline his findings as the experienced search marketers reading this are probably already aware of what goes down.

  • During the first two weeks, there was a gradual increase in rankings for most keywords.
  • The third week of the experiment brought a dramatic rankings drop for 85 percent of the keywords.
  • By the fourth week, 70 percent of the keywords did a 180, surpassing their baseline rankings.
  • There was little movement in rankings during the final two weeks of the experiment.

(Another aside. The graphs were a bit confusing to me until I realized, thanks to commenter John Spickler, that a drop in rankings is visualized as an upswing on the graph and a rise in rankings appears to be a dip in the graph. Makes sense considering high rankings are represented by low-value numbers, like the number one.) [He should have plotted them as negative numbers so he didn't confuse me people. --Susan]

I’ve never conducted an SEO experiment myself so I was delighted to come across a study like Matt’s. In the time that I’ve become a student of the industry, I’ve seen few case studies. Conferences are always good for a couple, but other than that, I continue to collect my industry knowledge from the news, vague secondhand accounts and the occasional example of an outlier that has dropped off the face of the SERPs. I can understand why the Colonel keeps his secret recipe secret, but I often wish that wasn’t the case in the search world.

Imagine if this cryptic attitude was the norm in the realm of science. Think of all the findings that no one would know about. Even scarier, think of all the developments that would have never happened (shoulders of giants and all)! How far could the industry have advanced by now if openness didn’t put someone at a disadvantage?

Am I na├»ve? Is my science comparison completely off base? Is secrecy the real reason I don’t see more case studies? Where do you go to get your information?. If you know where to find case studies or where an Internet marketer is giving it all away, I’m all ears.





9 responses to “The Rarity of the Shared SEO Experiment”

  1. xie mo writes:

    We know that the Internet has become the enterprises to promote products, increase the visibility of the necessary means to their business information in BtoB’s Web site to log into a lot of business after the Internet will do; At the same time, relatively well-known site in order to enrich their own Data resources, has been the find this kind of enterprise customer base continued to increase their visibility.

  2. Mark Anderson writes:

    SEO experiments are not like science experiments for several reasons. By publishing all the results of science experiments, every other scientist benefits. SEO experiments on the other hand only benefit you if they are kept secret. If you publish your results, then you’ve leveled the playing field. We can’t all rank #1.

  3. Nick Stamoulis writes:

    That is great, very rarely do consultants and marketing professionals share too much of their testing information.

  4. Jim Gaudet writes:

    For onpage SEO you can just view the source of any page you want. So you can learn yourself.
    On page seo will help with your terms, but links, links, links and traffic, traffic, traffic.
    That’s the potion.
    I do as practice though, make sure all my code is validated, and that it is clean. This helps when optimizing for SEO.

  5. Matt Ridout writes:

    Hi,

    Thanks so much for taking the time to read my post and posting your feedback about it. I hope to do more of these insights experiments soon.

    Matt

  6. Dr. Pete writes:

    I hate to say it, but as a former research scientist, I have to admit that that kind of secrecy is also common in the scientific community. Fortunately, all of the incentives in academic research are around publishing and most people want to extend knowledge, so the important things eventually get shared.
    I’m sure some of it in SEO is wanting to keep a competitive advantage, but it’s also just very hard to do research. Most of what people report (like Matt’s post) is really observational – you carefully track what happened and report it. It’s in the field, though, by necessity – you can’t move your website into a laboratory and control everything that’s going on. So, until dozens of SEOs do a similar observational study and compare notes, we only have a piece of the puzzle. I’m not insulting this approach in any way – I wish more people reported numbers like this. It’s just that experimentation in the traditional sense is very difficult for organic SEO, and most people don’t know where to start.

  7. Virginia Nussey writes:

    Mark, you’re right. I realize that competition is a real concern for SEOs, but whenever someone does share their findings, I find the results fascinating.

    Thanks for your insights, Dr. Pete. I didn’t realize the scientific community was so secretive. Maybe there’s hope for the SEO community to put the puzzle together after all…

    And thanks, Matt. I look forward to your future experiments!

  8. paisley writes:

    lol… interesting.. started an SEO experiment in public with 12 other people last week… 5 of the 12 have #1 rankings for the terms we targeted.. 1 strayed from the original experiment, 4 did not.

  9. seo philippines writes:

    Its very interesting. And i think its very useful for us..Thanks for the info! Really appreciate it.



Learn SEO
Content Marketing Book
Free Executives Guide To SEO
By continuing to use the site, you agree to the use of cookies. AcceptDo Not Accept
css.php