Forum Member Offers New Google Sandbox Theory
A new thread on the WebmasterWorld forums has made a quick launch to the front page, garnering it a lot of attention. The thread named Flattening Effect of Page Rank Interations – explains the ‘sandbox’? offers up a new theory to explain the much-hyped Google sandbox.
The proposed theory speculates that true Page Rank can only be calculated after multiple PR iterations. Thread author Grant hypothesizes that the duration of the sandbox is the ‘same amount of time it takes Google to iterate through the number of calculations needed to equilibrate the number of links for a site’. Yowsa, did you make it through all that?
Grant provides a detailed breakdown for this theory. I fear paraphrasing his methodology will weaken it, so I’m opting to leave it intact:
“If you analyze a site with 5 pages that all link to each other (the homepage PageRank is that the homepage is PR 3.5, and all other pages are PR .365 – the largest PR gap that will ever exist through multiple iterations in this example.
This homepage PR represents a surge in PR because Google has not yet calculated PR distribution, therefore the homepage has an artificial and temporary inflation of PR (which explains the sudden and transient PR surge and hence SERPs).
In the second iteration, the homepage goes down to PR 1.4 (a drop of over 50%!), and the secondary pages get lifted to .9, explaining the disappearing effect of "new" sites. Dramatic fluctuations continue until about the 12th iteration when the homepage equilibrates at about a lowly 2.2, with other pages at about .7.”
What do you think? It would explain why new sites often appear in the SERP and then immediately drop out of sight. But I suppose lots of factors could account for that. Forum members appreciate the fresh look, calling Grant’s theory both ‘intelligent’ and ‘refreshing’. The best thing about this thread, besides the new Page Rank theory of course, is that not everyone agrees with it and intelligent debate is the end result.
Tedster, a WebmasterWorld forum administrator, praises Grant’s theory but says while he thinks Page Rank calculations are part of the sandbox effect, he doesn’t think they are the sole factor.
“There must be at least one other component to the sandbox effect, because we see it on some keyword searches and not others — and PR is not related to content.”
Tedster thinks deep links (not just links from the homepage) could help infuse sites and ‘short circuit the flattening effect that PR iterations produce’.
Other members, like Junior Member Oliver Henniges, have labeled Grant’s theory doubtful. Oliver says in order for Grant’s theory to be accurate, Google would have to go through an iteration loop every 48 hours, contradicting the held belief that iterations need to be run in one action.
And of course, then you have the members who still think the idea of a Google sandbox is just another urban legend circulating the Web, refusing to acknowledge that it even exists. Fun to theorize, but administrator Trillianjedi (bring on the ‘jedi’ rankings!) seems to knock that down pretty quick.
And isn’t that precisely what makes these threads so interesting — it’s all rumor and every aspect is up for debate. Is there a Google sandbox? I don’t know. If there is, does Page Rank work the way Grant thinks? Maybe. No one can say for sure, but it sure is fun to talk about!
Users’ who found this WebmasterWorld thread interesting are encouraged to check out the strong new comments found on the SER site. More people are chiming in as Grant’s theory regarding the Google sandbox becomes one very hot topic of discussion.