Get Free Quote
« Never underestimate... | Blog home | Ask Shoots, Scores! »
March 30, 2006

Search and Destroy

Print Friendly

In an attempt to get back at the DOJ for dragging them into court, Google went ahead and deleted all of the pages of a government website. Just kidding!

Well, about it being Google’s fault anyway. Barry of Search Engine Roundtable has started a thread on the SER forums commenting on a post by Alex Papadimoulis entitled, The Spider of Doom. The post explains how webmaster Josh Breckman was contracted to develop a content management system for a ‘fairly large government’ site. Breckman was asked to design a system that would allow employees to log in and make changes to the content on a continuous basis. Because there was already an active website, the client also wanted to be able to ‘reorganize and upload’ the old content onto the new site before it went live. According to Papadimoulis’ post, things were seemingly going quite well until one day all of the content vanished.

Was it Google out to settle the score? An International hijacker hoping to learn government secrets? A touch of company espionage? Jeeves out for world domination? No, none of that. After some searching and scrambling, Breckman was able to locate the cause of the wipeout: the dastardly wicked Googlebot! How did it happen?

“A user copied and pasted some content from one page to another, including an “edit” hyperlink to edit the content on the page. Normally, this wouldn’t be an issue, since an outside user would need to enter a name and password. But, the CMS authentication subsystem didn’t take into account the sophisticated hacking techniques of Google’s spider.”

Yes, that’s right. The Googlebot went in and hit delete on every single page of the government-based website. But you can’t blame the Googlebot – it was just doing its job. That’s what you get for leaving open edit links on the front end of your site.

According to Papadimoulis, Breckman was reportedly able to restore an older version of the site using backups, while bringing up the root cause: security could be beaten by disabling cookies and JavaScript. However, the government site owners didn’t understand quite what happened and instead ordered Breckman to never copy and paste content ever again. Yes, that’ll do it.

One thing is for sure: Beware of Googlebot…and client-side security.

Print Friendly




Comments are closed.



Learn SEO
Content Marketing Book
Free Executives Guide To SEO
By continuing to use the site, you agree to the use of cookies. AcceptDo Not Accept
css.php

Curated By Logo