Deteriorating search engine results
The other day at the Bruce Clay Australia offices we had a discussion about how the search results seem to have deteriorated over the past 3 – 4 years. Whether it was searching for cheap hotel rates in Melbourne or trying to find a review for a laptop, it seems more time has to be spent and more searches have to be carried out just to find the relevant information. Now there are a few reasons why this might have happened and I am going to explore a few of these.
With all the advances in technology and search engines algorithms, are we expecting too much from the search results? In the past few years, everything has been geared around instant gratification (I think this was a driving force in the development of real-time search) and we are not only expecting results accurately but immediately. Now I struggle to recall my searching habits of 4 years ago but I have a sneaky suspicion that I was willing to cut all the big search engines some slack when it came to searching. I don’t think they have that luxury any more.
I think a big part of this heightened expectations comes from knowledge. I know Google has 5000 PhD’s just sitting around thinking of ways to improve the algorithm, but when I want to get the cheapest accommodation in Melbourne I dont want to have to navigate between 10 different, yet almost identical aggregator sites in the SERP’s before giving up due to frustration and ending up in Wogga Wogga, just because it’s easier to find a cheap hotel.
Another interesting observation is that more and more users are using longer (longtail) search queries to find what they are looking for. The question I pose is: are users searching with more specific search phrases because they are more savvy to the way the web works, or is it because the search results are so poor for broad-based queries that in order to find something relevant you are forced to use detailed and lengthy search queries?
Too much information! Normally a teenager’s response to a friend giving one personal detail too many, it can also describe the explosion of web sites, blogs, aggregators and web properties in general. Since March 2005, the number of people using the Internet has doubled to almost 2 billion. The figures around number of actual sites on the internet are a little hazy, but according to Google the number of sites they had explored was 26 million pages in 1998, one billion pages in 2000 and ONE TRILLION PAGES in 2008, and the number of individual web pages out there is growing by several billion pages per day.
Now I know Google has some of the best infrastructure, technology and talent at their disposal but if we take the number of pages as products that Google has to provide quality control for, then that’s alot of control – even for the world’s biggest brand. Not only is there so much more information for Google to sift through, but the information is becoming more complex to evaluate. SEO is becoming more prominent, with alot of people who own web properties having been exposed to some kind of search engine optimisation education and using that knowledge to make their sites more ‘appealing’ to search engines. Popular blogging platforms like WordPress also have SEO functionality now built in, straight out of the box, so again, more SEO-friendly pages are added to the web index. Aggregators, with their boilerplate templates are also becoming more popular in the SERPS again, further adding to the congestion.
Social media also comes into this. With Google and Bing starting to index Facebook and Twitter status updates, the amount of information is expected to increase exponentially. However, I think the search engines have the capacity to do this or they wouldn’t try.
The Internet is expanding exponentially and pointless pages are taking advantage of Google’s ranking system, rendering it unwieldy, Google and the other search engines need to counter this and find a way to once again become dynamic and streamlined and eliminate all those guff and filler sites that take up a large portion of their index (and my time incidentally).
Diminishing core focus on search
When Google first started, all they cared about was search. They put all their efforts and resources into making search better. Now it seems that have moved into fields pretty far flung from search, like renewable energy, cloud computing and mobile phones. Now I realise a simple answer is that they have invested in massive infrastructure and staff numbers to compensate for their diversification but somewhere along the way, with so many different things going on, their focus on search has diluted. Whether it’s right at the top with Eric, Larry and Sergey or down the line through the many thousands of Google employees, it’s happening.
I believe this dilution of absolute focus on search has resulted in some deterioration (or perhaps not as much improvement) of the search results. It’s time to put the focus back on search, or the current batch of search engines may become the HotBot and AltaVista of the past.
Maybe search results have gotten better and we are just expecting too much, however I don’t think this is the case; many people with in the search industry have noticed this and questioned the big wigs about it. I think that there are alot of outside factors that might be negatively affecting the SERPS, but I think it’s time the search engines took cognisance of these factors and tried to counter them instead of putting feature upon feature (real-time search, more blended results, caffeine update etc) upon pretty average search results that are already crying out for some better quality control and more relevant sites within those SERPS.