Get Free Quote

FEATURE: Announcing New SEO Data Built Directly into the CMS 

by Jessica Lee, October 14, 2011

In last month’s SEO Newsletter, we discussed SEO-friendly Web development , and why the content management system (CMS) is so crucial in an efficient search engine optimization campaign. This month, we’re continuing the CMS discussion by announcing new functionality in the works with Bruce Clay, Inc.’s (BCI) SEO-friendly CMS partner, Pixelsilk, that will provide more opportunity for site owners to control their SEO strategy right within their CMS.

“Search Advice” is a feature that launched in 2010 that provides real-time SEO guidelines from BCI in Pixelsilk’s CMS, when optimizing Web pages. The next iteration of Search Advice, set to roll out in Q1 of 2012, adds features that helps protect a site’s rankings from duplicate content issues and aims to improve the quality and value of content to the user.

Search Advice within the CMS Makes Optimization Efficient

For several years, Pixelsilk and Bruce Clay, Inc. have worked together to provide a strong foundation for sites to implement their Web marketing plans. The initial launch of Search Advice integrated components of BCI’s SEOToolSet® right into Pixelsilk’s CMS, so clients could track and employ SEO tactics within the CMS.

Search Advice serves up tactical plans of attack for on-page optimization to users of the Pixelsilk CMS. The original version works with a Web page’s assigned keywords being passed through BCI’s SEOToolSet for analysis.

The analysis gives back recommendations for how the Meta information and the content should be structured based on a competitive analysis of all the top-ranking sites for a keyword, mixed with proven methodologies.

This includes things like how many words of content should be on the page, keyword usage (where it should appear) and keyword density (how many times certain keywords should be used) in both the Meta information and the page’s content.

Bruce Clay Inc / Pixelsilk SEO CMS Tool

 
This feature is built on what BCI refers to as the “least imperfect” approach. There are more than 200 factors that Google’s algorithm takes into account for ranking; no one site can incorporate all of them at once, hence the concept of imperfection. However, the top-ranked sites usually have a similar pattern, such as on-page factors like content length and Meta data, to name a few. To mimic these sites and essentially “beat” them by one or more factors allows a site to be least imperfect.

The Search Advice functionality offers data to the user in the Pixelsilk CMS in real time, as he or she optimizes the page – making optimization efficient. The user can easily see if he or she is on the right track with SEO recommendations for Meta data and content, and how far off or how close he or she is to hitting the mark.

Updated Search Advice Works to Protect Rankings and Improve Content

The quality of a website is something that’s on the minds of Web marketers everywhere. With Google cracking down on low-quality sites in its SERPs for the better part of 2011, including the Panda algorithm update, its general mission is to always make the results more relevant to users.

Whether your site was hit by Panda, boosted by it or not affected at all, it’s a perfect time to reassess the factors that are allowing the competition to rank in a post-Panda era. This includes some of those factors we talked about in the previous section of this article, plus additional factors that the new iteration of Search Advice offers.

Flesch-Kincaid Readability Score

It’s a fact that when newspapers were the medium of choice, they were written at about a fifth-grade level to appeal to the masses’ varying reading abilities, and to keep it simply stated. Today, Web users may arguably have less of an attention span when it comes to reading content on a monitor or mobile device than the days when books and print materials were the norm. That’s why writing for the Web has to be kept straightforward, simple and to the point.

Reading a book

The new Flesch-Kincaid feature set to launch in Search Advice    helps users understand, based on an age-old formula, just how easy a page of Web content is to read. Much of this formula has to do with the length of sentences and the amount of syllables in the words used.

This new functionality will be built straight into the page, so writers and site editors have the ability to “grade” the content right then and there. Another function of this feature is to see how a page’s content fares in comparison with the top-ranking sites for that keyword – is the readability about the same? How can your site be more like the top-ranked sites?

However, at BCI, we often talk about the concept of tools and wisdom coming together to make the right decisions in the Web marketing plans. The readability score is a perfect example of that; if all the top-ranked competitors are writing Web content at a 12th grade+ level, it does not mean the site owner should rewrite all the content to be more verbose.

What this feature can do is give one more window into the ranking factors, and can even be used to apply deductive reasoning when trying to figure out what Google is looking for when it ranks a site.

Avoiding Duplicate Content Issues

Duplicate content has always been a no-no in SEO because it can filter a site from the SERPs if a search engine spider detects content on another site that matches it. All else equal, the site that looks the most relevant for that query will show up for that content, whether that site is the original author or not.

Until certain Google algorithm updates came into play this year, duplicate content wasn’t looked at as a penalization issue by the search engines, rather, one site would just win rankings over another based on whatever factors Google deemed relevant.

Now, Google’s algorithm specifically aims to banish scraper sites (those sites that steal unique content to feature on their own) from the SERPs, and it’s now the webmaster’s responsibility to monitor how his or her site is holding up in the era of this new algorithm.

A page can be considered duplicate in the spider’s eyes if it merely has duplicate Meta information across two or more pages, and of course, if it has duplicate body content. Search Advice plans to roll out two features that address this issue.

The first is a feature that ensures the Meta information is different enough site-wide to ensure pages will not be filtered out from the SERPs from duplicate content. The other feature is the integration of Copyscape right into Pixelsilk’s CMS.

Copyscape is a Web content plagiarism tool that helps identify what sites have the same content as any given Web page. Site owners can use this feature to monitor already written pages to identify content scrapers, as well as to discover if internal content creation is unique enough across multiple Web properties or pages, for example, so any given page still has the ability to rank.

To learn more about Copyscape, play the following video:

Stay tuned for these new features from Bruce Clay, Inc. and Pixelsilk, set to launch early next year. Got questions on Search Advice or SEO-friendly content management systems? Call a sales executive at BCI at 805.517.1900 or toll-free at 866.517.1900. You can also contact Pixelsilk at 541.317.3583 at any time.