Bulk Submit 2.0
Reminiscing a bit, Danny Sullivan remembers that back in the good old days of search engine optimization, bulk submit was all the rage. Webmasters could email InfoSeek a list of all their URLs and they’d be put in the index almost immediately. Unsurprisingly, these bulk submits disappeared over time, but now, years later, we’re starting to see new automated submit and spidering tools emerge. And that’s what this panel is set to focus on.
First up to tackle the issue, Google engineer Amanda Camp:
To start things off, Amanda mentioned all the ways you can add content that she WOULDN’T be talking about, including, Google Base, Google Book Search, Local Business Center (gets your business in Google Local), Google Video, Blogger, Google Page Creator and Picasa Web Center. I’m glad she did that, because these are all often overlooked ways to get your content to appear in Google. Sometimes going in through the side door is more effective as ringing the doorbell or breaking in a window.
Not surprisingly, Amanda is here to talk about Google Sitemaps. With Sitemaps, webmasters create a list of Web pages that the want Google to know about. This improves comprehensiveness, notifies them of changes or new pages to help freshness, and identifies unchanged pages to prevent unnecessary crawling and improve efficiency
Google Sitemaps accepts four different file submission types: text files, RSS/Atom feed, Sitemap protocol and OAI-PMG (Open Archives Initiative Protocol for Meta Data Harvested) – which you’ll probably never use, so pretend we didn’t even mention that one.
Amanda recommends that your site map be placed in the highest directory of the URLs you are submitting. Also, the domain of the URLs needs to match the location of the site map exactly (i.e. http:// vs. https://, www. vs. non-www, blog.google.com vs. google.com).
Tips for using Google Sitemaps:
- Always including the full path to the URL
- Remove any unnecessary parameters (session IDs)
- You may name the file anything you want (we recommend using the correct extension
- URLS must use UTF-8 encoding
- All URLs must be coded for readability
- Sitemaps should be a max of 50,000 URLs or 10MB
- Index files a max of 1,000 Sitemaps
- Use Gzip to compress your Sitemaps
Once you’ve created your Sitemap, go to Google’s Webmaster tools and let them know at what URL the file is located. Once they know, Google will go fetch it.
Amanda also went over the new Sitemaps.org site, which we blogged about at launch, so I won’t repeat that here.
Next up, Yahoo Engineering Manager Amit Kumar:
Amit is here to give attendees all the goods regarding Yahoo Site Explorer, which he calls a window to webmasters. Aw. Through the interface, webmasters can authenticate their site, browse their pages and in-links, and perform bulk submission (sitemaps, RSS, OPML).
He stresses the importance for users to authenticate their site. To do this, the webmaster must download an authentication key and upload it onto their system. Once the file is uploaded, notify Yahoo and your site should be authenticated within 24 hours.
Once complete, it tells Yahoo! that you own your site and they will be more willing to share information with you, like your subdomain information, your language (or at least, what language Yahoo! thinks your site is in), and your last crawl date. Additionally, you’ll also be able to export your first 1000 results to TSV.
Amit says the best is yet to come! Yahoo! is working on a lot of new features and product services so keep your eyes on Yahoo!’s Search Blog for updates.
Performics Eric Papczun is up next and says submitting a site map is important in order to get a complete and accurate list of your URLs. This is especially useful for small sites. Be careful not to add noise to the crawl by supplying multiple URLs to the same page.
Once submitted, Eric notes that site maps are usually picked up within 1-2 days, with the entire site map being crawled in 3-14 days (7 days is the average).
Eric offers up several site map management tips
• Have an optimized native sitemap: link to it in your global footer
• Focus the crawler on the right content by excluding redundant content, disembodied content or spammy stuff
• Use preferred domain tool to tell Google if you want www.domain.com or domain.com to appear on the SERP
• Including separate sitemaps for news and mobile content
Google sitemaps is just a tool; use it to help you accomplish your objectives.
Last up is my new SEO crush (don’t worry, Rand, you’re still my number one), Range Online Media’s Todd Friesen.
Todd’s presentation focused around using feed listings and comparison shopping engines (Google Base, Yahoo! Shipping, Microsoft Shopping) as another way to get your listings to show up on the SERP. He notes that RSS and XML aren’t just for blogs anymore – they are the preferred format of CSE’s and can help site get indexed via Paid Inclusion. If you submit it, they will come.
The benefits of using shopping feeds is that you can get product listings to appear in the "natural results’ within 48 hours, provide relevant and targeting copy, quickly update listings to reflect sales or promotions, and provide detailed tracking and reporting for testing
Todd mentions that though paid inclusion is often seen as a competitor to SEO, it can be used to achieve immediate results or to bulk up areas where site content is limited. Webmasters will find they can get results in days, not months, it doesn’t tie up clients IT resources and it makes A/B Testing in Natural Results Possible.