Here I didn’t think it could get any easier submitting sitemaps to search engines, but, it appears the search engines have decided on a sitemaps standard, and all you have to do is put one line in your robots.txt file and all the search engines who support it, Yahoo, Ask, Google, MSN, etc, will know exactly where it is.
Second, it’s now easier for you to tell us where your Sitemaps live. We wondered if we could make it so easy that you wouldn’t even have to tell us and every other search engine that supports Sitemaps. But how? Well, every website can have a robots.txt file in a standard location, so we decided to let you tell us about your Sitemap in the robots.txt file. All you have to do is add a line like
to your robots.txt file. Just make sure you include the full URL, including the http://. That’s it. Of course, we still think it’s useful to submit your Sitemap through Webmaster tools so you can make sure that the Sitemap was processed without any issues and you can get additional statistics about your site
Last but not least, Ask.com is now also supporting the Sitemap protocol. And with the ability to discover your Sitemaps from your robots.txt file, Ask.com and any other search engine that supports this change to robots.txt will be able to find your Sitemap file. Source: What’s new with Sitemaps.org?
Good stuff. Bruce Clay covered the Sitemaps summit,
Interestingly, Keith notes that less than 35 percent of servers have a robots.txt file. See, this explains why so much off-limits content is getting spidered and appearing in the index. It’s not the engine’s fault; it’s the site owners who didn’t read their robots.txt manual.
Some more fun facts from Keith: The majorities of robots.txt files are copied from others found online or are provided by a hosting site. This is a clear sign that site owners don’t know to use them. The files typically vary in size from 1 character to well over 256,000 characters, though the average robots.txt file is 23 characters. Source: Robots.txt Summit
LOTS more info on his site. As usual, Danny Sullivan has the whole thing covered in one blog post.
Last November, Google, Microsoft and Yahoo united to support sitemaps, a standardized method of submitting web pages through feeds to the search engines. Today, the three are now joined by Ask.com in supporting the system and an extension of it called auto discovery. This is where the major search engines will automatically locate your sitemaps file if the location is listed in a robots.txt file. Announcements are up from Google and Ask now Yahoo and Microsoft. Source: Search Engines Unite On Sitemaps Autodiscovery