If you use Google Sitemaps for your Blogger and you are seeing a sudden increase in the number of links being blocks by robots.txt, don't panic as it is not anything you did wrong. It is all because Blogger is adding robots.txt by default recently. As you can see from the codes below, all pages under the /search directory is being disallowed meaning and pages under the /search directory would not appear included in any search result pages of major search engines (e.g. Google, Yahoo and Live.com).
User-agent: *
Disallow: /search
Sitemap: http://gspy.blogspot.com/feeds/posts/default?orderby=updated
This is, in fact, a good news to Blogger users as Google is treating most of these blocked pages as duplicated content and are listing them as 'Supplemental Results'. Furthermore, the more duplicated content you site has, the less Google is weighing your site's content.
A further improvement towards this great feature is to allow Blogger users to customize their own robots.txt so they could prevent undesired content from appearing in search results.
Note: If you are using Blogger and has recently directed your feed to FeedBurner, make sure you change your sitemap URL in Google Sitemaps to
http://yourblog.blogspot.com/rss.xml?orderby=updated instead of just
http://yourblog.blogspot.com/rss.xml otherwise an error would occur in Google Sitemaps.