Update: Microsoft sent me a response to this post, which I felt would be great to add.

For webmasters, It is problematic to use the “site:” operator to determine how many pages for a site are included in the Live Search index. The “Site:” operator generates an estimate of the pages in the index. These numbers can vary wildly depending on when you execute the query.

You posed the question about whether users should block MSNbot because traffic from the bot is not worth the stress on your servers. Obviously, we would prefer that customers not block MSNbot, rather customers who are concerned with stress from Live Search crawls should add the crawl-delay parameter to their robots.txt file. This can help reduce the load on your servers and still be a part of the Live Search results.
Webmasters can refer to the MSNBot support page for more information on crawl-delay.