Introduction About Site Map

XML
RSS 2 Feed RSS 2 Feed
Navigation

Main Page | Blog Index

Monday, June 27th, 2005, 5:00 pm

Googlebot versus MSNBot

Bill Gates
Bill Gates arrested in his younger days (photo in public domain)

An item I have just come across (hat tip Justin Moore) speaks about excluding MSNBot based on poor performance. MSNBot (and Yahoo! likewise) consume a great deal of bandwidth, but are barely used as search engines by the Internet audience.

From Justin:

  • Exhibit A – MSNbot Crawl hits = 9561
  • Bandwidth used = 124.43 MB
  • Visits to wantmoore.com as a results from searches @ Google = 683* [sic.*]
  • Exhibit B – Googlebot Crawl hits = 3415
  • Bandwidth used = 51.74 MB
  • Visits to wantmoore.com as a results from searches @ MSN = 41 [sic.*]

So, effective immediately:

  User-agent: MSNBot
  Disallow: /

* Order of the visits is wrong and should be reversed.

I spotted similar statistics on my domain, with Google as the top referrer at almost 20,000 visits/month. MSN remain at a miserable 200 and take almost as much bandwidth as Googlebot. I have plenty of bandwidth to spare; So, I might as well let Microsoft spend crawling time in vain.

3 Responses to “Googlebot versus MSNBot”

  1. Webmaster Says:

    Wait, wait, wait, you’re blocking MSN bot from using a whole 100mb of bandwidth, because it only sends you 41 hits / month? Who cares, I mean it’s 100mb of bandwidth, just let it send you that wee bit of traffic? or, of course you could SEO for MSN

  2. SEO Files Says:

    Googlebot versus MSNBot
    [Source: schestowitz.com] quoted: Wait, wait, wait, you’re blocking MSN bot from using a whole 100mb of bandwidth, because it only sends you 41 hits / month? Who cares, I mean it’s 100mb of bandwidth, just let it send you that wee bit of tr…

  3. Roy Schestowitz Says:


    Wait, wait, wait, you’re blocking MSN bot from using a whole 100mb of bandwidth…

    Webmaster (Floobo),

    No, no, no, it was Justin Moore whose choice was to do that (it was a blockquote, in fact). I very much disagree to putting robots in a position of disadvantage. I don’t discriminate against crawlers even if their potential is low.

    Allowing no access to competition is the same evil that resulted in proprietary protocols, from which we all suffer.

Back to top

Retrieval statistics: 21 queries taking a total of 0.123 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|