Borek wrote:
> On Fri, 08 Jul 2005 05:14:41 +0200, Roy Schestowitz
> <newsgroups@schestowitz.com> wrote:
>
>> On a more general note, I know how valuable these sitemaps are to Google.
>> They give them more available rosources. Yet, many people ask "how does
>
> So far Google just fetches my sitemaps 4 times a day. One site is PR3
> 5 months old, second is PR2 several years old, redesigned in June.
> No signs of crawl on either (and there are not spidered pages on
> both sites).
>
>> it help /me/"? Unless Google reward people for doing it (e.g. ranks,
>> crawling frequency), what is the point?
>
> Good question :(
>
> But it has beed only four days since I have submitted sitemaps,
> so perhaps crawl will start in a few days.
Thanks for keeping us aware of the way it affects your logs. I am curious
but skeptic about Google site map. I know how laborious it can be to find
and centralise these entries to isolate Googlebot. I wish there was some
table of reference from Google, which perhaps predicts the amount of
crawling based on PR, links and other factors.
As a side note, by over-informing Google what happens in your site/s (which
site maps do), wouldn't you expect _less_ crawling? Less crawling means
less traffic hence potentially a lower bill, but is it necessarily a good
thing? Will as many links on your site be selected when Google does a
single pass on each page, for example? Robots fetched 60,000 pages from my
site last month. I only have about 20,000 pages altogether (a lot of
documentation).
Roy
--
Roy S. Schestowitz
http://Schestowitz.com
|
|