SEO Dave wrote:
> On Wed, 23 Feb 2005 12:13:30 GMT, Big Bill <firstname.lastname@example.org>
>>>> The usual way to get round this is to have a series of site maps on
>>>> the second level.
>>>Interesting... why not just have a single gigantic site map?
>>'cause Google Says So! along with saying it doesn't generally go more
>>than 100 links deep into a page.
> That's rubbish, Google doesn't say that at all.
> "Offer a site map to your users with links that point to the important
> parts of your site. If the site map is larger than 100 or so links,
> you may want to break the site map into separate pages."
> "Keep the links on a given page to a reasonable number (fewer than
> 100). "
> These are guidelines not hard rules. Unless you want to show where
> Google says "it doesn't generally go more than 100 links deep into a
> The real important question is do you understand why Google recommends
> webmasters should keep links per page (all pages, not just site maps)
> to below 100 links?
> If you understood this you'll then understand why it's fine to have a
> pages with 500+ links from it as long as you get other things in
> You'd then know why Google says 100 links and not 100 links unless you
> have X,Y and Z in place in which case you can have many more links.
> Which would be much more accurate, but confuse a lot of webmasters.
> This page http://www.classic-literature.co.uk/classic-literature.asp
> has 900+ links from it, Google follows links from this page at random
> (like it does with all pages) way below the first 100, it's probably
> followed them all by now.
Thanks for that. I imagined this must have been the case.
I have spotted pages with over 100 links that still get crawled to their
depths. If it were not the case, then I'd be worried. I have not come to
the point where I sacrifice visitors' usability to satisfy crawlers. If it
were the case (assuming crawlers reflect on visitors' needs), then crawlers
would be said to be faulty.