__/ [ John Bokma ] on Friday 05 May 2006 05:29 \__
> Big Bill <kruse@xxxxxxxxxxxxxxx> wrote:
>
>> http://www.sitepoint.com/article/indexing-limits-where-bots-stop
>
> Thanks. Did this question pop up here recently? I knew it was well over
> 300K (Google), which is more then sufficient IMO.
That's intersting indeed. Thanks, Bill. I have always imagined it would be
somewhere around 100KB since, beyond this stage, the user can be
dissatisfied with the referrals. There remain many CMS's that fail to
apply paging to comments (it's being worked on in WordPress, as a plug-in,
I believe). Have you ever opened a popular Digg or Slashdot page? Even
with comment folding, these pages become monsters that are hard to open
within a reasonable amount of time. They also devour memory. Thresholds,
if comment moderation exists, do not help much either; not in their
default setup anyway.
Other things to ponder: how many links in a page can be honoured and
therefore followed? How are long pages valued? How does that affect
keyword density? E.g. is it normalised by the total amount of words?...
Roy
--
Roy S. Schestowitz | "World ends in five minutes - please log out"
http://Schestowitz.com | GNU/Linux ¦ PGP-Key: 0x74572E8E
7:10am up 7 days 14:07, 13 users, load average: 0.59, 0.67, 0.64
http://iuron.com - next generation of search paradigms
|
|