Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Sitemaps Work

Roy Schestowitz wrote:

<snip>dynamic/static sites, google sitemap, effects on SEO</snip>

It actually raises an interesting point. Given certain URL structures, SE's
can not only predict if pages will get generated on-the-fly, but also how
they get generated. For example, it should not be hard to spot a blog and
tell it apart from a commercial CMS, a free (Open Source) CMS and a DIY
CMS. It can give the SE some indication of reliability of information
within.


You could write blog software that has exactly the same URL structure as a commercial CMS. As for reliability, would Enron's commercial CMS contain information that is more or less reliable than the OpenSource Wikipedia?


[Davémon] added on Wednesday 07 September 2005 09:13

Maybe not to google, but if it's not dynamic, linking 30,000 pages is
enough ctrl-c ctrl-v to give anyone a RSI.

You should not forget about tools that link pages off-line rather than on-line (AKA on-the-fly or 'real time').


You're right. I forget those all the time, mostly because I don't use them!


--

Davémon
http://www.nightsoil.co.uk

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index