Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> wrote:
> __/ [ John Bokma ] on Thursday 08 June 2006 05:23 \__
[..]
>> If they check for that, yup. Some sites check for the crawlers, based
>> on IP or name.
>
> In worse scenarios, if you have no browser extensions, wget can be
> used to fetch the page in question. There's the "--user-agent" option.
In worse scenarios that doesn't work, unless you work at Google.
[ website structures ]
> *smile* I can remember the time when I ceased to maintain the sitemap
> and lost that visual, conceptual idea of how my site was constructed.
> It is now somewhat of a messy Web, which I sometimes try to
> restructure. Same situation with E-mail accounts, Web hosts, and
> domain names.
I think the messy web structure is the best. Websites are rarely a perfect
tree structure.
--
John Freelance Perl programmer: http://castleamber.com/
Creating a customized Command Prompt shortcut:
http://johnbokma.com/windows/command-prompt-shortcut.html
|
|