Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Spidering Sites (was: Konqueror off-line)

__/ [ Ron Gibson ] on Sunday 12 March 2006 08:20 \__

> On Sat, 11 Mar 2006 10:25:41 +0000, Roy Schestowitz wrote:
> 
>> So, for example, consider:
>  
>> $ wget -r -l2 -t1 -N -np -erobots=off
>> http://username:password@xxxxxxxxxxxxxxx:80
>> 
>> The above will limit the level of /depth/ explored in the site. It also
>> honours rules for spidering and it authenticates with the site, in case it
>> is necessary.
> 
> I don't use wget a lot. Thanx for the example - Saved to tips.

It was extracted from my own pool of 'tips' -- that which is used for rapid
copy&paste. It's a common habit.

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index