Roy Schestowitz wrote:
> __/ [ BearItAll ] on Wednesday 16 August 2006 14:47 \__
>
> > I wouldn't normally use a browser to download large files, much easier with
> > wget. So I haven't hit the 2G limit before.
> >
> > But I was on a site this morning and the only way their gave was via the
> > browser, though they did tell you that most browsers would hit a size limit
> > including Firefox (of cause I missed that bit of text until I discovered
> > the file wasn't fully downloaded, as you do). I thought that Opera would
> > have no trouble, but it turns out that is the same.
> >
> > Gzilla is the one to use for these large downloads, it's working fine so
> > far.
> >
> > But I just wondered if anyone had seen a reason for the download size
> > limit? I can understand the limit of number of concurrent downloads,
> > browsers are too big-n-bulky for that sort of job, but the file size limit
> > doesn't seem to have a reason other than a number the browser writers
> > picked out of the air.
>
> I can only think of the file size limit (4GB for NTFS) as a factor.
There is no such limit on NTFS. That is for fat32. There are file
formats with that inherint limit (wav for exmaple) and software that
will fail on larger files because it was written for 9x systems and
doesn't take NTFS into consideration...
--
Tom Shelton
|
|