Roy Schestowitz wrote:
> __/ [ BearItAll ] on Wednesday 16 August 2006 14:47 \__
>
> > I wouldn't normally use a browser to download large files, much easier with
> > wget. So I haven't hit the 2G limit before.
> >
> > But I was on a site this morning and the only way their gave was via the
> > browser, though they did tell you that most browsers would hit a size limit
> > including Firefox (of cause I missed that bit of text until I discovered
> > the file wasn't fully downloaded, as you do). I thought that Opera would
> > have no trouble, but it turns out that is the same.
> >
> > Gzilla is the one to use for these large downloads, it's working fine so
> > far.
> >
> > But I just wondered if anyone had seen a reason for the download size
> > limit? I can understand the limit of number of concurrent downloads,
> > browsers are too big-n-bulky for that sort of job, but the file size limit
> > doesn't seem to have a reason other than a number the browser writers
> > picked out of the air.
>
> I can only think of the file size limit (4GB for NTFS) as a factor.
There is no such limitation in NTFS. That's the limit in fat32. There
are formats that limit you to 4GB, such as .wav - but that isn't the
file system. Also, some software may fail at this limit, because it
doesn't take NTFS into consideration - but the fact is there is no 4GB
limit on NTFS.
--
Tom Shelton [MVP]
|
|