cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: FTP large file support patch

From: Dave Meyer <meyer_at_paracel.com>
Date: Mon, 15 Dec 2003 10:07:20 -0800 (PST)

> I suggest leaving off the % from OFF_T_FMT so that it's possible to add
> flags if desired (e.g. you can't currently do ("%-12" OFF_T_FMT))
> It looks to me like _FILE_OFFSET_BITS is probably the right way to do this
> check. It's important that it will work even on when being cross compiled,
> but I haven't looked to see how configure determines this value.
>
> ...
>
> That might be the case, but the kind of systems that will be downloading
> a total of > 2 GiB are going to have a 64-bit off_t. For the others, it's
> just another form of the 2 GiB size limit.

I'll look into fixing the printf bits and progress function over the
holidays -- since I have something accomplishing my needs at work, I can't
really justify spending more time on these bits there. My further patches
will be coming from djm_at_cs.hmc.edu, and probably won't be showing up for a
week or two.

I will, however, be spending time at work on the stuff to recognize broken
server handling of the REST command for offsets above 2 GB. I'm going to
add that as an additional part of this current patch, since it doesn't
really apply to anyone who isn't also concerned about large files (well,
at least, as far as I've seen -- all of the servers I've talked to support
up to 2 GB REST just fine...).

When I get back to the progress metering callback, though, I do have a
question: Does the progress get cleared after each download for a
multi-file download, or is it shown as the total progress across all
downloads (and same question for uploads)? If it's the former, then I
have no qualms about trying to store download/upload state in off_t's,
since (as you mentioned) nobody who doesn't have large file support will
be downloading anything over 2 GB. However, if it's the latter, then I am
a little leery about changing things from doubles to off_t's, since the
*total* download may easily exceed 2 GB even on systems that don't support
large files -- download all of NCBI's Genbank EST's
(ftp://ftp.ncbi.nih.gov/genbank/gbest*.seq.gz) as one curl command, for
instance, and you're looking at something on the order of 17 GB of data...

Thanks,

Dave

-------------------------------------------------------
This SF.net email is sponsored by: IBM Linux Tutorials.
Become an expert in LINUX or just sharpen your skills. Sign up for IBM's
Free Linux Tutorials. Learn everything from the bash shell to sys admin.
Click now! http://ads.osdn.com/?ad_id=1278&alloc_id=3371&op=click
Received on 2003-12-15