cURL / Mailing Lists / curl-library / Single Mail

curl-library

Need help with large files

From: Joe Nardone <jnardone_at_gmail.com>
Date: Tue, 10 Apr 2007 17:42:04 -0400

I've been over the mailing lists for most of the day and am still stuck, so
now I'm turning to you.

I've got a daemon on RHEL4, using the libcurl 7.12.1 that ships (though I've
also tried using 7.16.1). In all cases I'm using the shared lib.

My program gets and sends files over FTP. However, 2+ GB files aren't
working.

Here's what I've tried so far, all using the RHEL-packaged lib:

1) If I don't compile my own program with large file support, I core dump
when I hit 2GB. (Not a big surprise, since the FILE * is 32-bit).

2) If I compile with either implicit or explicit large file support (
http://people.redhat.com/berrange/notes/largefile.html), then I do *not*
core dump when I hit 2 GB. The entire file transfers BUT: curl_easy_perform
never finishes, and hangs/loops forever until I kill the program. The
entire file is transferred, but it never detects that the transfer is
complete.

This was true also when I manually built 7.16.1 and rebuilt my programs
against it. (In all scenarios files under 2 GB work fine.) In both cases
curl --version shows "Largefile" as a feature.

What I have not been able to figure out is:
1) What, if anything, do I need to do to enable large file support within
libcurl at compile time? Or should it just "work"?
2) What am I doing wrong?
     a) Do I need to use a custom CURLOPT_WRITEFUNCTION? I am using the
default. (If so, will I need a custom function for reads as well?)
     b) Am I doing something stupid with regards to large files in my own
code? I am floundering trying to find a concise answer to this.

Any help would be appreciated.

Joe
Received on 2007-04-10