cURL / Mailing Lists / curl-library / Single Mail

curl-library

Curl Issue..........

From: Sumukh Anantha Manohar <am.sumukh_at_gmail.com>
Date: Fri, 22 Jul 2011 14:32:08 +0530

On Thu, Jul 21, 2011 at 3:30 PM, <curl-library-request_at_cool.haxx.se> wrote:

> Send curl-library mailing list submissions to
> curl-library_at_cool.haxx.se
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://cool.haxx.se/cgi-bin/mailman/listinfo/curl-library
> or, via email, send a message with subject or body 'help' to
> curl-library-request_at_cool.haxx.se
>
> You can reach the person managing the list at
> curl-library-owner_at_cool.haxx.se
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of curl-library digest..."
>
>
> Today's Topics:
>
> 1. Re: Curl Issue.......... (Alan Wolfe)
> 2. Re: Curl Issue.......... (Dan Fandrich)
> 3. file upload using libcurl fails for filenames containing '#'
> (Thomas Falkenberg)
> 4. Re: file upload using libcurl fails for filenames containing
> '#' (Jeff Pohlmeyer)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 20 Jul 2011 23:29:32 -0700
> From: Alan Wolfe <alan.wolfe_at_gmail.com>
> To: libcurl development <curl-library_at_cool.haxx.se>
> Subject: Re: Curl Issue..........
> Message-ID:
> <CA+dZFW9yCKJji7X2jaMWTzsnWmx-25f9-GA5c1kS5zYShoOsBw_at_mail.gmail.com
> >
> Content-Type: text/plain; charset="iso-8859-1"
>
> On Jul 20, 2011 11:28 PM, "Sumukh Anantha Manohar" <am.sumukh_at_gmail.com>
> wrote:
>
> Hi,
> I am trying to download files whose size is greater than 2GB using curl
> API's like curl_easy_perform ( ) and curl_easy_setopt ( ) on linux and unix
> platforms using c++ language.
>
> The problem is, the above functions are helping me to download upto 2GB
> file
> size and not more than that.
> Here what i am doing is................I am calling the statement:
>
> curl_easy_setopt(curlHandle_m, CURLOPT_WRITEFUNCTION,
> FileTransfer::CallBack);
>
> Then, it calls the CallBack function which tries to download the file part
> by part in bytes into the destination place.But once the file reaches 2GB
> size it is not able to write the remaining data into the file.
>
> Following is the Callback function:
>
> size_t FileTransfer::CallBack(void *buffer, size_t size, size_t nmemb, void
> *stream) {
>
> struct DownloadFileName *out = (struct DownloadFileName *) stream;
> if (out && !out->stream) {
>
> LOG_DEBUG(ACE_TEXT("While downloading file name as: %s\n"), out->filename);
>
> // open file for writing
> out->stream = ACE_OS::fopen(out->filename, "wb");
>
> if (!out->stream) {
>
> // Failure, can't open file to write
> return CURLE_WRITE_ERROR;
> }
> }
> return ACE_OS::fwrite(buffer, size, nmemb, out->stream);
> }
>
> So plz anyone help me regarding this issue......
>
> Thanks,
> -------------------------------------------------------------------
> List admin: http://cool.haxx.se/list/listinfo/curl-library
> Etiquette: http://curl.haxx.se/mail/etiquette.html
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://cool.haxx.se/pipermail/curl-library/attachments/20110720/39cef79c/attachment-0001.html
> >
>
> ------------------------------
>
> Message: 2
> Date: Wed, 20 Jul 2011 23:37:54 -0700
> From: Dan Fandrich <dan_at_coneharvesters.com>
> To: curl-library_at_cool.haxx.se
> Subject: Re: Curl Issue..........
> Message-ID: <20110721063752.GA530_at_coneharvesters.com>
> Content-Type: text/plain; charset=us-ascii
>
> On Thu, Jul 21, 2011 at 10:53:59AM +0530, Sumukh Anantha Manohar wrote:
> > I am trying to download files whose size is greater than 2GB using curl
> API's
> > like curl_easy_perform ( ) and curl_easy_setopt ( ) on linux and unix
> platforms
> > using c++ language.
> >
> > The problem is, the above functions are helping me to download upto 2GB
> file
> > size and not more than that.
>
> libcurl is capable of transferring files larger than 2 GiB, but the rest of
> the
> system must also be capable for it to work. Recent Linux and libc versions
> are
> of course capable, but not all filesystems are (e.g. FAT). What filesystem
> are you writing to?
>
> Since you mention problems on other UNIX platforms as well, the problem
> may actually be in the server. You don't say what protocol you're using,
> but HTTP and FTP (and others) are capable of transferring large files
> only if the server software and server OS are also capable. The fault
> is likely to lie on the remote side. Try enabling libcurl trace debugging
> and you're likely to see a 31-bit overflow of some sort show up.
>
> >>> Dan
>
>
Hi,
The protocols which i am using are FTP, HTTP and SFTP and am using the ext2
filesystem. I checked with the remote server (florida), and i was able to
transfer more than 2GB sized files, from and to the server using the ftp
client.Currently i am looking on libcurl trace debugging. If you have any
inputs please provide me based on the missing information provided

thanks,
Sumukh

-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2011-07-22