Date: Mon, 28 Jul 2003 20:31:58 -0600
First off, let me say that libcurl is a very nice project and I hope
everyone keeps up the great work being done.
I am the author of Download Wizard, a Cocoa-based download manager for
Mac OS X. I had already written the program's core network library
(HTTP, FTP, Hotline) from scratch. I was in the process of researching
HTTPS and FTPS support when I came across libcurl. Since its license
allows use in commercial products and it already supported those
protocols (as well as some others), I figured I'd put together a
prototype of Download Wizard utilizing libcurl for both HTTP and FTP.
This has been for the most part successful, but I had to make some
changes to libcurl to get it behaving the way I need it to. I figured
I'd share my experience and explain both the problems and how I chose
to solve them. If anyone knows a better way to do these things, please
let me know. Also, if my changes are something that might be desirable
in the libcurl codebase, I'd be happy to share :)
1. There was no way to determine if a file on an FTP server was
resumable. I determine if an HTTP server supports resumes by checking
for the "Accepts-range: bytes" header (is there a better way?). My
solution for FTP was, in the case of the user selecting the NOBODY and
HEADER options, I send a "REST 0" after the "SIZE" command is sent. If
I get a 350 code back, then I assume the server supports the REST
command and return an "Accept-ranges: bytes" header in the same manner
as is already done with "Content-Length:".
2. You can set a cookie string on an HTTP URL, but if CURL is
redirected, even if it's on the same server, the second request won't
include the cookies. My program supports cookies from multiple
browsers, so pointing it at a cookie file wasn't feasible. My solution
was to implement a "cookie callback" function in the same manner as the
existing password callback. This callback is called if the
CURLOPT_COOKIE option isn't set and is passed the current working URL
as well as a buffer to copy the cookie string into.
3. I couldn't find any sort of function to externally tell CURL to stop
downloading a file. This obviously isn't necessary for the command-line
utility, but is needed for my GUI. In got this working by setting up a
progress function callback and returning a 1 from that if the user
clicked the stop button. This seems to do the trick but I imagine there
would be some sort of performance penalty for installing the progress
function. Is there perhaps a better way to do this?
4. I see an option to set a lower speed limit for transfers, but what
about an upper one? Does this exist or has it been discussed?
Anyhow, I'd welcome any feedback and be happy to share my code changes
if anyone wants them.
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
Received on 2003-07-29