cURL
Haxx ad
libcurl

curl's project page on SourceForge.net

Sponsors:
Haxx

cURL > Mailing List > Monthly Index > Single Mail

curl-library Mailing List Archives

Re: Misc. enhancements

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Tue, 29 Jul 2003 10:38:51 +0200 (CEST)

On Mon, 28 Jul 2003, Daniel Noguerol wrote:

> First off, let me say that libcurl is a very nice project and I hope
> everyone keeps up the great work being done.

I'm glad you like it and welcome to the project!

> I am the author of Download Wizard, a Cocoa-based download manager for Mac
> OS X. I had already written the program's core network library (HTTP, FTP,
> Hotline) from scratch. I was in the process of researching HTTPS and FTPS
> support when I came across libcurl. Since its license allows use in
> commercial products and it already supported those protocols (as well as
> some others), I figured I'd put together a prototype of Download Wizard
> utilizing libcurl for both HTTP and FTP. This has been for the most part
> successful, but I had to make some changes to libcurl to get it behaving the
> way I need it to. I figured I'd share my experience and explain both the
> problems and how I chose to solve them. If anyone knows a better way to do
> these things, please let me know. Also, if my changes are something that
> might be desirable in the libcurl codebase, I'd be happy to share :)

I trust you already are familiar with curlhandle, the Cocoa binding for
libcurl: http://curlhandle.sourceforge.net/. I thought I should mention it
since it might suit a Cocoa application better, I don't know...

> 1. There was no way to determine if a file on an FTP server was resumable. I
> determine if an HTTP server supports resumes by checking for the
> "Accepts-range: bytes" header (is there a better way?).

Hm, I don't think there is. Even so, that header is only a "MAY" in the RFC: a
server might support ranges without sending that header. (That's my
interpretaton of RFC2616 section 14.5.)

> My solution for FTP was, in the case of the user selecting the NOBODY and
> HEADER options, I send a "REST 0" after the "SIZE" command is sent. If I get
> a 350 code back, then I assume the server supports the REST command and
> return an "Accept-ranges: bytes" header in the same manner as is already
> done with "Content-Length:".

I think we should reconsider that whole approach of faking HTTP headers in the
FTP code, but since we (uh, ok *I*) already have started that venture I think
your approach fits in well.

I'm interested in the code for this.

> 2. You can set a cookie string on an HTTP URL, but if CURL is redirected,
> even if it's on the same server, the second request won't include the
> cookies. My program supports cookies from multiple browsers, so pointing it
> at a cookie file wasn't feasible. My solution was to implement a "cookie
> callback" function in the same manner as the existing password callback.
> This callback is called if the CURLOPT_COOKIE option isn't set and is passed
> the current working URL as well as a buffer to copy the cookie string into.

This is a subset of the somewhat wider idea of a new cookie interface I'm
working on slowly (working => writing down the idea, no code yet). Please let
us know what you'd think about an API similar to this:

        http://curl.haxx.se/dev/COOKIES

I'd rather have a go at all the cookie related ideas and issues at once, than
patching up small fixes one by one.

> 3. I couldn't find any sort of function to externally tell CURL to stop
> downloading a file.

Right, there is none. I have not come up with a suitable and portable enough
solution. I mean, I could easily make a curl_easy_stop() function that sets a
variable that is checked for in the same manner progress function callbacks
can work today, but then so could anyone do for himself.

> This obviously isn't necessary for the command-line utility, but is needed
> for my GUI. In got this working by setting up a progress function callback
> and returning a 1 from that if the user clicked the stop button. This seems
> to do the trick but I imagine there would be some sort of performance
> penalty for installing the progress function. Is there perhaps a better way
> to do this?

This is the recommended approach for doing it. If you can come up with a
better approach to implement a "stop" function, feel free to tell us. :-)

> 4. I see an option to set a lower speed limit for transfers, but what about
> an upper one? Does this exist or has it been discussed?

It doesn't exist and it has never been discussed, AFAICR. But you can easily
add such a check yourself, with the progress callback...

-- 
 Daniel Stenberg -- curl: been grokking URLs since 1998
-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
Received on 2003-07-29

These mail archives are generated by hypermail.

donate! Page updated November 12, 2010.
web site info

File upload with ASP.NET