cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: designing multiple file support in libcurl

From: Johan Nilsson <johan_at_nnl.se>
Date: Tue, 13 Feb 2001 10:55:28 +0100

Hello!

I don't know much about connection management so my ideas may be way off.

This concerns libcurl.

Why don't have some kind of connection keep-alive option. This means that one
could set a timer interval to keep the connection open even after a clean-up. If
no new request to that server is requested within the interval, then close the
connection. This timer interval and/or the keep-alive flag could be set with
curl_easy_setopt. This would eliminate any new commands and let us use curl as
before. One new command could be useful though, a curl_easy_shutdown or something
like that that closes all connections not currently used for transferring (if for
example we want to shutdown the program using libcurl).

It would be nice if curl would keep a "list" of the open connections so that we
can do parallell downloads of several files from different servers (using only
one connection to each server).

Is it possible to use the same connection for parallell transfers?

Regards,

Johan

Daniel Stenberg wrote:

> How would the interface work?
>
> (This is cross-posted to both the curl mailing lists, the discussion of this
> should probably be held in the libcurl list.)
>
> I've been toying with the idea (and some source code) to make libcurl capable
> of transferring any amount of files using the same socket connection. I
> imagine I'd enable this at least for http and ftp, probably with ftp coming
> first.
>
> How would a libcurl-using dude want to add more than one URL? How should curl
> do to signal the application when there's a another file coming?
>
> When adding more than one URL there's this chance that the following URLs
> aren't using the same server and then curl won't be able to use the same
> connection for them. I thought I'd let the library find that out itself, so
> that we can pass a long list of URLs and libcurl will download those in the
> list that are on the same server, just skipping the rest.
>
> Would it be enough to have a setopt() option that takes a linked list with
> URL strings? Or should I just allow CURLOPT_URL being used multiple times?
>
> Would it be enough if I introduced another callback hook that gets called
> when a new file is being downloaded? It could have a few interesting
> parameters as well as the one passed to the write callbacks.
>
> We should also consider the effects on the reversed transfer, when uploading
> multiple files to the remote server...
>
> The mail is a bit chaotic, but so is my brain at the moment! ;-)
>
> --
> Daniel Stenberg -- curl project maintainer -- http://curl.haxx.se/
>
> _______________________________________________
> Curl-library mailing list
> Curl-library_at_lists.sourceforge.net
> http://lists.sourceforge.net/lists/listinfo/curl-library

--
| Johan Nilsson, M.Sc. in C.S.E., johan_at_nnl.se
| NNL Technology AB, http://www.nnl.se
| Phone: +46 (0)13 211400
| Mobile: +46 (0)70 9634472
| Address: Teknikringen 1B, S-58330 Linköping, SWEDEN
_______________________________________________
Curl-library mailing list
Curl-library_at_lists.sourceforge.net
http://lists.sourceforge.net/lists/listinfo/curl-library
Received on 2001-02-13