cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: Large transfers

From: Duncan Wilcox <duncan_at_mclink.it>
Date: Thu, 31 Jul 2003 16:13:03 +0200

> Yes, unless you fool the server to send the file using chunked
> transfer-encoding.

Unless the server seeks for every chunk it sends... anyway I didn't
notice any curl command line option to force chunked transfers...

> (BTW, this defect is mentioned in docs/KNOWN_BUGS)

Ah, sorry for not noticing.

> I beg to differ. Treating this "correctly" is bound to be impossible.
> That is
> a blantant server error and even though we sometimes do work-arounds
> for
> obvious and common bugs in servers, I'm still to be convinced this is
> one of
> those.

All versions of Apache up to 1.3.27 have the problem, and they won't
disappear overnight. For someone who routinely handles large files
(media? mysql dumps? huge logfiles?) it definitely is an issue. Reading
your paragraph below though reveals that --http1.0 is a workaround, and
it works.

Maybe curl (library or command line tool) could attempt a protocol
downgrade automatically? (it does shut down the connection anyway)

> libcurl however uses HTTP 1.1 and then ignoring the content-length
> will cause
> the transfer to just "hang" when the server consider the transfer to be
> complete, as it won't close the connection (until it's been idle for N
> seconds).

I hadn't realized that my proposal was incompatible with HTTP/1.1,
luckily I didn't jump in and code it so I can still retract it :)

Duncan

-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
Received on 2003-07-31