I've used libcurl with good success for years (though I would not call
myself in any way an expert with it or HTTP in general). Right now I'm
at an impasse though.
My existing code knows how to upload an arbitrary set of arbitrary-sized
files using a multipart/formdata POST. What I'm trying to do now, in
order to solve a performance problem, is to upload them in compressed
Of course the easy thing would be to compress (gzip) each file into a
memory buffer and pass it with CURLFORM_BUFFER. But since there's no
limit to the number or size of the files, that doesn't seem like a good
However there's a nice new CURLFORM_STREAM option which seems made for
the purpose: set up a callback to read blocks from the file and compress
them into the libcurl-provided buffer. Something like this:
curl_easy_setopt(curl, CURLOPT_READFUNCTION, callback);
res = curl_formadd(&form, &last,
The intractable problem here is that according to the docs, use of
CURLFORM_CONTENTSLENGTH is mandatory when CURLFORM_STREAM is also in
use. But there's no way to predict the length of compressed data (other
than by doing a trial compression) so this way forward seems to be
blocked. I've looked at the associated test (lib554.c) but that uses a
data set of known length.
So - is there a way to upload file data via a multipart form,
compressing on the fly?
PS The requirement to use CURLFORM_CONTENTSLENGTH is not mentioned with
CURLFORM_STREAM, though it is elsewhere. I'm attaching a patch to fix
this. Also, though the doc says CONTENTSLENGTH must be *used*, it does
not say explicitly what it must *contain*. I'm assuming, and the patch
states, that it's the total expected length of the part, which in my
case would be the compressed size of the file.
Received on 2009-01-11