Re: uploading compressed data using a multipart form (and patch)
Date: Mon, 12 Jan 2009 22:27:13 +0100 (CET)
On Sun, 11 Jan 2009, Mohun Biswas wrote:
> My existing code knows how to upload an arbitrary set of arbitrary-sized
> files using a multipart/formdata POST. What I'm trying to do now, in order
> to solve a performance problem, is to upload them in compressed format.
> Of course the easy thing would be to compress (gzip) each file into a memory
> buffer and pass it with CURLFORM_BUFFER. But since there's no limit to the
> number or size of the files, that doesn't seem like a good idea.
> However there's a nice new CURLFORM_STREAM option which seems made for the
> purpose: set up a callback to read blocks from the file and compress them
> into the libcurl-provided buffer. Something like this:
> So - is there a way to upload file data via a multipart form, compressing on
> the fly?
The only really functional way to do this (that I can think of), HTTP-wise, is
to do the entire POST using chunked encoding. libcurl does not have any
particular support for doing multipart formposts using chunked encoding so
you'd either have to introduce that support, or you need to do the full
multipart formpost syntax layer yourself and send it as a "normal" POST using
for example a read callback. libcurl _does_ support it for plain POSTs.
> I'm assuming, and the patch states, that it's the total expected length of
> the part, which in my case would be the compressed size of the file.
Thanks, I committed your patch just now!
-- / daniel.haxx.seReceived on 2009-01-12