Memory leak during gzip decompression
Date: Thu, 07 May 2009 19:54:27 +0200
When using libcurl to access some sites with CURLOPT_ENCODING set to "" and the site returning data with
Content-Encoding: gzip, libcurl (or zlib) leaks memory.
I believe the problem stems from the fact that search.live.com returns data that is gzip compressed, but where the ISIZE
field is wrong.
The relevant RFC http://tools.ietf.org/html/rfc1952 under section "220.127.116.11.Compliance" doesn't require the examination
of the trailing CRC32 or ISIZE fields (although, no sane decoder does that) and no browser throws an error when the
ISIZE field is incorrect (since I guess Transfer-Encoding: chunked already ensures that the data is received in whole)
so my guess is that:
a. The memory leak shouldn't be present, even for slightly malformed data.
b. The current behaviour that silently forgives this deviation from the RFC should be left as-is.
I don't know enough about content_encoding.c and zlib to track this further down unfortunately...
libcurl - Perl binding maintainer
Received on 2009-05-07