curl / Mailing Lists / curl-library / Single Mail

curl-library

multi curl consuming more heap memory

From: Prasanna Venkateswaran via curl-library <curl-library_at_cool.haxx.se>
Date: Tue, 23 May 2017 13:00:52 +0530

Hi,

I have written an application to post a json payload to a given REST API
endpoint. The application uses libevent2 and multi curl to achieve
asynchronous behavior. While testing I noticed that the heap memory usage
of the process keeps increasing when multiple parallel POST requests are in
progress and the memory usage never comes down even after the requests are
complete.

The application is modeled exactly like the official sample code
"hiperfifo" except for a few business logic changes. So to narrow down the
root cause of memory usage, I compiled and ran the hiperfifo example. I
noticed that the heap memory usage increases even when using the sample
code. I have pasted my observations below.

Environment:
OS: Ubuntu 15.10
#curl --version
curl 7.43.0 (x86_64-pc-linux-gnu) libcurl/7.43.0 GnuTLS/3.3.15 zlib/1.2.8
libidn/1.28 librtmp/2.3
Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3
pop3s rtmp rtsp smb smbs smtp smtps telnet tftp
Features: AsynchDNS IDN IPv6 Largefile GSS-API Kerberos SPNEGO NTLM NTLM_WB
SSL libz TLS-SRP UnixSockets

Test:
Run the sample application (hiperfifo.c)
#./curl_async
Creating named pipe "hiper.fifo"
Now, pipe some URL's into > hiper.fifo

Now pipe a url into "hiper.fifo"
#echo "www.google.com" >hiper.fifo

Take a note of the memory usage
#cat /proc/$(pgrep curl_)/smaps

I see the following for heap memory usage.

01329000-0138c000 rw-p 00000000 00:00 0
 [heap]
Size: 396 kB
Rss: 360 kB
Pss: 360 kB
Shared_Clean: 0 kB
Shared_Dirty: 0 kB
Private_Clean: 0 kB
Private_Dirty: 360 kB
Referenced: 360 kB
Anonymous: 360 kB
AnonHugePages: 0 kB
Swap: 0 kB
KernelPageSize: 4 kB
MMUPageSize: 4 kB
Locked: 0 kB

If I send one url at a time to the pipe, the memory usage remains fairly
constant.

I have a sample file (test.urls) which has 200+ urls picked from top 500
domains. I wrote a small script to loop over the urls in the file and pipe
it to the named pipe.
#while read url; do echo ${url} > hiper.fifo; done < test.urls

The heap memory usage has jumped to ~9MB as can be seen below.

01329000-01c14000 rw-p 00000000 00:00 0
 [heap]
Size: 9132 kB
Rss: 8644 kB
Pss: 8644 kB
Shared_Clean: 0 kB
Shared_Dirty: 0 kB
Private_Clean: 0 kB
Private_Dirty: 8644 kB
Referenced: 8600 kB
Anonymous: 8644 kB
AnonHugePages: 0 kB
Swap: 0 kB
KernelPageSize: 4 kB
MMUPageSize: 4 kB
Locked: 0 kB

This memory usage does not come down even after all the requests are
completed. The memory usage does not come down even after waiting for more
than 30 minutes.

When I run valgrind on hiperfifo, it does not report any leak either even
when the heap memory usage is high.

Can you please help me understand the reason for the heap memory increase?
I believe if I can get to the bottom of this issue, it can help me fix the
high heap usage in my application
as well.

Regards,
Prasanna

-------------------------------------------------------------------
Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
Etiquette: https://curl.haxx.se/mail/etiquette.html
Received on 2017-05-23