cURL / Mailing Lists / curl-library / Single Mail


Please advise on how to improve libcurl HTTP GET performance

From: Alex <>
Date: Mon, 10 Oct 2011 16:11:02 +0400

I'm trying to download web-pages. Everything works fine, except for low speed. I have 3 mbps connection (and I have disabled all other network-related programs), but I only get 30-40 KB/s in debug build, and even less (18-30 KB/s) in release build, and that's not the kind of speed I was hoping for. I've performed tests with several html pages ranging in size from 12 KB to 4.6 MB. The DNS lookup time is included into the time I've measured to determine speed, but it's very small in relation to the time needed for downloading 4.5 MB page.
I can only think of two way of improving performance:
- multithreading - multiple curl_easy_perform, each in it's own thread. Not a great solution - I will need 10-12 threads to utilize my relatively slow channel. For 100 Mbps I'll need 30 threads - quite too many, in my opinion, I'll spend mre time synchronizing them than executing them;
- download multiple simultaneous fragments - if server has Accept-Ranges attribute, one could split download into several independent sections. Is there support for this feature in libcurl?
Please excuse my (most likely) silly question, I'm new to libcurl and I'm sure I am missing something. If you can think of a way to squeeze more performance from libcurl - please advise.

Thanks in advance!

List admin:
Received on 2011-10-10