cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re:Re: Curl command line and libcurl is much slower than wget

From: curl <curl_at_wangzw.org>
Date: Fri, 13 Jan 2012 18:14:15 +0800 (GMT+08:00)

Hi

Thanks for your suggestion.

I also tried to get socket using curl_easy_getinfo(handle, CURLINFO_LASTSOCKET, &sock);
and then read using system call "read" from socket, and the speed is keep the same as before, 12Mbyte.
So I think the problem maybe under the connect stage.
 
 
----- Original Message -----From: Dan Fandrich <dan_at_coneharvesters.com>
To: curl-users_at_cool.haxx.se

On Fri, Jan 13, 2012 at 05:33:14PM +0800, curl wrote:
> I use curl 7.23.1. The network is local ethernet network, 4 servers connect to
> the same switch via 1Gb network interface card.
> I use curl command line as following:
> curl -L "http://hosename:port/file" > /dev/null
> The file is very large, 10GB at least.

There shouldn't be any reason why this can't saturate the link. I haven't
done any benchmarking myself lately, so maybe there is some slowdown that's
crept into the code lately.

> My requirement is just read data from a http server and each time just read the
> size of data what I want to read, may read a lot of times.
> So I use libcurl to connect the http server with the flag CURLOPT_CONNECT_ONLY,
> and send http header myself, and then read data using curl_easy_recv when I
> want to read some data. That means I can treat a file on http server as a
> local sequence file and read whenever I need to read.

If you do it that way, there's really no need to use libcurl at all. Just use
raw sockets and do away with the dependency. But, it's not going to work right
in the general case using any HTTP/1.1 features this way, so you might as well
use libcurl the way it was designed. That is, install a write callback function
and abort the transfer once it's received all the data it wants. You can also
use CURLOPT_RANGE for more efficient use of the network streams if the server
supports it.

> The problem is curl command line and libcurl are both much slower than what I
> expected and wget.

I wouldn't be too worried if the curl_easy_recv approach doesn't provide full
speed as there hasn't been much call to optimize it, but certainly the normal
approach ought to be faster.

>>> Dan
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html

-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2012-01-13