curl / Mailing Lists / curl-library / Single Mail

curl-library

Re: Using libcurl sending much more HTTP request in parallel

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Thu, 10 Aug 2017 00:33:00 +0200 (CEST)

On Wed, 9 Aug 2017, fan fan via curl-library wrote:

> In my application, I using libcurl to send 40 HTTP requests in every one
> second. As I test, this consume much CPU, my application such as:
>
> main thread for sending request and using epoll listen fds whether read
> or write.
>
> while(running) {
> int ret = epoll_wait(epollFd, events, maxEventNum, 10); //10ms

Unconditionally returning every 10 milliseconds like this will make you spend
a lot of CPU in vein. Don't do that. libcurl tells you how long it wants you
to wait at most. No need to spin around a lap when there's nothing for libcurl
to do.

Do you also call libcurl to tell there's a timeout if the 10 milliseconds
lapse or when do you do that?

> for (i = 0; i < ret; i++) {
> if (events[i].events & EPOLLIN) {
> curl_multi_socket_action // set CURL_CSELECT_IN
> curl_multi_info_read // for reading

I think you should put the curl_multi_info_read() call outside of the loop
checking for activities, since it might not have anything to return and then
its a bit of a waste to call it up to 40 times.

But sure, curl_multi_socket_action() may of course consume CPU since that's
the function that will handle the traffic on the socket you give it (and
possibly on a few others if necessary).

> As I test, send 100 or 200 or 300 or 400 or 500 in parallel maybe consume
> cpu 100%.

On modern hardware that shouldn't happen.

> I found there are too much EPOLLIN event, this consume much cpu.

What do you mean with too much EPOLLIN event? If libcurl tells you to wait for
that, shouldn't you then be glad you get it so that you can download the data?

> 1. Does my use libcurl wrong?

Ineffective at least. But you also didn't show the whole program so its not
possible to tell the whole story.

> 2. Have any performance measurement for libcurl process in parallel?

It should easily handle thousands of parallel connections, CPU wise. Very
likely tens of thousands too. Of course they may have to struggle with each
other for other resources (like bandwidth).

-- 
  / daniel.haxx.se
-------------------------------------------------------------------
Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
Etiquette:   https://curl.haxx.se/mail/etiquette.html
Received on 2017-08-10