cURL / Mailing Lists / curl-library / Single Mail


RE: Multithreading Problems (CentOS 5.1)

From: Daniel Stenberg <>
Date: Fri, 20 Jun 2008 13:34:21 +0200 (CEST)

On Fri, 20 Jun 2008, Nick George wrote:

>> Using c-ares? What libcurl version? Which Linux kernel?
> Kernel Version: 2.6.18-53.1.21.el5 SMP x86_64 (CentOS 5.1)
> cURL version 7.15.5 Vendor: CentOS Release 2.el5 Arch: x86_64
> I'm not using c-ares at all.

More modern kernels are faster. More modern libcurl releases are better.
Possibly you're also using an older glibc too...

> I've looked a bit further into the issue. If I comment out the
> curl_easy_perform() call, the code runs VERY quick (probably because there's
> nothing to do), so it seems that it's the cURL code must be calling some
> library functions that end up calling futex().

Most probably, yes. It would be interested to know which function that is. Do
you see the exact same futex behavior with more recent libcurls?

> I've tried the same code on Ubuntu 8.04 (kernel 2.6.24-16-generic i686) with
> the latest stable version of libcurl compiled from source. I'm running into
> the same issue. I can't run more than 380 threads at the same time, but it's
> enough to see that it pauses for a number of seconds on calls to futex().

I've seen similar things happen when the wrong driver was loaded for the
(SATA) drive so it got dead slow and everything ended up waiting for disk i/o.

I'm not saying that's what you see, only that "waiting for a futex" is such a
generic error that it is impossible to tell what it is. I also don't think it
is a libcurl flaw.

> I ended up giving up on the multithreaded approach. I was having other
> problems with my i386 box not being able to create more than 300 threads
> anyway. So, instead I tried extending the multi-app.c example to perform the
> same tests. It works fine up to about 150 threads, where performance starts
> to degrade massively.

Those aren't "threads", but just individual transfers.

> This time, the call to select() is pausing for many seconds at a time. I
> can't win! Right now, I'm at a bit of a loss to explain what's going on.

I wouldn't advice anyone to use select() with more than perhaps 50 file
descriptors. If you want to use more you should really consider the
curl_multi_socket() API instead as that is designed for (and proved to
provide) high performance with many simultaneous transfers.

BTW, what protocol(s) are you using? And what's the nature of the URLs do you
use? Only local server(s)?

Received on 2008-06-20