cURL / Mailing Lists / curl-library / Single Mail


Re: PHP/cURL: intensive calls send exception without caching connection resource

From: Stéphane HULARD <>
Date: Mon, 13 Jan 2014 09:49:55 +0100


Thanks for your answer.

As I read your questions, I think my problem details are unclear about the environment and my curl utilisation.

First I need to clarify some point, I did not know that your bug tracker is on SourceForge. I searched for my problem on Guzzle github issues and global Stack overflow curl questions (surely not the right place…). In this information repositories I did not find any similar problem so I think it will be better to ask low level tools authors...
I check in curl issues today but I did not find similar answers too...

The tool make from 500 requests to 10000 requests per seconds during our crawl. The request count trends to grow quickly.
We made 90% of HTTP calls to a local server ( and 10% to various url on 1-10 remote server (most are HTTP but can be FTP, or HTTPS).

We do not use simultaneous request, we work in a synchronous way.
I don't understand what you mean by "file descriptors". It is the curl handle which are initialized inside the process? If it is, we use a curl handle different for each request (as Guzzle library make this happen) but we rewrite a HTTP client which cache handles to try maximum reuse.

I searched on the web about my error and I found that it can be encountered when the server can't allow a new php curl_init call because there isn't enough available “process” for a network call… I don't know what kind of process it is but, this is what saturate mean for me.

Finally, I understand that you recommend to reuse curl handles but with PHP it is not really possible because (before PHP 5.5) we can't reinitialize curl handle configuration. We wrote a piece of code which prevent the error to appear but it can't be scaled to a lot of different curl handle configurations.

Kind regards


Stéphane HULARD
Directeur technique
CH Studio - Collectif digital
2 rue d'Italie - 38490 Les Abrets
Phone: +33 (0)6 18 18 65 04

On Sun, 29 Dec 2013, Daniel Stenberg wrote:

> The last week, I've faced a problem which has not really been encountered by
> the community (stack overflow, github issues or different resources).

Let me first just mention that our bug tracker is on Sourceforge, stack
overflow and github issues would not be places with any official affiliation
with the curl project. Just be aware when reading advice or answers you find
out there in the wild.

> In our application we made an insane number of HTTP requests to manipulate
> data (about 500 to 10000 requests/seconds). Before the last week, we use a
> server which is not really powerful so our requests are not too fast but now
> our application is deployed on a powerful server and we faced connection
> problems.

So if the "500 to 1000 requests/seconds" speed is the slow one, how fast is
the fast speed? Are you talking only HTTP or HTTPS as well? Are you
communicating with one, a few or millions of servers?

> I see the error (our elastic search server is Failed to
> connect to Cannot assign requested address

How many simultaneous transfers do you have? How many file descriptors do
you allow for each process?

> After some research, I've found that we see this error when we have saturate
> connection possibilities of our server.

What does "saturate" mean more specifically? I assume "our server" is actually
your term for the client that uses libcurl?

> I've found too that to prevent this error we should reuse our cURL resource
> and then it disappear.

When using libcurl, you should always re-use handles as much as possible for
many reasons. Performance is one.

List admin: 
List admin:
Received on 2014-01-13