Re: Many CLOSE_WAIT when handling lots of URLs
Date: Mon, 6 Jan 2014 06:13:01 +0000
From: curl-library [mailto:curl-library-bounces_at_cool.haxx.se] On Behalf Of Daniel Stenberg
Sent: Friday, November 29, 2013 05:47
To: libcurl development
Subject: RE: Many CLOSE_WAIT when handling lots of URLs
>Yes, you need to debug it. Instrument the code to figure out what makes it happen, and then dig into what you can do about it. Or work on writing a recipe/source code that repeats the problem that you can share with us and then we can help out with the debugging!
>Figure out which connections that aren't properly closed and then track back why they end up like that and what the proper handling of them should've been!
I tried to modify some suspicious code (around the commit d021f2e8a0067fc769652f27afec9024c0d02b3d) to fix this issue but failed.
After that, I worked on writing a program to reproduce it. The attached program is based on the official example hiperfifo.c. After compiling it successfully, running the program for about 10-20 mins, will see many CLOSE_WAIT on the machine.
This program is to GET 200 urls from a 10000 urls list each time.
NOTE: In my test environment, all that 10000 urls will resolve to my internal web server, but not go to the real world web servers . So you may need to setup an internal webserver, make all the url resolving to the internal webserver, place some files on it and change the url path in my program to reproduce the issue.
- application/x-gzip attachment: hiperfifo.tar.gz