cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: Multi Thread problem

From: Chirag <chirag_at_cellcloud.com>
Date: Fri, 20 Sep 2002 22:48:09 +0530

Hi,

    Crash always happens on the same URL. But it is strange that for a
couple of hundred times it brings the content from that url without any
problem and then all of a sudden, one time it doesnt go any further from the
lines below. (ie. it shows connected, connection letf alive and then the
last thing i see it closing live connection) After curl_easy_perform, i
check for the return values.

* Connected to 66.13.8.101 (66.13.8.101)
> GET /servlet/ICC?msisdn=919845156838&message=BUZZ%20C HTTP/1.1
Host: 66.13.8.101:8080
Pragma: no-cache
Accept: image/gif, image/x-xbitmap, image/jpeg, image/pjpeg, */*

* Connection (#0) left alive
---------------> Result of the CURL PERFORM is 0
------>After CURL Perform
* Closing live connection (#0)

*Crashes* after this statement.

I have not tried without the connection timeout. I will do that now. But the
problem is that the program runs for atleast 1-2 hours before it comes down.
Actually my program is live and hence i cant experiment much. I have the
same setup on linux machine and it goes on for hours together on linux. Does
this behaviour have any relation with the Memory leak my program is having ?
I have not been able to trace the leaks and hence there is lot of mem. leak.
Is there any upper limit to memory leak before the Solaris OS brings the
process down ? I dont suspect the leak much as my program always comes down
after curl_easy_perform() whenever it comes down.

Please help. The other problem is that whenever my cron tries to bring up my
program as a background process (after the crash), the program never runs. I
am using ./webber > /dev/null 2>&1 &. Is there any other way to start a
process in background ?

Hope to get some help soon.
Thanx in advance,
Chirag

----- Original Message -----
From: "Daniel Stenberg" <daniel_at_haxx.se>
To: "libcurl Mailing list" <curl-library_at_lists.sourceforge.net>
Sent: Friday, September 20, 2002 9:45 PM
Subject: Re: Multi Thread problem

> On Fri, 20 Sep 2002, Chirag wrote:
>
> You didn't mention version here, but I trust this is curl 7.9.8 we're
talking
> about?
>
> > The problem is that this function is called by a number of threads (15
> > threads to be precise). The whole code runs fine for some time, but then
> > when there are say 5 requests at the same time to a same URL, and when
it
> > tries to retrive the content simultaneously, sometime in between i get a
> > seg_fault.
>
> Does the crash always happen when several threads get the same URL?
>
> > I have put VERBOSE and here is where it gives me a seg_fault always...
>
> And if you run the program with a debugger, can you get it to stop on the
> offending line? If so, which one is it?
>
> > By default i am giving a timeout of 60 secs. I dont know what cud be the
> > problem.
>
> That's only bad if the name lookup takes a long time since the abortion of
> that is made with a signal and I don't know what a Solaris threaded
program
> does with it.
>
> > I have tried all combination of increasing the timeout from 5 secs to 60
> > secs.
>
> What does it do without timeout?
>
> > Also i have tried it with 5 threads to 13 threads but the crash is
erratic.
> > As u see from the VERBOSE below....when two or more threads have
> > simultaneously got the content....my program goes down.
>
> No, I didn't really see that. When and where exactly does it crash?
>
> > http_body = (char *)strstr(chunk.memory, "\r\n\r\n");
>
> This approach really isn't necessary. You can pass in a different "chunk"
> pointer (or even callback function) for the headers when they are passed
to
> the callback.
>
> I found no obvious flaws in the code.
>
> --
> Daniel Stenberg -- curl related mails on curl related mailing lists
please
>
>
>
> -------------------------------------------------------
> This sf.net email is sponsored by:ThinkGeek
> Welcome to geek heaven.
> http://thinkgeek.com/sf
>
>

-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
Received on 2002-09-20