cURL / Mailing Lists / curl-library / Single Mail

curl-library

RE: Memory problem with Windows

From: Robert Iakobashvili <roberti_at_Go-WLAN.com>
Date: Thu, 2 Dec 2004 09:56:21 +0200

Dear Gonzalo,

Nice hearing from you once more.

My guess is that on some stage you are trying to re-initialize
the multiple handle for the new cycle (with other easy_handles or re-initialized
easy_handles) and calling curl_multi_cleanup more than once in a program.
There some bugs in the curl list and hash (libcurl is a great tool, thanks to it developers),
that I had no time to debug, finally you can end with veird memory corruptions.

To bypass the problem you may wish to do instead something like this:

    for (cycle = 0; cycle < bd->cycles_number_ ; cycle++)
    {
        for (url = 0; url < bd->urls_number_ ; url++)
        {
            fprintf (stderr, "\n\"%s\" - user_activity_simulation () - cycle %ld of getting url %ld .\n\n",
                     bd->batch_name_, cycle, url) ;

            // Here you remove all easy handles from the array to init all them with the new http-url
            //
            for (k = 0 ; k < bd->cl_number_ ; k++)
            {
                curl_multi_remove_handle (bd->multi_handle_, bd->cl_handles_array_[k]) ;
            }

            // Here you are resetting each easy_handle, re-initializing it and adding it one by one to the empty multi_handle
            for (k = 0 ; k < bd->cl_number_ ; k++)
             {
                 curl_easy_reset (bd->cl_handles_array_[k]) ; // actually the reset may be omitted in most cases.
                 single_handle_setup (bd->cl_handles_array_[k],
                                      bd->ip_addresses_[k],
                                      bd->url_array_[url],
                                      &cdata[k],
                                      cycle,
                                      err_buff) ;
                 
                 curl_multi_add_handle(bd->multi_handle_, bd->cl_handles_array_[k]);
            }
            
            if ( mget_url (bd->multi_handle_,
                           bd->url_completion_timeout_[url],
                           bd->batch_name_) == -1)
            {
                fprintf (stderr, "user_activity_simulation () - mget_url() failed for the cycle %ld of getting url %ld .\n", cycle, url) ;
                return -1;
            }
            fprintf (stderr, "\nuser_activity_simulation () - sleeping after cycle %ld of getting url %ld.\n\n", cycle, url) ;
            sleep (bd->url_inter_sleeping_time_[url]);
        }
    }

Hope, that it helps. In general, try to delete/cleanup as less as you can in your program and instead to re-use.

Sincerely,
Robert Iakobashvili, Esq
roberti_at_go-wlan.com

Date: Wed, 1 Dec 2004 15:04:45 -0300
From: "Gonzalo Diethelm" <gonzalo.diethelm_at_aditiva.com>
Subject: RE: Memory problem with Windows
To: "'libcurl development'" <curl-library_at_cool.haxx.se>
Message-ID: <006c01c4d7d0$417850f0$7a0aa8c0_at_laptopgonzo>
Content-Type: text/plain; charset="us-ascii"

> > The symptom: the program is killed when calling
> curl_easy_cleanup().
> > When
> > run inside MSVC's debugger, I get the following stack trace:
>
> I can't say that helped me much...

Hey, thanks for looking anyway. I have compiled libcurl with debugging
information; here is the (relevant) stack trace:

ConnectionKillOne(SessionHandle * 0x00ab4250) line 1677 + 24 bytes
Curl_close(SessionHandle * 0x00ab4250) line 206 + 9 bytes
curl_easy_cleanup(void * 0x00ab4250) line 394 + 9 bytes

Inside ConnectionKillOne() it dies when executing this line of code:

  for(i=0; i< data->state.numconnects; i++) {

The data variable is corrupted, it doesn't show any good values at all.

> valgrind is actually a better tool for linux and
> memory-related problems.

I did try valgrind, it only shows one problem that I saw was reported in the
mailing list here:

http://marc.theaimsgroup.com/?l=sqlite-users&m=108818352518443&w=2

That, it seems to me, is not a real memory problem; it seems the call to
write() is always writing 1024 bytes (the page size), but the piece of
memory where the bytes are taken from is smaller than that, so it will also
write garbage bytes to the file; these garbage bytes are probably ignored
anyway.

> > Anybody has any suggestions? Are there any known
> memory-related bugs
> > in
> > libcurl under Windows?
>
> in libcurl 7.12.2: none.

How about any DLL-related problems? I just noticed the recommendation to use
sqlite_freemem() instead of free() for any errors returned by SQLite; this
is (I guess) related to allocating memory in one DLL (SQLite) and erasing it
in another one. Unluckily, that change did not fix my symptoms... Could this
kind of DLL-memory thing surface anywhere else in SQLite?

Thanks for any information. Regards,


--
Gonzalo Diethelm
gonzalo.diethelm_at_aditiva.com

Received on 2004-12-02