cURL / Mailing Lists / curl-library / Single Mail

curl-library

RE: Cookies..

From: Lorenzo Pastrana <pastrana_at_ultraflat.net>
Date: Mon, 16 Sep 2002 11:33:13 +0200

Well, it seams pretty easy (at firsh sight)
would something like this suffice?

void curl_easy_flushcookies(CURL *curl)
{
  struct SessionHandle *data = (struct SessionHandle *)curl;
  // pasted from Curl_close.
  if(data->set.cookiejar)
    /* we have a "destination" for all the cookies to get dumped to */
    Curl_cookie_output(data->cookies, data->set.cookiejar);

  Curl_cookie_cleanup(data->cookies);
}

That would imply a new explicit cookie initialisation (cookiejar and/or
cookiefile) for further use of the handle is needed. But that is precisely
the point : decoupling cookie management & handle lifetime.

Some impressions?

Thanks.
Lo.

> -----Original Message-----
> From: curl-library-admin_at_lists.sourceforge.net
> [mailto:curl-library-admin_at_lists.sourceforge.net]On Behalf Of Lorenzo
> Pastrana
> Sent: lundi 16 septembre 2002 11:10
> To: curl-library_at_lists.sourceforge.net
> Subject: RE: Cookies..
>
>
>
> > The file is only written when you *cleanup() the handle. Until then, the
> > cookies are kept in memory only.
> >
> > > Actually I'm reusing the handles for multiple urls so I won't call
> > > curl_easy_cleanup between them, is that the reason ? If it
> is, how can I
> > > use cookies and persistant connections at the same time ?
> >
> > The cookies are used when CURLOPT_COOKIEJAR is set, it just
> doesn't create
> > the file until you cleanup.
> >
> > > Is there a way to force the cookie dump other than curl_easy_cleanup ?
> >
> > No. I don't see the point in doing that.
>
> Well, my test scenario is a web spider.
> I'm actually in a multi interface setup, where I allways keep a max of
> MAX_CONNECT handles living in a FIFO list.
> When an url comes in for processing I pop a handle from the top
> of the FIFO
> (if empty get a new handle), setup a cookie file accordingly to
> the url, and
> process it.
> When done, I push back the url on top of the FIFO, if the fifo gets larger
> than MAX_CONNECTS then I pop the bottom urls and clean them up.
> This algo is what I thought was the simplest way to make profit of
> persistant connections.
> But regarding cookies, my problem here is that I can't have a single BIG
> cookiejar for all urls since what I would allways get is the last
> cleaned-up
> handle cookies overwriting all previously set data when popping the bottom
> handles... So I figured out that one possible solution would be
> to have one
> cookie file per url, but then I'd need a way to flush cookies data on
> completion of each request, witch seems to be impossible at the moment.
> What is currently going on is that I get various unpredictable cookies
> collections in the cookiejars, depending on what & how many urls have been
> managed by the handle.
> I'll look in the curl_easy_cleanup code to see if I cant export a
> curl_easy_flush_cookie from there... But please tell me if I'm completely
> off my track here..
>
> Cheers.
> Lo.
>
>
>
> -------------------------------------------------------
> This sf.net email is sponsored by:ThinkGeek
> Welcome to geek heaven.
> http://thinkgeek.com/sf
>
>

-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
Received on 2002-09-16