cURL
Haxx ad
libcurl

curl's project page on SourceForge.net

Sponsors:
Haxx

cURL > Mailing List > Monthly Index > Single Mail

curl-library Mailing List Archives

Re: curl library and charset

From: Carlos Alvarez <citrouille_at_wanadoo.fr>
Date: Thu, 7 Apr 2005 11:02:39 +0200

(PS: Sorry the subject has nothing to do with the real thread's issue, the because I made a mistake on the real crash reason and I forgot to change the subject )
  ----- Original Message -----
  From: Carlos Alvarez
  To: curl-library_at_cool.haxx.se
  Sent: Thursday, April 07, 2005 10:52 AM
  Subject: curl library and charset

  I am using a program that send http request on multiple threads (simplest way we can do). The programs seems to run nice, and always randomly segfaults on curl calls.

  Take a look at this gdb debug :

  Program received signal SIGSEGV, Segmentation fault.
  [Switching to Thread 13326 (LWP 2770)]
  0x400bb9a6 in addbyter (output=110, data=0xbd1feeec) at mprintf.c:989
  989 mprintf.c: No such file or directory.
          in mprintf.c
  (gdb) where
  #0 0x400bb9a6 in addbyter (output=110, data=0xbd1feeec) at mprintf.c:989
  #1 0x400bab16 in dprintf_formatf (data=0xbd1feeec,
      stream=0x400bb990 <addbyter>, format=0x400cd248 "name lookup timed out",
      ap_save=0xbd1fef44) at mprintf.c:645
  #2 0x400bba0f in curl_mvsnprintf (
      buffer=0x8188450 <Address 0x8188450 out of bounds>, maxlength=16384,
      format=0x400cd248 "name lookup timed out", ap_save=0xbd1fef44)
      at mprintf.c:1007
  #3 0x400af241 in Curl_failf (data=0x8188180,
      fmt=0x400cd248 "name lookup timed out") at sendf.c:160
  #4 0x400a870f in Curl_resolv (conn=0x402437af, hostname=0xbd1ff50c "",
      port=0, entry=0x0) at hostip.c:385

  What is wrong there ?

  Here is my curl code for request :

  //******* WEBPAGE Struct
  typedef struct{
      size_t pagesize;
      char *htmlcode;
  }WEBPAGE;

  //********************************************************************
  // gets downloaded data
  size_t _refGetPageFunc(void *ptr, size_t size, size_t nmemb, void *buffer)
  {
       size_t realsize = size * nmemb;
       WEBPAGE *mem = (WEBPAGE *)buffer;
       int psize = strlen(ptr);

       mem->htmlcode = realloc(mem->htmlcode, mem->pagesize + realsize + 1);
       if(mem->htmlcode)
       {
           memcpy(&(mem->htmlcode[mem->pagesize]), ptr, realsize);
           mem->pagesize += realsize;
           mem->htmlcode[mem->pagesize] = '\0';
      }
      else
           printf("[Curl]error: unallocated buffer\n");

      return realsize;
  }

  //********************************************************************
  // Frees a WEBPAGE struture
  void refFreeWebpage(WEBPAGE *data)
  {
       curl_free(data->htmlcode);
       curl_free(data);
  }

  //********************************************************************
  // Gets a web page's html code with parameters
  // return value : the page text
  char *refGetHTTPPage(char *pagename, long timeout)
  {
       char errbuf[CURL_ERROR_SIZE+128];
       char *rcode;
       WEBPAGE *data;
       CURLcode res;
       CURL *curl = curl_easy_init();
       char ua[] = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)";

       if(pagename == NULL)
       {
            printf("[HTTP] nom de page manquant. Certainement un bug\n");
            return NULL;
       }

       data = malloc(sizeof(*data));
       data->htmlcode = malloc(sizeof(*(data->htmlcode)));
       data->pagesize = 0;

       if(curl)
       {
            curl_easy_setopt(curl, CURLOPT_URL, pagename);
            curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, _refGetPageFunc);
            curl_easy_setopt(curl, CURLOPT_WRITEDATA, data);
            curl_easy_setopt(curl, CURLOPT_TIMEOUT, timeout);
            curl_easy_setopt(curl, CURLOPT_ERRORBUFFER, &errbuf);
            curl_easy_setopt(curl, CURLOPT_USERAGENT, ua);
            curl_easy_setopt(curl, CURLOPT_DNS_CACHE_TIMEOUT, 60);
     
            res = curl_easy_perform(curl);
            curl_easy_cleanup(curl);

            if(res != CURLE_OK)
            {
                 printf("[HTTP]: URL [%s] failed\n[HTTP]: %s (code %i)\n", pagename, errbuf, res);
                 refFreeWebpage(data);
                 return NULL;
            }
       }
       else
       {
            printf("[HTTP] cURL n'a pas été initialisee\n");
            refFreeWebpage(data);
            return NULL;
       }

       rcode = strdup(data->htmlcode);
       refFreeWebpage(data);

       return rcode;
  }

  cya
Received on 2005-04-07

These mail archives are generated by hypermail.

donate! Page updated November 12, 2010.
web site info

File upload with ASP.NET