cURL / Mailing Lists / curl-library / Single Mail

curl-library

[ curl-Bugs-903131 ] CURLINFO_EFFECTIVE_URL breaks on some URLs

From: SourceForge.net <noreply_at_sourceforge.net>
Date: Mon, 23 Feb 2004 17:13:57 -0800

Bugs item #903131, was opened at 2004-02-23 18:13
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=100976&aid=903131&group_id=976

Category: libcurl
Group: bad behaviour
Status: Open
Resolution: None
Priority: 5
Submitted By: Jeff Rodriguez (bigtangringo)
Assigned to: Daniel Stenberg (bagder)
Summary: CURLINFO_EFFECTIVE_URL breaks on some URLs

Initial Comment:
[jrod_at_linmon1 cutils]$ curl -V
curl 7.11.0 (i686-pc-linux-gnu) libcurl/7.11.0
OpenSSL/0.9.7c zlib/1.1.3
Protocols: ftp gopher telnet dict ldap http file https ftps
Features: SSL libz NTLM
[jrod_at_linmon1 cutils]$ uname -a
Linux linmon1 2.4.18-18.7.x #1 Wed Nov 13 20:29:30
EST 2002 i686 unknown

Basically I&#039;m using:
/* curling here */
curl_easy_getinfo(structCurl->structConnection,
CURLINFO_EFFECTIVE_URL, &ptrCharUrl);
curl_easy_cleanup(structCurl->structConnection);

After this call, I malloc some space and copy the URL to
that space, the resulting copy as well as the origional
are full of garbage characters.

This only occurs on some urls however, I&#039;m relatively
sure it&#039;s not the function I&#039;m using to copy the url since
I&#039;ve tested it quite a bit and it works great on
everything else (even resulting CURL content).

Relevant CURL headers are:
#include <curl/curl.h>
#include <curl/types.h>
#include <curl/easy.h>

An example url (use it verbatim):
http://www.godaddy.com

That broke, but:
https://www.godaddy.com

Did not break, let me know if I can provide any more
information.

----------------------------------------------------------------------

You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=100976&aid=903131&group_id=976
Received on 2004-02-24