cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: curling password protected website with hidden dynamic variable on the login page

From: Ralph Mitchell <ralphmitchell_at_gmail.com>
Date: Wed, 3 Feb 2010 22:18:58 -0500

On Wed, Feb 3, 2010 at 9:57 PM, Ralph Mitchell <ralphmitchell_at_gmail.com>wrote:

> On Wed, Feb 3, 2010 at 2:10 PM, Maximilian Rausch <maxrausch_at_gmail.com>wrote:
>
>> I need to curl data that is on a password protected site and I am
>> first trying to get by the login page so that I can store the cookies
>> which will allow me to get to the password protected data with a
>> second call to curl. The login webpage is
>>
>> https://secure.ngx.com/sso/login
>>
>> However when the submit button is clicked on the login page it posts a
>> hidden variable "lt" which changes every time the page reloads (so you
>> can't curl the login page once and parse it to find the value and then
>> post it the next time). You can see this by taking a look at the
>> source code of the page. I have tried a live HTTP replay ( a function
>> of the live http headers plugin for firefox) but the replay does not
>> successfully login because I think that the the new page has a
>> different value for "lt" while the replay is posting the old value.
>>
>> I can not figure out how to get the value of 'lt' the first time I
>> curl the page so that I can post it along with the username, password,
>> and other necessary variables.
>>
>
> If you save that first page, you can grep out all the input fields,
> including the hidden ones:
>
> grep -i '<input' home.html
>
> You'll want to leave out the commented out "warn" field and the reset
> button.
>
> You can also extract the form using Daniel's formfind.pl program, which
> you can find here:
>
>
> http://curl.haxx.se/cvs.cgi/curl/perl/contrib/formfind?view=markup&content-type=text/vnd.viewcvs-markup&revision=HEAD
>
> That'll give you the correct form elements and also the action url to post
> to.
>

Heh - I should have looked harder at the email headers. You don't want a
command line solution. What are you doing with the downloaded page?? Are
you saving it somewhere? If not, take a look at the getinmemory.c program:

     http://curl.haxx.se/libcurl/c/getinmemory.html

That shows you how to save the downloaded page in a buffer for further
processing.

Ralph Mitchell

-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2010-02-04