Haxx ad

curl's project page on


cURL > Mailing List > Monthly Index > Single Mail

curl-users Mailing List Archives

Re: Building recursive list of URLs, no page downloading

From: Bob Basques <>
Date: Wed, 29 Jan 2003 17:17:32 -0600

If you have access to the server(s) and they are running *nix, you could
use the commandline utilities to sift through all the text on each page.

bobb wrote:

>I need to spider thousands of URLs at our company's websites to see how
>many URLs there are there for the move to our new CMS.
>Access to some sections of the websites are user/pass restricted and
>authentication is performed through cookies, not standard HTTP/auth. so it
>is essential that I can load cookie/s into this program.
>Also, I do not need to actually download the URL, just note its URL and
>move onto the next URL linked from the first page.
>Not sure whether there is a way in curl, or perhaps wget?
>Thanks for any suggestions,
>This SF.NET email is sponsored by:
>SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See!

This SF.NET email is sponsored by:
SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See!
Received on 2003-01-30

These mail archives are generated by hypermail.

donate! Page updated November 12, 2010.
web site info

File upload with ASP.NET