cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Help for an Uber noob. Very basic

From: G. T. Stresen-Reuter <tedmasterweb_at_gmail.com>
Date: Mon, 12 Jan 2009 18:41:49 +0000

On Jan 12, 2009, at 5:45 PM, Victor wrote:

> Hi there.

Hi. Good start on the list. Don't be put off by the feedback you
receive. Email communication loses a lot of context and non-verbal
communication (facial expressions, tone, pauses, etc.). That said,
here is some feedback you might find useful.

> There are a list of images I would like to download. But I do not
> know the scripting languages necessary nor can I access the
> directory storing the images. I can however access each image at a
> time.
>
> For example I can display http://picayune.uclick.com/comics/crrub/2009/crrub090112.gif
> but not http://picayune.uclick.com/comics/crrub/2009/ .

It sounds like you are looking for a tool that lets you collect all
the images on a page into a folder on your hard drive. Although cURL
can be used to do such a thing, that functionality is not built into
cURL. cURL limits its activity to /transferring/ files. It cannot
determine what image files appear in a web page and THEN transfer
them. There are other tools on the web, though, that can do that...

The other point here is that cURL probably is retrieving the second
URL above, but only the HTML (as one would expect) and not any of the
images or other web page components referenced in the HTML (such as
style sheets and javascript libraries).

> I have read what I thought was relevant in the MAN pages but it did
> not seem to work. Obviously I am doing something wrong.
>
> I have managed to download the above image (the .gif) but it names
> itself after the ubuntu directory even if I use the --O option.
> Here is my sample code:
>
> victor@victor-desktop:~$ curl --O "victor@victor-desktop" http://picayune.uclick.com/comics/crrub/2009/crrub090112.gif
> % Total % Received % Xferd Average Speed Time Time
> Time Current
> Dload Upload Total Spent
> Left Speed
> 100 50002 100 50002 0 0 34977 0 0:00:01 0:00:01
> --:--:-- 39983
>
> This downloads it but the file is called victor_at_victor-desktop .
> Obviously this wont work if I want to download 100 of them.

When using options consisting of one letter, such as "O", you only
need one hyphen, not two.

I don't think you need the "victor_at_victor-desktop" in the command so
try leaving it out.

curl -O http://picayune.uclick.com/comics/crrub/2009/crrub090112.gif

> So then I tried this:
>
> victor_at_victor-desktop:~$ curl --O "victor_at_victor-desktop"/--remote-
> name http://picayune.uclick.com/comics/crrub/2009/crrub090112.gif
> Warning: Failed to create the file victor_at_victor-desktop/--remote-name
> % Total % Received % Xferd Average Speed Time Time
> Time Current
> Dload Upload Total Spent
> Left Speed
> 2 50002 2 1062 0 0 5277 0 0:00:09 --:--:--
> 0:00:09 5277
> curl: (23) Failed writing body (0 != 1062)

In this case, there is a slash (/) preceding the two hyphens of remote-
name. That definitely won't work, so remove it and make sure there is
a space preceding the two hyphens instead. Also, I don't cURL supports
an option of "--remote-name" but rather "--remote-time" (which has a
totally different purpose).

Try the command above and see if that does what you need and good luck.

Ted Stresen-Reuter
http://tedmasterweb.com

-------------------------------------------------------------------
List admin: http://cool.haxx.se/cgi-bin/mailman/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2009-01-12