cURL
Haxx ad
libcurl

curl's project page on SourceForge.net

Sponsors:
Haxx

cURL > Mailing List > Monthly Index > Single Mail

curl-library Mailing List Archives

Re: Question: How to guarantee server data intact before resume download from it?

From: Dan Fandrich <dan_at_coneharvesters.com>
Date: Tue, 29 Nov 2005 09:47:25 -0800

On Wed, Nov 30, 2005 at 12:21:03AM +0800, yu kai wrote:
> I have some questions regarding libcurl resuming download from HTTP/FTP.
> 1. If there is a broken download before, and the client call libcurl to
> resume download again, how can the client guarantee the server file(either
> in HTTP or FTP server) is not changed (the worst case is, same filesize,
> while different content), before going on resume download from the last
> successful bit of local copy? What's the recommended way? Is it different
> for HTTP and FTP downloading?

With HTTP, looking at the ETag, Last-Modified and Content-Length headers
should give you a fairly reliable indication of whether the content has
changed or not. The Content-MD5 header would give you an (almost) perfect
indication of whether the content has changed, but it's an optional header
and I've never actually seen it being used in practice.

With FTP, you might be able to use one of the commands 'SITE EXEC md5sum file'
'SITE CHECKSUM' or 'SITE CHECKMETHOD' to get an MD5 digest of a file. These
are non-standard, though.

> 2. Also has the alike question on uploading, how can the client quarantee
> the last successful copy in the server is intact for going on?

You could use one of the commands above to generate an MD5 digest and compare
it to what you expect. Alternately, you could just download the file and
compare it!

>>> Dan

-- 
http://www.MoveAnnouncer.com              The web change of address service
          Let webmasters know that your web site has moved
Received on 2005-11-29

These mail archives are generated by hypermail.

donate! Page updated November 12, 2010.
web site info

File upload with ASP.NET