cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Bug report: curl win32 ftruncate64 not working correctly.

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Thu, 13 Dec 2007 22:08:40 +0100 (CET)

On Thu, 13 Dec 2007, gdb007 wrote:

> It seems curl built for win32 are using internal ftruncate64
> function to truncate file when retry downloading. but the ftruncate64
> is only doing file seeking, write 0 byte at the resizing position, and
> then seek back to original position. which won't resize the file
> actually so the file is still kept with downloaded data before
> retrying.

So its a truncate function that doesn't truncate? Now that's usefull... ;-)

> _chsize_s should work as a replacement for ftruncate64 in win32, it use
> 64-bit integer as the file size, and therefore can handle file sizes greater
> than 4 GB. you may view
> http://msdn2.microsoft.com/en-us/library/whx354w1(VS.80).aspx for function
> details.

That page says it is "specific to Microsoft Visual Studio 2005/.NET Framework
2.0". If I click on the link for "Microsoft Visual Studio 2003/.NET Framework
1.1" (long names indeed) it says _chsize() only and that one just takes a
long.

Does this mean that _chsize_s() somehow only works with VS2005 or later?

> Another possible bug is: curl with --retry option set won't retry if
> connection lost during downloading. it will show the message "transfer
> closed with xxxx bytes remaining to read" and then exit with
> CURLE_PARTIAL_FILE or CURLE_GOT_NOTHING (very rare case when connection is
> closed with no any data received yet). I am not sure if this is working as
> intended or i am missing with some options.

It is intended. I made the --retry only retry on transient errors, and those
errors are not really transient or even likely to not happen again on retries.
If curl would be made to retry on those, where would we draw the line?

> i suggest the feature to allow retrying for those files not fully downloaded
> (in case you know the file size and there is remaining bytes from server) if
> connection is lost

CURLE_PARTIAL_FILE basically? Yeah, I could agree to that if is made as a
separate option.

> also the option to do not throw away partly downloaded data and truncate the
> file will be helpful when downloading large file and network is slow or
> unstable. (e.g: you downloaded 100Mb for a 200Mb file and then timeout, it
> is good to keep those 100Mb data before retrying).

I guess that makes sense. I don't really remember exactly why I made the
functionality as it is today.

Improving --retry is not a priotity to me (quite honestly, I've _never_ used
the option in real life), please consider jumping in to help out here if you
want this happening anytime soon.

-- 
  Commercial curl and libcurl Technical Support: http://haxx.se/curl.html
Received on 2007-12-13