-
-
Notifications
You must be signed in to change notification settings - Fork 6.9k
Closed
Description
Say I want to download a large file, 1 GiB perhaps. I specify -O
.
My network connection isn't very reliable, so I specify -m
and --retry
, so it doesn't hang up.
The problem is, on every retry, the output file is truncated, effectively throwing away all the work so far.
Warning: Transient problem: timeout Will retry in 1 seconds. 5 retries left.
Throwing away 327962289 bytes
This is wasteful, and makes --retry
quite useless.
At least, if -C -
is specified, it's pretty clear the user wants to always resume automatically.
The workaround is to either not to use --retry
& shell-wrap curl
in a for
loop, or to use wget
.
Relevant source, for anyone interested.
curl 7.50.3