[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Restoring large archive with network failures
- To: email@example.com, firstname.lastname@example.org
- Subject: Re: Restoring large archive with network failures
- From: Fede Pereiro <email@example.com>
- Date: Wed, 3 Apr 2019 09:40:04 +0200
- Cc: Colin Percival <firstname.lastname@example.org>
- In-reply-to: <CANjVpLdxZV9d0_nW8bNQ=ib1_faJGepwYw+ZtLVa+UbidYwzMQ@mail.gmail.com>
- References: <CANjVpLcQHLxYhDuiWBi5nwm0GXZ6aWeWTfJWMgYsayBGpSAmDw@mail.gmail.com> <email@example.com> <20180915221632.GA13609@mac> <CANjVpLdxZV9d0_nW8bNQ=ib1_faJGepwYw+ZtLVa+UbidYwzMQ@mail.gmail.com>
Hi Colin & Graham,
I restored a large archive using the --resume-extract option and it worked beautifully. I ended up writing a throwaway node script for comparing the written files with the metadata in the list and it all looked good. Thank you for the great work!
Hi Colin & Percival!
I think I never subscribed to the list -- I nonchalantly assumed that no subscription was necessary : ). Just subscribed.
I've just uploaded a PR that adds a --resume-extract option for precisely this
If you're comfortable recompiling Tarsnap, by all means give it a try! There's
always the warning that this is relatively untested code right now.
On Sat, Sep 15, 2018 at 11:37:41AM -0700, Colin Percival wrote:
> Hi Fede,
> I just found this in my spam filter -- it looks like your mailing list post
> may have been eaten (are you subscribed to the list with this email address?)
> as well.
> Feel free to re-send this to the list, but I think the answer to your question
> is (a) yes you're right that using --keep-newer-files for this has a problem
> relating to files which are partially downloaded when an extract fails, and
> (b) there's a work in progress for adding a --resume-extract option which will
> catch cases like this... I think Graham (CCed) may have a patch you can try,
> if you're comfortable recompiling Tarsnap.
> Colin Percival
> On 9/10/18 8:28 AM, Fede Pereiro wrote:
> > Hi everyone!
> > I'm in the process of restoring (-x) a large archive to my local disk. During
> > download, I have experienced network errors that have interrupted the download.
> > If I initialize the process again, using the *--keep-newer-files *to only
> > download the missing files, the restore will eventually be complete. My main
> > concern, however, is that a network error can leave a file partly downloaded.
> > In this case, when I re-run the command, said file won't be re-downloaded and
> > I will be left with an incomplete file.
> > To avoid this, I would have to manually delete the last file downloaded before
> > the network failure and re-enter the command. Another option would be to
> > concoct a script that compares the tarsnap -t output with that of a recursive
> > ls and spot files with different sizes (then I could manually delete them and
> > run the command again).
> > Before I do either of the above, I ask: is there a more reliable and efficient
> > (both in personal time and in bandwidth cost) to restore large backups when
> > using a connection that experiences network failures?
> > I take this opportunity to thank cperciva and the community for creating and
> > maintaining Tarsnap.
> > Thanks!
> Colin Percival
> Security Officer Emeritus, FreeBSD | The power to serve
> Founder, Tarsnap | www.tarsnap.com | Online backups for the truly paranoid