Thank you James & Graham for the responses, but lets now put things into perspective, *why* I’m asking what I’m asking:
> On 04 Aug 2020, at 07:21 , Graham Percival <gperciva@tarsnap.com> wrote:
>
> On Mon, Aug 03, 2020 at 06:14:06PM -0700, james young wrote:
>>
>> On Aug 3, 2020, at 6:15 AM, hvjunk <hvjunk@gmail.com> wrote:
>>> Question: should I delete them one by one, or cheaper/beter/more-efficient to bundle them all together in batches?
>>
>> The documentation recommends batches:
>> https://www.tarsnap.com/improve-speed.html#faster-delete
>
> Yes, to quote that page: "Multiple archives can be deleted with
> the same command; this is usually faster (and never slower) than
> using individual delete commands”.
Thank you for confirming this.
>>> And if in batches: Does tarsnap recover gracefully from connectivity failures while in the process of deletion? Asking as, being >250ms away from the AWS region/zone, I’ve noticed a “hang” on one of my mass deletions, and not sure whether it was busy, or actually stuck and retrying
>
> Three options that you might want to add to your delete commands:
> -v (to see which archive is currently being deleted)
> --keep-going (ignore an error from one archive when deleting
> multiple archives)
> --archive-names (to read a list of archive names from a file,
> instead of using multiple -f options)
>
> The --keep-going is particularly useful for recovering if you
> cancel the tarsnap command. If you have
Yes, already using it, but that —archive-names I wasn’t aware of, will try that. How ever my issues starts with this:
Deleting 7 archives (I at present I’m in need for pruning many more) Takes around 3hours?
My traffic utilization also shot throw the roof, but I recall something that is how the dice rolls with tarsnap, so I’ll have to live with that part.
The issue with “recovery” is that tarsnap seems sometimes to be stuck doing that deletion, how can I see whether it’s still busy, or the server is slow, or the link is slow (don’t believe that, but it is from EU side, so we do have latency ;( ), as it being stuck and busy deleting seems to be the same “State” from what I’ve seen.
Oh, and going off course, while this multi day deletion is in progress I can’t do backups, so wonders about an interruption mechanism that is guaranteed.
time tarsnap -v -v -v -d --keep-going -ftracsdbprod01-2019-05-19_06-09-48 -ftracsdbprod01-2019-05-26_06-08-45 -ftracsdbprod01-2019-06-02_06-07-55 -ftracsdbprod01-2019-06-09_06-07-04 -ftracsdbprod01-2019-06-16_06-04-20 -ftracsdbprod01-2019-06-23_06-03-20 -ftracsdbprod01-2019-06-30_06-03-54
Deleting archive "tracsdbprod01-2019-05-19_06-09-48"
Total size Compressed size
All archives 225965898333226 33116540894803
(unique data) 96202552048 15960528756
tracsdbprod01-2019-05-19_06-09-48 789490332813 118785513597
Deleted data 446920883 85989389
Deleting archive "tracsdbprod01-2019-05-26_06-08-45"
Total size Compressed size
All archives 225174121939237 32997494549562
(unique data) 94812088036 15733761124
tracsdbprod01-2019-05-26_06-08-45 791776393989 119046345241
Deleted data 1390464012 226767632
Deleting archive "tracsdbprod01-2019-06-02_06-07-55"
Total size Compressed size
All archives 224452185966072 32889377423202
(unique data) 94433885219 15654951606
tracsdbprod01-2019-06-02_06-07-55 721935973165 108117126360
Deleted data 378202817 78809518
Deleting archive "tracsdbprod01-2019-06-09_06-07-04"
Total size Compressed size
All archives 223727553550787 32780981949299
(unique data) 94015137980 15571887044
tracsdbprod01-2019-06-09_06-07-04 724632415285 108395473903
Deleted data 418747239 83064562
Deleting archive "tracsdbprod01-2019-06-16_06-04-20"
Total size Compressed size
All archives 223001051971806 32672339466735
(unique data) 93586043191 15487088049
tracsdbprod01-2019-06-16_06-04-20 726501578981 108642482564
Deleted data 429094789 84798995
Deleting archive "tracsdbprod01-2019-06-23_06-03-20"
Total size Compressed size
All archives 222272912587289 32563471853664
(unique data) 93205034739 15409849238
tracsdbprod01-2019-06-23_06-03-20 728139384517 108867613071
Deleted data 381008452 77238811
Deleting archive "tracsdbprod01-2019-06-30_06-03-54"
Total size Compressed size
All archives 221542862886540 32454351465027
(unique data) 93166985343 15400063912
tracsdbprod01-2019-06-30_06-03-54 730049700749 109120388637
Deleted data 38049396 9785326
real 155m40.198s
user 4m1.124s
sys 0m5.632s
Attachment:
signature.asc
Description: Message signed with OpenPGP