[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Question about mitigating large storage costs in case of ransomware



Thanks for all of the feedback, everyone!  It sounds like using --maxbw
will work just fine, and I should set up some mail reporting or
something to notify me if new archive data gets out of hand.

On Wed, 12 Jul 2017 20:04:00 +0000 (UTC)
Garance AE Drosehn <gadcode@earthlink.net> wrote:

> Here's a strategy I use, although I wasn't thinking of ransomware at
> the time.
> 
> I have a script which first does a 'tarsnap --dry-run', and parses
> the summary output from that.  It checks the 'Total Size' and
> 'Compressed Size' of the new data, and will skip making a real-backup
> if that size seems unreasonable.  Think about this for a second:  If
> you are hit by ransomware, then you probably don't want to upload
> *any* of that encrypted data to tarsnap's servers.
> 
> If the update-size seems reasonable, then I go ahead and create a
> real update.  Note that I also break up my backup into multiple
> separate tarsnap archives.  So, for instance, it might be reasonable
> for my "Documents" backup to have a lot of new data in it, but it
> would be extremely suspicious if the backup of all my svn
> repositories had a huge amount of new data.
> 
> What I have is even more convoluted than what I've described, and I'm
> pretty sure that my scripts wouldn't be useful for others.  But
> that's the direction I would consider if I were you.
>