35 Matching Annotations
  1. Jun 2021
    1. This is where off-site backups come into play. For this purpose, I recommend Borg backup. It has sophisticated features for compression and encryption, and allows you to mount any version of your backups as a filesystem to recover the data from. Set this up on a cronjob as well for as frequently as you feel the need to make backups, and send them off-site to another location, which itself should have storage facilities following the rest of the recommendations from this article. Set up another cronjob to run borg check and send you the results on a schedule, so that their conspicuous absence may indicate that something fishy is going on. I also use Prometheus with Pushgateway to make a note every time that a backup is run, and set up an alarm which goes off if the backup age exceeds 48 hours. I also have periodic test alarms, so that the alert manager’s own failures are noticed.

      Solution for human failures and existential threads:

      • Borg backup on a cronjob
      • Prometheus with Pushgateway
  2. Mar 2021
  3. Oct 2020
    1. Creating backup plans automatically

      [Backup-4] Lack of statement (No procedure to enable the back up plan)

      Background: If a backup plan is created with “Backup and Recovery Assistant”, you need to take steps to activate the backup plan on the Acronis website after creating it. Without enabling it, WPH backup data will not be acquired. Issue: IG does not have the description. Request:Add a procedure.

    2. Nevertheless, if you need to update the release version to reflect settings as changed by Acronis, use this tool to install the latest Acronis agent version.

      [Backup-3]Lack of statement (No alert for executing the script.)

      Background: There are two ways to update Acronis agent.?a)wph-acronis-agent-maintenance --install-latest (Upgrade to the latest agent released by Acronis)?b)wph-acronis-agent-maintenance --install-release (Upgrade to the latest agent supported by WPH)?By using method a, the agent may be updated to a version not supported by WPH.?The IG has instructions to upgrade the agent using method a. Issue: For method a, the IG has not been alerted to that. Request:Add a warning sentence to the two related places.

    3. Acronis agent version maintenance

      [Backup-2]Lack of statement  (It is not described as an optional procedure)

      Issue : “Acronis agent version maintenance” is not listed as an option despite the optional work. Request : Write (optional) in the title.

    4. Updating an Acronis account

      [Backup-1]Lack of statement   (It is not described as an optional procedure)

      Issue : “Updating an Acronis account” is not listed as an option despite the optional work. Request : Write (optional) in the title.

  4. Jun 2020
  5. May 2020
  6. Mar 2020
  7. Jan 2020
    1. How efficient is deduplication?

      tarsnap 一个独立的 backup 加密应用

  8. Dec 2019
    1. It is possible to do a successful file system migration by using rsync as described in this article and updating the fstab and bootloader as described in Migrate installation to new hardware. This essentially provides a way to convert any root file system to another one.
    2. rsync provides a way to do a copy of all data in a file system while preserving as much information as possible, including the file system metadata. It is a procedure of data cloning on a file system level where source and destination file systems do not need to be of the same type. It can be used for backing up, file system migration or data recovery.
    1. I am familiar with using rsync to back up various files from my system but what is the best way to completely restore a machine.
    1. If you want to keep several days worth of backups, your storage requirements will grow dramatically with this approach. A tool called rdiff-backup, based on rsync, gets around this issue.
    2. Agreed, I use rdiff-backup because I found my rsync backups were getting cluttered with old files, and sometimes the lack of versioned backups was problematic. I'm a big fan of rdiff-backup. I don't think it actually leverages rsync, as such, but librsync. It's a great tool.
    3. I think that rsync is great but tools like dar, attic, bup, rdiff-backup or obnam are better. I use obnam, as it does "snapshot backups" with deduplication.
    4. I run the script daily, early every morning, as a cron job to ensure that I never forget to perform my backups.
    5. There are many options for performing backups. Most Linux distributions are provided with one or more open source programs specially designed to perform backups. There are many commercial options available as well. But none of those directly met my needs so I decided to use basic Linux tools to do the job.
    1. CloneZilla works perfectly. It produces small image files, has integrity check and works fast. If you want to use third device as image repository you should choose device-image when creating image of the first disk and then image-device when you restore it to second disk. If you want to use only two disks - you should use device-device mode. Optionally you may want generate new UUIDs, SSH-key (if SSH server installed), and change hostname.
    1. While there are so many tools to backup your systems, I find this method super easy and convenient, at least to me. Also, this method is way better than disk cloning with dd command. Because It doesn’t matter if your hard drive is different size, or use different filesystem.
  9. Jul 2019
  10. Jun 2019
    1. Barman (Backup and Recovery Manager) is an open-source administration tool for disaster recovery of PostgreSQL servers
  11. Oct 2017
  12. Jan 2014
    1. An effective data management program would enable a user 20 years or longer in the future to discover , access , understand, and use particular data [ 3 ]. This primer summarizes the elements of a data management program that would satisfy this 20-year rule and are necessary to prevent data entropy .

      Who cares most about the 20-year rule? This is an ideal that appeals to some, but in practice even the most zealous adherents can't picture what this looks like in some concrete way-- except in the most traditional ways: physical paper journals in libraries are tangible examples of the 20-year rule.

      Until we have a digital equivalent for data I don't blame people looking for tenure or jobs for not caring about this ideal if we can't provide a clear picture of how to achieve this widely at an institutional level. For digital materials I think the picture people have in their minds is of tape backup. Maybe this is generational? New generations not exposed widely to cassette tapes, DVDs, and other physical media that "old people" remember, only then will it be possible to have a new ideal that people can see in their minds-eye.