- Sep 2024
-
github.com github.com
-
In cases where I've been concerned about the migration of data (e.g. copying my entire home directory from one system to another), I've used fingerprint to generate a transcript on the source machine, and then run it on the destination machine, to reassure me that the data was copied correctly and completely.
-
-
-
rotate
-
prune
-
- Sep 2023
- Feb 2023
-
-
Related here is the horcrux problem of note taking or even social media. The mental friction of where did I put that thing? As a result, it's best to put it all in one place.
How can you build on a single foundation if you're in multiple locations? The primary (only?) benefit of multiple locations is redundancy in case of loss.
Ryan Holiday and Robert Greene are counter examples, though Greene's books are distinct projects generally while Holiday's work has a lot of overlap.
-
- Nov 2022
-
brainbaking.com brainbaking.com
-
Preserving web content never really left my mind ever since taking screenshots of old sites and putting them in my personal museum. The Internet Archive’s Wayback Machine is a wonderful tool that currently stores 748 billion webpage snapshots over time, including dozens of my own webdesign attempts, dating back to 2001. But that data is not in our hands. Should it? It should. Ruben says: archive it if you care about it: The only way to be sure you can read, listen to, or watch stuff you care about is to archive it. Read a tutorial about yt-dlp for videos. Download webcomics. Archive podcast episodes.
Should people have their own webarchive? A long list of pro's and cons comes to mind. For several purposes a 3rd party archive is key, for others having things locally is good enough. For other situations having a off-site location is of interest. Is this less a question of webarchiving and more a question of how wide the scope should be of one's own 3-2-1 back-up choices? I find myself more frequently thinking about the processes at e.g. the National Archive in The Hague, where a lot comes down to knowing what you will not keep.
-
- Oct 2022
-
www.indxd.ink www.indxd.inkIndxd1
-
https://www.indxd.ink/
A digital, web-based index tool for your analog notebooks. Ostensibly allows one to digitally index their paper notebooks (page numbers optional).
It emails you weekly text updates, so you've got a back up of your data if the site/service disappears.
This could potentially be used by those who have analog zettelkasten practices, but want the digital search and some back up of their system.
<small><cite class='h-cite via'>ᔥ <span class='p-author h-card'>sgtstretch </span> in @Gaby @pimoore so a good friend of mine makes [INDXD](https://www.indxd.ink/) which is for indexing analog notebooks and being able to find things. I don't personally use it, but I know @patrickrhone has written about it before. (<time class='dt-published'>10/27/2022 17:59:32</time>)</cite></small>
-
-
www.dla-marbach.de www.dla-marbach.de
-
»Bei Feuer sind die schwarzeingebundnen Exzerpten zuerst zu retten«, wies der Dichter Jean Paul seine Frau vor Antritt einer Reise im Jahr 1812 an.
"In the event of a fire, the black-bound excerpts are to be saved first," the poet Jean Paul instructed his wife before setting out on a journey in 1812.
-
-
www.reddit.com www.reddit.com
-
https://www.reddit.com/r/antinet/comments/ur5xjv/handwritten_cards_to_a_digital_back_up_workflow/
For those who keep a physical pen and paper system who either want to be a bit on the hybrid side, or just have a digital backup "just in case", I thought I'd share a workflow that I outlined back in December that works reasonably well. (Backups or emergency plans for one's notes are important as evidenced by poet Jean Paul's admonition to his wife before setting off on a trip in 1812: "In the event of a fire, the black-bound excerpts are to be saved first.") It's framed as posting to a website/digital garden, but it works just as well for many of the digital text platforms one might have or consider. For those on other platforms (like iOS) there are some useful suggestions in the comments section. Handwriting My Website (or Zettelkasten) with a Digital Amanuensis
-
- Jul 2022
-
www.reddit.com www.reddit.com
-
I can't reverse it, but maybe somebody who understands how Chrome does the decryption can. The ability is there, its not that Chrome can't decrypt them, it is that Chrome won't decrypt them due to false "security".And if Chrome actually, genuinely can no longer decrypt passwords after they have been restored from backup, then that is a shockingly bad bug in their password manager.
-
- Nov 2021
-
wiki.debian.org wiki.debian.org
-
take advantage of LVM snapshots. Take snapshots before and after an upgrade. In case, if the system is in unrecoverable position, rollback to the last snapshot from a system rescue LiveCD. A useful program for this, as well as regular system backups is timeshift
-
- May 2020
-
codeguard.zendesk.com codeguard.zendesk.com
-
Not necessarily. Hosting companies tend to keep your backups in the same place as your primary files. You don’t carry around a copy of your birth certificate along with the actual one – you keep the real one safe at home for emergencies. So why not do the same for your backups? CodeGuard provides safe, offsite backup that is 100% independent from your hosting provider.
-
-
en.wikipedia.org en.wikipedia.org
-
involve a combination of Local backup for fast backup and restore, along with Off-site backup for protection against local disasters
-
Recent backups are retained locally, to speed data recovery operations.
-
-
-
www.hostgator.com www.hostgator.com
-
After the initial backup, future backups are differential, both in the files that are transferred and the files that are stored on your behalf.
I guess git can help with the differential part of the backup, but how exactly does it? In order for it to help with transfer from the subject server, wouldn't it have to keep the git repo on that server? Otherwise wouldn't it have to transfer everything to the remote cloud git repo so that git can do the diff there?
Makes me wonder if simple rsync wouldn't be more efficient than all this.
-
- Apr 2020
-
queue.acm.org queue.acm.org
-
Of course, just because your users want a copy of their data doesn't necessarily mean that they're abandoning your product. Many users just feel safer having a local copy of their data as a backup.
Tags
Annotators
URL
-
- Dec 2019
-
github.com github.com
-
For any type of full backup, on an active server, it's recommended that you snapshot the filesystem first, then run your backup script on the contents of the snapshot. Your filesystem has to be on LVM to use snapshots, though, and have enough free physical extents (un-partitioned space) to shore up the amount of expected changed data during the snapshot window (more for very busy servers, less for not-so-busy servers). Look here... http://www.tldp.org/HOWTO/LVM-HOWTO/snapshots_backup.html
Tags
Annotators
URL
-
-
serverfault.com serverfault.com
-
Both /proc and /sys are virtual filesystems which reflect the state of the system, and allow you to change several runtime parameters (and sometimes do more dangerous things, like directly writing to the memory or to a device). You should never backup or restore them.
-
-
www.taobackup.com www.taobackup.com
-
-
www.taobackup.com www.taobackup.com
-
You might think that a one-line configuration file is not worth backing up. However, if it took you three hours to figure out how to set that configuration, it will probably take you three hours again in six months time.
-
-
unix.stackexchange.com unix.stackexchange.com
-
I am not concerned with any need to exactly restore the system if there is a crash. I have had three previous computers, none of them failed, but I kept manual backups on external hard drives and just use them, now, as a source for any material on them that I still need.
-
-
www.pointsoftware.ch www.pointsoftware.ch
-
So if you create one backup per night, for example with a cronjob, then this retention policy gives you 512 days of retention. This is useful but this can require to much disk space, that is why we have included a non-linear distribution policy. In short, we keep only the oldest backup in the range 257-512, and also in the range 129-256, and so on. This exponential distribution in time of the backups retains more backups in the short term and less in the long term; it keeps only 10 or 11 backups but spans a retention of 257-512 days.
-
-
opensource.com opensource.com
-
But just creating the backups will not save your business. You need to make regular backups and keep the most recent copies at a remote location, that is not in the same building or even within a few miles of your business location, if at all possible. This helps to ensure that a large-scale disaster does not destroy all of your backups.
-
No backup regimen would be complete without testing. You should regularly test recovery of random files or entire directory structures to ensure not only that the backups are working, but that the data in the backups can be recovered for use after a disaster. I have seen too many instances where a backup could not be restored for one reason or another and valuable data was lost because the lack of testing prevented discovery of the problem.
-