pyratelog

personal blog
git clone git://git.pyratebeard.net/pyratelog.git
Log | Files | Refs | README

commit 767c23b074dcdf1b34703cb218984e1d8aac9dad
parent 6d2469e0a3725c1050b12fe5cc782da2ca73db2a
Author: pyratebeard <root@pyratebeard.net>
Date:   Tue, 18 Oct 2022 22:19:16 +0100

smoke_me_a_kipper

Diffstat:
Mentry/smoke_me_a_kipper.md | 13++++++++++---
1 file changed, 10 insertions(+), 3 deletions(-)

diff --git a/entry/smoke_me_a_kipper.md b/entry/smoke_me_a_kipper.md @@ -1,6 +1,6 @@ ## i'll be back(up) for breakfast -Earlier this year I wrote about my [backup setup](20220414-speak_of_the_dedup.html) and this last week I had to put it to the test. +Earlier this year I wrote about my [backup setup](20220414-speak_of_the_dedup.html) and recently I had to put it to the test. My PC is a tower that I have on a small stand next to my desk. In the past I had kept the case (an Antec 1200) on my desk but it is rather large and dominates the space a bit too much, I don't have a very big desk. The other day my 1 year old toddled into the study and started pushing the power button on my PC. This power cycled the machine a few times in quick succession. At the time I wasn't aware of this. The next morning I booted up my PC but noticed it was very sluggish. It crashed trying to open my browser. After it happened again I started digging through the logs and noticed some filesystem corruption. @@ -21,9 +21,16 @@ Once the RAID array was reformatted I began the data copy from my external drive This got me to a relatively good position. Okay I had lost some random downloads, and a little bit of code that hadn't been pushed to my git server, but nothing serious. It is a little disappointing though, my backup setup is not good enough. -The reason I don't do a full nightly backup to the cloud is because `rclone` takes so long to copy the data. I decided to look into this, to see if it could be sped up. Reading the man page shows that `rclone` has an option to only transfer files younger than a specified age, `--max-age=`. Using `dedup` means I don't have to transfer everything each time `rclone` runs, only the most recent archive. Testing this brought my nightly backup time down to TK. +I decided I needed more regular backups of my $HOME, so I needed some more storage. I purchased another external drive which now sits permanently plugged into my PC. I was going to use `dedup` again but decided it would be better to use an alternative so I am not relying on only one tool. I opted for [borg](TK){target="_blank" rel="noreferrer"}. -I decided I needed more regular backups of my $HOME, so I needed some more storage. I purchased another external drive which now sits permanently plugged into my PC. I was going to use `dedup` again but decided it would be better to use an alternative tool so I am not relying on only one tool. I opted for `rsnapshot`. The first backup did take a long time, but now each evening I can run `rsnapshot` to backup my $HOME to the external drive and `rclone` that latest archive to the cloud storage. +After installing `borg` I initialised a new repo and kicked off a full backup +``` + ──── ─ borg init -e repokey /media/backup/new_repo + ──── ─ borg create -v --stats /media/backup/new_repo::backup1 $HOME +``` +The first backup did take a long time, but now each evening I can run `borg` to backup my $HOME to the external drive. + +and `rclone` that latest archive to the cloud storage. Another full backup will still be done to the drive in my bug out bag, I just have to be better at doing it more regularly. At least now if I need to restore I will be able to recover all of $HOME and not only the important things.