Page 1 of 1

off-site data backup?

Posted: 2018-03-22 16:25
by putzpie
Hey y'all, just thought I'd see what people recommend for a cloud/offsite backup solution. I was looking at linode but that's not really for storage, I don't want to use Google services ideally and I need to store a lot of data. Ideally I want to store my whole collection of everything which is well over a TB.

I checked out maidsafe and it looks fairly promising. I haven't installed it yet and the whole free market fundamentalism is a turn off tbh, but i guess it makes sense given the intended goal of the project so whatever. Not really a huge deal if it works well, just wanted to see what people recommend in regards to making an online backup of my data in case I ever lost all my hardware.

Thanks

Re: off-site data backup?

Posted: 2019-03-29 19:12
by Head_on_a_Stick
https://www.tarsnap.com/

Disclaimer: I don't backup.

Re: off-site data backup?

Posted: 2019-03-30 01:54
by bester69
Im using free accounts Box and Mega

Mega is 50Gb free, Ive two free accounts (50+50), I switch between them with a script that switch profiles.

My configuration backup is as follows.:

0- Atiic (deduplication home) > to disk
1- duplicity-rclone (attic backup) > To cloud Mega or Box (thought script refers to google)

here you can see my attic/rclone crontab.daily backup .:

Code: Select all

#!/bin/sh
#
ionice -c2 -n7 -p$$
renice +15 -p $$
###
DEDUP=/home/user/LINUXDEBS/dedup/
PAR2=/home/user/LINUXDEBS/dedup_par2/
export  ATTIC_UNKNOWN_UNENCRYPTED_REPO_ACCESS_IS_OK=yes
export  ATTIC_PASSPHRASE=$(sudo /home/user/scripts/secrets.sh XXYY2)


# Rclone Environment
export RCLONE_CONFIG_PASS=$(sudo /home/user/scripts/secrets.sh XXYY6)
dfile=`date +%Y-%m-%d_%H:%M:%S`

attic init --encryption='passphrase' $DEDUP
# excluded directories
attic create --stats -v                           \
    $DEDUP::hostname-$dfile      \
    /home/user/LINUXDEBS/PDF\ linux/	\
    /home/user/LINUXDEBS/PDF\ docs/		\
    /home/user/LINUXDEBS/Accesos/		\
    /home/user/LINUXDEBS/scripts/	\
    /home/user/LINUXDEBS/config/	\
    /home/user/LINUXDEBS/LIBROS_ONENOTE/	\
    /home/user/LINUXDEBS/MONDOTRANS/	\
    /home/user/LINUXDEBS/BUSSINESS/	\
    /home/user/LINUXDEBS/TRABAJO/	\
    /home/user/LINUXDEBS/rip/	\
    /media/sda5/MegaSync/Dropbox/SEGUROUT/	\
    /media/sda5/MegaSync/Dropbox/MIOS/teacher/	\
    /media/sda5/MegaSync/SEGURCRYP/	\
    /home/user/.mpd/		\
    --exclude /home/user/.config/opera/    \
    --exclude /home/user/.config/google-chrome/    \
    --exclude '*.pyc'
# Use the `prune` subcommand to maintain 7 daily, 4 weekly
# and 6 monthly archives.
attic prune -v $DEDUP --keep-daily=7 --keep-weekly=4 --keep-monthly=6

#Subida a CLoud Google
echo "Subiendo a Cloud:  google SEGCRYPT"
rm /tmp/rclone$dfile.log
rclone  sync --delete-before --exclude=.directory  --exclude=.Trash-1000 --checksum  --no-update-modtime --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s /media/sda3/MegaSync/SEGURCRYP GoogleDom2:SEGURCRYP --log-level DEBUG --log-file /tmp/rclone$dfile.log

echo "Subiendo a Cloud:  google SEGOUT"
rm /tmp/rclone$dfile.log
rclone  sync --delete-before --exclude=.directory --checksum  --no-update-modtime --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s /media/sda3/MegaSync/Dropbox/SEGUROUT GoogleDom2:SEGUROUT --log-level DEBUG --log-file /tmp/rclone$dfile.log

echo "Subiendo a Cloud:  google DEDUP"
rm /tmp/rclone$dfile.log
rclone   sync --delete-before --exclude=.directory --checksum  --no-update-modtime --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s /home/user/LINUXDEBS/dedup GoogleDom2:LINUX/dedup --log-level DEBUG --log-file /tmp/rclone$dfile.log