Scheduled Maintenance: We are aware of an issue with Google, AOL, and Yahoo services as email providers which are blocking new registrations. We are trying to fix the issue and we have several internal and external support tickets in process to resolve the issue. Please see: viewtopic.php?t=158230

 

 

 

off-site data backup?

Off-Topic discussions about science, technology, and non Debian specific topics.
Post Reply
Message
Author
putzpie
Posts: 4
Joined: 2018-03-20 14:47

off-site data backup?

#1 Post by putzpie »

Hey y'all, just thought I'd see what people recommend for a cloud/offsite backup solution. I was looking at linode but that's not really for storage, I don't want to use Google services ideally and I need to store a lot of data. Ideally I want to store my whole collection of everything which is well over a TB.

I checked out maidsafe and it looks fairly promising. I haven't installed it yet and the whole free market fundamentalism is a turn off tbh, but i guess it makes sense given the intended goal of the project so whatever. Not really a huge deal if it works well, just wanted to see what people recommend in regards to making an online backup of my data in case I ever lost all my hardware.

Thanks

User avatar
Head_on_a_Stick
Posts: 14114
Joined: 2014-06-01 17:46
Location: London, England
Has thanked: 81 times
Been thanked: 132 times

Re: off-site data backup?

#2 Post by Head_on_a_Stick »

https://www.tarsnap.com/

Disclaimer: I don't backup.
deadbang

User avatar
bester69
Posts: 2072
Joined: 2015-04-02 13:15
Has thanked: 24 times
Been thanked: 14 times

Re: off-site data backup?

#3 Post by bester69 »

Im using free accounts Box and Mega

Mega is 50Gb free, Ive two free accounts (50+50), I switch between them with a script that switch profiles.

My configuration backup is as follows.:

0- Atiic (deduplication home) > to disk
1- duplicity-rclone (attic backup) > To cloud Mega or Box (thought script refers to google)

here you can see my attic/rclone crontab.daily backup .:

Code: Select all

#!/bin/sh
#
ionice -c2 -n7 -p$$
renice +15 -p $$
###
DEDUP=/home/user/LINUXDEBS/dedup/
PAR2=/home/user/LINUXDEBS/dedup_par2/
export  ATTIC_UNKNOWN_UNENCRYPTED_REPO_ACCESS_IS_OK=yes
export  ATTIC_PASSPHRASE=$(sudo /home/user/scripts/secrets.sh XXYY2)


# Rclone Environment
export RCLONE_CONFIG_PASS=$(sudo /home/user/scripts/secrets.sh XXYY6)
dfile=`date +%Y-%m-%d_%H:%M:%S`

attic init --encryption='passphrase' $DEDUP
# excluded directories
attic create --stats -v                           \
    $DEDUP::hostname-$dfile      \
    /home/user/LINUXDEBS/PDF\ linux/	\
    /home/user/LINUXDEBS/PDF\ docs/		\
    /home/user/LINUXDEBS/Accesos/		\
    /home/user/LINUXDEBS/scripts/	\
    /home/user/LINUXDEBS/config/	\
    /home/user/LINUXDEBS/LIBROS_ONENOTE/	\
    /home/user/LINUXDEBS/MONDOTRANS/	\
    /home/user/LINUXDEBS/BUSSINESS/	\
    /home/user/LINUXDEBS/TRABAJO/	\
    /home/user/LINUXDEBS/rip/	\
    /media/sda5/MegaSync/Dropbox/SEGUROUT/	\
    /media/sda5/MegaSync/Dropbox/MIOS/teacher/	\
    /media/sda5/MegaSync/SEGURCRYP/	\
    /home/user/.mpd/		\
    --exclude /home/user/.config/opera/    \
    --exclude /home/user/.config/google-chrome/    \
    --exclude '*.pyc'
# Use the `prune` subcommand to maintain 7 daily, 4 weekly
# and 6 monthly archives.
attic prune -v $DEDUP --keep-daily=7 --keep-weekly=4 --keep-monthly=6

#Subida a CLoud Google
echo "Subiendo a Cloud:  google SEGCRYPT"
rm /tmp/rclone$dfile.log
rclone  sync --delete-before --exclude=.directory  --exclude=.Trash-1000 --checksum  --no-update-modtime --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s /media/sda3/MegaSync/SEGURCRYP GoogleDom2:SEGURCRYP --log-level DEBUG --log-file /tmp/rclone$dfile.log

echo "Subiendo a Cloud:  google SEGOUT"
rm /tmp/rclone$dfile.log
rclone  sync --delete-before --exclude=.directory --checksum  --no-update-modtime --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s /media/sda3/MegaSync/Dropbox/SEGUROUT GoogleDom2:SEGUROUT --log-level DEBUG --log-file /tmp/rclone$dfile.log

echo "Subiendo a Cloud:  google DEDUP"
rm /tmp/rclone$dfile.log
rclone   sync --delete-before --exclude=.directory --checksum  --no-update-modtime --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s /home/user/LINUXDEBS/dedup GoogleDom2:LINUX/dedup --log-level DEBUG --log-file /tmp/rclone$dfile.log

bester69 wrote:STOP 2030 globalists demons, keep the fight for humanity freedom against NWO...

Post Reply