Scheduled Maintenance: We are aware of an issue with Google, AOL, and Yahoo services as email providers which are blocking new registrations. We are trying to fix the issue and we have several internal and external support tickets in process to resolve the issue. Please see: viewtopic.php?t=158230

 

 

 

Full Cloud Strategy Incremental Backup

Off-Topic discussions about science, technology, and non Debian specific topics.
Post Reply
Message
Author
User avatar
bester69
Posts: 2072
Joined: 2015-04-02 13:15
Has thanked: 24 times
Been thanked: 14 times

Full Cloud Strategy Incremental Backup

#1 Post by bester69 »

Hi,

Right Now, Im using fsarchiver + Duplicity (to send to Box by webdav) to send a Full System Backup to the cloud (around 7Gb). I usually store a system backup in the cloud once or twice a year..
Now I was considering to use directly Duplicity to send incremental backups taking advantage of the btrfs system snapshots.

How do you see it??, Is it a properly and trusty strategy backup for the system, making incremental backups in the Cloud? :roll:

The idea is to make some incremental copies of a COW filesystem btrfs snapshot directly in the cloud with duplicity..
bester69 wrote:STOP 2030 globalists demons, keep the fight for humanity freedom against NWO...

ruffwoof
Posts: 298
Joined: 2016-08-20 21:00

Re: Full Cloud Strategy Incremental Backup

#2 Post by ruffwoof »

System files ... IMO I don't need to back them up as they're easily replaced. System config files ... yes, but they're relatively small. I have a "script" in which I record all the system configuration changes I've made, sometimes that could just be run straight on a new/fresh reinstall, however more often I just add notes into that script that periodically I tidy up so it is a proper script.

Data in contrast is potentially more invaluable. I tend to store data away from system files i.e. system and home are pretty much just configuration/run areas, data files are pulled in from elsewhere. As part of that I run a OpenBSD server to which regular backups are made/stored, but then like you every once in a while I backup that data to off site/geographically disconnected. I prefer to use alternatives to cloud based i.e. removable disks for that geographically disconnected backups.

If I were doing cloud backups I'd probably create a squashfs (mksquashfs after installing squashfs-tools), and encrypt that before uploading it to the cloud, just so if the cloud storage provider did get hacked the content wouldn't be easily viewed.

Typically my home/system config backups run through in seconds. Data backups are longer, i.e. over LAN speeds. Diconnected backups about the same amount of time. Something like a hour perhaps (i.e. I tend to just go off to lunch and leave it running).

User avatar
bester69
Posts: 2072
Joined: 2015-04-02 13:15
Has thanked: 24 times
Been thanked: 14 times

Re: Full Cloud Strategy Incremental Backup

#3 Post by bester69 »

ruffwoof wrote:System files ... IMO I don't need to back them up as they're easily replaced. System config files ... yes, but they're relatively small. I have a "script" in which I record all the system configuration changes I've made, sometimes that could just be run straight on a new/fresh reinstall, however more often I just add notes into that script that periodically I tidy up so it is a proper script.

Data in contrast is potentially more invaluable. I tend to store data away from system files i.e. system and home are pretty much just configuration/run areas, data files are pulled in from elsewhere. As part of that I run a OpenBSD server to which regular backups are made/stored, but then like you every once in a while I backup that data to off site/geographically disconnected. I prefer to use alternatives to cloud based i.e. removable disks for that geographically disconnected backups.

If I were doing cloud backups I'd probably create a squashfs (mksquashfs after installing squashfs-tools), and encrypt that before uploading it to the cloud, just so if the cloud storage provider did get hacked the content wouldn't be easily viewed.

Typically my home/system config backups run through in seconds. Data backups are longer, i.e. over LAN speeds. Diconnected backups about the same amount of time. Something like a hour perhaps (i.e. I tend to just go off to lunch and leave it running).
Hi, thanks for answering,

Ive a lot of layouts backups in my computer.:

BTRFS
- One fixed home snapshot (user Data are in a above subvolume folder)
- One fixed system snapshot
- A rotative datuser snapshot (several weeks retention)

ATTIC (deduplication backup)
- dedup.sh >> daily deduplication backup of main datauser folders >> each some weeks I make an upload to Mega Cloud and a mirror copy to Google Drive
- backres.sh >> whole home deduplication backup excluding crytical files/folder >> each some few months I make an upload to Box cloud.

FSARCHIVER (filesystem backup) + DUPLICITY (deduplication backup)
- fsbackup.sh >> Every five/six months I create a full system backup excluding home in a folder partition. (around 5Gbytes)
- dupres.sh (duplicty) >> It uses duplicty (webdav) to push the previous filesytem backup in 100M volsize files into Box. You can interrupt and restore upload without any intergrity risks, so In tree or four days you can have the fullsytem upload completed.

CP -alr (undelete ntfs backup)
cp_sda1.sh >> daily linkhard backups of ntrfs root partitions with a 6 days retention, so If i move/delete accidentally any file/folder, I can restore the hardlinks.
http://forums.debian.net/viewtopic.php?f=3&t=137863
bester69 wrote:STOP 2030 globalists demons, keep the fight for humanity freedom against NWO...

Post Reply