Full Cloud Strategy Incremental Backup

If it doesn't relate to Debian, but you still want to share it, please do it here

Full Cloud Strategy Incremental Backup

Postby bester69 » 2018-06-22 21:04

Hi,

Right Now, Im using fsarchiver + Duplicity (to send to Box by webdav) to send a Full System Backup to the cloud (around 7Gb). I usually store a system backup in the cloud once or twice a year..
Now I was considering to use directly Duplicity to send incremental backups taking advantage of the btrfs system snapshots.

How do you see it??, Is it a properly and trusty strategy backup for the system, making incremental backups in the Cloud? :roll:

The idea is to make some incremental copies of a COW filesystem btrfs snapshot directly in the cloud with duplicity..
bester69 wrote:There is nothing to install in linux, from time to time i go to google searching for something fresh to install in linux, but, there is nothing
User avatar
bester69
 
Posts: 1215
Joined: 2015-04-02 13:15

Re: Full Cloud Strategy Incremental Backup

Postby ruffwoof » 2018-06-27 15:06

System files ... IMO I don't need to back them up as they're easily replaced. System config files ... yes, but they're relatively small. I have a "script" in which I record all the system configuration changes I've made, sometimes that could just be run straight on a new/fresh reinstall, however more often I just add notes into that script that periodically I tidy up so it is a proper script.

Data in contrast is potentially more invaluable. I tend to store data away from system files i.e. system and home are pretty much just configuration/run areas, data files are pulled in from elsewhere. As part of that I run a OpenBSD server to which regular backups are made/stored, but then like you every once in a while I backup that data to off site/geographically disconnected. I prefer to use alternatives to cloud based i.e. removable disks for that geographically disconnected backups.

If I were doing cloud backups I'd probably create a squashfs (mksquashfs after installing squashfs-tools), and encrypt that before uploading it to the cloud, just so if the cloud storage provider did get hacked the content wouldn't be easily viewed.

Typically my home/system config backups run through in seconds. Data backups are longer, i.e. over LAN speeds. Diconnected backups about the same amount of time. Something like a hour perhaps (i.e. I tend to just go off to lunch and leave it running).
ruffwoof
 
Posts: 279
Joined: 2016-08-20 21:00

Re: Full Cloud Strategy Incremental Backup

Postby bester69 » 2018-06-27 15:59

ruffwoof wrote:System files ... IMO I don't need to back them up as they're easily replaced. System config files ... yes, but they're relatively small. I have a "script" in which I record all the system configuration changes I've made, sometimes that could just be run straight on a new/fresh reinstall, however more often I just add notes into that script that periodically I tidy up so it is a proper script.

Data in contrast is potentially more invaluable. I tend to store data away from system files i.e. system and home are pretty much just configuration/run areas, data files are pulled in from elsewhere. As part of that I run a OpenBSD server to which regular backups are made/stored, but then like you every once in a while I backup that data to off site/geographically disconnected. I prefer to use alternatives to cloud based i.e. removable disks for that geographically disconnected backups.

If I were doing cloud backups I'd probably create a squashfs (mksquashfs after installing squashfs-tools), and encrypt that before uploading it to the cloud, just so if the cloud storage provider did get hacked the content wouldn't be easily viewed.

Typically my home/system config backups run through in seconds. Data backups are longer, i.e. over LAN speeds. Diconnected backups about the same amount of time. Something like a hour perhaps (i.e. I tend to just go off to lunch and leave it running).


Hi, thanks for answering,

Ive a lot of layouts backups in my computer.:

BTRFS
- One fixed home snapshot (user Data are in a above subvolume folder)
- One fixed system snapshot
- A rotative datuser snapshot (several weeks retention)

ATTIC (deduplication backup)
- dedup.sh >> daily deduplication backup of main datauser folders >> each some weeks I make an upload to Mega Cloud and a mirror copy to Google Drive
- backres.sh >> whole home deduplication backup excluding crytical files/folder >> each some few months I make an upload to Box cloud.

FSARCHIVER (filesystem backup) + DUPLICITY (deduplication backup)
- fsbackup.sh >> Every five/six months I create a full system backup excluding home in a folder partition. (around 5Gbytes)
- dupres.sh (duplicty) >> It uses duplicty (webdav) to push the previous filesytem backup in 100M volsize files into Box. You can interrupt and restore upload without any intergrity risks, so In tree or four days you can have the fullsytem upload completed.

CP -alr (undelete ntfs backup)
cp_sda1.sh >> daily linkhard backups of ntrfs root partitions with a 6 days retention, so If i move/delete accidentally any file/folder, I can restore the hardlinks.
viewtopic.php?f=3&t=137863
bester69 wrote:There is nothing to install in linux, from time to time i go to google searching for something fresh to install in linux, but, there is nothing
User avatar
bester69
 
Posts: 1215
Joined: 2015-04-02 13:15


Return to Offtopic

Who is online

Users browsing this forum: No registered users and 4 guests

fashionable