Welcome to Centmin Mod Community
Register Now

Sysadmin rclone backup scripts

Discussion in 'System Administration' started by Sunka, Oct 30, 2016.

  1. Sunka

    Sunka Well-Known Member

    1,150
    325
    83
    Oct 31, 2015
    Pula, Croatia
    Ratings:
    +525
    Local Time:
    1:30 PM
    Nginx 1.17.9
    MariaDB 10.3.22
    All day I am trying to create sript to rclone several folders from server.
    I ended with this configuration.

    Btw, I have permission backuped on daily bases and they are saved in /home/nginx/domains/pijanitvor.com/ so they are backed up too.
    I can not first tar directories, because there will not be enough space for files and tared files, so I am using Rclone on not compressed files.

    It is pity that incremental backups are not option in Rclone, like they are in rsync, so my script is just classic "add another command".
    Also, there is still(?) bug when sync do not delete empty directories on remote server.
    Also, is recommanded (do not know is this bug fixed or not) to delete and create directories on remote server by classic command (purge and mkdir).

    So, I have 4 directories where everything (I think) what need to be backup is backuped.
    First I run initial "0 day old" backup of all 4 directories.
    Because there is 10000 attachments in there, this backup is still running (more than 4 hours, and all backup will be about 37 GB), but I started through cron with bash script, so I can turn off session on cli and my computer.

    Code:
    /usr/sbin/rclone sync /home/nginx/domains/pijanitvor.com/ ptdrive:'##PT - Rclone'/0/pijanitvor.com
    /usr/sbin/rclone sync /usr/local/nginx/conf/ ptdrive:'##PT - Rclone'/0/conf
    /usr/sbin/rclone sync /mnt/xenforo-data/ ptdrive:'##PT - Rclone'/0/xenforo-data
    /usr/sbin/rclone sync /var/lib/redis/ ptdrive:'##PT - Rclone'/0/redis
    When that will be finished, I will manually create another 13 folders on Gdrive, so I will have folders from 0-13. That will be sign how old is backup from today (similiar like rsync do), and after that I will create this script and make it run with cron once a day.

    Code:
    #!/bin/bash
    /usr/sbin/rclone purge ptdrive:'##PT - Rclone'/13
    /usr/sbin/rclone mkdir ptdrive:'##PT - Rclone'/13
    /usr/sbin/rclone copy ptdrive:'##PT - Rclone'/12/ ptdrive:'##PT - Rclone'/13
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/11/ ptdrive:'##PT - Rclone'/12
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/10/ ptdrive:'##PT - Rclone'/11
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/9/ ptdrive:'##PT - Rclone'/10
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/8/ ptdrive:'##PT - Rclone'/9
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/7/ ptdrive:'##PT - Rclone'/8
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/6/ ptdrive:'##PT - Rclone'/7
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/5/ ptdrive:'##PT - Rclone'/6
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/4/ ptdrive:'##PT - Rclone'/5
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/3/ ptdrive:'##PT - Rclone'/4
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/2/ ptdrive:'##PT - Rclone'/3
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/1/ ptdrive:'##PT - Rclone'/2
    /usr/sbin/rclone sync ptdrive:'##PT - Rclone'/0/ ptdrive:'##PT - Rclone'/1
    /usr/sbin/rclone sync /home/nginx/domains/pijanitvor.com/ ptdrive:'##PT - Rclone'/0/pijanitvor.com
    /usr/sbin/rclone sync /usr/local/nginx/conf/ ptdrive:'##PT - Rclone'/0/conf
    /usr/sbin/rclone sync /mnt/xenforo-data/ ptdrive:'##PT - Rclone'/0/xenforo-data
    /usr/sbin/rclone sync /var/lib/redis/ ptdrive:'##PT - Rclone'/0/redis
    It is needs to be in that order, from 13 to 0, so every backup will be moved to day older directory. So in the end, that will be backup for last 2 weeks, one daay in one folder. All about 500 GB.

    Also, I run same folders (7 days backup) on another server with rsync (incremental backup).

    I hope that this will be helping to someone.

    Any suggestions, thoughts, bugs?

     
  2. Sunka

    Sunka Well-Known Member

    1,150
    325
    83
    Oct 31, 2015
    Pula, Croatia
    Ratings:
    +525
    Local Time:
    1:30 PM
    Nginx 1.17.9
    MariaDB 10.3.22
    But this is my rclone usage.

    This is bash script for Rclone to run once a day through cron. It will created 14 days backup.

    And there is also two "bugs" for Rclone that I found on Rclone forum and think that it would be Ok to mention it here so other users should have them in mind.

    I posted here so maybe will use it, or it will be help for creating another script.
    Feel free to move this my few posts from this thread to some another one (y)
     
  3. eva2000

    eva2000 Administrator Staff Member

    55,801
    12,271
    113
    May 24, 2014
    Brisbane, Australia
    Ratings:
    +18,857
    Local Time:
    9:30 PM
    Nginx 1.27.x
    MariaDB 10.x/11.4+
    some hints for ways of backing up via shell script (google for more info)

    using a for loop array
    Code (Text):
    BACKUPDIR_SOURCES='/home/nginx/domains/demodomain.com /usr/local/nginx/conf'
    
    for d in ${BACKUPDIR_SOURCES[@]}; do
      echo "backup: $d"
    done

    Resulting output would be
    Code (Text):
    backup: /home/nginx/domains/demodomain.com
    backup: /usr/local/nginx/conf
    
     
  4. eva2000

    eva2000 Administrator Staff Member

    55,801
    12,271
    113
    May 24, 2014
    Brisbane, Australia
    Ratings:
    +18,857
    Local Time:
    9:30 PM
    Nginx 1.27.x
    MariaDB 10.x/11.4+
    another hint for determining remote storage age of directory

    Google Drive remote = gdrive1 with remote directory called centos7.localdomain
    Code (Text):
    rclone -q lsd gdrive1:
    -1 2016-10-28 23:08:06 -1 centos7.localdomain
    

    Getting just the directory's date/time
    Code (Text):
    rclone -q lsd gdrive1: | awk '{print $2,$3}'
    2016-10-28 23:08:06
    

    Converting that date/time to unix epoch time
    Code (Text):
    date "+%s" -d "$(rclone -q lsd gdrive1: | awk '{print $2,$3}')"
    1477696086
    

    determining remote directory's age
    Code (Text):
    today=$(date +%s)
    remote_date=$(date "+%s" -d "$(rclone -q lsd gdrive1: | awk '{print $2,$3}')")
    remote_age=$(($today-$remote_date))
    remote_dirname=$(rclone -q lsd gdrive1: | awk '{print $5}')
    echo "Remote directory $remote_dirname is $remote_age seconds old"
    Remote directory centos7.localdomain is 89682 seconds old