网站数据定时备份脚本分享 (保留最近的30份备份数据)

 

备份是我们运维人员最基本的日常工作,做好备份是稳定运维的一个重要环节。下面分享两个使用过的简单备份脚本:

1)网站数据备份
将网站数据/var/www/vhost/www.kevin.com和/var/www/vhost/www.grace.com分别备份到:
/Data/code-backup/www.kevin.com和/Data/code-backup/www.grace.com下。

[root@huanqiu_web5 code-backup]# cat web_code_backup.sh
#!/bin/bash
  
#备份网站数据
/bin/tar -zvcf /Data/code-backup/www.kevin.com/www.kevin.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.kevin.com
/bin/tar -zvcf /Data/code-backup/www.grace.com/www.grace.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.grace.com
  
#删除一周之前的备份文件
find /Data/code-backup/www.kevin.com -type f -mtime +7 -exec rm -f {} \;
find /Data/code-backup/www.grace.com -type f -mtime +7 -exec rm -f {} \;
  
[root@huanqiu_web5 ~]# crontab -l
#每天凌晨5点备份网站数据
0 5 * * * /bin/bash -x /Data/code-backup/web_code_backup.sh > /dev/null 2>&1
 
备份后的效果如下:
[root@huanqiu_web5 ~]# ls /Data/code-backup/www.kevin.com/
www.kevin.com_20170322_174328.tar.gz
[root@xqsj_web5 ~]# ls /Data/code-backup/www.grace.com/
www.grace.com_20170322_174409.tar.gz

2)数据库备份(自动删除10天前的备份文件)
数据库服务使用的是阿里云的mysql,远程进行定时的全量备份,备份到本地,以防万一。mysql数据库远程备份的数据最好打包压缩

[root@huanqiuPC crontab]# pwd
/Data/Mysql_Bakup/crontab
[root@huanqiuPC crontab]# cat backup_db_wangshibo.sh
#!/bin/bash
MYSQL="/usr/bin/mysql"
MYSQLDUMP="/usr/bin/mysqldump"
BACKUP_DIR="/Data/Mysql_Bakup"
#DB_SOCKET="/var/lib/mysql/mysql.sock"
DB_hostname="110.120.11.9"
DBNAME="wangshibo"
DB_USER="db_wangshibo"
DB_PASS="mhxzk3rfzh"
TIME=`date +%Y%m%d%H%M%S`
LOCK_FILE="${BACKUP_DIR}/lock_file.tmp"
BKUP_LOG="/Data/Mysql_Backup/${TIME}_bkup.log"
DEL_BAK=`date -d '10 days ago' '+%Y%m%d'`
##To judge lock_file
if [[ -f $LOCK_FILE ]];then
exit 255
else
echo $$ > $LOCK_FILE
fi

##dump databases##
echo ${TIME} >> ${BKUP_LOG}
echo "=======Start Bakup============" >>${BKUP_LOG}
#${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} | gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz
${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} |gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz
echo "=======Finished Bakup============" >>${BKUP_LOG}
/bin/rm -f ${LOCK_FILE}

##del back 10 days before##
/bin/rm -f ${BACKUP_DIR}/${DEL_BAK}*.gz

定时进行备份

[root@huanqiuPC Mysql_Bakup]# crontab -l
10 0,6,12,18 * * * /bin/bash /Data/Mysql_Bakup/crontab/backup_db_wangshibo.sh >/dev/null 2>&1

脚本执行后的备份效果如下

[root@huanqiuPC crontab]# cd /Data/Mysql_Bakup
[root@huanqiuPC Mysql_Bakup]# ls
20161202061001.wangshibo.gz

同步线上数据库到beta环境数据库(覆盖beta数据库):
将上面定时备份的数据包拷贝到beta机器上,然后解压,登陆mysql,source命令进行手动覆盖。

                                                                                                  再看一例                                                                                                      

[root@backup online_bak]# cat rsync.sh      (脚本中的同步:限速3M,保留最近一个月的备份)
#!/bin/bash

# ehr data backup----------------------------------------------------------
cd /data/bak/online_bak/192.168.34.27/tomcat_data/
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.27:/data/tomcat7/webapps /data/bak/online_bak/192.168.34.27/tomcat_data/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.34.27/tomcat_data/
NUM1=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I1=$( /usr/bin/expr $NUM1 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I1 p"|xargs rm -rf

# zp data backup----------------------------------------------------------
cd /data/bak/online_bak/192.168.34.33/tomcat_data/
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/data/tomcat8/webapps /data/bak/online_bak/192.168.34.33/tomcat_data/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.34.33/tomcat_data/
NUM2=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I2=$( /usr/bin/expr $NUM2 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I2 p"|xargs rm -rf

cd /data/bak/online_bak/192.168.34.33/upload
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/home/zrx_hr/upload /data/bak/online_bak/192.168.34.33/upload/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.34.33/upload
NUM3=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I3=$( /usr/bin/expr $NUM3 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I3 p"|xargs rm -rf

# zabbix mysql backup----------------------------------------------------------
/bin/mkdir /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`
/data/mysql/bin/mysqldump -hlocalhost -uroot -pBKJK-@@@-12345 --databases zabbix > /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`/zabbix.sql

cd /data/bak/online_bak/192.168.16.21/mysql_data/
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.16.21/mysql_data/
NUM4=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I4=$( /usr/bin/expr $NUM4 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I4 p"|xargs rm -rf

[root@backup online_bak]# pwd
/data/bak/online_bak
[root@backup online_bak]# ls
192.168.16.21    rsync.sh
192.168.34.27  192.168.34.33  
[root@backup online_bak]# ll
total 10K
drwxr-xr-x   3 root root   23 Aug 19 17:47 192.168.16.21
drwxr-xr-x   4 root root   41 Aug 19 18:30 192.168.34.27
drwxr-xr-x   4 root root   37 Aug 19 18:17 192.168.34.33
-rwxr-xr-x   1 root root 6.3K Aug 19 19:20 rsync.sh

[root@backup online_bak]# ll 192.168.16.21/
total 4.0K
drwxr-xr-x  2 root root   28 Aug 19 19:43 mysql_data

[root@backup online_bak]# ll 192.168.16.21/mysql_data/
total 1.5G
-rw-r--r-- 1 root root 1.5G Aug 19 19:43 20170819.tar.gz

[root@backup online_bak]# ll 192.168.34.27
total 4.0K
drwxr-xr-x  2 root root 4.0K Aug 19 19:26 tomcat_data

[root@backup online_bak]# ll 192.168.34.27/tomcat_data/
total 3.9G
......
-rw-r--r-- 1 root root 140M Aug 19 11:06 20170818.tar.gz
-rw-r--r-- 1 root root 140M Aug 19 19:26 20170819.tar.gz

[root@backup online_bak]# ll 192.168.34.33
total 8.0K
drwxr-xr-x  2 root root 4.0K Aug 19 19:26 tomcat_data
drwxr-xr-x  2 root root   28 Aug 19 19:30 upload

[root@backup online_bak]# crontab -l
# online backup
0 2 * * * /bin/bash -x /data/bak/online_bak/rsync.sh >/dev/null 2>&1

                                                                                                                                                                                                         

取一个目录下,按照文件/目录的修改时间来排序,取最后一次修改的文件
[work@qd-op-comm01 xcspam]$ ls
bin                    xcspam-20170802145542  xcspam-20170807204545  xcspam-20170814115753  xcspam-20170818115806  xcspam-20170824162641  xcspam-20170831173616  
xcspam                 xcspam-20170802194447  xcspam-20170808163425  xcspam-20170815191150  xcspam-20170821122949  xcspam-20170824165020  xcspam-20170831191347
xcspam-20170731154018  xcspam-20170803113809  xcspam-20170808195340  xcspam-20170815210032  xcspam-20170821153300  xcspam-20170829100941  xcspam-20170904105109
xcspam-20170801190647  xcspam-20170807150022  xcspam-20170809103648  xcspam-20170816141022  xcspam-20170822173600  xcspam-20170831135623  xcspam-20170911120519
xcspam-20170802142921  xcspam-20170807164137  xcspam-20170809111246  xcspam-20170816190704  xcspam-20170823101913  xcspam-20170831160115  xcspam-20170911195802
[work@qd-op-comm01 xcspam]$ ls -rtd xcspam* |tail -1
xcspam-20170911195802

[work@qd-op-comm01 xcspam]$ ls -rtd xcspam* |tail -2|head -1   //这是倒数第二个被修改的文件

                                                                                                                                  

自动删除30天之前的备份数据, 即保留最近的30份备份数据, 脚本如下(这个可以作为通用脚本):

[root@qw-backup01 caiwu]# cat delete_30days_before.sh 
#!/bin/bash

cd `pwd`

NUM=`ls -l|awk '{print $9}'|wc -l`
I=$( /usr/bin/expr $NUM - 31 )
ls -l|awk '{print $9}'|sed -n "1,$I p"|xargs rm -rf

[root@qw-backup01 caiwu]# ls
201901100.des3  20190141.des3  20190150.des3  20190159.des3  20190168.des3  20190177.des3  20190186.des3  20190195.des3
20190133.des3   20190142.des3  20190151.des3  20190160.des3  20190169.des3  20190178.des3  20190187.des3  20190196.des3
20190134.des3   20190143.des3  20190152.des3  20190161.des3  20190170.des3  20190179.des3  20190188.des3  20190197.des3
20190135.des3   20190144.des3  20190153.des3  20190162.des3  20190171.des3  20190180.des3  20190189.des3  20190198.des3
20190136.des3   20190145.des3  20190154.des3  20190163.des3  20190172.des3  20190181.des3  20190190.des3  20190199.des3
20190137.des3   20190146.des3  20190155.des3  20190164.des3  20190173.des3  20190182.des3  20190191.des3  delete_30days_before.sh
20190138.des3   20190147.des3  20190156.des3  20190165.des3  20190174.des3  20190183.des3  20190192.des3
20190139.des3   20190148.des3  20190157.des3  20190166.des3  20190175.des3  20190184.des3  20190193.des3
20190140.des3   20190149.des3  20190158.des3  20190167.des3  20190176.des3  20190185.des3  20190194.des3

执行脚本
[root@qw-backup01 caiwu]# sh -x delete_30days_before.sh 
+ cd /data/backup/caiwu
++ ls -l
++ awk '{print $9}'
++ wc -l
+ NUM=70
++ /usr/bin/expr 70 - 31
+ I=39
+ ls -l
+ awk '{print $9}'
+ sed -n '1,39 p'
+ xargs rm -rf

再次查看, 发现只保留了30天之内的备份数据
[root@qw-backup01 caiwu]# ls
20190170.des3  20190174.des3  20190178.des3  20190182.des3  20190186.des3  20190190.des3  20190194.des3  20190198.des3
20190171.des3  20190175.des3  20190179.des3  20190183.des3  20190187.des3  20190191.des3  20190195.des3  20190199.des3
20190172.des3  20190176.des3  20190180.des3  20190184.des3  20190188.des3  20190192.des3  20190196.des3  delete_30days_before.sh
20190173.des3  20190177.des3  20190181.des3  20190185.des3  20190189.des3  20190193.des3  20190197.des3
posted @ 2016-12-07 19:09  散尽浮华  阅读(5034)  评论(0编辑  收藏  举报