每天定时下载gfs资料shell脚本

在数值天气预报应用中,经常需要下载一些输入资料,美国ncep的gfs资料是常用的一种分析场资料。业务运行,需要每天定时从ncep网站上下载,所以写了一个Shell脚本实现这一功能。脚本内容如下:

#!/bin/bash

export LANG=C

# date setting
if [ $# -eq 0 ];then
  echo "+++++ Error hours, please check argument! +++++"
  echo "  Usage: $0 00/06/12/18"
  exit 1
else
  hh=$1
  shift
fi

if [ $hh -ne 00 -a $hh -ne 06 -a $hh -ne 12 -a $hh -ne 18 ];then
  echo "+++++ Error hours, please check argument! +++++"
  echo "  Usage: $0 00/06/12/18"
  exit 1
fi

if [ $# -ne 0 ];then
  rundate=$1
else
  rundate=`date -u +%Y%m%d`
fi

gdate=$rundate

# download gfs files to tmp dir
tmppath=/dev/shm/gfs/${gdate}${hh}
rm -rf $tmppath && mkdir -p $tmppath
cd $tmppath
for i in `seq 0 6 72`
do
  filepath=ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.${gdate}${hh}
  filename=gfs.t${hh}z.pgrb2f`printf %02d $i`
  # use axel as first chosen downloader, proz and wget as second and third
  /usr/local/bin/axel -n 8 -v  ${filepath}/${filename} >& log.${filename}
  if ! grep "^Downloaded.*megabytes" log.${filename} ;then
    echo "download with axel unsuccessfully! redownload with prozilla"
    /usr/local/bin/proz -k=8 -r -f --no-curses --no-netrc --no-getch -v ${filepath}/${filename} >& log.${filename}
    if ! grep "All Done" log.${filename} ;then
      /usr/bin/wget ${filepath}/${filename} >& log.${filename}
    fi
  fi
done

gfspath=/data/gfs/${gdate}${hh}
test -d $gfspath || mkdir -p $gfspath

# move downloaded gfs files to final dir
cd $tmppath
for i in `seq 0 6 72`
do
  filename=gfs.t${hh}z.pgrb2f`printf %02d $i`
  cp $filename $gfspath && rm -f $filename
done

echo
echo "GFS data files downloaded successfully!"
date

posted on 2014-02-28 11:48  氨帝的萝卜  阅读(2674)  评论(0编辑  收藏  举报

导航