打印 上一主题 下一主题 利用cURL实现单个文件分多段同时下载,支持断点续传(修订版)
利用cURL实现单个文件分多段同时下载,支持断点续传(修订版) [复制链接] |
摘自 http://bbs.chinaunix.net/thread-917952-1-1.html
在ubuntu下测试通过, 适合在支持多线程下载的站点下载文件
可以配合flashgot在firefox中使用
用法:./mycurl url [referedUrl]
第一个参数url是要下载的文件的地址,第二个参数referedUrl是指需要参照的网址(一般不需要,有些网站,比如华军需要此参数)
例如:
./mycurl ftp://xx.xxx.xxx/xxx.rar
或者
./mycurl http://xx.xxx.xx/xxx.rar http://www.xxx.xxx/yy.htm
下面是代码:
#!/bin/bash
####################################################################
#
# Script for curl to support resumable multi-part download.
#
# Tested on Ubuntu
#
url=$1
# How many "parts" will the target file be divided into?
declare -i parts=5
read -ep "Please input the target directory: " targetdir
read -ep "Please input the outfile name: " outfile
[ -z "$targetdir" ] && targetdir="./"
cd $targetdir||exit 2
[ -z "$outfile" ] && outfile=`basename $1`
#Set the referer url
if [ -n "$2" ]; then
refurl="-L -e $2"
else refurl=""
fi
length=`curl $refurl -s -I $url|grep Content-Length|tail -n 1|sed s/[^0-9]//g`
if [ -z "$length" ]; then
echo "cann't get the length of the target file"
exit 1
fi
let "length = $length"
#lsession is used to record how many bytes of each subpart should be downloaded
declare -i lsession=$(($length/$parts))
finished="false"
#Assume the available maximum connections on server can reach "parts" at first
maxconn=$parts
while true;
do
for (( i=1; i<=parts ; i=i+1 ))
do
#Array offsetold is used to record how many bytes have been downloaded of each subpart
if [ -e $outfile$i ]; then
offsetold[$i]=`ls -l $outfile$i|awk '{print $5}'`
else offsetold[$i]=0
fi
let "offsetold[$i] = ${offsetold[$i]}"
done
curr=0
for (( i=1; i<=parts && maxconn>0; i=i+1 ))
do
if [ $i -lt $parts ]; then
if [ ${offsetold[$i]} -lt $lsession ]; then
curl $refurl -r $(($curr+${offsetold[$i]}))-$(($curr+$lsession-1)) $url >> $outfile$i &
maxconn=$(($maxconn-1))
fi
else
if [ ${offsetold[$i]} -lt $(($length-$(($lsession*$(($parts-1)))))) ]; then
curl $refurl -r $(($curr+${offsetold[$i]}))- $url >> $outfile$i &
maxconn=$(($maxconn-1))
fi
fi
curr=$(($curr+$lsession))
done
#To wait for all curl processes to terminate.
wait
finished="true"
maxconn=0
for (( i=1; i<=parts; i=i+1 ))
do
#Array offsetnew is used to record how many bytes have been downloaded of each subpart
if [ -e $outfile$i ]; then
offsetnew[$i]=`ls -l $outfile$i|awk '{print $5}'`
else offsetnew[$i]=0
fi
let "offsetnew[$i] = ${offsetnew[$i]}"
if [ $i -lt $parts ]; then
if [ ${offsetnew[$i]} -lt $lsession ]; then
finished="false"
fi
else
if [ ${offsetnew[$i]} -lt $(($length-$(($lsession*$(($parts-1)))))) ]; then
finished="false"
fi
fi
#Calculate the "real" available maximum connections supported by server
if [ ${offsetnew[$i]} -gt ${offsetold[$i]} ]; then
maxconn=$(($maxconn+1))
fi
done
if [ "$finished" == "true" ]; then
break
elif [ $maxconn -eq 0 ]; then
echo "Some errors may occur. retry 10 sec later..."
sleep 10
maxconn=parts
fi
done
echo "All parts have been downloaded. Merging..."
mv --backup=t $outfile"1" $outfile
for (( i=2; i<=parts; i=i+1))
do
cat $outfile$i >> $outfile
rm $outfile$i
done
echo "Done."
[ 本帖最后由 ypxing 于 2007-4-4 21:45 编辑 ]
|
|
|
2楼
[报告]
发表于 2007-04-01 11:01:09
|只看该作者
|
|
SUN E4500/SUN F4800/SUN V880 Solaris 8 KSH/NAWK/SED/VIM 6.3.3/perl 5.005_03 |
||
- 论坛徽章:
- 0
3楼 [报告]
回复 2楼 一梦如是 的帖子
谢谢大侠指点, 学习中... 俺也去试试axel |
钢七连
不抛弃,不放弃
http://ypxing.cublog.cn/
|
4楼
[报告]
发表于 2007-04-01 11:23:22
|只看该作者
|
|
慷慨陈词,岂能皆如人意,鞠躬尽瘁,但求无愧我心。 stay hungry, stay foolish https://github.com/tcler http://www.tldp.org/LDP/abs/html |
||
- 论坛徽章:
- 0
5楼 [报告]
哇,支持! |
爱家、爱国、爱和平、爱自由、爱生活、爱大自然!
|
6楼
[报告]
发表于 2007-04-01 19:28:31
|只看该作者
|
|
- 论坛徽章:
- 0
7楼 [报告]
一开始写这个script是为了和flashgot配合,在firefox里使用来者,呵呵 我也经常用wget的 原帖由 mmx384 于 2007-4-1 19:28 发表 |
钢七连
不抛弃,不放弃
http://ypxing.cublog.cn/
|
8楼
[报告]
发表于 2007-04-01 23:38:42
|只看该作者
|
ASUSW3Z-W3HT30; DELL Lattitude D630; ThinkPadX61 7675C ; DELL OPTIPLEX-755 | |
- 论坛徽章:
- 0
9楼 [报告]
谢谢 用他们提供的库可以开发更友好,功能更复杂的东东,有时间要试一下的 原帖由 baif 于 2007-4-1 23:38 发表 |
钢七连
不抛弃,不放弃
http://ypxing.cublog.cn/
|
10楼
[报告]
发表于 2007-04-03 21:28:23
|只看该作者
|
|
钢七连 不抛弃,不放弃 http://ypxing.cublog.cn/ |