爬虫代理IP

爬虫使用代理ip

获得代理IP的网站:

http://www.xicidaili.com/

验证代理是否可用的方式之一:

globalUrl = "http://ip.chinaz.com/getip.aspx"

如何使用代理:

一 使用requests:

ip = "http://" + i[0]+":"+i[1]
requests.get(globalUrl,headers = header,proxies = ipdict,timeout = 3).text

二 使用 urllib:   

ip = "http://" + i[0]+":"+i[1]
try:
except Exception,e:
proxy_info = {'host': i[0],
proxy_support = urllib2.ProxyHandler({"http":"http://%(host)s:%(port)s" % proxy_info})
urllib2.install_opener(opener)
try:
except Exception,e:
proxy_info = {"host": "xxx",
"user": "xxx",
proxy_support = urllib2.ProxyHandler({"http":"http://%(user)s:%(pass)s@%(host)s:%(port)d" % proxy_info})

四 使用urllib2的request模块:

request = urllib2.Request(globalUrl,headers =header)
try:
except Exception,e:
print "%s can not use" % ip

五:使用httplib:

conn = httplib.HTTPConnection(i[0],i[1])
try:
conn.connect()
conn.request("GET",globalUrl,headers=header)
response = conn.getresponse()
print response.read()
except:
print "%s can not use" % i[0]
posted @ 2017-04-10 16:29  诡道!!!  阅读(328)  评论(0编辑  收藏  举报