scrapy 使用代理ip
1、在settings文件中添加ip池
IPPOOL=['xxx.xx.xx.xx','xxx.xx.xxx.xx']
2、在middleware文件中添加自己的代理ip类(首先需要导入ipPOOL,random模块)
class Myproxymiddleware(object):
def __init__(self,ip=''):
self.ip = ip
def process_request(self,request,spider)
ip = random.choice(IPPOOL)
request.meta['proxy'] = "http://"+ip
3、在setings中注释掉原先的中间件,启用代理ip 中间件,然后添加自己写的中间件
'scrapy.contrib.downloadermiddleware.httpproxy.HttpProxyMiddleware':543,
4、启用自己写的代理中间件
'projectname.middlewares.MyproxySpiderMiddleware':125