scrapy添加headers
scrapy添加header
第一种,setting里面有一个默认的请求头
USER_AGENT = 'scrapy_runklist (+http://www.yourdomain.com)'
DEFAULT_REQUEST_HEADERS = {
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Language': 'en',
}
DOWNLOADER_MIDDLEWARES = {
'scrapy_runklist.middlewares.ScrapyRunklistDownloaderMiddleware': 543,
}
- 这个是默认注释的,如果要打开注意改掉,
- 这样就很容易导致浏览器封掉的可能
- 我们可以打印一下这个请求头,在下载中间件,print("request", request.headers)
- 我们可以看到就是上面设置的,
- 但是这个是全局的设置,每一个爬虫都是一样的,怎么定制其他的header参数
第二种,怎么添加自己的请求头
- 可以直接在spider文件中添加custom_settings 这个设置
custom_settings = {
'LOG_LEVEL': 'DEBUG',
'LOG_FILE': '5688_log_%s.txt' % time.time(), # 配置的日志
"DEFAULT_REQUEST_HEADERS": {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36',
}
} # 添加的请求头
- 这样就就是走的我们自己的配置请求头了
- setting里面的可以不用注释,都已经不生效了
- 还可以添加其他的header参数,
第三种,还可以添加随机的请求头
- 第一步, 在settings文件中添加一些UserAgent,在这里笔者是查找别人的
USER_AGENT_LIST=[
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1",
"Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6",
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.9 Safari/536.5",
"Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SE 2.X MetaSr 1.0; SE 2.X MetaSr 1.0; .NET CLR 2.0.50727; SE 2.X MetaSr 1.0)",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24",
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24"
]
- 同时在settings文件中设置 “DOWNLOADER_MIDDLEWARES”
DOWNLOADER_MIDDLEWARES = {
# 'lagou.middlewares.LagouDownloaderMiddleware': 543,
'lagou(项目的名称).middlewares.RandomUserAgentMiddleware': 400,
scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None
}
- 第二步骤, 在 middlewares.py 文件中导入 settings模块中的 USER_AGENT_LIST 方法
from lagou.settings import USER_AGENT_LIST
class RandomUserAgentMiddleware(object):
def process_request(self, request, spider):
rand_use = random.choice(USER_AGENT_LIST)
if rand_use:
request.headers.setdefault('User-Agent', rand_use)
- 这样就可以了
- 运行起来我们发现是可以进行随机选择这个ua的,
技术改变命运