Scrapy同时启动多个爬虫
1. 在项目文件夹中新建一个commands文件夹
2. 在command的文件夹中新建一个文件 crawlall.py
3.在crawlall.py 中写一个command类,该类继承 scrapy.commands
from scrapy.commands import ScrapyCommand class Command(ScrapyCommand): requires_project = True def syntax(self): return '[options]' def short_desc(self): return 'Runs all of the spiders' def run(self, args, opts): spider_list = self.crawler_process.spiders.list() for name in spider_list: self.crawler_process.crawl(name, **opts.__dict__) self.crawler_process.start()
- 命令行执行:启动所有爬虫 scrapy crawlall
有疑问可以加wx:18179641802,进行探讨