Scrapy定制命令开启爬虫

一、单爬虫运行

每次运行scrapy都要在终端输入命令太麻烦了

在项目的目录下创建manager.py(任意名称)

from scrapy.cmdline import execute

if __name__ == '__main__':
    execute(["scrapy", "crawl", "quote", "--nolog"])

二、所有爬虫运行

1、在spiders同级创建commands目录(任意)

2、在其中创建 crawlall.py 文件,决定命令的运行

from scrapy.commands import ScrapyCommand


class Command(ScrapyCommand):
    requires_project = True

    def syntax(self):
        return '[options]'

    def short_desc(self):
        return 'Runs all of the spiders'

    def run(self, args, opts):
        spider_list = self.crawler_process.spiders.list()
        for name in spider_list:
            self.crawler_process.crawl(name, **opts.__dict__)
        self.crawler_process.start()

3、配置文件

# COMMANDS_MODULE = '项目名称.目录名称'
COMMANDS_MODULE = 'toscrapy.commands'

4、manager.py

from scrapy.cmdline import execute

if __name__ == '__main__':
    execute(["scrapy", "crawlall", "--nolog"])

 

posted @ 2019-10-28 23:11  市丸银  阅读(251)  评论(0编辑  收藏  举报