scrapy爬虫之断点续爬和多个spider同时爬取
摘要:
from scrapy.commands import ScrapyCommand from scrapy.utils.project import get_project_settings #断点续爬scrapy crawl spider_name -s JOBDIR=crawls/spider_ 阅读全文
posted @ 2018-03-20 10:04 Dbass 阅读(3192) 评论(0) 推荐(0) 编辑