向scrapy爬虫传参,向scrapy部署后的爬虫传参

 

 

https://blog.csdn.net/c0411034/article/details/81750028

 

https://blog.csdn.net/Q_AN1314/article/details/50748700

 


 

在爬虫里面 接收 参数

 def __init__(self, pid=None, *args, **kwargs):
        # print(pid)
        super(yourSpider, self).__init__(*args, **kwargs)

 

 

本地启动 ,使用参数 
execute(["scrapy", "crawl", "spider_p2", "-a", "pid=3687695"])

 

用scrapyd部署后 
post 参数里面
project=your_project&spider=your_spider&pid=3687696



 

posted @ 2020-01-15 13:42  AngDH  阅读(1144)  评论(0编辑  收藏  举报