python爬虫--scrapy日志

scrapy日志

日志等级

Scrapy提供5层logging级别:

CRITICAL - 严重错误(critical)

ERROR - 一般错误(regular errors)

WARNING - 警告信息(warning messages)

INFO - 一般信息(informational messages)

DEBUG - 调试信息(debugging messages)

在settings.py中配置

import datetime

#配置日志文件名和位置
to_day = datetime.datetime.now()
log_file_path = "log/scrapy_{}_{}_{}.log".format(to_day.year,to_day.month,to_day.day)
LOG_FILE = log_file_path
LOG_LEVEL = "DEBUG"

在pipelines.py中:

import logging

logger = logging.getLogger(__name__)

def process_item(self, item, spider):
    logger.warning(item)

在spider文件中引入Log日志:

class DcdappSpider(scrapy.Spider):
    name = 'dcdapp'
    allowed_domains = ['m.dcdapp.com']
    custom_settings = {
        # 设置管道下载
        'ITEM_PIPELINES': {
            'autospider.pipelines.DcdAppPipeline': 300,
        },
        # 设置log日志
        'LOG_LEVEL':'DEBUG',
        'LOG_FILE':'./././Log/dcdapp_log.log'
    }
posted @ 2020-02-22 11:06  corei5tj  阅读(33)  评论(0编辑  收藏  举报