我决定使用Python日志记录模块,因为Twisted on std error生成的消息太长,并且我希望将有意义的消息(例如由StatsCollector
生成的消息) INFO
到单独的日志文件中,同时维护屏幕消息。
from twisted.python import log
import logging
logging.basicConfig(level=logging.INFO, filemode='w', filename='buyerlog.txt')
observer = log.PythonLoggingObserver()
observer.start()
好吧,这很好,我已经收到了我的消息,但缺点是我不知道这些消息是由哪个爬虫生成的!这是我的日志文件,%(name)s
显示的是"twisted“
INFO:twisted:Log opened.
2 INFO:twisted:Scrapy 0.12.0.2543 started (bot: property)
3 INFO:twisted:scrapy.telnet.TelnetConsole starting on 6023
4 INFO:twisted:scrapy.webservice.WebService starting on 6080
5 INFO:twisted:Spider opened
6 INFO:twisted:Spider opened
7 INFO:twisted:Received SIGINT, shutting down gracefully. Send again to force unclean shutdown
8 INFO:twisted:Closing spider (shutdown)
9 INFO:twisted:Closing spider (shutdown)
10 INFO:twisted:Dumping spider stats:
11 {'downloader/exception_count': 3,
12 'downloader/exception_type_count/scrapy.exceptions.IgnoreRequest': 3,
13 'downloader/request_bytes': 9973,
与twisted on标准错误生成的消息相比:
2011-12-16 17:34:56+0800 [expats] DEBUG: number of rules: 4
2011-12-16 17:34:56+0800 [scrapy] DEBUG: Telnet console listening on 0.0.0.0:6023
2011-12-16 17:34:56+0800 [scrapy] DEBUG: Web service listening on 0.0.0.0:6080
2011-12-16 17:34:56+0800 [iproperty] INFO: Spider opened
2011-12-16 17:34:56+0800 [iproperty] DEBUG: Redirecting (301) to <GET http://www.iproperty.com.sg/> from <GET http://iproperty.com.sg>
2011-12-16 17:34:57+0800 [iproperty] DEBUG: Crawled (200) <
我已经尝试了%( name )s,%(module)s,但我似乎无法显示爬行器的名称。有人知道答案吗?
编辑:在设置中使用LOG_FILE
和LOG_LEVEL
的问题是,较低级别的消息将不会在标准错误中显示。
https://stackoverflow.com/questions/8532252
复制相似问题