首页
学习
活动
专区
工具
TVP
发布
社区首页 >问答首页 >scrapy添加scrapy_redis后出现报错?

scrapy添加scrapy_redis后出现报错?

提问于 2020-03-05 21:29:20
回答 1关注 0查看 621

最近学习scrapy框架,在settings中添加scrapy_redis参数

#Enables scheduling storing requests queue in redis.
SCHEDULER = "scrapy_redis.scheduler.Scheduler"

# Ensure all spiders share same duplicates filter through redis.
DUPEFILTER_CLASS = "scrapy_redis.dupefilter.RFPDupeFilter"


# Don't cleanup redis queues, allows to pause/resume crawls.
SCHEDULER_PERSIST = True

# Specify the full Redis URL for connecting (optional).
# If set, this takes precedence over the REDIS_HOST and REDIS_PORT settings.
REDIS_URL = 'redis://127.0.0.1:6379'

后,出现报错TypeError: can't pickle SelectorList objects,具体如下:

Unhandled error in Deferred:

2020-03-05 21:13:44 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):

File "/python/pachong/lib/python3.6/site-packages/twisted/internet/base.py", line 1283, in run

self.mainLoop()

File "/python/pachong/lib/python3.6/site-packages/twisted/internet/base.py", line 1292, in mainLoop

self.runUntilCurrent()

File "/python/pachong/lib/python3.6/site-packages/twisted/internet/base.py", line 913, in runUntilCurrent

call.func(*call.args, **call.kw)

File "/python/pachong/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick

taskObj._oneWorkUnit()

--- <exception caught here> ---

File "/python/pachong/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit

result = next(self._iterator)

File "/python/pachong/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in <genexpr>

work = (callable(elem, *args, **named) for elem in iterable)

File "/python/pachong/lib/python3.6/site-packages/scrapy/core/scraper.py", line 184, in _process_spidermw_output

self.crawler.engine.crawl(request=output, spider=spider)

File "/python/pachong/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl

self.schedule(request, spider)

File "/python/pachong/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule

if not self.slot.scheduler.enqueue_request(request):

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/scheduler.py", line 167, in enqueue_request

self.queue.push(request)

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/queue.py", line 99, in push

data = self._encode_request(request)

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/queue.py", line 43, in _encode_request

return self.serializer.dumps(obj)

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/picklecompat.py", line 14, in dumps

return pickle.dumps(obj, protocol=-1)

File "/python/pachong/lib/python3.6/site-packages/parsel/selector.py", line 65, in __getstate__

raise TypeError("can't pickle SelectorList objects")

builtins.TypeError: can't pickle SelectorList objects

2020-03-05 21:13:44 [twisted] CRITICAL:

Traceback (most recent call last):

File "/python/pachong/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit

result = next(self._iterator)

File "/python/pachong/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in <genexpr>

work = (callable(elem, *args, **named) for elem in iterable)

File "/python/pachong/lib/python3.6/site-packages/scrapy/core/scraper.py", line 184, in _process_spidermw_output

self.crawler.engine.crawl(request=output, spider=spider)

File "/python/pachong/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl

self.schedule(request, spider)

File "/python/pachong/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule

if not self.slot.scheduler.enqueue_request(request):

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/scheduler.py", line 167, in enqueue_request

self.queue.push(request)

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/queue.py", line 99, in push

data = self._encode_request(request)

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/queue.py", line 43, in _encode_request

return self.serializer.dumps(obj)

File "/python/pachong/lib/python3.6/site-packages/scrapy_redis/picklecompat.py", line 14, in dumps

return pickle.dumps(obj, protocol=-1)

File "/python/pachong/lib/python3.6/site-packages/parsel/selector.py", line 65, in __getstate__

raise TypeError("can't pickle SelectorList objects")

TypeError: can't pickle SelectorList objects

2020-03-05 21:13:44 [scrapy.core.engine] INFO: Closing spider (finished)

请问是否有遇到过类似的报错,非常感谢!

相关文章

相似问题

相关问答用户
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档