/usr/lib/python2.7/site-packages/scrapy中杂乱无章的库
我的项目目录:
.../projects/scrapy
.../projects/parser_module
....../projects/parser_module/parser
....../projects/parser_module/parser
........../projects/parser_module/parser/spiders/.....
........../projects/parser_module/parser/<files etc>
....../projects/parser_module/scrapy.cfg在目录.../projects/parser_module/中,我设置了命令scrapy crawl parser并获取结果:
Traceback (most recent call last):
File "/usr/bin/scrapy", line 4, in <module>
execute()
File "/usr/lib/python2.7/site-packages/scrapy/cmdline.py", line 109, in execute
settings = get_project_settings()
File "/usr/lib/python2.7/site-packages/scrapy/utils/project.py", line 60, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/usr/lib/python2.7/site-packages/scrapy/settings/__init__.py", line 108, in setmodule
module = import_module(module)
File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
ImportError: No module named settings 你能告诉我如何解决这个问题吗?
发布于 2014-07-04 22:58:18
要避免此类问题,请使用scrapy startproject parser_module创建项目文件夹
现在,为了解决您的问题,您可以重新开始,或者使用scrapy startproject创建一个虚拟项目以从中复制setting.py。然后,也许下一个错误,您将在此文件夹中找到其他丢失的文件。
这是一个垃圾项目的典型结构。
.
├── scrapy.cfg
└── project_name
├── __init__.py
├── items.py
├── settings.py
└── spiders
└── __init__.py
spider.pyhttps://stackoverflow.com/questions/24570960
复制相似问题