I have a python virtualvenv in a directory and inside it, I have created a scrapy project using startproject command line.
My directory structure:
root
| settings.py
| env (virtual env)
| requirements.txt (has scrapy, splash and other modules)
| __init__.py
|
| shared (shared code)
| | __init__.py
| | helper.py
|
| scrapy_project
| | scrapy.cfg
| | WebCrawler
| | | __init__.py
| | | settings.py
| | | items.py
| | | spiders
| | | | SplashSpider.py
When I want to run the scrapy_project spider I do cd scrapy_project and then scrapy crawl splash_spider but the problem is that inside my SplashSpider.py I need to import some code from shared module in the root directory.
I tried importing the helper.py using from shared import helper but I get this error ModuleNotFoundError: No module named 'shared'
I'm trying to have multiple scrapy projects under the root directory and use the shared module in each project's spiders. What's the best way to do that?