I am working on a python project which contains two modules. The modules are very closely related and hence I want them to be in the same git repo, and to be able to develop them together in my IDE:
module1depends onmodule2module1has lots of other heavy dependencies, module2 does notmodule1andmodule2will be used in different runtime environmentsmodule2should be installable separately so it can run in e.g. an AWS Lambda
Consequently I have tried to set up a project structure which contains the two modules in two folders within one repo, each folder having a setup.py so it can packaged. Is this a reasonable approach?
\module1
setup.py
\module1
__init__.py
[scripts].py
\module2
setup.py
\module2
__init__.py
[scripts].py
The above structure should allow me to work on the project locally treating module2 as a regular module, but the setup.py file means that it can be distributed as it's own package, right?
The problem is that I can't include the module2 dependency in module1's setup.py:
from setuptools import setup
setup(
name="module1",
version="0.1",
packages=['module1'], # I really need to include module2 scripts here, right?...
install_requires=['pandas', 'numpy', ...]
)
Does anyone have any advice on how to approach this problem? The two solutions I can guess are:
- Packaging and publishing
module2on it's own before depending on it frommodule1. This makes development much more infleixble - Nesting
module2withinmodule1so I can include it in thepackagesargument to thesetup(...)function. This breaks the clarity thatmodule1should be treatingmodule2as if it were essentially an external dependency...