I am trying to figure out a good way to package and deploy a number of python packages I created. Eventually, I would like to use some package repository for deployment or create some sort of setup script.
The structure of my project is as follows: I have two subprojects A and B that both use tools from another self-created package C. The tools in C are for internal use only and not of bigger interest to a general audience. However, A and B shall be deployed. I want that users can install A and B independently of each other, but I do not need/want to deploy C as standalone.
In the best case, I would like users to be able to install the packages with something along the lines of
pip install my_project.A
or
pip install my_project.B
Furthermore, in A, I would like to import C as follows:
import my_project.C
Would I have to package A, B, and C independently and work with install_requires in setuptools.setup? If all projects belong to a large "meta-project", is there a way I can bundle them together weakly while maintaining their general independence? (In my example, C contains "tools". I do not want to deploy a package with such a generic name. Would I then have to deploy it as my_project_tools?)
Is there a way so that all packages installed from that meta-package are installed in the same folder hierarchy while allowing the installation of separate components of that meta-package?
There is a related answer here, but my problem differs in that there is the common dependency C.
I am new to packaging, so I appreciate answers that do not suppose too much background knowlege.
Additional info: I am using Python 3.7, and the program is not compatible with lower versions. It should run platform independently, though. Some modules require compilation and Cython.