I have a Python project I've been working on for three months, and the project consists of several modules for our team to deploy our software and run scale tests on AWS.
Now I want to take the source code, compile that into a binary distribution and bundle all the modules there, like the way python libraries are done. That's for Unix systems.
So, suppose one of the modules is called cluster.py, the other scale.py ... if I wanted to create a cluster in AWS I would run:
python cluster.py <args>
I've been reading about dist in Python, and I wanted to do exactly the same how Python libraries do:
The user downloads the package, lets say a .zip file, and run:
python setup.py build
then
python setup.py install
so the user could from the command line execute:
cluster <args>
Thus, as I add more functionalities, they would be all bundled up into the package. However, to install all the libraries and other dependencies, I use another python script, called deploy.py. So the build step would consist pretty much of running deploy.py.
from setuptools import setup, find_packages
setup(
name = "Tools",
version = "1.0",
packages = find_packages(),
scripts = ['deploy.py', 'cluster.py'],
# Project uses reStructuredText, so ensure that the docutils get
# installed or upgraded on the target machine
install_requires = ['docutils>=0.3'],
package_data = {
},
# metadata for upload to PyPI
author = "Me",
author_email = "me@example.com",
description = "This is an Example Package",
license = "PSF",
keywords = "hello world example examples",
url = "http://example.com/HelloWorld/", # project home page, if any
# could also include long_description, download_url, classifiers, etc.
)