I have a directory structure like the following in my serverless application(simplest app to avoid clutter) which I created using AWS SAM with Python 3.8 as the runtime:
├── common
│   └── a.py
├── hello_world
│   ├── __init__.py
│   ├── app.py
│   └── requirements.txt
└── template.yaml
I would like to import common/a.py module inside the Lambda handler - hello_world/app.py. Now I know that I can import it normally in Python by adding the path to PYTHONPATH or sys.path, but it doesn't work when the code is run in Lambda inside a Docker container. When invoked, the Lambda handler function is run inside /var/task directory and the folder structure is not regarded. 
I tried inserting /var/task/common, /var/common, /var and even /common to sys.path programmatically like this:
import sys
sys.path.insert(0, '/var/task/common')
from common.a import Alpha
but I still get the error:
ModuleNotFoundError: No module named 'common'
I am aware of Lambda Layers but given my scenario, I would like to directly reference the common code in multiple Lambda functions without the need of uploading to a layer. I want to have something like the serverless-python-requirements plugin in the Serverless framework but in AWS SAM.
So my question is, how should I add this path to common to PYTHONPATH or sys.path? Or is there an alternative workaround for this like [serverless-python-requirements][3] to directly import a module in a parent folder without using Lambda Layers?
 
    