I had the same problem, but I used source control as a solution to this. 
My workflow is using Gitlab CI > AWS Code Pipeline (S3 Source and CodeDeploy).
So in my development branch, my AppSpec file would look like this:-
version: 0.0
os: linux
files:
  - source: /
    destination: /var/www/html/my-project-dev
hooks:
  AfterInstall:
    - location: scripts/after_install.sh
      timeout: 400
      runas: root
in my staging branch:-
version: 0.0
os: linux
files:
  - source: /
    destination: /var/www/html/my-project-staging
hooks:
  AfterInstall:
    - location: scripts/after_install.sh
      timeout: 400
      runas: root
My Gitlab-CI just uses a shell executor to my EC2 instance and it basically compresses my project folder and uploads to S3.
.gitlab-ci.yml
stages:
  - deploy
setup dependencies:
  stage: .pre
  script:
    - echo "Setup Dependencies"
    - pip install awscli
deploy to s3:
  stage: deploy
  script:
    - tar -cvzf /tmp/artifact_$CI_COMMIT_REF_NAME.tar ./*
    - echo "Copy artifact to S3"
    - aws s3 cp /tmp/artifact_$CI_COMMIT_REF_NAME.tar s3://project-artifacts/
clean up:
  stage: .post
  script:
    - echo "Removing generated artifact"
    - rm /tmp/artifact_$CI_COMMIT_REF_NAME.tar
Note that $CI_COMMIT_REF_NAME is used to differentiate the artifact file being generated. In development branch it would be artifact_development.tar, in staging  branch artifact_staging.tar.
Then, I have 2 pipelines listening to the two respective artifacts which deploys to 2 different CodeDeploy Application. 
Not sure if this is the best way, surely welcome any suggestions that is better