I am using the following code in my bitbucket-pipelines.yml files to remotely deply code to a staging server.
image: php:7.1.1
pipelines:
  default:
    - step:
        script:
          # install ssh
          - apt-get update && apt-get install -y openssh-client
          # get the latest code
          - ssh user@domain.com -F ~/.ssh/config "cd /path/to/code && git pull"
          # update composer
          - ssh user@domain.com -F ~/.ssh/config "cd /path/to/code && composer update --no-scripts"
          # optimise files
          - ssh user@domain.com -F ~/.ssh/config "cd /path/to/code && php artisan optimize"
This all works, except that each time the pipeline is run, the ssh client is downloaded and installed everything (adding ~30 seconds to the build time). Is there way I can cache this step?
And how can I go about caching the apt-get step?
For example, would something like this work (or what changes are needed to make the following work):
pipelines:
  default:
    - step:
        caches:
          - aptget
        script:
          - apt-get update && apt-get install -y openssh-client
definitions:
  caches:
    aptget: which ssh