I have been trying to wrap my head around how to utilise BitBucket's Pipelines to auto-deploy my (Laravel) application onto a Vultr Server instance.
I have the following steps I do manually, which I am trying to replicate autonomously:
- I commitmy changes andpushto BitBucket repo
- I log into my server using Terminal: ssh root@ipaddress
- I cdto the correct directory:cd /var/www/html/app/
- I then pullfrom my BitBucket repo:git pull origin master
- I then run some commands: composer install,php artisan migrateetc..
- I then log out: exit
My understanding is that you can use Pipelines to automatise this, is this true?
So far, I have set up a SSH key pair for pipelines and my server, so my server's authorized_keys file contains the public key from BitBucket Pipelines.
My pipelines file bitbucket-pipelines.yml is as follows:
image: atlassian/default-image:latest
pipelines:
  default:
    - step:
        deployment: staging
        caches:
          - composer
        script:
          - ssh root@ipaddress
          - cd /var/www/html/app/
          - git pull origin master
          - php artisan down
          - composer install --no-dev --prefer-dist
          - php artisan cache:clear
          - php artisan config:cache
          - php artisan route:cache
          - php artisan migrate
          - php artisan up
          - echo 'Deploy finished.'
When the pipeline executes, I get the error: bash: cd: /var/www/html/app/: No such file or directory.
I read that each script step is run in it's own container.
Each step in your pipeline will start a separate Docker container to run the commands configured in the script
The error I get makes sense if it's not executing cd /var/www/html/app within the VPS after logging into it using SSH.
Could someone guide me into the correct direction?
Thanks
 
     
     
     
    