Using BitBucket Pipelines to Deploy onto VPS via SSH Access
Asked Answered
T

3

38

I have been trying to wrap my head around how to utilise BitBucket's Pipelines to auto-deploy my (Laravel) application onto a Vultr Server instance.

I have the following steps I do manually, which I am trying to replicate autonomously:

  • I commit my changes and push to BitBucket repo
  • I log into my server using Terminal: ssh root@ipaddress
  • I cd to the correct directory: cd /var/www/html/app/
  • I then pull from my BitBucket repo: git pull origin master
  • I then run some commands: composer install, php artisan migrate etc..
  • I then log out: exit

My understanding is that you can use Pipelines to automatise this, is this true?

So far, I have set up a SSH key pair for pipelines and my server, so my server's authorized_keys file contains the public key from BitBucket Pipelines.

My pipelines file bitbucket-pipelines.yml is as follows:

image: atlassian/default-image:latest

pipelines:
  default:
    - step:
        deployment: staging
        caches:
          - composer
        script:
          - ssh root@ipaddress
          - cd /var/www/html/app/
          - git pull origin master
          - php artisan down
          - composer install --no-dev --prefer-dist
          - php artisan cache:clear
          - php artisan config:cache
          - php artisan route:cache
          - php artisan migrate
          - php artisan up
          - echo 'Deploy finished.'

When the pipeline executes, I get the error: bash: cd: /var/www/html/app/: No such file or directory.

I read that each script step is run in it's own container.

Each step in your pipeline will start a separate Docker container to run the commands configured in the script

The error I get makes sense if it's not executing cd /var/www/html/app within the VPS after logging into it using SSH.

Could someone guide me into the correct direction?

Thanks

Thermopile answered 27/4, 2018 at 0:39 Comment(1)
I dont understand why your server would be the one with the public key. Don't you want to ssh into the server with a public key, not vice versa?Assumed
E
76

The commands you are defining under script are going to be run into a Docker container and not on your VPS.

Instead, put all your commands in a bash file on your server.

1 - Create a bash file pull.sh on your VPS, to do all your deployment tasks

#/var/www/html
php artisan down
git pull origin master
composer install --no-dev --prefer-dist
php artisan cache:clear
php artisan config:cache
php artisan route:cache
php artisan migrate
php artisan up
echo 'Deploy finished.'

2 - Create a script deploy.sh in your repository, like so

echo "Deploy script started"
cd /var/www/html
sh pull.sh
echo "Deploy script finished execution"

3 - Finally update your bitbucket-pipelines.yml file

image: atlassian/default-image:latest

pipelines:
  default:
    - step:
        deployment: staging
        script:
          - cat ./deploy.sh | ssh <user>@<host>
          - echo "Deploy step finished"

I would recommend to already have your repo cloned on your VPS in /var/www/html and test your pull.sh file manually first.

Examination answered 1/5, 2018 at 21:36 Comment(5)
Thanks a bunch! With some tweaking I got it to work. The git pull command was missing.Thermopile
Fantastic. Because I only needed to run a couple lines, I skipped Step #1 and put everything into deploy.sh. Also, the image does need to be atlassian, as shown. For those wanting to run their deployment on a specific branch, this shows examples of the syntax: confluence.atlassian.com/bitbucket/…Depository
I just logged in to upvote this answer! Anyway to make this comment useful I found this stackoverflow thread from here community.atlassian.com/t5/Bitbucket-questions/…Loaiasis
Starting from v5.1 you can add this command to the mix: php artisan view:clearAnatol
I know this is old now but what about getting the files on the VPS initially? I mean, will this work for a new server where the repo isn't on the VPS yet?Conchoidal
M
8

The problem with the answer marked as the solution is that the SH process won't exit if any of the commands inside fails.

This command php artisan route:cache for instance, can fail easily! not to mention the pull!

And even worse, the SH script will execute the rest of the commands without stop if any fail.

I can't use any docker command because after each, the CI process stops and I can't figure out how to avoid those commands to not exit the CI process. I'm using the SH but I'll start adding some conditionals based on the exit code of the previous command, so we know if anything went wrong during the deploy.

Mistral answered 22/11, 2019 at 10:46 Comment(0)
S
4

I know this may be an old thread, but bitbucket does provide a pipeline to do all that is mentioned above in a much cleaner way.

Please have a look at https://bitbucket.org/product/features/pipelines/integrations?p=atlassian/ssh-run

Hope this helps.

Soemba answered 9/10, 2022 at 0:12 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.