automated deployment from bitbucket to iis on windows server
Asked Answered
T

2

10

I want to use bitbucket as middle-ware between local server and the live server I'm deploying into.

I'm using windows server 2012 and iis 8. I'm working on an ASP.NET MVC 5 project.

Basically, i want to recreate the same workflow azure has for continuous integration :

Work on the application locally, then commit changes to master(not necessarily) branch in bitbucket, the server would then get synchronized with master branch to reflect changes.

I assume that the starting point would be to have a copy of bitbucket repository in both local and live server, but I'm struggling on how to link the local and live server to bitbucket.

Thetisa answered 3/12, 2015 at 21:1 Comment(1)
as a starting point if anyone is facing the same issue, go to kudu project, it's open source and it's what powers azure continuous integration. github.com/projectkudu/kuduThetisa
S
1

Here is the scenario: you have a .NET core application, it is in a Bitbucket repository, you are hosting it in a Windows server using IIS, and you want to set up CI/CD using Bitbucket pipelines. If this is your case, continue reading.

Before you get too excited, make sure the following prerequisites have been met:

  • You have a .NET Core project already
  • Configured SSH on your server
  • Generated SSH keys and copied public key on server
  • Your project is in a Bitbucket repository
  • You have pipelines enabled in your repository settings

To get started, create a bitbucket-pipelines.yml file on the root of your project.

Step 1.

image:  mcr.microsoft.com/dotnet/core/sdk:3.1

At the top of the file specify what image you want to use. In this case choose the dotnet/core/sdk:3.1 image which allows you to run dotnet commands later on in our steps.

Then use the pipelines keyword and set the first step in your pipeline.

image:  mcr.microsoft.com/dotnet/core/sdk:3.1
pipelines:
    default: 
      - step:
        name: Build App
        caches:
          - dotnetcore
        script:
          - dotnet restore
          - dotnet build --no-restore
          - dotnet publish --no-restore -c Release -o $BITBUCKET_CLONE_DIR/release
        artifacts:
          - release/**

The default keyword sets a default step that executes if the branch name doesn’t match or if there are no specified branches in the file. If you wish to limit a step to certain branches, you can use the branches keyword to specify like shown below.

branches:
  master:
    - step: 

The step keyword introduces the first step in the pipeline and runs the commands in the list under script, and name is simply a name you give the step. Use caches to cache external dependencies and speed up the build time. You can read more about caches here.

Next, run the dotnet commands to restore, build, and publish your project. The $BITBUCKET_CLONE_DIR is an environment variable that indicates the path where the project is cloned when the pipeline runs. To make sure the published files are in that directory I set the output in the publish command to a directory called “release” in the clone directory ($BITBUCKET_CLONE_DIR/release).

Artifacts are saved files that are created during the step. Artifacts can be passed from one step to another. You will need to use the artifact in the next step to move them to the server.

Step 2.

image:  mcr.microsoft.com/dotnet/core/sdk:3.1
pipelines:
    default: 
      - step:
        name: Build App
        caches:
          - dotnetcore
        script:
          - dotnet restore
          - dotnet build --no-restore
          - dotnet publish --no-restore -c Release -o $BITBUCKET_CLONE_DIR/release
        artifacts:
          - release/**
       #2nd step to deploy to the server
       - step:
         name: Deploy to server
         deployment: staging
         script:
           - pipe: atlassian/scp-deploy:0.3.9
           variables: 
             USER: $USER
             SERVER: $SERVER
             REMOTE_PATH: $REMOTE_PATH
             LOCAL_PATH: 'release/*'
             DEBUG: 'true'

In the second step, the deployment keyword is used so the step runs with config specific to that environment. You can set variables for each deployment in your repository settings. The pipe atlassian/scp-deploy uses scp (Secure Copy Protocol) to move the published files to the server. The pipe takes a few required parameters to SSH into your server.

  • USER is the user on the server
  • SERVER is the IP or URL to your server
  • REMOTE_PATH is the path that you want to copy your files to on the server and must exist prior to running the pipeline
  • LOCAL_PATH is the path that has the files we need to copy over. In this case, simply put the relative path to the artifact we produced in the previous step
  • DEBUG is optional and set to false by default, but I keep it true while working on the pipeline

Documentation for this pipe can be found here.

The SSH Key is saved as a repository setting and added to a default location that the scp pipe automatically checks and uses if it exists. The pipe has an optional SSH_KEY parameter which allows you to use a specific key. If you go this route, you will need to encode your private key to base64 first.

Step 3.

image:  mcr.microsoft.com/dotnet/core/sdk:3.1
pipelines:
    default: 
      - step:
        name: Build App
        caches:
          - dotnetcore
        script:
          - dotnet restore
          - dotnet build --no-restore
          - dotnet publish --no-restore -c Release -o            $BITBUCKET_CLONE_DIR/release
        artifacts:
          - release/**
      #2nd step to deploy to the server
       - step:
         name: Deploy to server
         deployment: staging
         script:
           - pipe: atlassian/scp-deploy:0.3.9
           variables: 
             USER: $USER
             SERVER: $SERVER
             REMOTE_PATH: $REMOTE_PATH
             LOCAL_PATH: 'release/*'
             DEBUG: 'true'
        #3rd step runs script on server
        - step:
          name: Run batch script
          caches:
            - docker
          script:
            - pipe: docker://accessefm/bat-script-runner:latest
              variables:
              HOST: $SERVER
              USER: $USER
              PORT: $PORT
              SCRIPT_REMOTE_PATH: $DEPLOY_SCRIPT_PATH
              ARGUMENTS: '$SRC $DEST'

In the final step, a batch script needs to be executed on the server. This is necessary because we need to stop the IIS service before we move the files published by the pipeline to the project directory in your server, and restarts IIS after doing so.

In my case, I needed to run a batch script since the Windows server version did not support running Bash scripts and had to create a custom pipe to be able to do so. If your server supports bash you could use the atlassian/ssh-run pipe (Server must have WSL installed). However, if you are in the same boat as me, you could use the accessefm/bat-script-runner pipe I created or make your own custom one. Documentation for creating a custom pipe can be found here.

That is it! A simple solution to deploy a .NET core application to a server using solely Bitbucket pipelines. Keep in mind this is a basic example, it is possible you will need to beef up your pipelines to work according to your team standard.

Answer credit Miguel A. Delgado

Sussman answered 29/12, 2022 at 7:2 Comment(0)
G
-2

You should use Jenkins.

I use Jenkins to create an "Automated Nightly Build" which pulls from my BitBucket repo, runs a build, and then moves those files to a server.

It is free, and it is awesome and easy to set up!

https://jenkins.io/

Gaye answered 13/11, 2018 at 5:7 Comment(2)
Is using Jenkins the only way to get this to work with this kind of setup?Trachoma
Jenkins is just one way. The topic you should research is CI/CD. Here are a few of my other favorite tools to deploy code based on Git Hooks: AWS (CodePipeline), Heroku, TerraformGaye

© 2022 - 2024 — McMap. All rights reserved.