However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process. I can access database as I marked as DBO on the machine.
Firstly you have to create SQL Server Database Project using SSDT (or Azure Data Studio insiders preview) by importing objects of the live database.
The database project then is to be placed into a repository
The pipeline (classic or yaml) is to have a build task MSBuild@1. Here is an YAML example. It generates the dacpac
- task: MSBuild@1
displayName: 'Build solution YourDatabase.sln'
inputs:
solution: 'src/YourDatabase.sln'
This task compiles the database project and produces dacpac file(s)
- Then produced files are to be extracted:
- task: CopyFiles@2
displayName: 'Extract DACPACs'
inputs:
CleanTargetFolder: false
SourceFolder: '$(agent.builddirectory)\s\src\YourDatabase\bin\Debug\'
Contents: '*.dacpac'
TargetFolder: '$(build.artifactstagingdirectory)'
- And finally, published as the artefact
- task: PublishPipelineArtifact@1
displayName: 'Publish Artifact'
inputs:
targetPath: '$(build.artifactstagingdirectory)'
artifact: 'drop'
- Deployment of the dacpac is the final goal and can be done using
SqlDacpacDeploymentOnMachineGroup@0
, however, this is out of the scope of the original question
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
It depends.
Classic pipelines have a separation of BUILD and RELEASE phases. In this case, you can build it once and reuse that dacpac for many future releases.
In case of multi-stage yaml pipelines, it is common that every pipeline run triggers build and deployment stages, because they are still belong to the same pipeline and run as a single unit work.