How to run different pre and post SSDT pubish scripts depending on the deploy profile
Asked Answered
D

4

6

I am using SSDT in Visual Studio 2013.

I have created some pre and post publish scripts for the development server. The pre-deployment scripts empty data from tables and re-set all the auto identity fields. The post-deployment scripts populate the tables with static and sample data.

Now I need to publish the database to our staging and live database servers. I have created new "publish.xml" profiles for these servers. But obviously I don't want the same pre and post scripts to run.

How can I either specify different scripts depending on the publish profile, or make the scripts aware of the target and perform different actions.

My biggest concern is publishing to the live server and accidentally destroying data.

Thanks in advance.

Doug

Dusen answered 3/2, 2016 at 10:11 Comment(0)
S
6

You have a few options:

1 - Wrap your data changes in calls to @servername or something unique to the environment so you would have something like:

if @@servername = 'dev_server'
begin
     delete blah
     insert blah

end

2 - You can also achieve something similar using sqlcmd variables, pass in a variable called "/v:DestoryData=true" or something and then you can reference that in your script.

3 - Don't use pre/post deploy scripts but have your own mechanism for running them i.e. use a batch file to deploy your dacpacs and add a call to sqlcmd before and after - the downside to this is that when deploying, changes to a table result in any foreign keys being disabled before the pre-deploy and re-enabled after the post-deploy.

4 - Edit the dacpacs, the pre/post deploy scripts are just text files inside the dacpac which is essentially a zip file that follows the microsoft packaging format and there is a .net packaging api to let you modify it.

I think that is about it, please ask if anything is unclear :)

ed

Siskin answered 3/2, 2016 at 11:4 Comment(2)
I would argue with the statement number 3. It is true but partially. PostDeployment is a greate place to populate any static dictionaries, insert initial data, check existing data and its integrity (like - if foreign key and check constraints are trusted), sequence reset (in MSSQL > 2012), test data generation.Jeroldjeroma
hey great step by step answer on using cmdvars :) - I did point out the downsides to number 2 but they are useful to be aware of so that you can use them occasionally :)Siskin
J
6

I would suggest to use SQLCMD Variables for your conditional script execution.

If right-click on a DB project and choose Properties, there is a tab "SQLCMD Variables"

DB Properties

Enter "$(ServerName)" as variable and something as default value.

Then you need to open your EVERY .publish.xml in XML editor to insert the following code after PropertyGroup part:

<ItemGroup>
    <SqlCmdVariable Include="ServerName">
        <Value>[YourVersionOfServer]</Value>
    </SqlCmdVariable>
</ItemGroup>

[YourVersionOfServer] should be equal to the result of @@servername on each of your servers.

The final .publish.xml might look like:

enter image description here

Then you should wrap you conditional code in pre and post deployment files with:

if @@servername = '$(ServerName)'
begin
    ... code for specific server
end

Thus you can guarantee that the right code hits the right server

Jeroldjeroma answered 5/2, 2016 at 12:3 Comment(0)
G
1

First set up SQLCMD Variables by right clicking on the project and going to properties and the SQLCMD Variables tab: Project Properties

Then set up a folder structure for you to organize scripts that you want to run for a specific server or any other thing you want to switch off of, like customer. Each server gets a folder. I like to barrel all the scripts I want to run for that folder into an index file in that folder. The index lists out a :r command followed by each scrip in the folder that should run, organized by filename with a numerical prefix so the order can be controlled.

Folder Structure

In the index file in the folder that groups all the server folders will do something different than listing out a call to each server's index file, instead it switches which index file to run based on the SQLCMD variable passed in based on the publish profile. It does so with the following simple code:

:r .\$(Customer)\Index.sql

The reason you want to do it like this by setting up folders and index files is that not only does it keep things organized it also allows you to use Go statements in all of your files. You can then use one script with :r statements for all other scripts you want to run, nesting to your heart's content. You could set up your sqlcmd file to do it the following way which doesn't require you to set up folders or specific names for files, but it requires you to remove all GO statements. By doing it the above way you don't have to remove any go statements.

if @@servername = '$(ServerName)'
begin
    ... code for specific server
end

Then when I right click the project and hit publish it builds the project and pops up with the following dialog. I can change which scripts get run by changing the SQLCMD variable. Publish Database Dialog

I like to save off the most commonly used settings as a separate publish profile and then I can get back to it by just clicking on it's .xml file in the solution explorer. This process makes everything so easy and there is no need to modify the xml by hand, just click save profile as, Load Profile, or Create Profile using the publish database dialog shown above.

Also, you can generate your index files with the following powershell script:

foreach($directory in (Get-ChildItem -Directory -Recurse) | Get-Item)
{
    #skip writing to any index file with ---Ignore in it.
    if(-Not ((Test-Path $directory\Index.sql) -and (Get-Content $directory\Index.sql | %{$match = $false}{ $match = $match -or $_ -match "---Ignore" }{$match})))
    {
        $output = "";
        foreach($childitem in (Get-ChildItem $directory -Exclude Index.sql, *.ps1 | Sort-Object | Get-Item))
        {
            $output= $output + ":r .\" + $childitem.name 
            if($childitem -is [system.io.directoryinfo])
            {
                $output = $output + "\Index.sql";
            }
            $output = $output + "`r`n"
        }
        Out-File $directory\Index.sql -Encoding utf8 -InputObject $output       
    }
}
Genetic answered 1/2, 2019 at 15:22 Comment(1)
I like this approach, particularly for the :r .\$(Customer)\Index.sql part. However, I can't get this to work fully. I've got a variable called $(ServerName), with a default value, but when publishing and adding in a different value it still only references the index.sql of the default path. I've put a question on SO: #57506883Armbrecht
D
1

For anyone wondering you could also do something like this:

Publish profile:

<ItemGroup>
    <SqlCmdVariable Include="Environment">
        <Value>Dev</Value>
    </SqlCmdVariable>
</ItemGroup>

Post deployment script:

if '$(Environment)' = 'Dev'
begin
    ... code for specific server
end

It seemed a bit more natural to me this way compared to the "ServerName" semantics. I also had issues trying to use @@servername, but not sure why.

Droppings answered 8/4, 2021 at 17:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.