DACPAC schema compare runs before pre-deployment scripts during publish
Asked Answered
R

4

23

When publishing a dacpac with sqlpackage.exe, it runs Schema Compare first, followed by pre-deployment scripts. This causes a problem when, for instance, you need to drop a table or rename a column. Schema Compare was done before the object was modified and the deployment fails. Publish must be repeated to take the new schema into account.

Anyone have a work-around for this that does not involve publishing twice?

Rajah answered 1/9, 2015 at 11:41 Comment(2)
Renames should be handled natively if you use the Refactor capability. Dropping tables is also supported in the project. However Ed's suggestion is valid for times when the native functionality won't work. What are you trying to drop/rename/accomplish that needs to be done before the schema compare?Munt
In this particular case, I am dropping a foreign key altering the FK column from BIGINT to INT and pointing the FK to a different table that is itself being re-factored. The deployment fails the first run because schema compare is done before these changes. It deploys on the second run.Rajah
L
14

Gert Drapers called it as pre-pre-deployment script here

Actually it is a challenge. If you need to add non-nullable and foreign key column to a table full of data - you can do with a separate script only.

If you are the only developer - that is not a problem, but when you have a large team that "separate script" has to be somehow executed before every DB publish.

The workaround we used:

  • Create separate SQL "Before-publish" script (in DB project) which has a property [Build action = None]
  • Create custom MSBuild Task where to call SQLCMD.EXE utility passing "Before-publish" script as a parameter, and then to call SQLPACKAGE.EXE utility passing DB.dacpac
  • Add a call of the custom MSBuild Task to db.sqlproj file. For example:
<UsingTask 
        TaskName="MSBuild.MsSql.DeployTask" 
        AssemblyFile="$(MSBuildProjectDirectory)\Deploy\MsBuild.MsSql.DeployTask.dll" />

<Target Name="AfterBuild">
    <DeployTask 
        Configuration="$(Configuration)" 
        DeployConfigPath="$(MSBuildProjectDirectory)\Deploy\Deploy.config" 
        ProjectDirectory="$(MSBuildProjectDirectory)" 
        OutputDirectory="$(OutputPath)" 
        DacVersion="$(DacVersion)">
    </DeployTask>
</Target>

MsBuild.MsSql.DeployTask.dll above is that custom MSBuild Task.

Thus the "Before-publish" script could be called from Visual Studio.

For CI we used a batch file (*.bat) where the same two utilities (SQLCMD.EXE & SQLPACKAGE.EXE) were called.

The final process we've got is a little bit complicated and should be described in a separate article - here I mentioned a direction only :)

Leslielesly answered 3/9, 2015 at 13:1 Comment(1)
An experiment with the "Before-publish" and deployment without Visual Studio: github.com/lobodava/dac-deployerLeslielesly
L
1

Move from using visual studio to using scripts that drive sqlpackage.exe and you have the flexibility to run scripts before the compare:

https://the.agilesql.club/Blog/Ed-Elliott/Pre-Deploy-Scripts-In-SSDT-When-Are-They-Run

ed

Libenson answered 1/9, 2015 at 12:17 Comment(1)
How does your existing CI process call msdeploy?Libenson
M
1

We faced a situation when we need to transform data from one table into other during deployment of the database project. Of course it is a problem to do using the DB project due to in the pre-deployment the destination table (column) still doesn't exist but in post-deployment script the source table (column) is already absent.

To transform data from TableA to TableB we used the following idea (This approach can be used for any data modifications):

  1. Developer adds destination table (dbo.TableB) into the DB project and deploys it onto the local DB (without committing to a SVN)
  2. He or she creates a pre-deployment transformation script. The trick is that the script put the result data into a temporary table: #TableB
  3. Developer deletes the dbo.TableA in the DB project. It is assumed that the table will be deleted during execution of the main generated script.
  4. Developer writes a post-deployment script that copies data form #TableB to dbo.TableB that was just created by the main script.
  5. All of the changes are committed into the SVN.

This way we don't need the pre-pre-deployment script due to we store the intermediate data in the temporary table.

I'd like to say that the approach that uses the pre-pre-deployment script had the same intermediate (temporary) data, however it is stored not in temporary tables but in real tables. It happens between pre-pre-deployment and pre-deployment. After execution of pre-deployment script this intermediate data disappears.

What is more, the approach with using temporary tables allows us to face the following complicated but real situation: Imagine that we have two transformations in our DB project:

  1. TableA -> TableB
  2. TableB -> TableC

Apart from that we have two databases:

  1. DatabaeA that have the TableA
  2. DatabaeB where the TableA was already transformed into the TableB. The TableA is absent in the DatabaseB.

Nonetheless we can deal this situation. We need just one new action in the pre-deployment. Before the transformation we try to copy data form the dbo.TableA into #TableA. And the transformation script works with temporary tables only.

Let me show you how this idea works in DatabaseA and DatabaseB. It is assumed that the DB project has two couples of the pre and post deployment scripts: "TableA -> TableB" and "TableB -> TableC".

Below is the example of the scripts for "TableB -> TableC" transformation.

Pre-deployment script

----[The data preparation block]---
--We must prepare to possible transformation
--The condition should verufy the existance of necessary columns
IF  OBJECT_ID('dbo.TableB') IS NOT NULL AND
    OBJECT_ID('tempdb..#TableB') IS NULL
BEGIN
    CREATE TABLE #TableB
    (
        [Id] INT NOT NULL PRIMARY KEY, 
        [Value1] VARCHAR(50) NULL, 
        [Value2] VARCHAR(50) NULL
    )

    INSERT INTO [#TableB]
    SELECT [Id], [Value1], [Value2]
    FROM dbo.TableB
END

----[The data transformation block]---
--The condition of the transformation start
--It is very important. It must be as strict as posible to ward off wrong executions.
--The condition should verufy the existance of necessary columns
--Note that the condition and the transformation must use the #TableA instead of dbo.TableA
IF  OBJECT_ID('tempdb..#TableB') IS NOT NULL 
BEGIN

    CREATE TABLE [#TableC]
    (
        [Id] INT NOT NULL PRIMARY KEY, 
        [Value] VARCHAR(50) NULL
    )

    --Data transformation. The source and destimation tables must be temporary tables.
    INSERT INTO [#TableC]
    SELECT [Id], Value1 + ' '+ Value2 as Value
    FROM [#TableB]

END

Post-deployment script

--Here must be a strict condition to ward of a failure
--Checking of the existance of fields is a good idea
IF  OBJECT_ID('dbo.TableC') IS NOT NULL AND
    OBJECT_ID('tempdb..#TableC') IS NOT NULL
BEGIN

    INSERT INTO [TableC]
    SELECT [Id], [Value]
    FROM [#TableC]

END

In the DatabaseA the pre-deployment script has already created the #TableA. Therefore the data preparation block won't be executed due to there is no dbo.TableB in the database. However the data transformation will be executed because there is the #TableA in the database that was created by the transformation block of the "TableA -> TableB".

In the DatabaseB the data preparation and transformation blocks for the "TableA -> TableB" script won't be executed. However we already have the the transformed data in the dbo.TableB. Hence the the data preparation and transformation blocks for the "TableB -> TableC" will be executed without any problem.

Marked answered 28/10, 2015 at 8:22 Comment(0)
B
0

I use the below work around in such scenarios

  1. If you would like to drop a table
  • Retain the table within the dacpac (Under Tables folder).
  • Create a post deployment script to drop the table.
  1. If you would like to drop a column
  • Retain the column in the table definition within dacpac (Under Tables folder).
  • Create a post deployment script to drop the column.

This way you can drop tables and columns from your database and whenever you make the next deployment ( may be after few days or even months) exclude that table/columns from dacpac so that dacpac is updated with the latest schema.

Belize answered 10/11, 2021 at 13:50 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.