Blog Post

Controlling a dacpac deployment

,

I have been thinking quite a lot recently (ok not that much but my thinking has changed) about how to deploy dacpac's. In particular I am talking about how to control the deployment, what tool is used to deploy the dacpac, what connection details are passed in, which of the hundreds of parameters are passed in and how.

I think that as I configured and deployed databases using dacpac's in a number of different environments my approaches are been:

In the beginning there was sqlpackage.exe

sqlpackage is great, it gives you a well documented command line and you can pass whatever arguments you need. The amount of times I have typed sqlpackage /action:script or /action:publish, the a,c,t keys on my keyboard are a little bit faded (they aren't but you get the picture!).

What I used to do was to check into my unit test ssdt projects, a directory with sqlpackage and all the dependent dll's so I could run the deploy from any build agent without having to have the DacFx or SSDT installed. This worked really well but there were a couple of downsides, firstly, the version checked in would hardly ever be updated and with an almost monthly (actual?) release cycle for ssdt builds this means you get behind pretty quickly. The other main issue is that you have to build up a lot of command line arguments so invariably I would end up with a powershell script or piece of .net code to manage that complexity. This is made even more complex when you have to consider the pain that is spaces in windows and an automated deployment process. There is an alternative to passing arguments on the command line that is to use a publish profile - I completly discounted these, I am not sure why but it was a long time ago - let's just agree that the important thing is that this approach to using sqlpackage.exe left me feeling a little uneasy in a few places.

After sqlpackage.exe there came code

The next approach was to write some code, this would either be powershell or c#. Typically powershell would be as part of a production ci/cd process - I would write a script to call the DacFx and then call the script from jenkins/vsts etc. I also found myself calling the DacFx from c# but this was typically limited to integration tests for tools that I was writing.

I liked the scripted approach because it meant that I could still pass some argumenst like the server name, user credentials etc on the command line (or encrypt them where necessary) and put a whole load of arguments in the scruipt to be shared by each environment. There were still a couple of problems, firstly as with the sqlpackage.exe approach I still had the problem that the DacFx needed to be installed and available so I would check the files into source control (or make them available etc). There was one additional problem that I did not forsee, that is when you use sqlpackage.exe you can load contributors from a sub folder called "Extensions", when you used the DacFx yourself you had to install them into program files which went against my (pretty strong) desire to be able to deploy from any build agent (windows for now, but i'm hoping!).

Then came the flip-flopping

For a while I meandered between the two approaches until the ssdt team announced that they had released a nuget package with the DacFx in and I decided that I would move over to that as it meant that I no longer had to check in the dll's into source control which in itself is a big win. I also decided to fix the extensions thing and so figured out a (slightly hacky) way to get the DacFx dll's in the nuget package to behave like sqlpackage and allow a sub-directory to be used to load dll's - I fixed that using this powershell module that wraps a .net dll (https://the.agilesql.club/blogs/Ed-Elliott/DacFxed-Nugetized-DacFx-Power...). Now I have the problem of not having to check in dll's and still being able to load contributors without having to install into program files sorted BUT I still had the problem of lots of command line args which I was sharing in powershell scripts and passing in some custom bits like server/db names etc.

The final piece of the puzzle

<blink>Publish Profiles<blink> ta daaaa.

I had actually been using publish profiles, I normally had at least one as part of my unit test project that I would use to dpeloy my whole database locally before running tests before I did a git push (obviously during dev I would do a quick deploy rather than constant deploys). My current grand scheme is to put everything I need in publish profiles and use those to deploy dacpac's. I realise that this isn't a great revelation but I am happy with them and pleased with where I am now, who knows where I will end up!

*I know blink doesn't work 🙂

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating