SSAS Deployment Utility

  • Has anyone had any success using the SSAS Deployment Utility for SQL Server 2012 (Microsoft.AnalysisServices.Deployment.exe)? It should be possible to run it in 3 modes:

    1. Silent mode (/s) - no user interaction, logging to console or a specified log file

    2. Answer mode - this launches the Deployment Wizard, no deployment takes place, but choices made are written back to the files originally created by the Build process.

    3. Output mode - no deployment takes places, but the XMLA deployment script is written to a user-specified file.

    The only mode in which I can get it to do anything is 'answer mode', but that is not much use for inclusion in an automated deployment process.

    It makes no difference whether running on a 32-bit or 64-bit environment.

    I found a post on another site from someone who was having exactly the same problem and in the end they ditched the utility and did everything via a PowerShell script! I don't mind going down that route if I have to, but I just thought I'd put the question out there.

    I can feel a Connect issue coming on....;-)

    Regards

    Lempster

  • Ok, so no one uses the Deployment Utility to deploy SSAS projects then?

    So, question: how do you deploy your 2012 SSAS projects in as automated a way as possible, particularly when, as a Developer, you may not have access to all the environments to which the package should be deployed. (Production for instance)

    Thanks

    Lempster

    P.S. I've logged a Connect issue. If any of you have experienced the same issue or are able to recreate it, please vote.

    https://connect.microsoft.com/SQLServer/feedback/details/814749/ssas-2012-deployment-utility-only-works-in-a-answer-mode

  • Unfortunately, I have not really bothered with it. I usually automate my Tabular Model Processing using XMLA called via a SQL Server Agent job or by using AMO.

  • The Deployment Wizard is nice for generating XMLA scripts to hand off the DBA, but once you've got your script it's very easy to automate using the tool of choice. (examples: run straight from SQL Server Agent, Analysis Services SSIS Execute DDL Task, ASCMD)

  • sneumersky (1/29/2014)


    Unfortunately, I have not really bothered with it. I usually automate my Tabular Model Processing using XMLA called via a SQL Server Agent job or by using AMO.

    Sure, that's for processing a tabular model or multidimensional cube, but I'm talking about deployment of the project.

  • Andrew Notarian (1/29/2014)


    The Deployment Wizard is nice for generating XMLA scripts to hand off the DBA, but once you've got your script it's very easy to automate using the tool of choice. (examples: run straight from SQL Server Agent, Analysis Services SSIS Execute DDL Task, ASCMD)

    Thanks Andrew.

    I agree that the Deployment Wizard can do that, but I'm taling about the Deployment Utility which is run from the command line and does not require any user interaction when run in either Silent or Output mode.

    What I want to be able to do, ideally, is to enable the IT Operations team to deploy an entire solution (containing a Database Project, SSIS Project and SSAS Project) via an installer process such as WiX to multiple environments. To do that, the installer needs to run the individual utilities to deploy each Froject. For Database Projects that's sqlpackage.exe, for SSIS Projects that's the SSIS Deployment Utility and for SSAS Projects it should be the SSAS Deployment Utility, but as I can't get the latter to work in the mode in which I want it to, that option does not seem to be available to me.

    I suppose I could run the Deployment Wizard in the DEV environment, get it to produce an XMLA deployment script and then notify the Ops team of the changes they will need to make to the script to deploy into other environments, but that introduces the possibility of mistakes being made. Maybe I'll have to rely on that and make sure the the accompanying documentation is up to scratch! 😉

    Regards

    Lempster

  • In that case I think you are not going to find too many people who have used that. I don't think most people deploy the whole solution to other environments.

  • Andrew Notarian (1/30/2014)


    In that case I think you are not going to find too many people who have used that. I don't think most people deploy the whole solution to other environments.

    Interesting philosphy Andrew. Presumably most people would want to test thier ETL process (SSIS-based) and Cube processing/browsing in at least one environment prior to Production, so are you saying that each composite part would be deployed piecemeal, i.e. create and populate the relational DWH database as one step, deploy & test the ETL packages as another and then deploy & test the Cube as a third step?

    If one is developing using SSDT-BI inside the Visual Studio shell (or the full-blown Visual Studio), one has the option to Deploy the entire Solution. What I am seeking is a method by which people without access to Visual Studio and SSDT-BI can do that. Maybe I should just seek the Holy Grail instead!

    Regards

    Lempster

  • Lempster (1/30/2014)


    Andrew Notarian (1/30/2014)


    In that case I think you are not going to find too many people who have used that. I don't think most people deploy the whole solution to other environments.

    Interesting philosphy Andrew. Presumably most people would want to test thier ETL process (SSIS-based) and Cube processing/browsing in at least one environment prior to Production, so are you saying that each composite part would be deployed piecemeal, i.e. create and populate the relational DWH database as one step, deploy & test the ETL packages as another and then deploy & test the Cube as a third step?

    If one is developing using SSDT-BI inside the Visual Studio shell (or the full-blown Visual Studio), one has the option to Deploy the entire Solution. What I am seeking is a method by which people without access to Visual Studio and SSDT-BI can do that. Maybe I should just seek the Holy Grail instead!

    I came to BI work from a development background. I treat SSIS and SSAS projects as if they were ASP.NET projects. In ASP.NET, you build your project and only deploy the DLLs and ASPX pages to the server. The source code (i.e. the solution) does not get deployed. For SSIS, we build the project and only the DTSX files get deployed to the server.

    SSAS is more analogous to traditional database development. You pass off your DDL to the DBA who then reviews and runs it to support your changes to the application. For SSAS, we use the Deployment Wizard to generate XMLA instead of T-SQL.

    The data warehouse piece would just be however you deal with regular old OLTP database in your company.

    With the data warehouse structure up to date, your new SSIS ETL in place and your latest OLAP cube structure ready to go, you should be able to run your SSIS packages to load up your data warehouse properly and process your cube. If that works in QA, then you have some level of confidence that you can repeat those deployment steps in Production with success.

    Using that approach you would not really want to have your solutions,projects, (i.e. your source code) on a server, be it Test, QA, Stage or Production.

    At this exact moment I am working on the best way to deal with configuration changes from environment to environment so that the compiled ETL code (i.e. the DTSX files) can remain the same with only the config changes. That's also like ASP development where you ignore your web.config file when deploying new compiled .NET code.

  • One more thing, the people deploying your changes should NOT need Visual Studio. Even Microsoft is not interested in selling Visual Studio to just sit on a server doing nothing 99% of the time.

    For the cube, you could bundle it all up in a batch file so that your XMLA will run from the command line using ASCMD.

    For SSIS you have the Package Installation Wizard.

  • I don't disagree with you in general and that's the way I've worked in the past. I think it is a situation peculiar to my current place of work where everything gets deployed using a one-click approach - a .MSI file that runs all the deployment utilities under the covers. This is probably because there is no DBA (yes, you read that right :w00t:) and so they want as little manual involvement as possible.

    Andrew Notarian (1/30/2014)


    At this exact moment I am working on the best way to deal with configuration changes from environment to environment so that the compiled ETL code (i.e. the DTSX files) can remain the same with only the config changes.

    If you are using SQL 2012 you can do this using Environments, Environment Variables and Parameters. If you're using 2005/2008 then Package Configurations are the way to go.

    Regards

    Lempster

  • MSI makes sense for file & registry updates, but are you running SQL updates that way too? Yikes. It sounds like a nice idea but must take a lot of work.

  • Andrew Notarian (1/30/2014)


    ...Yikes. It sounds like a nice idea but must take a lot of work.

    It does. 🙂

    Edit: changed 'is' to 'does'

  • Dude, I really feel for you getting put through the ringer like that.

  • sneumersky (1/30/2014)


    Dude, I really feel for you getting put through the ringer like that.

    Thanks! 🙂

    I have to be phlegmatic about it and think that at least I'll learn something along the way.

Viewing 15 posts - 1 through 15 (of 17 total)

You must be logged in to reply to this topic. Login to reply