Package or Project deployment

  • My experience is more with SQL2008R2 SSIS than SQL2012 SSIS, and I'm struggling to see the benefit of the Project deployment.

    The solution I'm working on requires reading source data from over 250 tables and performing a straight dump into a 'landing' environment, from there the new data if shuffled into a 'difference' database, from where it is transformed and combined with other tables into a 'staging' database and then into a Data Vault structure. To ensure sufficient throughput we spawn via CMD a package multiple time passing parameters on which tables to process. It seems to me that with the Project deployment approach, the SSIS server requires SQL to also be installed, and logging is performed into the SSISDB. Using Project deployment can I have a central server for logging and holding the packages? Do I lose any scaleout capability if we go with Project deployment?

    It seems to me that much has been lost with the Project deployment. With Package deployment I can run packages on multiple servers by reading from a common share. With Project deployment I'm unable to spawn multiple instances via CMD. Am I missing something? I want to avoid having to develop in excess of 1000 packages (data flows), currently we have 6 packages and a bunch of XML files that contain the source / destination information.

    Have others made the switch to Project deployments and is it worth the effort, or should I stick with Package deployments?

  • pcd_au (8/21/2014)


    My experience is more with SQL2008R2 SSIS than SQL2012 SSIS, and I'm struggling to see the benefit of the Project deployment.

    The solution I'm working on requires reading source data from over 250 tables and performing a straight dump into a 'landing' environment, from there the new data if shuffled into a 'difference' database, from where it is transformed and combined with other tables into a 'staging' database and then into a Data Vault structure. To ensure sufficient throughput we spawn via CMD a package multiple time passing parameters on which tables to process. It seems to me that with the Project deployment approach, the SSIS server requires SQL to also be installed, and logging is performed into the SSISDB. Using Project deployment can I have a central server for logging and holding the packages? Do I lose any scaleout capability if we go with Project deployment?

    It seems to me that much has been lost with the Project deployment. With Package deployment I can run packages on multiple servers by reading from a common share. With Project deployment I'm unable to spawn multiple instances via CMD. Am I missing something? I want to avoid having to develop in excess of 1000 packages (data flows), currently we have 6 packages and a bunch of XML files that contain the source / destination information.

    Have others made the switch to Project deployments and is it worth the effort, or should I stick with Package deployments?

    Interesting solution architecture. How many SSIS servers are we talking about here, just for interest? Do you have centralised logging of package executions?

    I've been using the project deployment model since 2012 came out. Initially, it took some getting used to, but now I much prefer that all packages in a project are treated together, rather than in isolation and can share connections and parameters. ISPac deployments mean that you always know exactly which version of a package is deployed.

    Your solution sounds so well engineered (and, at six packages, very manageable) that you should stick with what works and maybe start 'trying out' the project deployment model for any new projects.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil

    Currently only one, but we have proven that we can run multiple SSIS servers, with centralised logging / monitoring / reporting.

    As for the number of packages, I have a model to get this down to a total of 3.

    The problem I'm facing is that others (calling themselves architects) are complaining that we are not using the full capability / power of SSIS, such as Dataflows, new logging / recording aspects of 2012.

    If I go with Project deployment do I lose the scale out capability?

    I believe that I can pass parameters, but need to check that out.

    From a logging perspective I need to check if 'volume' is being logged, just having start and end time to totally useless unless you have some concept of volume processed. If I have to log my own volume then I may as well log the duration!

    pcd

  • About a year ago, I was introduced to a new project using the catalog model, and my personal transition has been difficult. My final verdict, is that there's no additional benefit, and often caused more deployment issues since it requires tighter integration and coordination with SQL Agent configurations. And it gets more complicated with source control (we use Team Foundation), and deploying to multiple SQL instances.

    I maybe can see some benefit over the new Catalog Model over previous file/package deployments, if you're brand new to SSIS. But there's nothing I haven't done before through configuration files, smart parameter and variable utilization (which still is required with the Catalog Model.)

    A few of my colleagues are of the same opinion, and the ramp-up time and training has cost us more time than it should have saved. I would stick with what's working, and wait until Catalog Model matures.

    In terms of scaling, your XML configurations should still be very viable at least through SQL 2014, I imagine.

    It might depend on your team, but for the teams I'm working for...ROI on transition has not been good. retraining and refactoring costs wasted more time/money overall.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply