Create one loop for 27 tables

  • Hi there, everyone.

    I know this might sound stupid, but I need to ask it anyway.

    I have to get data across from one server to another via SSIS. They all use the same logic in that there is a data SQL truncate task, a dataflow task that uses source and destination OleDB. Then there is another SQL task and finally another SQL task. It's the same process for all tables.

    How can I create it so that it essentially loops through and runs it for each table instead of having to create 27 packages. Each SQL task will be different as it pertains to each table.

    I need to add that the tables vary vastly so during the workflow I have to match up the columns manually as the columns I am given are non-sensical to end users. I have created fields that are understandable so I have to map the source columns to the columns that will be used on the destination table.

    • This topic was modified 2 years, 8 months ago by  darkangelBDF.
  • This depends how you've set things up. You say, here, you're using a SQL Task which, I assume, means you aren't using data flow tasks to migrate the data from one server to the other. Are you therefore using linked servers to migrate the data from one istance to the other? If so, I don't really see the need to SSIS at all.

    If, however, you are using Data Flows, and Data Sources, you can't use a Foreach Loop in SSIS to do this. Data in SSIS needs to be well defined; it can't have a dynamic definition. This means you need to explicitly define the columns and their data types in the data flow and your tables aren't going to all have the same definition (I would be concerned if they do, as that likely means a design flaw). For such scenarios, you would need to build separate flows for each table. That could be in one Data Flow, with many flows defined in it (which would run in parallel),  you could have multiple Data Flow tasks that run sequential, or a mixture. Most likely the latter may well be the most efficient, due to (I assume) you having Primary and Foreign Key constraints. This means you can load some tables, that aren't bound by FK's first, and then do the migrations for other tables that are bound by them afterwards.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Thanks for the advice.  I have data flow tasks. I’ve come to the realization that the only way to do it is to have separate work flows for each table. The SQL tasks just do things like updating a staging table before the data flow task runs. After the data flow task it runs an update on the destination Db and then another one calls a merge sproc. There is essentially no other way of doing the data flow tasks though. Thank you again for your response.

  • One of the reasons you would not want to loop over a list of tables is the fact that you have now serialized the extract and load process.  Using separate data flows in a single package will allow SSIS to execute multiple data flows at the same time (how many depends on available resources).

    Using separate packages - with multiple data flows in each package (grouped so that dependent tables reside in a single package - for example) - and then using a master package to control when each child package runs can allow for multiple extracts/loads all running at the same time (again - up to available resources).

    I personally prefer the master/child method - as it allows for scheduling each distinct set of processes as needed.  If one of the child packages fails - then you only need to worry about rerunning that one package.  If you have everything in a single package and one part of the package fails - it becomes much more difficult to rerun just that portion.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Well, technically you *could* do this with SSIS, but the only ways I can think of involve a lot of hacky things to work around SSIS, like doing it all from code (script task). But if you're going to do that, why bother using SSIS in the first place?

    If you don't want to create the whole thing manually then you might try using SSMS once to do the copy via the Import/Export wizard, and within the wizard, tick the boxes to save it all as a new SSIS package, and then import that package into your SSIS project. SSMS will build all the individual transfers for you, in a data flow task, so you don't have to do them all yourself. I'm not sure if the end result would be acceptable, but you might try it and take a look at the package it creates. Clearly, internally, SSMS knows how to dynamically build a package, but it's not a "loop"...it just creates a data flow task that has within it all the individual data flows for any tables you select in the wizard. So it's not exactly what you want, but it might save you the drudgery of creating all the flows yourself if you have lots of tables and it's just a straight copy (or simple mapping).

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply