DTS Transformation Task and Execute Sql Task in one package

  • Hi,

    I want to do the following but I'm not knowing whether it would be possible in a single DTS package.

    Basically I need to export several anomalies like cost<0, price<0 etc to excel sheet. And I need to update the database.

    First I want to perform a Data Transformation task from Sql server to Excel Sheet which will report one anomaly. after that I need to update the database using execute sql task.

    Then again I want to perform a Data Transformation task from Sql server to Excel Sheet which will report another anomaly. after that I need to update the database using execute sql task.

    The figure would be like this...

    sqlserver --->Excel--->ExecuteSqlTask-->sqlserver--->Excel--->ExecuteSqlTask--->sqlserver --->Excel--->ExecuteSqlTask

    So my question now is can I do all this in a single DTS package?

    Thanks,

    Sridhar!!

  • You can certainly use both tasks in a single package.  Do you want to loop back and execute the same tasks again or will each Transform Data task and Execute SQL task be unique i.e. use different source tables and/or different Excel files?

    Greg

    Greg

  • Hi Greg,

    Thanks for the reply.Each Data Transform Task and Execute SQL Task are unique. Each will export the anomalies of a table into the excel sheet and after that using the Execute Sql Task, It will update the table. Please let me know how to do it?

    Thanks in Advance,

    Sridhar!!

  • Hi Sridhar,

    Create a package in DTS Designer.  Add Connection objects for each source and destination to the workspace.  Add Transform Data Task between each pair of source and destination Connections.  Add Execute SQL Tasks for each table and connect them to their specific transformation with Workflow (use On Success or On Completion).

    Look for examples at http://www.sqldts.com

    Good luck,

    Greg

    Greg

  • Hi Greg,

    Thanks for the reply. The problem is they should not be executed in parallel. They should be executed in series. i.e. after first data transformation and execute sql task is done, then second one should start.I hope you understand what I meant.

    Thanks,

    Sridhar!!

  • If I am understanding what you mean....

    It is possible to use workflow to activate one task 'on completion', 'on success' etc of a previous task.

    This is done by selecting start task, ctrl-end task then right click workflow, then choose appropriate action.

  • Sridhar,

    I think rossc and I are trying to describe the same thing.  I wish I could copy an image into the message forum.  Let's try this:

    Transform Task1  ---->  SQL Task1  ---->  Transform Task2  ----> SQL Task2

    ----> = On Success

    ----> = On Completion

    SQL Task1 won't start until Transform Task1 is completes successfully and Transform Task2 won't start until SQL Task1 completes.

    Hope this helps,

    Greg

     

    Greg

  • Hi,

    I will try that. Thanks to both of you for the Help.

    Thanks,

    Sridhar!!

  • Hi,

    I have one more question related to this...I need to do several DTS Transformation tasks and Execute Sql Tasks...

    The order would be like this...

    SqlConn--->ExcelConn--->ExecuteSqlTask

    SqlConn1--->ExcelConn1--->ExexuteSqlTask--->ExecuteSqlTask--->ExecuteSqlTask

    SqlConn2--->ExcelConn2--->ExexuteSqlTask--->ExecuteSqlTask--->ExecuteSqlTask

    I have so many of these. My question is Do I need to have more than one SqlConn to achieve this? Is there a way to reuse the single SqlConnection? And If I do like the above will all the Sql connections be open? It would be bad in performance wise.

    Thanks,

    Sridhar!!

  • You can use the same connections since you want your tasks to execute serially.  If you had tasks that you wanted to execute in parallel, you would use seperate connections.

    To reuse the connections, have your workflow arrows loop back around from the SQL Task to the SQL connection. 

    FYI, you can copy connections on the workspace by right-clicking on the connection icon, click copy, right_click in open space, click paste.

    Greg

    Greg

  • Hi Greg,

    Thanks for the reply. The problem is the excel connection would be different. that is the excel file name would be same but the sheet name and the columns in the sheet would be different. So I think I cannot use loop back option or can i still use it? I'm again drawing my sequence here

    SqlConn--->ExcelConn--->ExecuteSql--->SqlConn(which is same as before)--->ExcelConn(which will have different stored procedure and different data sheet name)--->ExecuteSql?

    Will it affect performance if I include more SqlConnections and is there a way to close them? Thanks for all the help.

    Thanks,

    Sridhar!!

  • Oh yeah.  I forgot that each export is going to a different Excel file.  In that case, the easiest thing to do is create copies of the connections as I described before.

    As for performance, I believe that a connection is only open for the duration of the task that is using it rather than for the duration of the package.  In other words, your SQL and Excel connections are open only while each Transform Data Task is executing. 

    Greg

    Greg

  • Thanks Greg. I have created the DTS package now. But I haven't tested it. I need to test once the new data comes. I will ask you if any problem comes.

    Thanks Again,

    Sridhar!!

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply