March 26, 2009 at 2:54 pm
Hello all,
I'm moving our databases from SQL Server 2000 over to SQL Server 2005. Currently, I'm recreating DTS packages as SSIS packages and one of the very first packages, and probably easiest to migrate, is taking 4 minutes to run whereas the DTS package version only takes 4 seconds... quite a markup!!!
The package is quite simple.
1) Execute SQL Task: truncate table.
2) Data Flow Task: OLE DB Source insert to OLD DB Destination.
Also, I'm getting a little popup in the lower right-hand corner of the screen telling me VS 2005 (waiting for an internal process blah, blah, blah) is busy quite often. Do I have a crappy install?
I am quite obviously new to this... any help is greatly appreciated!!!
Thanks,
M
March 26, 2009 at 3:07 pm
It sounds like you are running the SSIS package on your desktop. If so, that is going to be part of the problem.
In the OLEDB Source - are you referencing the table\view directly? If so, there is a known issue with that setup and you should change it to use a SQL Command and select from the table (e.g. SELECT * FROM dbo.MyTable).
And finally, what options do you have set in the OLEDB Destination? There are some options there that can affect performance also. Basically, you are using a BULK INSERT task and, depending upon the number of rows - could be filling the transaction log with one large transaction.
And last - how many rows are we talking about?
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
March 26, 2009 at 4:08 pm
Jeffrey Williams (3/26/2009)
In the OLEDB Source - are you referencing the table\view directly? If so, there is a known issue with that setup and you should change it to use a SQL Command and select from the table (e.g. SELECT * FROM dbo.MyTable).
Nailed it. Using SQL Command, BIDS doesn't hang. Thanks!
Jeffrey Williams (3/26/2009)
And finally, what options do you have set in the OLEDB Destination? There are some options there that can affect performance also. Basically, you are using a BULK INSERT task and, depending upon the number of rows - could be filling the transaction log with one large transaction.And last - how many rows are we talking about?
The package runs much quicker now - but still takes 1:15 to run what only takes 4 seconds in DTS.
OLEDB Destination settings:
Table lock checked
Check constraints checked
(Keep identity and Keep nulls unchecked)
Data access mode: Table or view - fast load.
Name of the table or the view: the table name.
Rows per batch: blank
Maximum insert commit size: 0
This particular insert only deals with 800 rows. I'm hoping to use the same set up for much larger inserts though 15000-20000. Do you see any problem with that?
Thanks again for your help!
M
March 26, 2009 at 4:17 pm
I don't see any issues with the setup for this small set. When you get to loading millions of rows you are going to want to modify the batch and commit options.
I think you are looking at the difference between running on your desktop and running on the server. When you are working on your project in BIDS - and testing it, it runs on your desktop with your desktop resources.
One of the other things to realize is that it does take some time for the process to actually start. There are validation steps being performed on startup that will take some time. On something this small, you probably will see a drop in performance - but, when you start working with a lot larger sets of data you are going to be pleasantly surprised.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
March 26, 2009 at 4:42 pm
Jeffrey Williams (3/26/2009)
I don't see any issues with the setup for this small set. When you get to loading millions of rows you are going to want to modify the batch and commit options.I think you are looking at the difference between running on your desktop and running on the server. When you are working on your project in BIDS - and testing it, it runs on your desktop with your desktop resources.
One of the other things to realize is that it does take some time for the process to actually start. There are validation steps being performed on startup that will take some time. On something this small, you probably will see a drop in performance - but, when you start working with a lot larger sets of data you are going to be pleasantly surprised.
I've been running it in sql server management studio.
I figure you're right about the validation steps - they seem to be the source of the extra time. Strange that it flies right through the truncate statement - it doesn't slow down until the data flow steps. Probably pretty normal, eh?
Thanks again for such quick feedback.
Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply