January 12, 2007 at 10:15 am
Is it possible? If so, how?
Here's the situation. I have approx. seventy packages that all do the same thing with a different source file and a different destination table. For various reasons, it will be a LOT easier (and save us weeks of work per year) if I can make it one package that loops seventy times, changing the source text file and the destination table with each iteration.
I've figured out how to do both of those things in activeX/vbscript, and they're changing nicely. Unfortunately, however, when I hit the data pump task itself, I get the error "The data pump task requires transformations to be specified."
Now, in every case, it's going to be a straight copy, col001 in the text file to col001 in the table, col002 in the text file to... etc. Is there any bit of code I can throw in to tell the package to just do a straight copy with whatever number of columns happen to be in the table?
Thanks.
January 12, 2007 at 10:29 am
If it's just simple text file import, do you even need DTS ?
Have you tried T-SQL BULK INSERT ? Or BCP from the command line ?
January 12, 2007 at 10:36 am
Take at look "Looping, Importing and Archiving" with DTS at http://www.sqldts.com/246.aspx
SQL = Scarcely Qualifies as a Language
January 12, 2007 at 11:35 am
>Take at look "Looping, Importing and Archiving" with DTS at http://www.sqldts.com/246.aspx<
It's the package I'm using as a template for the loop, but each file gets loaded into the same table. I've tried using "Changing the Data Pump Source and Destination Tables" (http://www.sqldts.com/213.aspx) as the solution to this, but it specifies that each change must have the same transformations: "The structure of the source and destination can obviously be different but the old source table must be the same structure as the new one, and the same applies for the destination table."
In my case, I have seventy different destination tables with seventy different schemae, but I can give the program whatever assurance is necessary that the source schema and destination schema will be identical.
January 12, 2007 at 11:36 am
>If it's just simple text file import, do you even need DTS ?<
Step one is import. There's some validation and checking before the staging table's data gets inserted into the production table.
Though you do have me wondering if I can just use a BCP in this step instead of a data pump. Off to look into that, thanks...
January 12, 2007 at 11:51 am
Typically I don't use a data pump unless I absolutely have to. It's good for some types of data movement, but if it's just a simple field-to-field import with no transforms or lookups, why use a sledgehammer to drive in a half inch nail ?
January 12, 2007 at 11:53 am
[smacks self in head]
Now I remember why I didn't use BULK INSERT... non-standard row terminator. I'm still trying to figure out how to type ROWTERMINATOR = 'CR' (or 'LF', can't remember which it is offhand) into a bulk insert statement and get it to run without a syntax error in query analyzer...
January 12, 2007 at 12:05 pm
ROWTERMINATOR = '\r' - carriage return only
ROWTERMINATOR = '\n' - CR/LF pair
Viewing 8 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply