December 16, 2004 at 8:39 am
I am extremely new to SQL and I am trying to upload data into the SQL DB. But I want to make the upload process through DTS much smarter.
The problem is that my source files (.txt) are huge around 700 MB, and sometimes the DTS package fails. especially in the last step when it tries to insert data into the final table.
The situation is such that i am putting data from text files into a work table say ABC. From ABC I am slelcting the data and inserting into table XYZ, which is my final table. I am using SQL task to insert the data into table XYZ.
Now there are possibilties that due to discrepancy in the data files, certain steps might fail. Mostly the first steps will not fail because I am not doing any transformations. But the last steps might fail (it is failing now) for various reasons. Is it possible to record the error logs and the remove the problematic record (row) from the work table and put it into an exception table, yet carry on with the SQL task? In other other words, when my last step i.e. the SQL task is running, if there are any problematic records, the SQL task should not stop. i.e. the inserting of records into XYZ should continue; and all the problematic records with/without error message of the particular record should be present into some kind of exception table. Later on, I can check the exception table and fix the errors and put it back into my final table XYZ.
May be it is very simple for some of you guys but I am pretty new to SQL. Could you help me out with this?
As an example of a part of my SQL Task.
INSERT INTO XYZ
SELECT [ABC].[Col001], cast([ABC].[Col002] AS datetime), [ABC].[Col003], cast([ABC].[Col004] AS datetime),
cast([ABC].[Col005] AS datetime), [ABC].[Col006], cast([ABC].[Col007] AS money), [ABC].[Col008], [ABC].[Col009],
[ABC].[Col010], [ABC].[Col011], [ABC].[Col012], cast([ABC].[Col015] AS money), [PQR].[sendersref], [PQR].[benref],
[PQR].[docref],
CASE WHEN [PQR].[startdate] = '00000000' OR [PQR].[startdate] = '00010101' THEN NULL ELSE cast([PQR].[startdate] AS datetime) END,
CASE WHEN [PQR].[enddate] = '00000000' OR [PQR].[enddate] = '00010101' THEN NULL ELSE cast([PQR].[enddate] AS datetime) END, cast([PQR].[rate] AS float),
Regards,
Shoaib
December 16, 2004 at 9:50 am
Hi Shoaib,
You might try using a Transform Data Task instead of an SQL Task. You can use the same SELECT statement in the Source tab. Then on the Options tab, in the File Type box, uncheck "7.0 format" and check "Error text", "Source error rows", and "Dest error rows". That will write the exception rows to files. In the Data Movement box, make the Max error count greater than zero so the task will continue even if exceptions are encountered in the source data.
If you're not doing anything else with the ABC work table, you could eliminate it and just use the Transform Data Task between your source text file and your destination table.
Greg
Greg
December 21, 2004 at 1:25 am
Hi Greg,
Thanks for the reply.
But I cannot use Transform Data Task in the final step, since I use the same work table for multiple final tables. Moreover, I need to use the same DTS package for appending the final table depending on a certain criteria. That is the reason why I am using an SQL task instead of a Transform Data Task.
So in other words, I want to use the same functionality of Transform Data Task in my SQL task by writing the exception rows to file/table and continuing the task because right now what happens is if there is any error in the SQL task, nothing is being written to the final table. Everything just rolls back. This is the situation that I want to avoid.
Could you help me out please?
Regards,
Shoaib
Viewing 3 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply