June 19, 2006 at 12:56 am
Hi,
I've got a source file with "NULL" as values are mapped as DT_DBTIMESTAMP in the connection manager
e.g.
1,NULL,1993-07-01 00:00:00.000,1993-07-08 00:00:00.000,NULL,NULL
I was hoping that I could use 'Derive column' componet, my ssis project won't go past the 'flat file source' in the data flow 'coz it couldn't resolve the null.
If manually search and replaced the 'NULL' with a space then everything is ok. However, there are close to 2 million records so I'm not that keen on going down this path.
Thanks.
June 19, 2006 at 1:25 am
looks like setting the error output for the column to 'ignore failure' does the trick. not so elegant though.
any better ways?
September 13, 2010 at 5:25 am
Hi,
Did u found any solution for the requirement .me also facing the same probs i found the solution also to resolve using Derived Column.
but another problem is that suppose the row record are not seperated by any delimiter means how u will identify that particular record will belongs to that row .
Eg:
EmpID Name Age City
1,Sabari,27,Chennai
2,Mahesh
3,Shanker,25
suppose if the source file contains the above record how we will acheive this .
can you pls help me if u know the solution for this .
Regards,
Sabarinathan
September 13, 2010 at 5:38 am
sabarichandru (9/13/2010)
Hi,Did u found any solution for the requirement .me also facing the same probs i found the solution also to resolve using Derived Column.
but another problem is that suppose the row record are not seperated by any delimiter means how u will identify that particular record will belongs to that row .
Eg:
EmpID Name Age City
1,Sabari,27,Chennai
2,Mahesh
3,Shanker,25
suppose if the source file contains the above record how we will acheive this .
can you pls help me if u know the solution for this .
Regards,
Sabarinathan
This is not the same problem. Your source data should (by convention) look like this:
EmpID Name Age City
1,Sabari,27,Chennai
2,Mahesh,,
3,Shanker,25,
All I had to do to make it valid was add three commas.
What would happen, I wonder, if a record had a city but not an age? Would it be this:
1,Sabari,Chennai
or (correctly) this
1,Sabari,,Chennai
Can you get whoever or whatever creates your source file to correct the format? If you can, your problems go away.
If not, you have a few options
1) Read each line in as a single field and then split it as part of your package (you may be able to achieve this with derived columns - if not, a script component will do it easily)
2) Pre-process the file to add in the missing commas and then feed the processed file into a standard data flow task.
If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.
September 13, 2010 at 5:55 am
Phil Parkin (9/13/2010)
sabarichandru (9/13/2010)
Hi,Did u found any solution for the requirement .me also facing the same probs i found the solution also to resolve using Derived Column.
but another problem is that suppose the row record are not seperated by any delimiter means how u will identify that particular record will belongs to that row .
Eg:
EmpID Name Age City
1,Sabari,27,Chennai
2,Mahesh
3,Shanker,25
suppose if the source file contains the above record how we will acheive this .
can you pls help me if u know the solution for this .
Regards,
Sabarinathan
This is not the same problem. Your source data should (by convention) look like this:
EmpID Name Age City
1,Sabari,27,Chennai
2,Mahesh,,
3,Shanker,25,
All I had to do to make it valid was add three commas.
What would happen, I wonder, if a record had a city but not an age? Would it be this:
1,Sabari,Chennai
or (correctly) this
1,Sabari,,Chennai
Can you get whoever or whatever creates your source file to correct the format? If you can, your problems go away.
If not, you have a few options
1) Read each line in as a single field and then split it as part of your package (you may be able to achieve this with derived columns - if not, a script component will do it easily)
2) Pre-process the file to add in the missing commas and then feed the processed file into a standard data flow task.
A conventional Script Task/Component(Transformation/PreProcess)...
Raunak J
September 13, 2010 at 6:51 am
Hi,
This source file comes from Peoplesoft Database so u cant correct the format. client will send the CSV as a source.
u had given some suggestions how to rsolve the issues using Derived Columns or Script Component.
Basically i'm BI Developer i don't know .NET . can u help he how to acheive using both the Task by taking my data as a example.
By using the Delimiter i handled the null value . but without delimiter from morning i'm trying still couldn't acheive it.
Please Help me
Regards,
Sabarinathan
September 13, 2010 at 7:00 am
sabarichandru (9/13/2010)
Hi,This source file comes from Peoplesoft Database so u cant correct the format. client will send the CSV as a source.
u had given some suggestions how to rsolve the issues using Derived Columns or Script Component.
Basically i'm BI Developer i don't know .NET . can u help he how to acheive using both the Task by taking my data as a example.
By using the Delimiter i handled the null value . but without delimiter from morning i'm trying still couldn't acheive it.
Please Help me
Regards,
Sabarinathan
Sabari,
Trash in is trash out...
If ypur CSV file itself is missing data...it is the client who is at fault...
You simply cannot build a warehouse on assumptions
Raunak J
September 13, 2010 at 7:22 am
From the source the Null values will come in Columns. just they need to replace Null values to either (0 or 1). already we discussed . without delimiter i'm struggling .
jus u help how to resolve if we get the Source File with Null values and without Delimiter .
cos client want solution for this kind of scenario .
Already we discussed with client regarding this but they not agreeing . so i need go ahead with this option .
So please please help how to resolve the issues .
Regards,
Sabarinathan
September 13, 2010 at 7:28 am
sabari as you said you have close to 2 million records. I suggest you stick with derived column transformation with CRLF as delimiter
Raunak J
September 13, 2010 at 7:30 am
You never responded to my 'what if?' question regarding what would happen if a record had a city but not an age. Assuming that the answer is that there would be only one delimiter, there is no solution other than to point out to the data provider how unreasonable they are being.
If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.
September 13, 2010 at 7:35 am
Raunak Jhawar (9/13/2010)
sabari as you said you have close to 2 million records. I suggest ...
That was Sheckster.
If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.
September 13, 2010 at 7:37 am
Phil Parkin (9/13/2010)
Raunak Jhawar (9/13/2010)
sabari as you said you have close to 2 million records. I suggest ...That was Sheckster.
My mistake...Sorry:-D:-D:-D
Raunak J
September 13, 2010 at 8:54 am
Sheckster (6/19/2006)
looks like setting the error output for the column to 'ignore failure' does the trick. not so elegant though.
any better ways?
I would redirect the errors instead of ignoring them. Redirecting your records should allow the source to upload all the data. Then, if you want to fix these, in the redirected path, setup logic to fix the the NULL values using a Derived Column Transform. After that you can merge these fixed records back into the default path.
You may be able to fix these records, but that may not be the correct thing to do. Only someone with a good understanding of the business process will know that.
For best practices on asking questions, please read the following article: Forum Etiquette: How to post data/code on a forum to get the best help[/url]
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply