Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Exporting and Versioning Lookup Data: A Real-World Use of the CLR Expand / Collapse
Author
Message
Posted Monday, June 23, 2008 11:17 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
Comments posted to this topic are about the item Exporting and Versioning Lookup Data: A Real-World Use of the CLR




SQL# - http://www.SQLsharp.com/
Post #522329
Posted Wednesday, June 25, 2008 7:17 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Monday, December 15, 2014 4:15 AM
Points: 5,471, Visits: 1,402
Nice and cool one....


Post #523282
Posted Wednesday, June 25, 2008 11:46 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, December 15, 2008 12:55 PM
Points: 14, Visits: 55
You've read my mind in terms of what I was looking for. Pointed out the rabbit trails and the reasons for the final result was a nice touch, too.

My situation may be a little different, but maybe you can provide some insight. There is a plan to import all (with exceptions) user tables from a Progress database (that's part of a commercial application) into SQL Server 2005. There are over 500 tables, and when upgrades come out, there could be added tables, deleted tables, structurally changed tables.

Ideally, I'd like to
cycle through all the tables from the Progress database,
exclude the ones in the Exception table
compare the structure to the table definition in SQL Server 2005
note if the table is new
note if the table is no longer in Progress
note any differences in table structure
import the data if there are no differences

Your article seems to point in the direction of SQL#, something I've unfortunately found too little of. Please send me your thoughts on the above issue.

TIA
Andre
Post #523593
Posted Wednesday, June 25, 2008 1:42 PM


Right there with Babe

Right there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with Babe

Group: General Forum Members
Last Login: Monday, November 24, 2014 4:29 PM
Points: 752, Visits: 920
Neat article. Thanks for posting it.

---
Timothy A Wiseman
SQL Blog: http://timothyawiseman.wordpress.com/
Post #523682
Posted Wednesday, June 25, 2008 3:10 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: 2 days ago @ 7:27 AM
Points: 35,769, Visits: 32,437
Very good article... you did a heck of a good job on it. :)

Just a couple of observations...

You did all of this just because BCP won't export column headers... I believe I'd have fought that particular requirement... and won. ;) Even if I lost, it's still a lot easier to gen the headers from the Information_Schema (or whatever) and use the COPY command to prepend them to the data file created by BCP than to go through all of what you did to make it happen.

Also, you don't need a text qualifier if you have a Tab delimited file and it shouldn't have been as much of a concern as the requirements (and you) made it... after all, you're not importing to a spreadsheet.



--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #523747
Posted Wednesday, June 25, 2008 4:27 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
andreq1 (6/25/2008)
My situation may be a little different, but maybe you can provide some insight. There is a plan to import all (with exceptions) user tables from a Progress database (that's part of a commercial application) into SQL Server 2005. There are over 500 tables, and when upgrades come out, there could be added tables, deleted tables, structurally changed tables.


Hello Andre. I am not sure what your options are. Have you done a Google search? I just looked up "Progress DB" since I am unfamiliar with it and found this:

http://techrepublic.com.com/5208-6230-0.html?forumID=101&threadID=210931&start=0

so other people are trying to do the same thing as you. However, the solution in my article, and SQL#, are only for getting info out of SQL Server 2005+. There was mention of an ODBC driver in that post on techrepublic so maybe there is a 3rd party app or maybe a complex SSIS package could work?? Sorry I cannot be of more help.





SQL# - http://www.SQLsharp.com/
Post #523773
Posted Wednesday, June 25, 2008 5:04 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
Jeff Moden (6/25/2008)
Very good article... you did a heck of a good job on it. :)

Just a couple of observations...

You did all of this just because BCP won't export column headers... I believe I'd have fought that particular requirement... and won. ;) Even if I lost, it's still a lot easier to gen the headers from the Information_Schema (or whatever) and use the COPY command to prepend them to the data file created by BCP than to go through all of what you did to make it happen.

Also, you don't need a text qualifier if you have a Tab delimited file and it shouldn't have been as much of a concern as the requirements (and you) made it... after all, you're not importing to a spreadsheet.



Thanks. For the column headers, yes, it was required and I did not want to do a DOS COPY to concatenate column headers (which I have seen done before) which is even more shelling out to DOS than just calling BCP (or now I need a .cmd file to do several steps) and there are many hundreds of tables to deal with here. Yes, it did take a little longer to accomplish but I do still prefer this approach which I feel is more elegant.

As far as the text-qualifier goes, I did a test and at first it appeared to work. We have lots of embedded tabs here so I created a 2 column table (INT, NTEXT) since that mirrors a lot of what we have (not all has moved over to 2005 yet). In one row I had no tabs, and in other rows I had various amount of tabs. I am not sure how it distinguished between column-delimeters and column data but it did seem to figure it out. However, I did a BCP out and was not able to re-import that file without getting a UNICODE data error. So I made sure to BCP the data out into a UNICODE file (-w option) and it still did not work when trying to reimport into the table the data came from. However, if I imported the data into a table that the Data Import wizard created, even with an NTEXT column, it worked just fine. On the other hand, the data that I exported from my CLR BulkExport proc imported just fine into an existing table. THEN, I added a column to the table (VARCHAR(50)) and on one row added data without tabs and in other rows added various amounts of tabs. Now when using the Import Data wizard it was completely wrong and it could not figure out which were the real delimeters. The problem we have is that most characters, even foreign language characters, are in our data. We have fields that capture HTML and/or JavaScript, etc.; tabs, commas, and quotes are all part of the data to begin with. Hence my need to a) have a text-qualifier, and b) to find one that would not naturally be in our data. Now, if there is something I am not doing correctly in terms of using BCP (and native format is not acceptable) then I would certainly like to know.






SQL# - http://www.SQLsharp.com/
Post #523787
Posted Wednesday, June 25, 2008 5:30 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: 2 days ago @ 7:27 AM
Points: 35,769, Visits: 32,437
We have lots of embedded tabs


That would certainly be a fly in the ointment. Thanks for the feedback.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #523792
Posted Thursday, June 26, 2008 9:19 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
Jeff Moden (6/25/2008)
We have lots of embedded tabs


That would certainly be a fly in the ointment.


Yeah, it really did constrain me and I truly did try to find a more "natural" way of dealing with this but wasn't able to find anything to meet all of my needs. And I do understand about the column headers but there was no way I could have not included them and given the text-qualification issue it just seemed best to find a comprehensive solution. If I didn't need the text-qualifier then I would have likely done what you suggested to get the column headers as I do agree that was a lot of effort to go through for that alone ;). And again, I am not a BCP expert by any means; I tried what I could and if there is a feature that get's me closer to the goal then I would certainly want to know for future reference.





SQL# - http://www.SQLsharp.com/
Post #524296
Posted Friday, June 27, 2008 10:13 AM
Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Friday, December 5, 2014 5:54 PM
Points: 1,341, Visits: 805
Good article on a good use of the CLR. Nice to include code samples. This line from the article, though, gave me a bit of a chuckle:
"We have quite a bit of static data here at ChannelAdvisor (.com) and with the amount of development being done it can change quite frequently."

TroyK



Post #525155
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse