Exporting and Versioning Lookup Data: A Real-World Use of the CLR

  • Comments posted to this topic are about the item Exporting and Versioning Lookup Data: A Real-World Use of the CLR

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Nice and cool one....

  • You've read my mind in terms of what I was looking for. Pointed out the rabbit trails and the reasons for the final result was a nice touch, too.

    My situation may be a little different, but maybe you can provide some insight. There is a plan to import all (with exceptions) user tables from a Progress database (that's part of a commercial application) into SQL Server 2005. There are over 500 tables, and when upgrades come out, there could be added tables, deleted tables, structurally changed tables.

    Ideally, I'd like to

    cycle through all the tables from the Progress database,

    exclude the ones in the Exception table

    compare the structure to the table definition in SQL Server 2005

    note if the table is new

    note if the table is no longer in Progress

    note any differences in table structure

    import the data if there are no differences

    Your article seems to point in the direction of SQL#, something I've unfortunately found too little of. Please send me your thoughts on the above issue.

    TIA

    Andre

  • Neat article. Thanks for posting it.

    ---
    Timothy A Wiseman
    SQL Blog: http://timothyawiseman.wordpress.com/

  • Very good article... you did a heck of a good job on it. 🙂

    Just a couple of observations...

    You did all of this just because BCP won't export column headers... I believe I'd have fought that particular requirement... and won. 😉 Even if I lost, it's still a lot easier to gen the headers from the Information_Schema (or whatever) and use the COPY command to prepend them to the data file created by BCP than to go through all of what you did to make it happen.

    Also, you don't need a text qualifier if you have a Tab delimited file and it shouldn't have been as much of a concern as the requirements (and you) made it... after all, you're not importing to a spreadsheet.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • andreq1 (6/25/2008)


    My situation may be a little different, but maybe you can provide some insight. There is a plan to import all (with exceptions) user tables from a Progress database (that's part of a commercial application) into SQL Server 2005. There are over 500 tables, and when upgrades come out, there could be added tables, deleted tables, structurally changed tables.

    Hello Andre. I am not sure what your options are. Have you done a Google search? I just looked up "Progress DB" since I am unfamiliar with it and found this:

    http://techrepublic.com.com/5208-6230-0.html?forumID=101&threadID=210931&start=0

    so other people are trying to do the same thing as you. However, the solution in my article, and SQL#, are only for getting info out of SQL Server 2005+. There was mention of an ODBC driver in that post on techrepublic so maybe there is a 3rd party app or maybe a complex SSIS package could work?? Sorry I cannot be of more help.

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Jeff Moden (6/25/2008)


    Very good article... you did a heck of a good job on it. 🙂

    Just a couple of observations...

    You did all of this just because BCP won't export column headers... I believe I'd have fought that particular requirement... and won. 😉 Even if I lost, it's still a lot easier to gen the headers from the Information_Schema (or whatever) and use the COPY command to prepend them to the data file created by BCP than to go through all of what you did to make it happen.

    Also, you don't need a text qualifier if you have a Tab delimited file and it shouldn't have been as much of a concern as the requirements (and you) made it... after all, you're not importing to a spreadsheet.

    Thanks. For the column headers, yes, it was required and I did not want to do a DOS COPY to concatenate column headers (which I have seen done before) which is even more shelling out to DOS than just calling BCP (or now I need a .cmd file to do several steps) and there are many hundreds of tables to deal with here. Yes, it did take a little longer to accomplish but I do still prefer this approach which I feel is more elegant.

    As far as the text-qualifier goes, I did a test and at first it appeared to work. We have lots of embedded tabs here so I created a 2 column table (INT, NTEXT) since that mirrors a lot of what we have (not all has moved over to 2005 yet). In one row I had no tabs, and in other rows I had various amount of tabs. I am not sure how it distinguished between column-delimeters and column data but it did seem to figure it out. However, I did a BCP out and was not able to re-import that file without getting a UNICODE data error. So I made sure to BCP the data out into a UNICODE file (-w option) and it still did not work when trying to reimport into the table the data came from. However, if I imported the data into a table that the Data Import wizard created, even with an NTEXT column, it worked just fine. On the other hand, the data that I exported from my CLR BulkExport proc imported just fine into an existing table. THEN, I added a column to the table (VARCHAR(50)) and on one row added data without tabs and in other rows added various amounts of tabs. Now when using the Import Data wizard it was completely wrong and it could not figure out which were the real delimeters. The problem we have is that most characters, even foreign language characters, are in our data. We have fields that capture HTML and/or JavaScript, etc.; tabs, commas, and quotes are all part of the data to begin with. Hence my need to a) have a text-qualifier, and b) to find one that would not naturally be in our data. Now, if there is something I am not doing correctly in terms of using BCP (and native format is not acceptable) then I would certainly like to know.

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • We have lots of embedded tabs

    That would certainly be a fly in the ointment. Thanks for the feedback.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden (6/25/2008)


    We have lots of embedded tabs

    That would certainly be a fly in the ointment.

    Yeah, it really did constrain me and I truly did try to find a more "natural" way of dealing with this but wasn't able to find anything to meet all of my needs. And I do understand about the column headers but there was no way I could have not included them and given the text-qualification issue it just seemed best to find a comprehensive solution. If I didn't need the text-qualifier then I would have likely done what you suggested to get the column headers as I do agree that was a lot of effort to go through for that alone ;). And again, I am not a BCP expert by any means; I tried what I could and if there is a feature that get's me closer to the goal then I would certainly want to know for future reference.

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Good article on a good use of the CLR. Nice to include code samples. This line from the article, though, gave me a bit of a chuckle:

    "We have quite a bit of static data here at ChannelAdvisor (.com) and with the amount of development being done it can change quite frequently."

    TroyK

  • cs_troyk (6/27/2008)


    Good article on a good use of the CLR. Nice to include code samples. This line from the article, though, gave me a bit of a chuckle:

    "We have quite a bit of static data here at ChannelAdvisor (.com) and with the amount of development being done it can change quite frequently."

    TroyK

    Thanks. And yes, it is amusing. It is supposed to be LookUp data and considered static because the application doesn't update it. I am not sure why we call it "static" outside of that fact--that the app doesn't update it--but that is what we call it here (and really 98% - 99% of it really is static). I prefer to call it LookUp, but that removes the source of amusement when considering how often some of the "static" data changes ;).

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply