• Jeff Moden (10/17/2010)


    I'm not a GUI/App programmer so I'll have to find another way to test what you propose, Solomon. Shouldn't be too difficult, though.

    If I were an App programmer, I'd write code to make whatever log file you speak of more easily fit the idea of a simple BCP import. Normalization of data isn't reserved to just tables in a database nor any less important outside a database.

    Hey there. I do have some sample code in the article I noted above so you can do a simple test and then expand on it if you like. I did not test the XML route but someone else did and posted their findings in the discussion forum.

    As far as normalizing the log file, that is not always in your control. For example, where I currently work we take web server log files, parse various parts of each row using RegEx, get current DB info and then INSERT or UPDATE various tables. This process was designed when we were on SQL Server 2000. With the new streaming method of 2008 though, it would be possible to do this more set-based. But the point is that we only have so much control over the format of the log file. And doing BCP still requires having a staging or queue table in the DB to then do set-based stuff on after the loading is done. The streaming method that I have been mentioning bypasses the staging table and it even bypasses the need to wait until the entire dataset is loaded into the table as it processes each row as it comes in but in a way that appears to be set-based. At that point the BCP route takes more time and additional Schema. So even if one could control the format of the input file you are still stuck with a slower and more complicated system.

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR