SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


XML Workshop XII - Parsing a delimited string


XML Workshop XII - Parsing a delimited string

Author
Message
jacob sebastian
jacob sebastian
SSCarpal Tunnel
SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)

Group: General Forum Members
Points: 4210 Visits: 2523
Comments posted to this topic are about the item XML Workshop XII - Parsing a delimited string

.
Phil Factor
Phil Factor
SSCertifiable
SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)

Group: General Forum Members
Points: 6899 Visits: 3051
I really enjoy these XML workshops.

Of course, the parsing of delimited data is an important issue for data feeds. If data can be converted into XML first, using AWK or GREP or whatever, it then becomes much easier to gulp it into SQL Server. The biggest questions I've had are:
performance. Whenever I've used this XML technology for data feeds, or passing data between routines, it has been very fast, but others using a very similar system have reported it as being very slow. I'm not at all sure why the difference.
Resilience Maybe I've been unlucky, but I've had many feeds to deal with that occasionally spit out data that crashes the simplest, and most obvious, data import systems. I reckon that the best systems can isolate corrupt or incorrect data before the bulk insert and allow inspection of it by the DBA

Any thoughts?


Best wishes,

Phil Factor
Simple Talk
jacob sebastian
jacob sebastian
SSCarpal Tunnel
SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)

Group: General Forum Members
Points: 4210 Visits: 2523
I agree with you on the first point. Wherever I used the XML approach in our applications, it just worked well. However, I have read some posts where people complained about performance problems. I am not sure if it is because they are using it in an incorrect manner or something else. Or it could be the volume of data...i am not sure.

On the next point, do you think a schema could help?


jacob

.
Phil Factor
Phil Factor
SSCertifiable
SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)SSCertifiable (6.9K reputation)

Group: General Forum Members
Points: 6899 Visits: 3051
A typical problem I've had to face in the past might be that I've got a million or so rows of data from a switch that have to be imported. If they are not imported, then the business runs the risk of leaving a fraud or intrusion undetected. Right in the middle of the million rows is a record or two that is mangled. Rejecting the file isn't an option. The import routine needs to be able to import all the good stuff and leave the bad stuff in a 'limbo' file for manual intervention. Cleaning the data manually before import isn't a good idea either as such things usually are scheduled for the early hours of the morning when the server isn't so busy. Could a schema solve this sort of problem by filtering 'sheep-from-goats' on a record-by-record basis, rather than a document basis?

I'd love to know what causes slow XML processing but, like you, I'll have to wait until it happens to me! I ran some timings a while back with the various parameter-passing techniques and found XML to be as fast as the 'helper-table'/'number table' technique, which is far faster than any iterative technique.


Best wishes,

Phil Factor
Simple Talk
jacob sebastian
jacob sebastian
SSCarpal Tunnel
SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)

Group: General Forum Members
Points: 4210 Visits: 2523
I understand the problem now. A schema will validate an entire document. I do not think that a schema can be used to filter "bad records" and process the "good" ones. I guess the only option available is to query the XML data and retrieve the "good" records (and retrieve the "bad" onese and dumb to a table or XML file for manual review).

regards
Jacob

.
Jeff Moden
Jeff Moden
SSC Guru
SSC Guru (340K reputation)SSC Guru (340K reputation)SSC Guru (340K reputation)SSC Guru (340K reputation)SSC Guru (340K reputation)SSC Guru (340K reputation)SSC Guru (340K reputation)SSC Guru (340K reputation)

Group: General Forum Members
Points: 340531 Visits: 42644
Right in the middle of the million rows is a record or two that is mangled. Rejecting the file isn't an option. The import routine needs to be able to import all the good stuff and leave the bad stuff in a 'limbo' file for manual intervention.


BCP will do just that very nicely... second in speed only to Bulk Insert which does not have such a capability.

--Jeff Moden

RBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair

Helpful Links:
How to post code problems
How to post performance problems
Forum FAQs
LP-181697
LP-181697
SSC-Addicted
SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)SSC-Addicted (402 reputation)

Group: General Forum Members
Points: 402 Visits: 143
This method is good. But it adding 6 additional characters to the real data value. It means, suddenly your string will not fit into allocated number of characters. You have to be careful.
jacob sebastian
jacob sebastian
SSCarpal Tunnel
SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)SSCarpal Tunnel (4.2K reputation)

Group: General Forum Members
Points: 4210 Visits: 2523
I dont think this approach is good for Large Chunks of data. It is handy when you have a small piece of delimited string and you want to break it into a relational table quickly.

.
Jack Corbett
  Jack Corbett
SSC Guru
SSC Guru (70K reputation)SSC Guru (70K reputation)SSC Guru (70K reputation)SSC Guru (70K reputation)SSC Guru (70K reputation)SSC Guru (70K reputation)SSC Guru (70K reputation)SSC Guru (70K reputation)

Group: General Forum Members
Points: 70988 Visits: 14949
I thought this was a very interesting article and certainly presented a new way to handle delimited strings without looping. For new applications I would just have the application pass XML as the parameter, but this is certainly a good way to handle existing applications and SSRS 2005 multi-select parameters.



Jack Corbett

Applications Developer

Don't let the good be the enemy of the best. -- Paul Fleming
At best you can say that one job may be more secure than another, but total job security is an illusion. -- Rod at work

Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
How to Post Performance Problems
Crosstabs and Pivots or How to turn rows into columns Part 1
Crosstabs and Pivots or How to turn rows into columns Part 2
M Gnat
M Gnat
Forum Newbie
Forum Newbie (3 reputation)Forum Newbie (3 reputation)Forum Newbie (3 reputation)Forum Newbie (3 reputation)Forum Newbie (3 reputation)Forum Newbie (3 reputation)Forum Newbie (3 reputation)Forum Newbie (3 reputation)

Group: General Forum Members
Points: 3 Visits: 18
I agree it's an interesting "concept", but in the real world; where I have to process MILLIONS of records in a routine, I don't see it working....
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search