• Can anyone help me build a FLOWR-based query that produces the same result as the pure XPath one?

    I don't know if that is possible. I thought I saw an article out there where someone replaces that nodes method (which you are using in your first query) with a FLOWR statement but I can't find it. I gave this a shot but failed :laugh:

    The best I could do for you is this solution:

    WITH xml_txt(txt) AS

    (

    select cast(myXmlData.query

    ('

    for $drop in /order/Product/Drops/Drop[@Number=1]/Area/@Name

    return string($drop)

    ')

    AS varchar(100))

    from #data d

    )

    SELECT item

    FROM xml_txt

    CROSS APPLY dbo.DelimitedSplit8K(txt,' ') s

    This uses Jeff Moden's splitter function (referenced in my signature). I tested this with 100,000 XML records in #data, however, and the FLOWR/Splitter solution runs for about 30 seconds whereas your original query runs for 8 seconds. Your query, when doing 1000's of records, produces a parallel query plan; my solution did not so I updated it to include Adam Mechanic's MakeParallel() function which reduced the query time by about 1/3rd but was still slower than the original solution that you posted.

    The final solution looked like this:

    WITH xml_txt(txt) AS

    (

    select cast(myXmlData.query

    ('

    for $drop in /order/Product/Drops/Drop[@Number=1]/Area/@Name

    return string($drop)

    ')

    AS varchar(100))

    from #data d

    )

    SELECT item

    FROM make_parallel()

    CROSS APPLY xml_txt

    CROSS APPLY dbo.DelimitedSplit8K(txt,' ') s

    All that said, you could clean your query up to look like this:

    select data.mynodes.value('.', 'varchar(50)') Name

    from #data

    cross apply myXMLData.nodes('(/order/Product/Drops/Drop[@Number="1"]/Area/@Name)') data(mynodes);

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001