Is this possible in T-sql

  • I have a stock Table with a FIFO QUE data which is : delimited. To complicate things the colon splits data as well as FIFO Records.

    EG

    Part [FIFO QUE]

    xxx 1:2.0:2:2.5

    This means Part xxx has to Fifo records with

    1 record Qty 1 Price 2.0

    2 record Qty 2 Price 2.5

    What I am trying to achieve is splitting that to a table

    PART SEQ QTY PRICE

    xxx 1 1 2.0

    xxx 2 2 2.5

    What I have so far is

    Select Part,[FIFO Queue], LEN([FIFO Queue]) - len(Replace([FIFO Queue],':','')) as frec from [Parts PO]

    Gives me the number of colons in string

    Then if i do

    Select

    Part,

    Case frec

    When 0 then 0

    When 1 then 1

    When 3 then 2

    When 5 then 3

    When 7 then 4

    When 9 then 5

    When 11 then 6

    When 13 then 7

    When 15 then 8

    When 17 then 9

    End as fiforecords,

    [FIFO Queue]

    from (Select Part,[FIFO Queue], LEN([FIFO Queue]) - len(Replace([FIFO Queue],':','')) as frec from [Parts PO]) as t

    Gives me the number of fifo records but not happy using a case statement. Any idea how to change?

    The third step would be Splitting the Data and records as needed above. The pattern above is Odd : are data splitters and even : are record splitters but have no idea how to do this in T SQL.

  • Yes, it is very possible to split out the individual parts from the comma-delimited columns. See Jeff's excellent string splitter at http://www.sqlservercentral.com/articles/Tally+Table/72993/. If you aren't familiar with it yet, this will usher in a new way of looking at strings for you. Enjoy.

  • As mentioned, the splitter will split your column into separate values and will put one value in each row. You'll get your quantity on odd rows and price on even rows. You'll just need to use cross tabs/pivot (with MAX function) and generate the groups (which can be done by a simple formula).

    Would you need help with that?

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Hi

    Thanks for the quick reply. I am going through the splitter info at moment reading on tally tables.(interesting)

    It does look like i have extra complication in that the splitter needs to split the data and the records in the string as the FIFO string is mixed. If you can give any input would be grateful. but i may also find the answer once finished reading the suggested article which is very good.

    Once again thank you for your quick replies.

    N.B The FIFO Part Table has 15000 records in it qhith some parts having up to 17 FIFO records in the que to be split out in order.

  • Am i right in thinking i can create two tally tables.

    Tallyeven with IDENTITY(INT,0,2)

    TallyOdd with IDENTITY(INT,1,2)

    And cross join those results ?

  • Take this for a spin. It seems to match your desired output...

    -- Test Data --

    IF OBJECT_ID('tempdb..#temp') IS NOT NULL

    DROP TABLE #temp;

    SELECT

    Part,

    FifoCode

    INTO #temp

    FROM

    ( VALUES ( 'xxx', ' 1:2.0:2:2.5'), ( 'yyy', ' 10:20.0:20:20.5') ) x (Part,FifoCode);

    -- View the original data --

    SELECT * FROM #temp t;

    -- The Solution --

    WITH Split AS (-- Uses a renamed version of Jeff Moden's Delimited 8k Splitter

    SELECT

    t.Part,

    sc.ItemNumber,

    sc.Item

    FROM

    #temp t

    CROSS APPLY dbo.SplitCSVToTable8K(t.FifoCode, ':') sc

    ), Quantity AS (-- Uses the mod value if the ItemNumber to divide the Split rows into "Quantity" rows and "Price" rows.

    SELECT

    s.Part,

    ROW_NUMBER() OVER (PARTITION BY s.Part ORDER BY s.ItemNumber) AS rn,

    CAST(s.Item AS INT) AS Qty

    FROM

    Split s

    WHERE

    s.ItemNumber % 2 = 1

    ), Price AS (-- the price section of the above comment...

    SELECT

    s.Part,

    ROW_NUMBER() OVER (PARTITION BY s.Part ORDER BY s.ItemNumber) AS rn,

    CAST(s.Item AS DECIMAL(19,4)) AS Price

    FROM

    Split s

    WHERE

    s.ItemNumber % 2 = 0

    )-- Bring it all back together to display the desired output

    SELECT

    q.Part,

    q.rn AS FIFO_Order,

    q.Qty,

    p.Price

    FROM

    Quantity q

    JOIN Price p

    ON q.Part = p.Part

    AND q.rn = p.rn

    Output...

    Part FIFO_Order Qty Price

    ---- -------------------- ----------- ---------------------------------------

    xxx 1 1 2.0000

    xxx 2 2 2.5000

    yyy 1 10 20.0000

    yyy 2 20 20.5000

    Note: the splitter being referenced in the "Split" CTE can be found here... Tally OH! An Improved SQL 8K โ€œCSV Splitterโ€ Function[/url]

    HTH,

    Jason

  • Jason,

    I find your solution overcomplicated. This was my original approach.

    --Using Jason's sample data

    SELECT Part,

    (ItemNumber + 1) / 2 AS SEQ,

    MAX( CASE WHEN split.ItemNumber % 2 = 1 THEN Item END) AS QTY,

    MAX( CASE WHEN split.ItemNumber % 2 = 0 THEN Item END) AS PRICE

    FROM #temp

    CROSS APPLY dbo.DelimitedSplit8K( FifoCode, ':') split

    GROUP BY Part,

    (ItemNumber + 1) / 2

    ORDER BY Part, SEQ;

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Luis Cazares (5/21/2015)


    Jason,

    I find your solution overcomplicated. This was my original approach.

    --Using Jason's sample data

    SELECT Part,

    (ItemNumber + 1) / 2 AS SEQ,

    MAX( CASE WHEN split.ItemNumber % 2 = 1 THEN Item END) AS QTY,

    MAX( CASE WHEN split.ItemNumber % 2 = 0 THEN Item END) AS PRICE

    FROM #temp

    CROSS APPLY dbo.DelimitedSplit8K( FifoCode, ':') split

    GROUP BY Part,

    (ItemNumber + 1) / 2

    ORDER BY Part, SEQ;

    Jason, what Luis did is a technique called a crosstab. It's a very useful technique, quite performant and is well worth the time to learn if you don't know it yet.

  • I agree... I definitely like Luis' solution better.

    Yes, I'm very familiar with how to do a cross tab query using aggregated case expressions... Unfortunately, that simply wasn't the 1st solution that came to mind...

  • I have implemented Jason solution and understood it after reading the article thank you Jason. I see luis has improved it which i will give it a try tomorrow and digest.

    Thank you guys for your help you have really taught me loads.

    For the record Jasons Solution took 15485 stock records split them out into 38626 Rows in under a second. Something i did not think possible until today.

  • MikeTomkies (5/21/2015)


    I have implemented Jason solution and understood it after reading the article thank you Jason. I see luis has improved it which i will give it a try tomorrow and digest.

    Thank you guys for your help you have really taught me loads.

    For the record Jasons Solution took 15485 stock records split them out into 38626 Rows in under a second. Something i did not think possible until today.

    Luis' solution should be even faster. If the estimated execution plan is an indicator (not always the case), his should be much faster.

    In either case, the "magic sauce" they both have in common is DelimitedSplit8K function...

  • Jason A. Long (5/21/2015)


    MikeTomkies (5/21/2015)


    I have implemented Jason solution and understood it after reading the article thank you Jason. I see luis has improved it which i will give it a try tomorrow and digest.

    Thank you guys for your help you have really taught me loads.

    For the record Jasons Solution took 15485 stock records split them out into 38626 Rows in under a second. Something i did not think possible until today.

    Luis' solution should be even faster. If the estimated execution plan is an indicator (not always the case), his should be much faster.

    In either case, the "magic sauce" they both have in common is DelimitedSplit8K function...

    I find that function has the tendency to change the way people look at data and their expectations of performance after using it a few times. ๐Ÿ˜‰

  • Ed Wagner (5/21/2015)


    Jason A. Long (5/21/2015)


    MikeTomkies (5/21/2015)


    I have implemented Jason solution and understood it after reading the article thank you Jason. I see luis has improved it which i will give it a try tomorrow and digest.

    Thank you guys for your help you have really taught me loads.

    For the record Jasons Solution took 15485 stock records split them out into 38626 Rows in under a second. Something i did not think possible until today.

    Luis' solution should be even faster. If the estimated execution plan is an indicator (not always the case), his should be much faster.

    In either case, the "magic sauce" they both have in common is DelimitedSplit8K function...

    I find that function has the tendency to change the way people look at data and their expectations of performance after using it a few times. ๐Ÿ˜‰

    I think that's the polite way of saying, "It completely destroys other SQL based splitters"...

  • Jason A. Long (5/21/2015)


    Ed Wagner (5/21/2015)


    Jason A. Long (5/21/2015)


    MikeTomkies (5/21/2015)


    I have implemented Jason solution and understood it after reading the article thank you Jason. I see luis has improved it which i will give it a try tomorrow and digest.

    Thank you guys for your help you have really taught me loads.

    For the record Jasons Solution took 15485 stock records split them out into 38626 Rows in under a second. Something i did not think possible until today.

    Luis' solution should be even faster. If the estimated execution plan is an indicator (not always the case), his should be much faster.

    In either case, the "magic sauce" they both have in common is DelimitedSplit8K function...

    I find that function has the tendency to change the way people look at data and their expectations of performance after using it a few times. ๐Ÿ˜‰

    I think that's the polite way of saying, "It completely destroys other SQL based splitters"...

    Yes it does. However, it does more than that. The techniques the article teaches and the approach to testing can be used to make many other things more efficient. It's also fun. ๐Ÿ˜‰

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply