Concatenate Rows

  • I am trying concatenate an instruction field for multiple rows. Once I get the result set I will then insert this into another table. I want to do this as a set based operation but I am having a hard time getting "looping" out of my head.

    Here is sample data.

    --===== If the test table already exists, drop it

    IF OBJECT_ID('TempDB..#mytable','U') IS NOT NULL

    DROP TABLE #mytable

    --===== Create the test table with

    CREATE TABLE #mytable

    (

    pms_idINT,

    eq_idVARCHAR(20),

    bl_idVARCHAR(8),

    fl_idVARCHAR(2),

    rm_idVARCHAR(8),

    interval_typeVARCHAR(4),

    instructionsVARCHAR(1000)

    )

    --===== Insert the test data into the test table

    INSERT INTO #mytable

    (pms_id, eq_id, bl_id, fl_id, rm_id, interval_type, instructions)

    SELECT 1, '346-BSC-022','346','01', NULL, 'mm', 'Do step 1' UNION ALL

    SELECT 2, '346-BSC-022','346','01', NULL, 'yy', 'Do step 2' UNION ALL

    SELECT 3, '346-BSC-021','346','01', NULL, 'wk', 'Do step 3' UNION ALL

    SELECT 4, '346-ASQ-01','346','01', '107', 'yy', 'Do something' UNION ALL

    SELECT 5, NULL,'346','01', '187', 'mm', 'Clean Area' UNION ALL

    SELECT 6, NULL,'346','01', '187', 'wk', 'Wipe sink Area' UNION ALL

    SELECT 7, NULL,'346','01', NULL, 'mm', 'Check Drains' UNION ALL

    SELECT 8, NULL,'346','01', NULL, 'wk', 'Inspect door'

    The output would something like:

    eq_idbl_idfl_idrm_idinstructions

    346-BSC-02234601NULLDo step 1 & Do step 2

    346-BSC-02134601NULLDo step 3

    346-ASQ-0134601107Do something

    NULL34601187Clean Area & Wipe sink Area

    NULL34601NULLCheck Drains & Inspect door

    I know I need to group by eq, bl, fl, rm but I can't figure out how the get a single field for the instructions with all of the grouped values.

    What's the best approach to do something like this?

    Thanks,

    Craig

  • it works over your small data set. not sure if you can get rid of the distinct with so many columns you are grouping by.

    here is the solution:

    WITH RN AS (SELECT ROW_NUMBER () OVER (ORDER BY pms_id) RN1,

    ROW_NUMBER() OVER (PARTITION BY eq_id,bl_id,fl_id,rm_id ORDER BY pms_id) RN2,

    *

    FROM #mytable)

    SELECT DISTINCT RN1 - RN2 OrderColumn, eq_id,bl_id,fl_id,rm_id,

    REPLACE(STUFF((SELECT ', ' + instructions

    FROM RN i

    WHERE i.RN1 - RN2 = o.RN1 - o.RN2

    FOR XML PATH('')),1,2,''),',',' &')

    FROM RN o

    ORDER BY RN1 - RN2

    if you do not care about the order you can omit the OrderColumn from the result set and remove the order by clause

    EDIT: Thanks for the DDL and Sample data on your first post.

    EDIT2: Explaining the code:

    the first CTE gives us 2 row numbers we can subtract and the result of the math stays the same over the different groups.

    the select statement selects our grouping id and then using FOR XML PATH concantenates the instructions together along with a ,. i use a , because it will stay the same in the conversion to XML and then finally the replace changes the , to the requested &


    For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden[/url] for the best way to ask your question.

    For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]

    Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
    Jeff Moden's Cross tab and Pivots Part 1[/url]
    Jeff Moden's Cross tab and Pivots Part 2[/url]

  • Jeremy,

    Thanks for the reply. Sorry I am responding late but you know how it goes, one problem to the next and all that.

    Took me a little bit to wrap my brain around your solution but it got me to look at "STUFF" and "FOR XML PATH" which are very useful. Which Holiday Inn Express did you stay at, I think I need to be there for a week 🙂

    This works quite well for the simple sample I set up but as I started getting some real data from the user it's gotten a bit more complicated and I started down a different path. I think I am going to create a function that does the work and returns a varchar. I am leaning this way because I have a feeling the requirements for the "instructions" will be changing often and if it does I just have to modify the function. I suspect I will take a performance hit but this will be part of an "off hour" job.

    No problem for the sample data. Your guys are doing me a favor, the least I can do is make easier. Besides I saw Jeff Moden's post about "questions etiquette" and he made it so simple to put it together I would have felt guilty if I didn't.

    Thanks again,

    Craig

  • Be interested in seeing your final query and the function. Pretty sure you are thinking scalar function, but keep in mind that will give you a hit performance wise. Would need to see what you create to see if it could be modified to an in-line table valued function which could be used to improve the performance of your final query.

  • Hi Lynn,

    Thanks for the reply. Any guidance for better solutions are always welcome. I have to say I am glad to have found this community because the contributors (read as experts) are very supportive.

    I am trying to keep things as simple as possible and the data loads (~1000 records once a week) are not that heavy. Unfortunately, it means I will make some sacrifices in scaleability (performance) in order to get it done simply (quickly) from my knowledge of TSQL.

    The function is pretty simple but it seems to be getting modified after more people see the out put. That's expected I guess.

    CREATE FUNCTION [dbo].[GetPmSteps]

    (

    -- Add the parameters for the function here

    @eq_id char(12),

    @bl_id char(8),

    @fl_id char(4),

    @rm_id char(8),

    @dateToDo datetime

    )

    RETURNS VARCHAR(MAX)

    AS

    BEGIN

    -- Declare the return variable here

    DECLARE @steps VARCHAR(MAX)

    SELECT @steps = COALESCE(@steps + CHAR(13) + CHAR(13), '') + '====== ' + UPPER(pm_procedure.description) + ' ======' + CHAR(13) + pm_procedure.instructions

    FROM pm_schedule INNER JOIN

    pm_procedure ON pm_schedule.pmp_id = pm_procedure.pmp_id

    WHERE (pm_schedule.date_next_todo <= @dateToDo) AND (pm_schedule.active = 1)

    AND (@eq_id IS NULL OR pm_schedule.eq_id = @eq_id) AND (@bl_id IS NULL OR pm_schedule.bl_id = @bl_id)

    AND (@fl_id IS NULL OR pm_schedule.fl_id = @fl_id) AND (@rm_id IS NULL OR pm_schedule.rm_id = @rm_id)

    ORDER BY pm_schedule.interval_type, pm_schedule.interval_freq DESC

    -- Return the result of the function

    RETURN @steps

    END

    I then Just call a select statement

    SELECT eq_id, bl_id, fl_id, rm_id, dbo.GetPmSteps(eq_id, bl_id, fl_id, rm_id, getdate()) as steps

    FROM pm_schedule

    WHERE (active = 1) AND (date_next_todo <= getdate())

    GROUP BY eq_id, bl_id, fl_id, rm_id

    The problem was that I never know how many separate procedures would show up for a given date until run time. The majority will just be one procedure but there are cases when there could be a monthly, quarter, semi, and yearly all show up on the same date range. The idea is to have one "ticket" that has all the steps.

    Thanks,

    Craig

  • Can you post some sample data that better reflects your setup. The scalar udf can most likely be rewritten into an Inline Scalar Udf (An inline table valued function that returns a table containing 1 record.)

    I would also like to point you to http://www.sqlservercentral.com/articles/T-SQL/91724/ which has more information about the subject. a great article by Jeff Moden.


    For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden[/url] for the best way to ask your question.

    For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]

    Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
    Jeff Moden's Cross tab and Pivots Part 1[/url]
    Jeff Moden's Cross tab and Pivots Part 2[/url]

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply