Updating specific rows

  • I am having a problem, but think I may be making it harder on myself. I need to update a table with new estimates. However, the table to be updated has a column for "benchmarking" data (In 1980, data = 42, in 1985 = 45, etc.). This benchmarking column was well intentioned, but not maintained well and has become useless, basically. But that does mean I want to delete old benchmarks.

    So, I need to update the table, but only want the latest benchmark date to be updated. Here is an example of data:

    areayear benchmarkNbrEmployed

    01976 1980 1758544

    01976 2005 1804304 <<This is the row to update.

    Now, I know if I just want a result set, I can use the query:

    SELECT

    ,[year],MAX([benchmark]),MIN([NbrEmployed])

    FROM Table

    WHERE year = '1976'

    GROUP BY area, year

    However, I don't see how this would work using an UPDATE or MERGE query. Does anyone have any ideas? Thank you -- Amy

  • Hi,

    i think the following statement would help.

    update t

    from Table t

    inner join (

    select area,

    [year],

    max(benchmark) benchmark,

    NbrEmployed

    from Table

    group by

    area,

    [year],

    NbrEmployed

    ) x on t.area=x.area

    and t.[year]=x.[year]

    and t.benchmark=x.benchmark

    and t.NbrEmployed

    [font="Arial"]Kind regards,

    Patrick Fiedler
    Consultant

    Fiedler SQL Consulting

    www.fiedler-sql-consulting.de
    [/font]

  • Not a lot to go on, but maybe something like this:

    DECLARE @rows_updated TABLE (

    area <same_datatype_as_in_table>,

    year <same_datatype_as_in_table>,

    benchmark <same_datatype_as_in_table>

    )

    UPDATE tn

    SET

    <column_name> = <new_value>

    OUTPUT

    INSERTED.area, INSERTED.year, INSERTED.benchmark INTO @rows_updated

    FROM dbo.tablename tn

    INNER JOIN (

    SELECT

    , [year],

    MAX([benchmark]) AS [benchmark],

    MIN([NbrEmployed]) AS [NbrEmployed]

    FROM dbo.tablename

    WHERE year = '1976'

    GROUP BY

    area, year

    ) AS matching_rows ON

    matching_rows.area = tn.area AND

    matching_rows.year = tn.year AND

    matching_rows.benchmark = tn.benchmark

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Amy.G (2/25/2013)


    ...Now, I know if I just want a result set, I can use the query...

    If you can produce the result set then all you can can produce an update statement using a CTE like so:

    -- Sample data

    DECLARE @testData TABLE (id int identity primary key, txt1 varchar(10), yr int, updateme varchar(20));

    DECLARE@old varchar(10)='old data';

    INSERT INTO @testData VALUES ('xxx',1999,@old), ('xxx',1998,@old), ('yyy',1995,@old), ('xyz',1998,@old);

    --Before

    SELECT * FROM @testData

    -- Update using a CTE

    ;WITH IDs AS

    (SELECT txt1, MAX(yr) AS yr

    FROM @testData

    GROUP BY txt1 ),

    UpdateThis AS

    (SELECT updateme

    FROM @testData td

    JOIN IDs x ON td.txt1=x.txt1 AND td.yr=x.yr

    )

    UPDATE UpdateThis

    SET updateme='new estimate';

    --After

    SELECT * FROM @testData

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • Alan.B. -- not only did it work, but I learned something completely new. Thank you.

    Amy

  • No problem Amy.

    That's actually a newer technique for me too and has been very helpful. You can do deletes in the same way...

    ;WITH X AS (SELECT * FROM dbo.emp WHERE active=0)

    DELETE FROM X

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply