Update a field with Stored Procedure output

  • Hello All,

    I'm new here, first post. I hope someone can help me with that. Here's the problem.

    My company has bought a new system based on SQL-2005 and I need to convert our data. Many of their tables have a column called pKey which needs to be populated. Their system as a stored procedure that they use to get the next number as follows.

    declare @pKey int

    exec dbo.getnextkey 'tablename.fieldname', NumberOfKeys, @pKey output

    Typically you would use the value 1 for NumberOfKeys but the procedure was coded to allow for a large number of keys to be "reserved" in 1 call.

    Now here's the problem I have with this. How can I copy records from my old tables to the new ones and populate the pKey column properly without using a cursor.

    I know I can't call a procedure to update a field in an update or insert statement. What I've been doing so far is;

    declare @pKey int

    declare @field1 char(x)

    declare @field2 char(x)

    declare xCurs cursor for select field1, field2 from table

    open xCurs

    fetch next xCurs into @field1, @field2

    while @@fetch_status = 0

    Begin

    set @pKey = dbo.getnextkey 'table.fieldname', 1, @pKey output

    insert into table(pKey, Field1, Field2) values (@pKey, @Field1, @Field2)

    fetch next xCurs into @field1, @field2

    End

    close xCurs

    deallocate xCurs

    This is a very simple example of what I have to do, on certain tables we have quantities by size stored horizontally in 24 columns where the new system stores it vertically so for each record in the cursor I have to run a loop from 1 to 24 within the Begin/End of the cursor loop. On a table with 10,000 records I end up creating 240,000 in the new database and that takes 2 hours.

    I know it would run in a fraction of the time if I could use something like;

    Select Field1 into #Temp from table

    Union All

    Select Field2 from table

    Union All

    Select Field3 from table

    ...

    up to 24 times and then re-read my temp table to populate my new table. My only problem is how/where/when do I call my stored procedure to get the next pKey for each new record.

  • The first question is: Are you the only person currently adding data to these tables?

    If so, my second question is: Can you find a range of viable PK values, include those in your temp table (as outlined in your post) and insert those?

    If, for example, the highest PK value currently in the table is 1025, and you're inserting 5,000 rows, then you start with ID 1026 and end with ID 6025, run your insert in one chunk, and you're good to go.

    If, however, you aren't the only one adding data, that won't work and you might have to try skipping a few numbers ahead to make it work. Like start at 2000 in the above example, if you can be sure there won't be a PK value of 2000 when you start your insert.

    (By the way, I can't be sure, but it sure sounds like the database was designed by someone who was probably trying to prevent exactly what you're talking about. The vendor is probably looking for a "migration fee" of some sort.)

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • If I understand the problem properly I think that this would work.

    SELECT

    IDENTITY(INT,) AS pKey

    ,*

    INTO ##T1

    FROM [YOUR TABLE]

    This will give you a temporary table with pKeys

    Then you could use the union query that you were describing to convert rows to columns.

    Check out the online help for details of the identity function (not the identity property). You can use it to set the starting value and increment as well as the data type.

    Hope this is what you're looking for.

  • Thank you both for the replies. I finally came up with a solution last night that works well. From the 2 hours that it took with the cursor I'm down to 1.5 minutes.

    Since this thread can still be useful to others I will answer the question and post my solution.

    GSquared, this will be a 1 shot deal to migrate our data when we go live with the new system so when I run it I will be the only one writing to the tables.

    I don't think the database was designed to prevent migration since they were not interested in doing it themselves. They actually recommended that our users re-enter everything. :w00t: It is just the way they designed it, they have pKeys and fKeys in many of their tables but none of them are IDENTITY columns, they have a table somewhere to keep their counters and a stored procedure that returns the next number and updates the counter.

    MJL, I don't have much experience yet with MS-SQL, I didn't know about the Identity() function. It's definitely a neat way of doing it. However I'd have to find their counters and make sure to set the next number to my highest pKey once I'm done.

    Now here's how I finally did it.

    declare @pKey int

    /* "Verticalize" my 24 columns */

    selectField1 into #1 From InpuTable

    union all

    selectField2 From InpuTable

    union all

    selectField3 From InpuTable

    .....

    up to Field24

    /* Add my pKey field */

    alter table #1 add pKey int

    /* Add an index on pKey - More on this below */

    create index #ix on #1 (pKey)

    /* Get the number of records in my temp table */

    declare @Count int

    set @Count = (Select count(*) from #1)

    /* Loop through to update my pKey column */

    While @Count > 0

    Begin

    /* Get the next key */

    exec dbo.getnextkey 'tablename.pkey', 1, @pKey output

    /* update the first records that hasn't been updated yet */

    update top (1) #1 set pKey = @pKey where pKey is null

    /* Decrement counter and loop */

    set @count = @count -1

    End

    /* Finally I can populate the table in the new database */

    insert into NewTable

    (

    [Fields List]

    )

    select [Fields List] from #1

    The first time I ran the loop I didn't have an index on my #1 table, after 5 minutes it was still running. Then it occured to me that as my pKey gets updated, SQL server has to read more records in each pass of the loop to find the first one that has a NULL pKey. Creating an index on the pKey column fixed that problem. It makes the loop extremely fast because now I always have a record with a NULL pKey at the top so each loop only reads 1 record.

    I hope this can help somebody else in the future.

  • Glad you got it working.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply