September 16, 2009 at 2:08 am
Hi,
I'm sure this has been answered but I'm not sure what to search for
If I have a stored procedure such as:
...
SELECT id
FROM sometable
WHERE columnA = 1
-- returns 10 rows
UPDATE sometable
SET someothercolumn = someothervalue
WHERE columnA = 1
-- updates 10 rows
...
What happens if the data in "sometable" is changed by another sproc between these 2 statements (e.g. an extra row is added where columnA=1). Is the set of rows changed by the UPDATE statement the same as the set of rows SELECTED by the first statement or is one extra row updated?
How can I ensure the same set of rows is addressed by each statement?
Thanks for any help! I just need to answer this definitively since I'm a bit lost with the transactionality options.
September 16, 2009 at 2:14 am
It depends on your transaction isolation level,
http://msdn.microsoft.com/en-us/library/ms173763.aspx.
The question is what do you want to happen ?
Do you want the other process blocked ?
Do you want the update to see the new data ?
September 16, 2009 at 2:36 am
Thanks Dave. It seems I'd need to add
SET TRANSACTION ISOLATION LEVEL SNAPSHOT
since I'd like to disregard any changes made outside the current process. It appears that I can set this isolation level for the sproc and it will return to the default isolation level once the sproc has completed, is that correct?
September 16, 2009 at 2:56 am
You could do , i believe that SNAPSHOT has a fairly high overhead.
Arguably this could be better
SELECT id
into #IdsToupdate
FROM sometable
WHERE columnA = 1
UPDATE sometable
SET someothercolumn = someothervalue
from #IdsToupdate
WHERE #IdsToupdate.Id = sometable.Id
September 16, 2009 at 3:22 am
That's exactly what I was doing but I thought there had to be another way. Do you think that the Snapshot solution would incur too much of an overhead? The sproc is run once every 10 minutes and doesn't operate on a vast amount of data.
Thanks for the help!
September 16, 2009 at 4:39 am
As ever , it depends. Personally ive never used snapshot isolation , so cant comment further really.
Try this link though
http://www.sqlnewsgroups.net/group/microsoft.public.sqlserver.server/topic11164.aspx
As to wheater its acceptable in your environment with your data, its all down to testing.
September 16, 2009 at 8:00 pm
bill36email (9/16/2009)
That's exactly what I was doingbut I thought there had to be another way. Do you think that the Snapshot solution would incur too much of an overhead? The sproc is run once every 10 minutes and doesn't operate on a vast amount of data.
Thanks for the help!
There is another way... lookup the OUTPUT clause in Books Online and forget about using the SELECT statement altogether. Lot's of examples on that thread, as well.
--Jeff Moden
Change is inevitable... Change for the better is not.
September 16, 2009 at 8:14 pm
I still recommend you lookup and study OUTPUT in Books Online, but here's a working example...
--===== Create and populate a 1,000,000 row test table.
-- Column "RowNum" has a range of 1 to 100,000 unique numbers
-- Column "SomeInt" has a range of 1 to 50,000 non-unique numbers
-- Column "SomeLetters2" has a range of "AA" to "ZZ" non-unique 2 character strings
-- Column "SomeMoney has a range of 0.0000 to 99.9999 non-unique numbers
-- Column "SomeDate" has a range of >=01/01/2000 and <01/01/2010 non-unique date/times
-- Column "SomeCSV" contains 'Part01,Part02,Part03,Part04,Part05,Part06,Part07,Part08,Part09,Part10'
-- for all rows.
-- Column "SomeHex12" contains 12 random hex characters (ie, 0-9,A-F)
-- Jeff Moden
SELECT TOP 1000000
SomeID = IDENTITY(INT,1,1),
SomeInt = ABS(CHECKSUM(NEWID()))%50000+1,
SomeLetters2 = CHAR(ABS(CHECKSUM(NEWID()))%26+65)
+ CHAR(ABS(CHECKSUM(NEWID()))%26+65),
SomeCSV = CAST('Part01,Part02,Part03,Part04,Part05,Part06,Part07,Part08,Part09,Part10' AS VARCHAR(80)),
SomeMoney = CAST(ABS(CHECKSUM(NEWID()))%10000 /100.0 AS MONEY),
SomeDate = CAST(RAND(CHECKSUM(NEWID()))*3653.0+36524.0 AS DATETIME),
SomeHex12 = RIGHT(NEWID(),12)
INTO dbo.JBMTest
FROM Master.dbo.SysColumns t1,
Master.dbo.SysColumns t2 --Lack of join criteria makes this a CROSS-JOIN
--===== A table is not properly formed unless a Primary Key has been assigned
-- Takes about 1 second to execute.
ALTER TABLE dbo.JBMTest
ADD PRIMARY KEY CLUSTERED (SomeID)
--===== Do the combination of the SELECT and UPDATE using just an UPDATE
UPDATE dbo.JBMTest
SET SomeCSV = 'Changed'
OUTPUT Deleted.SomeID,
Deleted.SomeCSV AS BeforeChange,
Inserted.SomeCSV AS AfterChange
WHERE SomeInt = 1000
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy