January 23, 2008 at 5:34 am
Hi All,
If I have a delete query that would delete many thousands of rows, is there a was to cause it to delete just 100 rows at a time? I have now idea of the T-SQL, but if this were another language then I would probably use a loop, is this possible in SQL?
Many thanks,
Phil
Regards,
Phil
January 23, 2008 at 5:50 am
Yes... not only is it possible, but it's a frequent recommendation... just change the SET ROWCOUNT value to what you want in the code that follows... note that this is a complete test example including a test table so you can "play" with it... details for everything are in the comments...
--===== If the test table exists, drop it
IF OBJECT_ID('dbo.JBMTestDetail','U') IS NOT NULL
DROP TABLE dbo.JBMTestDetail
GO
--===== Create and populate a 1,000,000 row test table.
-- Column "ConnID" has a range of 1 to 100,000 non-unique numbers stored as VARCHAR(30)
-- Column "Key1" has a range of 1 to 100,000 non-unique numbers stored as VARCHAR(30)
-- Column "SomeMoney has a range of 0.0000 to 99.9999 non-unique numbers
-- Column "Time_Stamp" has a range of >=01/01/2005 and <01/01/2015 non-unique date/times
-- Column "SomeCSV" contains 'Part01,Part02,Part03,Part04,Part05,Part06,Part07,Part08,Part09,Part10'
-- for all rows.
-- Column "SomeHex12" contains 12 random hex characters (ie, 0-9,A-F)
SELECT TOP 1000000
ConnID = ISNULL(CAST(ABS(CHECKSUM(NEWID()))%100000+1 AS VARCHAR(30)),''), --(10 rows per connection)
Key1 = ISNULL(CAST(ABS(CHECKSUM(NEWID()))%100000+1 AS VARCHAR(20)),''), --just to test index with
SomeLetters2 = CHAR(ABS(CHECKSUM(NEWID()))%26+65)
+ CHAR(ABS(CHECKSUM(NEWID()))%26+65),
SomeCSV = CAST('Part01,Part02,Part03,Part04,Part05,Part06,Part07,Part08,Part09,Part10' AS VARCHAR(80)),
SomeMoney = CAST(ABS(CHECKSUM(NEWID()))%10000 /100.0 AS MONEY),
Time_Stamp = ISNULL(CAST(RAND(CHECKSUM(NEWID()))*3652.0+38351.0 AS DATETIME),0),
SomeHex12 = RIGHT(NEWID(),12)
INTO dbo.JBMTestDetail
FROM Master.dbo.SysColumns t1,
Master.dbo.SysColumns t2 --Lack of join criteria makes this a CROSS-JOIN
--===== Create indexes similar to Troy's
CREATE CLUSTERED INDEX IXC_JBMTestDetail_Time_Stamp_ConnID ON dbo.JBMTestDetail (Time_Stamp,ConnID)
CREATE NONCLUSTERED INDEX IX_JBMTestDetail_ConnID_Key1 ON dbo.JBMTestDetail (ConnID,Key1)
GO
--===== Setup to measure performance...
SET STATISTICS TIME ON
--========================================================================
-- Demo the delete crawler on the test table constructed above
--========================================================================
--===== Define the cutoff date with a time of "midnight" or, if you will,
-- define the cutoff date with no time so we only delete whole days.
DECLARE @CutoffDate DATETIME
SELECT @CutoffDate = DATEADD(dd,DATEDIFF(dd,0,GETDATE()),-94)
--===== Limit all further queries, including deletes, to 25,000 rows
-- (about 1 second worth of deletes, like I said before)
SET ROWCOUNT 25000
--===== See if any rows qualify for deletion. If even just one exists,
-- then there's work to do and @@ROWCOUNT will be > 0.
-- DO NOT PUT ANY CODE BETWEEN THIS SELECT AND THE WHILE LOOP OR
-- YOU'LL MESS UP THE ROWCOUNT FOR THE WHILE LOOP
SELECT TOP 1 1 FROM dbo.JBMTestDetail WHERE Time_Stamp < @CutoffDate
--===== If the rowcount from the above is greater than 0,
-- then delete 25,000 rows at a time until there's nothing
-- left to delete
WHILE @@ROWCOUNT > 0
BEGIN
--===== Just a "marker" to separate the loop in the output
PRINT REPLICATE('=',78)
--===== This delay gives other processes breathing room
WAITFOR DELAY '00:00:10'
--===== Do the delete. Will be limited by the SET ROWCOUNT above.
-- DO NOT PUT ANY CODE BETWEEN THIS DELETE AND THE "END" OR
-- YOU'LL MESS UP THE ROWCOUNT FOR THE WHILE LOOP.
DELETE dbo.JBMTestDetail WITH (TABLOCKX)
WHERE Time_Stamp < @CutoffDate
END
--===== Restore the ability to process more than 25,000 rows
SET ROWCOUNT 0
--Jeff Moden
Change is inevitable... Change for the better is not.
January 23, 2008 at 6:42 am
Hi Jeff,
Thanks for that. I must congratulate you on how well documented and well presented your examples are - it makes learning so much easier.
Regards,
Phil
Regards,
Phil
January 23, 2008 at 6:51 am
Thanks for the feedback, Phil. Really appreciate your comments.
By the way (almost forgot)... they haven't fixed the code windows on this forum, yet. If you just copy and paste, they come out all on one line.
If you put your cursor one line above the code window, click and drag to one line below the code window, copy, paste into Word, replace ^l with ^p, and copy/paste that into SQL, all formatting including leading spaces will be preserved...
--Jeff Moden
Change is inevitable... Change for the better is not.
January 23, 2008 at 6:55 am
I did discover this, but just typed the code!
Cheers,
Phil
Regards,
Phil
January 23, 2008 at 8:15 am
By the way (almost forgot)... they haven't fixed the code windows on this forum, yet. If you just copy and paste, they come out all on one line.
If you put your cursor one line above the code window, click and drag to one line below the code window, copy, paste into Word, replace ^l with ^p, and copy/paste that into SQL, all formatting including leading spaces will be preserved...
hence my little art. "How to handle lost LF/CR when copying - SQL Server Central" at http://www.sqlservercentral.com/articles/SQLServerCentral.com/61520/
:w00t::cool:
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
January 23, 2008 at 8:39 pm
Dang... you wrote an article on it, they published it, and they still don't get the hint???
By the way, the "bad" character is actually a CHAR(11) or "Vertical Tab" in the ASCII character set.
And, I hadn't thought of using "Paste Special"... that's a great time saver! Thanks!
--Jeff Moden
Change is inevitable... Change for the better is not.
January 25, 2008 at 3:47 am
Hi Jeff,
Is this principal of limiting the number of rows for the deletion operation also applicable to updating? I have to update several hundred thousand rows and I want to do this in the most effective (and stable) way!
Thanks for all your help thus far,
Phil
Regards,
Phil
January 25, 2008 at 3:59 am
just my 2 ct
- with mass deletes:
- limit the number or deletes per transaction to avoid (dead) locking.
- perform full table(s) maintenance after the operation to optimize
space consumption and statistics.
- with mass updates:
- determine your impact scope. i. e. can you perform the update in
chunks (like with delete) having your data still symanticaly correct ?
- keep in mind, the log-overhead, io-overhead due to page splits
if the row nolonger fits in the original page, ...
- perform full table(s) maintenance after the operation to optimize
space consumption and statistics.
- maybe you're beter off creating a staging table, truncating
the original and insterting data according to the clustering key
sequence.
- prepare the operation, determine impact timeframe(s), involve your data consumers !
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
January 26, 2008 at 8:06 am
Philip Barry (1/25/2008)
Hi Jeff,Is this principal of limiting the number of rows for the deletion operation also applicable to updating? I have to update several hundred thousand rows and I want to do this in the most effective (and stable) way!
Thanks for all your help thus far,
Phil
Basically, yes. Along with some of the considerations ALZDBA has above...
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply