Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

delete a large amount of rows in a table Expand / Collapse
Author
Message
Posted Wednesday, February 27, 2013 1:18 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Sunday, March 3, 2013 4:12 AM
Points: 8, Visits: 8
Please help me make this query more efficient because i been waiting for more then 1 hour for query to complete and it did not finish and i got like 25 large tables to delete from:

this is what i wrote it like but it is very slow ..

SELECT * INTO WORKTBL FROM [CHARACTER_01_DBF].[dbo].SKILLINFLUENCE_TBL WHERE m_idplayer NOT IN (SELECT DISTINCT(m_idPlayer) COLLATE DATABASE_DEFAULT FROM [LOGGING_01_DBF].[dbo].[LOG_LOGIN_TBL] WHERE End_Time > '20130123015613')
DROP TABLE [CHARACTER_01_DBF].[dbo].SKILLINFLUENCE_TBL
SELECT * INTO [CHARACTER_01_DBF].[dbo].SKILLINFLUENCE_TBL from WORKTBL
DROP TABLE WORKTBL


Do you know any better method ? much faster then hours and hours of waiting for it to complete ?
Post #1424722
Posted Wednesday, February 27, 2013 2:08 PM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Friday, September 19, 2014 5:16 AM
Points: 887, Visits: 1,774
What percentage of the rows are you keeping and how many rows are there in the tables?

At some point its easier just to loop a delete statement depending on how much data you are getting rid of.



For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden for the best way to ask your question.

For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw

Need to Split some strings? Jeff Moden's DelimitedSplit8K
Jeff Moden's Cross tab and Pivots Part 1
Jeff Moden's Cross tab and Pivots Part 2

Jeremy Oursler
Post #1424743
Posted Wednesday, February 27, 2013 2:13 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Sunday, March 3, 2013 4:12 AM
Points: 8, Visits: 8
133.723 rows total and i wanna keep 77.646

i wanna delete the custom amount according to the query u see above but i tried to run a count for the remaining rows it also take a lot to do :S

i am really confused on the best solution for this...

please help me how ever you can

my main problem is i got 10 tables with 133k from what i wanna keep 77k from each so this will take ages unless i find a solution or u guys tell me one
Post #1424744
Posted Wednesday, February 27, 2013 2:25 PM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Friday, September 19, 2014 5:16 AM
Points: 887, Visits: 1,774
insanityflyff (2/27/2013)
133.723 rows total and i wanna keep 77.646

i wanna delete the custom amount according to the query u see above but i tried to run a count for the remaining rows it also take a lot to do :S

i am really confused on the best solution for this...

please help me how ever you can

my main problem is i got 10 tables with 133k from what i wanna keep 77k from each so this will take ages unless i find a solution or u guys tell me one


133k rows, is that the size of [CHARACTER_01_DBF].[dbo].SKILLINFLUENCE_TBL?

at that size it may just be easier to run a simple delete. Correct me if i am wrong but it looks like your initial query is selecting rows that are in SKILLINFLUENCE_TBL but not in the logging table? So if an m_idplayer exists in the logging table we can delete it from the main table?



For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden for the best way to ask your question.

For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw

Need to Split some strings? Jeff Moden's DelimitedSplit8K
Jeff Moden's Cross tab and Pivots Part 1
Jeff Moden's Cross tab and Pivots Part 2

Jeremy Oursler
Post #1424749
Posted Wednesday, February 27, 2013 2:31 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Sunday, March 3, 2013 4:12 AM
Points: 8, Visits: 8
DELETE FROM [CHARACTER_01_DBF_OLD].[dbo].CHARACTER_TBL WHERE m_idPlayer NOT IN (SELECT DISTINCT(m_idPlayer) COLLATE DATABASE_DEFAULT FROM [LOGGING_01_DBF].[dbo].[LOG_LOGIN_TBL])

this query should delete all the m_idplayers that dont appear in the LOG_LOGIN_TBL right ?
Post #1424750
Posted Wednesday, February 27, 2013 2:35 PM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Friday, September 19, 2014 5:16 AM
Points: 887, Visits: 1,774
it should.

How many rows are in [LOGGING_01_DBF].[dbo].[LOG_LOGIN_TBL]?

Your sub query could be causing the issue if there are only 133k rows in [CHARACTER_01_DBF].[dbo].SKILLINFLUENCE_TBL and its taking hours.

if the sub query is the issue any of the deletes using the sub query will take a while as well.



For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden for the best way to ask your question.

For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw

Need to Split some strings? Jeff Moden's DelimitedSplit8K
Jeff Moden's Cross tab and Pivots Part 1
Jeff Moden's Cross tab and Pivots Part 2

Jeremy Oursler
Post #1424752
Posted Wednesday, February 27, 2013 2:38 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Sunday, March 3, 2013 4:12 AM
Points: 8, Visits: 8
by sub query u mean WHERE End_Time > '20130123015613' right ?
this means it will delete all data that is below 2013/23/01/01:56:03
year day month hour minute second format

and that sub query makes the issue?
Post #1424755
Posted Wednesday, February 27, 2013 2:41 PM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Friday, September 19, 2014 5:16 AM
Points: 887, Visits: 1,774
thats what i mean by sub query.

However how many rows are in the logging table and what indexes there are will affect this a lot. Also is End_Time a DateTime column or Varchar? might be having an issue with your date format as well.



For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden for the best way to ask your question.

For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw

Need to Split some strings? Jeff Moden's DelimitedSplit8K
Jeff Moden's Cross tab and Pivots Part 1
Jeff Moden's Cross tab and Pivots Part 2

Jeremy Oursler
Post #1424757
Posted Wednesday, February 27, 2013 2:46 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Sunday, March 3, 2013 4:12 AM
Points: 8, Visits: 8
CapnHector (2/27/2013)
thats what i mean by sub query.

However how many rows are in the logging table and what indexes there are will affect this a lot. Also is End_Time a DateTime column or Varchar? might be having an issue with your date format as well.


the End_Time is a char(14) date type

the log_login_tbl has 981.057 rows
Post #1424764
Posted Wednesday, February 27, 2013 2:46 PM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Wednesday, September 10, 2014 1:06 PM
Points: 2,827, Visits: 8,479
In general, when I have a lot of rows to delete, I do it in a loop with a "wait". The advantages for me are:
Small chunks so I can see it progress.
Does not hog the system completely
Can be stopped & restarted without starting over

It may not be the fastest method, but in my case I needed to run it on a production server with minimal impact on others. Experiment with delay time & delete size

basically i set it up like this:

DeleteMore:

waitfor delay 5 seconds (to give others some cpu)

delete top 10000 from tableA where ......

if @@rowcount > 0 goto DeleteMore



Post #1424765
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse