Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Delete duplicate records in a table Expand / Collapse
Author
Message
Posted Wednesday, December 1, 2010 2:33 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Sunday, March 25, 2012 6:32 AM
Points: 7, Visits: 14
Hi,

Can you please suggest me the best query to delete the duplicate records in a table with large number of records.

Thanks in advance.

--Chandra
Post #1028479
Posted Wednesday, December 1, 2010 2:45 AM


SSChasing Mays

SSChasing MaysSSChasing MaysSSChasing MaysSSChasing MaysSSChasing MaysSSChasing MaysSSChasing MaysSSChasing Mays

Group: General Forum Members
Last Login: 2 days ago @ 5:07 AM
Points: 609, Visits: 1,135

Hi ..

Try this one...


declare @tab table (id int,name varchar(10))
insert into @tab (id,name)
select 1,'Sumit' union all
select 1,'Sumit' union all
select 2,'Sumit2' union all
select 2,'Sumit2'

select id,name from ( select row_number() over (partition by id,name order by id )as row ,id,name from @tab )as t
where row= 1
Post #1028486
Posted Wednesday, December 1, 2010 2:59 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 9:54 AM
Points: 6,813, Visits: 14,028
chitturiii (12/1/2010)
Hi,

Can you please suggest me the best query to delete the duplicate records in a table with large number of records.

Thanks in advance.

--Chandra


You may have to delete in batches if there are a very large number of rows to be deleted. Do you have a query which identifies the rows to delete? If not, post the structure of the table as CREATE TABLE etc and any indexes, also which columns are used to identify dupes.


“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1028490
Posted Wednesday, December 1, 2010 9:15 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Yesterday @ 7:58 AM
Points: 188, Visits: 365
If you want to filter duplicate, just use group by clause to filter duplicate row.

select column_name, min(PK_Column_name) PrimaryKeyColumn from tablename group by column_name having count(*)>1

The above script will identify duplicate value and show minimum Primary key value. If you intend to keep one row, you can keep the minimum Primary key value row. (of course you can change the condition)

The below will delete all duplicate rows and leaving one distinct value row(minimum Primary key value row)

with Tempinfo (columename, PrimaryKeyColumn)
as
(select column_name, min(PK_Column_name) PrimaryKeyColumn from tablename group by column_name having count(*)>1)

delete a
from tablename a join Tempinfo b on a.column_name=b.column_name and a.PrimaryKeyColumn<>b.PrimaryKeyColumn
Post #1028762
Posted Friday, December 3, 2010 10:47 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Tuesday, July 15, 2014 3:09 AM
Points: 314, Visits: 2,530
Also it is a better idea to put the recover model to SIMPLE or else your log file will bloat.But make sure you take a Full and a T log backup before changing the recovery model to SIMPLE.
Post #1030238
Posted Saturday, December 4, 2010 11:25 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 10:14 AM
Points: 35,371, Visits: 31,915
Sachin Nandanwar (12/3/2010)
Also it is a better idea to put the recover model to SIMPLE or else your log file will bloat.But make sure you take a Full and a T log backup before changing the recovery model to SIMPLE.


Agh... be careful now. Changing to SIMPLE breaks the backup chain. This is normally one of the worst things you can do.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1030298
Posted Tuesday, December 7, 2010 3:01 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Tuesday, July 15, 2014 3:09 AM
Points: 314, Visits: 2,530
Jeff Moden (12/4/2010)
Sachin Nandanwar (12/3/2010)
Also it is a better idea to put the recover model to SIMPLE or else your log file will bloat.But make sure you take a Full and a T log backup before changing the recovery model to SIMPLE.


Agh... be careful now. Changing to SIMPLE breaks the backup chain. This is normally one of the worst things you can do.


But that's the reason I suggested to take a FULL backup before changing it to SIMPLE.Then delete the records.Change it to FULL.Again take a Full Backup.
Correct me if I am missing something.
Post #1031094
Posted Tuesday, December 7, 2010 3:12 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Tuesday, October 14, 2014 4:43 AM
Points: 1,130, Visits: 1,391
Sachin Nandanwar (12/7/2010)
Jeff Moden (12/4/2010)
Sachin Nandanwar (12/3/2010)
Also it is a better idea to put the recover model to SIMPLE or else your log file will bloat.But make sure you take a Full and a T log backup before changing the recovery model to SIMPLE.


Agh... be careful now. Changing to SIMPLE breaks the backup chain. This is normally one of the worst things you can do.


But that's the reason I suggested to take a FULL backup before changing it to SIMPLE.Then delete the records.Change it to FULL.Again take a Full Backup.
Correct me if I am missing something.

Execute the query inside the Begin transaction - End Transaction, If you satisfy with the result then commit it otherwise rollback it. So, it will not change anything in your database in case of any error or issue. Take backup & restore require more time.

Once you change the backup type, your backup chain is no longer valid. You need to take FULL backup once again and do the differential/transaction log backup as per your backup strategy.


Thanks
Post #1031098
Posted Sunday, December 12, 2010 7:27 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 10:14 AM
Points: 35,371, Visits: 31,915
Hardy21 (12/7/2010)
Sachin Nandanwar (12/7/2010)
Jeff Moden (12/4/2010)
Sachin Nandanwar (12/3/2010)
Also it is a better idea to put the recover model to SIMPLE or else your log file will bloat.But make sure you take a Full and a T log backup before changing the recovery model to SIMPLE.


Agh... be careful now. Changing to SIMPLE breaks the backup chain. This is normally one of the worst things you can do.


But that's the reason I suggested to take a FULL backup before changing it to SIMPLE.Then delete the records.Change it to FULL.Again take a Full Backup.
Correct me if I am missing something.

Execute the query inside the Begin transaction - End Transaction, If you satisfy with the result then commit it otherwise rollback it. So, it will not change anything in your database in case of any error or issue. Take backup & restore require more time.

Once you change the backup type, your backup chain is no longer valid. You need to take FULL backup once again and do the differential/transaction log backup as per your backup strategy.


The whole point is to try to avoid sending the LOG file through the roof with a bazillion deletes and to try to accelerate the rate of the deletions. Simply adding an explicit transaction will have neither of those effects.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1033549
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse