Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««12345»»

Slows system performance Expand / Collapse
Author
Message
Posted Monday, April 15, 2013 4:31 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556
using sp_spaceused
I get -
DB size - 10011.13 MB
Unallocated space - 288.22 MB
Data - 3779736 KB
Index - 5912608 KB
Unused - 140264 KB
Reserved - 9832608 KB

(Note: I am doing performance testing in my db where i have only 2 tables.)
Post #1442240
Posted Monday, April 15, 2013 4:55 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 9:49 AM
Points: 7,234, Visits: 13,719
Can you run this please?

EXEC sp_spaceused 'mytable'

- change the table name to the name of the table you are deleting from.

Also, run this:

DBCC SQLPERF ( LOGSPACE )



“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1442250
Posted Monday, April 15, 2013 5:05 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556

Result of EXEC sp_spaceused 'tablename'
Rows = 12309864
REserverd = 6976112 KB
Data = 5038728 KB
Index = 1795848 KB
Unused = 141536 KB


Result of DBCC SQLPERF ( LOGSPACE )
Log size 7584.117
Log space used = 97.98335
Post #1442254
Posted Monday, April 15, 2013 5:22 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 9:49 AM
Points: 7,234, Visits: 13,719
Last of all, can you post the actual batch please? Change the table name if you need to.

“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1442258
Posted Monday, April 15, 2013 5:40 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556
BEGIN TRY

CREATE TABLE #aaa
(
Id BIGINT IDENTITY(1,1) NOT NULL PRIMARY KEY ,
xyzmasterCode VARCHAR(15) NOT NULL )


INSERT INTO#aaa (xyzmasterCode,NumberOfRows)
SELECT DISTINCT xyzmasterCode, COUNT(*)
FROM xyz
WHERE xyzmasterCode BETWEEN '0' AND 'Z'
AND filterDate <= @filterDate
GROUP BY xyzmasterCode



SET @Id = 0

WHILE 1=1
BEGIN
BEGIN TRAN

DELETE a
FROM xyz a,#aaa
WHERE a.xyzmasterCode = #aaa.xyzmasterCode
AND #aaa.Id between @Id and @Id + 100
AND filterDate <= @filterDate


DELETE J
FROM xyzmaster J,#aaa
WHERE J.Code = #aaa.xyzmasterCode
AND#aaa.Id between @Id and @Id + 100


COMMIT TRANSACTION ;
CHECKPOINT
IF @DeleteRowcount <=0
BREAK

SET @Id = @Id + 100

END



END TRY
Post #1442265
Posted Monday, April 15, 2013 6:25 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 9:49 AM
Points: 7,234, Visits: 13,719
That looks ok to me, other than the joins to the temp table not being optimized. You don't really need the temp table anyway:

SET @RowsDeleted = 1000;
WHILE @RowsDeleted > 0
BEGIN

BEGIN TRAN;

DELETE TOP(1000) s
FROM [Experimental].[dbo].[SampleData] s
WHERE xyzmasterCode BETWEEN '0' AND 'Z'
AND filterDate <= @filterDate;

SET @RowsDeleted = @@ROWCOUNT;

COMMIT TRAN;

CHECKPOINT;


BEGIN TRAN;

DELETE TOP(1000) s
FROM [Experimental].[dbo].[SampleData] s
WHERE xyzmasterCode BETWEEN '0' AND 'Z';

SET @RowsDeleted = @RowsDeleted + @@ROWCOUNT;

COMMIT TRAN;

CHECKPOINT;

END

You could run it as one transaction spanning both deletes or as two as shown -I'd try both.


“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1442280
Posted Wednesday, April 17, 2013 2:22 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Today @ 12:43 AM
Points: 1,287, Visits: 1,118
ChrisM@Work (4/15/2013)
That looks ok to me, other than the joins to the temp table not being optimized. You don't really need the temp table anyway:

SET @RowsDeleted = 1000;
WHILE @RowsDeleted > 0
BEGIN

BEGIN TRAN;

DELETE TOP(1000) s
FROM [Experimental].[dbo].[SampleData] s
WHERE xyzmasterCode BETWEEN '0' AND 'Z'
AND filterDate <= @filterDate;

SET @RowsDeleted = @@ROWCOUNT;

COMMIT TRAN;

CHECKPOINT;


BEGIN TRAN;

DELETE TOP(1000) s
FROM [Experimental].[dbo].[SampleData] s
WHERE xyzmasterCode BETWEEN '0' AND 'Z';

SET @RowsDeleted = @RowsDeleted + @@ROWCOUNT;

COMMIT TRAN;

CHECKPOINT;

END

You could run it as one transaction spanning both deletes or as two as shown -I'd try both.


Add a waitfor delay statement after the commit. This will help to reduce the blocking.
Post #1443142
Posted Wednesday, April 17, 2013 3:20 AM
Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Today @ 12:41 AM
Points: 1,090, Visits: 6,554
baabhu (4/17/2013)
ChrisM@Work (4/15/2013)
That looks ok to me, other than the joins to the temp table not being optimized. You don't really need the temp table anyway:

SET @RowsDeleted = 1000;
WHILE @RowsDeleted > 0
BEGIN

BEGIN TRAN;

DELETE TOP(1000) s
FROM [Experimental].[dbo].[SampleData] s
WHERE xyzmasterCode BETWEEN '0' AND 'Z'
AND filterDate <= @filterDate;

SET @RowsDeleted = @@ROWCOUNT;

COMMIT TRAN;

CHECKPOINT;


BEGIN TRAN;

DELETE TOP(1000) s
FROM [Experimental].[dbo].[SampleData] s
WHERE xyzmasterCode BETWEEN '0' AND 'Z';

SET @RowsDeleted = @RowsDeleted + @@ROWCOUNT;

COMMIT TRAN;

CHECKPOINT;

END

You could run it as one transaction spanning both deletes or as two as shown -I'd try both.


Add a waitfor delay statement after the commit. This will help to reduce the blocking.


Can you explain what you mean by this, baabhu? I can understand why a delay might be useful - removing obsolete log data isn't instantaneous - but I don't understand what this has to do with blocking. Cheers.



Low-hanging fruit picker and defender of the moggies





For better assistance in answering your questions, please read this.




Understanding and using APPLY, (I) and (II) Paul White

Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Post #1443156
Posted Sunday, April 21, 2013 2:18 PM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: 2 days ago @ 11:12 AM
Points: 2,934, Visits: 2,959
Hi

From your results it seems you have much schema locks.
Just had similar scenario yesterday, without locks but with high cpu. Some update scripts were running for 'forever'. We stopped them, rebuilt the indexes and run again and they passed well. Maybe will help you.

Regards
IgorMi




Igor Micev,
SQL Server developer at Seavus
www.seavus.com
Post #1444820
Posted Sunday, April 21, 2013 2:50 PM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Yesterday @ 3:52 PM
Points: 42,849, Visits: 35,978
IgorMi (4/21/2013)
Just had similar scenario yesterday, without locks but with high cpu. Some update scripts were running for 'forever'. We stopped them, rebuilt the indexes and run again and they passed well.


That was stale statistics resulting in a sub-optimal exec plan (probably estimating one row and getting thousands or millions). Next time just update stats, don't waste time doing a full index rebuild.

p.s. All queries take shared schema locks when they run, this is normal behaviour. Hence having lots of schema locks is not a problem (unless you have a bunch of schema mod locks too)



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1444822
« Prev Topic | Next Topic »

Add to briefcase «««12345»»

Permissions Expand / Collapse