Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12345»»»

Slows system performance Expand / Collapse
Author
Message
Posted Monday, April 8, 2013 8:11 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556
thanks for the link it was realy informative. I changed the setting after reading the artical.

My log fie is increasing a lot. 10MB every 8-9 seconds. What may be the reason.

I am reading the records from million rows and set updating one column in the same table at a time max of 10000 rows. This update is in begin tran and commit tran then why should the log file increas?

Note i have one index which has the updateable column in its inlcude list.




Post #1439857
Posted Monday, April 8, 2013 9:12 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:05 AM
Points: 7,177, Visits: 13,629
Your transaction log is currently too small to accommodate logging of the changes you are making to your tables in this series of statements.
Check which recovery model are you using. If your local instance is for training / testing, you probably don't need logging for backup, in which case setting your recovery model to Simple is likely to help. For a detailed look at transaction logs and an explanation of why this is the case, try this article by Gail Shaw.



“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1439897
Posted Tuesday, April 9, 2013 10:50 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556
Thank for the link. Auto growth was 10% every few second it was increasing the log. After i chnaged it to few mb the perfeomce has improved.

i have following code -

while 1=1
begin

begin tran
delete top 1000
where primarkkeycol = @variable
out out to deletehistory

if @@row count < 1000
break
commit tran
end

Out table(delete history) is also in the same noticed that in this case it now increasing temp db log and data. if log. if i am deleting records and adding same infor in other table why should the data log should increase
Also there is commit tran after 1000 records in process breaks then it takes long time to recover.
Post #1440471
Posted Wednesday, April 10, 2013 12:59 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:05 AM
Points: 7,177, Visits: 13,629
Which recovery model is your db set to?

“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1440669
Posted Wednesday, April 10, 2013 1:06 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556
recovery model - simple
Post #1440670
Posted Wednesday, April 10, 2013 1:29 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:05 AM
Points: 7,177, Visits: 13,629
Two key points - don't shrink after load, it will only cost you next time around; and checkpoint after each COMMIT to clear completed transactions from the log.
Grant Fritchey has an excellent blog entry which may help.


“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1440680
Posted Wednesday, April 10, 2013 1:59 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Sunday, April 27, 2014 7:55 PM
Points: 218, Visits: 556
thanks. I will check the perfromace and the size of log after adding the checkpoint

How do u write it,

Begin tran


Commit tran
checkpoint

if i have 2 begin tran / commot tran then i have to put chekcpoint 2 times?
Post #1440688
Posted Wednesday, April 10, 2013 3:12 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:05 AM
Points: 7,177, Visits: 13,629
Krishna1 (4/10/2013)
thanks. I will check the perfromace and the size of log after adding the checkpoint

How do u write it,

Begin tran


Commit tran
checkpoint

if i have 2 begin tran / commot tran then i have to put chekcpoint 2 times?


Yes, that's correct. Successful transactions in the tran log are of no further interest to you in simple recovery mode. SQL Server will get around to checkpointing in its own good time but you don't want to wait, you want the log space to be reusable for the next batch.


“Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Exploring Recursive CTEs by Example Dwain Camps
Post #1440707
Posted Thursday, April 11, 2013 4:11 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Tuesday, August 12, 2014 7:56 AM
Points: 318, Visits: 385
Just to add in my 2p-worth (after running into similar issues), if you're doing the insert/update/delete as separate transactions, have you considered using a MERGE statement to do everything in one go?
Post #1441169
Posted Friday, April 12, 2013 7:55 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Thursday, August 7, 2014 7:16 AM
Points: 811, Visits: 1,161
Use Batch insert/update/ delete instead of doing it in one big batch.
Post #1441711
« Prev Topic | Next Topic »

Add to briefcase ««12345»»»

Permissions Expand / Collapse