Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Measure Transaction size Expand / Collapse
Author
Message
Posted Monday, January 7, 2013 12:00 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Tuesday, June 24, 2014 12:38 AM
Points: 2,023, Visits: 2,517
All,

I have two queries like below.

declare cursor c1...

open c1

fetch c1 into ...

while @@sql_status ..

begin

BEGIN TRAN

insert t1
insert t2
insert t3
update t4

COMMIT

fetch c1 into ...

END

close ...
deallocate

The same query is used and the location of BEGIN TRAN is
changed now.

#2)

BEGIN TRAN

declare c1 cursor...

open c1

fetch c1 into ...

while @@sql.status ..

begin


insert t1
insert t2
insert t3
update t4


fetch c1 into ...

END

close ...
deallocate

COMMIT


which one is good to have? how to measure the usage of
tranaction size for the second one? which one is good if i
have
2000000 records processed within the cursor ?

Note: Cursor is not the right one to process 2000000 records. But the requirement is like that.


karthik
Post #1403430
Posted Monday, January 7, 2013 3:18 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Wednesday, July 23, 2014 8:48 AM
Points: 1,118, Visits: 1,582
It depends.

If the insert and updates are not inter-related then I would suggest that you put them into different transactions as this would lead to holding of Transaction Locks for lesser amount of time at one continuous stretch.

If the inserts are inter-related or not inter-related, but some other transactions are supposed to read data from the tables on which the inserts and updates are taking place, then it depends on what Isolation Level you want to implement. From the sample code that you have provided.....I see that you are working on four different tables. You are performing Inserts into three and Updating one. IMHO, you should use different Transactions for all the inserts and updates as it would lead to holding of Locks for a lesser time.

If you use the approach that you have posted then the transaction would hold locks till the entire transaction is completed and that would be really costly.

Following links would be useful:

Transaction & Locking Architecture

Locking in the Database Engine


Vinu Vijayan

For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden
Post #1403509
Posted Monday, January 7, 2013 4:38 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Today @ 11:13 AM
Points: 15,527, Visits: 27,909
There's almost no difference between the two. Both are likely to cause HUGE issues. Updating that much data using a cursor AND through individual INSERT & UPDATE statements is going to be a giant transaction. You'll need to have a very large log file. You'll also probably need a much larger tempdb. Plus, you're going to have a very large number of locks inside your database leading to massive blocking of other processes.

In general, this is a very poor choice. The requirements may say this, but this is where you, as the technical resource, push back against the requirements in order to arrive a technically supportable solution.


----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #1403539
Posted Monday, January 7, 2013 5:04 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Wednesday, July 23, 2014 8:48 AM
Points: 1,118, Visits: 1,582
Karthik could you brief us upon the requirement a bit??...It won't be that much of an effort......Plus you might get opinions that might make this requirement easier for you cause in any given case this approach is going to kill your performance completely.

PS: Don't forget to set up some sample data and show us a working example.


Vinu Vijayan

For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden
Post #1403548
Posted Monday, January 7, 2013 1:03 PM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 5:16 PM
Points: 1,970, Visits: 2,911
I suggest a hybrid method.

Do something like the first method except that you begin new transactions only at certain record counts, for example, every 2000 values processed:

record #1; record #2001; record #4001; etc.

You can adjust the number up or down as needed to reduce locking.

Every one in a separate transaction is huge overhead in creating and committing trans.

All in one transaction is huge overhead if it has to rollback after doing a lot of inserts/updates.


SQL DBA,SQL Server MVP('07, '08, '09)
"In America, every man is innocent until proven broke!" Brant Parker
Post #1403836
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse