Breaking a large tranasaction into smaller transactions

  • I work in a small group with access to a small SQL server instance (VMWare). Most transactions we perform are very small (< 20K records ). Every few weeks I must import and work with a large dataset 2.5MM records. Every time I do this I blow out our transaction log, and accordingly the DBA gets bent when he is forced to step in and fix the issue, not counting the "timeout" he gives us by fixing it when it is convenient for him. :-).

    He has repeatedly asked us to break up the transaction into smaller transactions, but he has failed to educate us on how to do it... Can any DBAs out there provide me guidance on how to take a large dataset and parse inserts / updates into smaller transactions? Adding extra storage space is not an option since we so infrequently need the additional space.

  • Break up the insert into smaller numbers of rows, and run them sequentially.

    How you go about doing that depends on your import method.

    If what you're doing is a single bulk import, then maybe the database needs to be put in "Bulk Logged" recovery model, to minimize the impact on the log, and then you don't have to break up the import.

    Can you provide some details on what you're currently doing? If so, we can provide more specific advice.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • How do you do the import?

    If you use BULK INSERT, http://msdn.microsoft.com/en-us/library/ms188365.aspx, there's a batch size parameter, which will load xx rows in a transaction and then commit.

    There are similar things in SSIS.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply