Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Will using a Temp table as working / staging tables reduce the log backup size? Expand / Collapse
Author
Message
Posted Tuesday, July 23, 2013 9:22 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: 2 days ago @ 7:20 PM
Points: 100, Visits: 501
I just want to clarify something.
I am trying to reduce the size of the log file backups created by certain processes at certain times of the day.

I understand logging will and should occur but:

If a process inserts, updates etc data in TEMPDB Temp tables for staging then I assume although this data is logged to the tempdb log file it will not get back up (presumably because the TEMPDB log file is not backed up!).

So my question is: If I get a user that is creating large log files using normal tables as working / staging tables, therefore large log file backups -- can I reduce this by requesting working / staging data be done in TEMPDB temporary tables?

thanks

Post #1476863
Posted Wednesday, July 24, 2013 4:12 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Friday, May 30, 2014 6:27 PM
Points: 2,808, Visits: 7,175
rather than put more work on the tempDb, you could create them a dedicated Staging Database and put this in simple mode so that log backups are not needed.
Post #1476959
Posted Wednesday, July 24, 2013 5:02 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 9:47 AM
Points: 13,872, Visits: 28,270
Are they inserting data into the regular tables and then updating it in place as part of a single operation?

If so, your approach could possibly work. But what about just eliminating the secondary update process. Figure out all the data and make it a single batch process instead of multi-pass operations?


----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server Query Performance Tuning
SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #1476985
Posted Wednesday, July 24, 2013 1:46 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: 2 days ago @ 7:20 PM
Points: 100, Visits: 501
rather than put more work on the tempDb, you could create them a dedicated Staging Database and put this in simple mode so that log backups are not needed.


Yes that is a good idea and would probably work

If so, your approach could possibly work. But what about just eliminating the secondary update process. Figure out all the data and make it a single batch process instead of multi-pass operations?


The process is quite complex from a business perspective ie working out loyalty points expiry for various periods and tiers.
So although it may be able to be done in a single batch I guess it has been broken down into many staging tables for debugging / checking reasons.
Post #1477249
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse