SQLServerCentral Article

How to Setup Log Shipping for Multiple Databases

,

Log shipping is a very useful tool to accomplish an array of tasks like offloading reporting activities or initializing database mirroring. In my environment I am using it as part of a high availability/ disaster recovery setup.

When using log shipping for one or just a few databases, Microsoft's out-of-the-box wizard setup works great. However, when you need to log ship large numbers of databases issues can arise using the default setup. The main problem I found with setting up log shipping for a large number databases is that for each database participating in log shipping there are 3 SQL jobs that run.

  • One to backup the log on the primary database
  • One to copy that log backup to another location accessible to the secondary database, and
  • One to restore that log backup to the secondary database.

In my environment I have 130 databases participating in log shipping, so that's a total of 390 jobs. I originally setup these jobs to run every 15 minutes, so that's 390 jobs attempting to launch at the same time. I have found 2 distinct issues with this setup.

  1. In a regular maintenance plan or scripted backup of transaction log files each log is backed up 1 at a time. With log shipping all of the logs are backed up at the same time. This can create a significant burden on system resources like CPU and memory usage.
  2. With the backup, copy, and restore jobs set to the same interval (every 15 minutes in my case) the secondary database only restores a backup every 3X the interval value (every 45 minutes).

The Solution

The solution that I have found to this issue is to setup only a single job for each of the backup, copy, and restore tasks and adjust the intervals that they run at. To accomplish this I start by creating the following 3 jobs

LSBackup_ALL_DBs

This job runs on the primary server at an interval of every 15 minutes. Here's the code it runs:

DECLARE @ID uniqueidentifier;
DECLARE cursor_log CURSOR FAST_FORWARD FOR
SELECT primary_id FROM msdb.dbo.log_shipping_primary_databases;
OPEN cursor_log;
FETCH NEXT FROM cursor_log INTO @ID;
WHILE @@FETCH_STATUS = 0
BEGIN
 DECLARE @CMD varchar(4000)
 SET @CMD = '"D:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqllogship.exe" -Backup ' + CAST(@ID AS varchar(50)) + ' -server SQL_Primary_Server'
 EXEC xp_cmdshell @CMD
 FETCH NEXT FROM cursor_log INTO @ID;
END
DEALLOCATE cursor_log

I realize some of you may balk when seeing the cursor above but this is a very acceptable use for a cursor considering the intended goal.

LSCopy_ALL_DBs

This job runs on the secondary server at an interval of every 5 minutes. Here's the code it runs:

DECLARE @ID uniqueidentifier;
DECLARE cursor_log CURSOR FAST_FORWARD FOR
SELECT secondary_id FROM msdb.dbo.log_shipping_secondary_databases;
OPEN cursor_log;
FETCH NEXT FROM cursor_log INTO @ID;
WHILE @@FETCH_STATUS = 0
BEGIN
 DECLARE @CMD varchar(4000)
 SET @CMD = '"C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqllogship.exe" -Copy ' + CAST(@ID AS varchar(50)) + ' -server SQL_Secondary_Server'
 EXEC xp_cmdshell @CMD
 FETCH NEXT FROM cursor_log INTO @ID;
END
DEALLOCATE cursor_log

LSRestore_ALL_DBs

This job also runs on the secondary server at an interval of every 10 minutes. Here's the code it runs:

DECLARE @ID uniqueidentifier;
DECLARE cursor_log CURSOR FAST_FORWARD FOR
SELECT secondary_id FROM msdb.dbo.log_shipping_secondary_databases;
OPEN cursor_log;
FETCH NEXT FROM cursor_log INTO @ID;
WHILE @@FETCH_STATUS = 0
BEGIN
 DECLARE @CMD varchar(4000)
 SET @CMD = '"C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqllogship.exe" -Restore ' + CAST(@ID AS varchar(50)) + ' -server SQL_Secondary_Server'
 EXEC xp_cmdshell @CMD
 FETCH NEXT FROM cursor_log INTO @ID;
END
DEALLOCATE cursor_log

You can see looking at the jobs above that we are simply taking advantage of the tables that SQL uses to manage log shipping.

NOTE: In the scripts above replace "SQL_Primary_Server" and "SQL_Secondary_Server" with the instance name for your primary and secondary servers respectively.

The next step is to setup all our databases to use log shipping. To do this I used the wizard in Management Studio to create the first one by right clicking on the database and selecting properties, then selecting the Transaction Log Shipping tab. I set this up normally letting Management Studio create the additional backup, copy, and restore jobs. The only difference is I set the jobs to be disabled in the wizard so that they wouldn't interfere with the custom jobs I created above.

Once I completed the first job. I used the same wizard in Management Studio to script the solution out. Then I just needed to change the database name in the script for each database I was setting up log shipping for and run the script. With one important exception; the script does not automatically create the new database for you on the secondary server or restore the backup. Below is the piece I added to the script to get this done.

This script is run on the secondary server before you setup log shipping and requires a linked server to the primary. Also, obviously you'll need to change the path to where the database gets restored to match your environment.

I should also note that this will not work for databases that have more than one data file.

-- ****** CREATE DATABASE FROM LATEST BACKUP ON PRODUCTION *******
DECLARE @backup_set_id int, @media_set_id int
DECLARE @backup nvarchar(260);
DECLARE @log nvarchar(128), @data nvarchar(128)
SELECT TOP 1 @backup_set_id = backup_set_id, @media_set_id = media_set_id
 FROM [SQL_Primary_Server].msdb.dbo.backupset
 WHERE database_name = '<db,,AddYourDBNameHere>'
 AND type = 'D'
 ORDER BY backup_finish_date DESC;
SELECT @backup = physical_device_name 
 FROM [SQL_Primary_Server].msdb.dbo.backupmediafamily 
 WHERE media_set_id = @media_set_id
SELECT @log = logical_name 
 FROM [SQL_Primary_Server].msdb.dbo.backupfile 
 WHERE backup_set_id = @backup_set_id
 AND file_type = 'L'
SELECT @data = logical_name 
 FROM [SQL_Primary_Server].msdb.dbo.backupfile
 WHERE backup_set_id = @backup_set_id
 AND file_type = 'D'
RESTORE DATABASE [<db,,AddYourDBNameHere>] FROM DISK = @backup WITH FILE = 1, 
 MOVE @data TO N'S:\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\<db,,AddYourDBNameHere>.MDF',
 MOVE @log TO N'S:\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\<db,,AddYourDBNameHere>_1.LDF',
 STANDBY = N'S:\MSSQL10_50.MSSQLSERVER\MSSQL\Backup\ROLLBACK_UNDO_<db,,AddYourDBNameHere>.BAK',
 NOUNLOAD, STATS = 10
GO

The last issue you need to deal with while setting up the "default" log shipping is to copy over all the transaction logs that occurred from the time of your last backup to the time log shipping started. Copy these log backups to the same location where your copy job is putting them. Once these are over there the restore job will pick these up and apply them.

The final step we have for setting up this log shipping for multiple databases is to fix the Job Id that all the log shipping databases point to and then deleting the useless jobs. The script below handles setting all the job Ids in log shipping to point to the same job.

-- ********* RUN THIS ON THE PRIMARY SERVER ************
DECLARE @bkup uniqueidentifier;
SELECT @bkup = job_id FROM msdb.dbo.sysjobs WHERE name = 'LSBackup_ALL_DBs';
UPDATE msdb.dbo.log_shipping_primary_databases 
SET backup_job_id = @bkup;
GO
-- ********* RUN THIS ON THE SECONDARY SERVER **************
DECLARE @cpy uniqueidentifier,@restore uniqueidentifier;
SELECT @cpy = job_id FROM msdb.dbo.sysjobs WHERE name = 'LSCopy_ALL_DBs';
SELECT @restore = job_id FROM msdb.dbo.sysjobs WHERE name = 'LSRestore_ALL_DBs';
UPDATE msdb.dbo.log_shipping_secondary 
SET copy_job_id = @cpy
 , restore_job_id = @restore;

I hope you guys found this useful. If you have any questions or comments let me know.

Rate

4.89 (28)

You rated this post out of 5. Change rating

Share

Share

Rate

4.89 (28)

You rated this post out of 5. Change rating