Log Shipping of Production DBs in OffSite Disconnected Environment

  • Hi,

    We have some Windows Workstations Standalone and in Workgroup, and they all have SCADA application with SQL Express 2016 for writing Real-time Alarms to SQL DB and occasional reading. We are procuring a Server machine to be utilized as Centralized Database Server/Log Server. It will also contain SQL Server (Standard or Express depends upon solution) and SCADA application for Reading the Alarms DBs from all Workstation. The communication between Workstations and Centralized Station is strictly uni-directional with no direct connection.

    We are considering custom Log Shipping to achieve synchronization.

    For Backing up, we can take use of Ola Hallengren scripts and Task-scheduler for automated backups and copying to local drives.

    For Automated Restoring of all DBs, we are looking for good/reliable Powershell/ T-SQL scripts, that can initiate restore without closing active connection (there will be only one). I don't have scripting experience in Powershell or T-SQL.

  • I'm no DBA, but doesn't log shipping require a single primary DB and one or more secondary DBs, with the secondary DBs being (effectively) copies of the primary?

    Your requirement seems to be the opposite, with multiple secondaries feeding into a single primary. If so, I don't think log shipping can do that.

    Your comments about backing up the databases seem solid.

    Under what circumstances is an automated restore required?


  • Hi Phil,

    My idea of log shipping is multiple primary databases (on Workstations each with their SQL Server instance) to multiple secondary databases on one Physical server machine (single SQL Server instance). Looks possible in theory but I have no practical experience.

    On Centralized server machine, regular restores are required to keep the DBs near to real-time. One option is to run restore for Full Backup (if any) and all related T-logs backups for each DB (multi-threading may be) and Trigger this task for every 01 Hour. Restore to be applied for 05 SQL DBs at once. So roughly by 50th minute mark, all 14 SQL DBs would be restored before looping it again for next cycle. Do you think its a good workable strategy?

     

  • If you are using standby mode to read from the log shipped databases, you will need to kill any connections to the standby databases before you can restore additional logs.  You can choose not to kill the connection, but the restore is prevented and the database will fall behind.

    For implementing a custom log shipping solution, you might find this tool I created useful which can handle restores from a UNC path, Azure blob or S3.  It's multi-threaded.  You will probably also want some monitoring.  I also created DBA Dash, which can be used to monitor log shipping (including custom implementations).

    Hope this helps

    DBA Dash - Free, open source monitoring for SQL Server

  • Hi David,

    Thanks for the tool.

    I will test it in a small setup and update.

     

    • This reply was modified 4 days, 17 hours ago by  Humbl3Lrnr.
  • Hi David,

    I tested the tool in lab environment, and I have a question. I could not find this info. on github wiki so I am posting it here.

    Where can I enter path of Target Server User DBs for restore? For my setup, each User DB on Target Server has different path and it is not same path where System DBs are stored.

  • A basic config would look like this where we configure the destination (where we want the DBs to be restored), and the path of the log backups.

    LogShippingService.exe --Destination "Data Source=LOCALHOST;Integrated Security=True;Encrypt=True;Trust Server Certificate=True" --LogFilePath "\\BACKUPSERVER\Backups\SERVERNAME\{DatabaseName}\LOG"

    Note that the path includes a {DatabaseName} placeholder.   The actual paths to your DB would be something like this:

    \\BACKUPSERVER\Backups\SERVERNAME\DB1\LOG

    \\BACKUPSERVER\Backups\SERVERNAME\DB2\LOG

    etc

    The format of the paths can vary, but you should have a folder name for the database somewhere.  e.g.

    \\BACKUPSERVER\Backups\SERVERNAME\LOG\{DatabaseName}

    The tool will loop through the databases in a restoring state on the destination server and apply the log backups from the LogFilePath, replacing the placeholder with the name of the database.

    The databases would already need to be initialized with a full backup restore, but the tool also has options to handle the initialization.  This is just a case of specifying --FullFilePath (and optionally --DiffFilePath).

    LogShippingService.exe --FullFilePath "\\BACKUPSERVER\Backups\SERVERNAME\{DatabaseName}\FULL" --DiffFilePath "\\BACKUPSERVER\Backups\SERVERNAME\{DatabaseName}\DIFF"

    There are other options for initialization.  A comma-separated list of paths can also be used.

    Hope this helps

    DBA Dash - Free, open source monitoring for SQL Server

  • Hi David,

    In my environment, the path for backups (where copies of backups are available) and path where we like to restore DBs locally on Target Server is different for each DB.

    Is it possible to define the restore DB folder location as well in tool?

    I initiated log shipping service for testing purpose but log file shows continuous iteration of log restore without any change in DB. Attaching log file as reference. What do you suggest?

     

    Attachments:
    You must be logged in to view attached files.
  • Hi, the tool works a bit differently from native log shipping.  There is no copy process to copy the logs from a share on the source instance to a folder on the target server.  It restores the log backups directly from the path specified.  This is similar to how sp_AllNightLog used to work (now deprecated and I wouldn't advise using it).

    Looking at the log, you specified a list of databases. Only AS3_VfiAlarm & AS5_VfiAlarm will be restored.

    Do you have database backups in these folders?  Also check that the account running the service has access to these folders.

    \\DESKTOP-8DJV43U\SHARED\AS3_VfiAlarm\FULL

    \\DESKTOP-8DJV43U\SHARED\AS5_VfiAlarm\FULL

    \\DESKTOP-8DJV43U\SHARED\AS3_VfiAlarm\LOG

    \\DESKTOP-8DJV43U\SHARED\AS5_VfiAlarm\LOG

    Do the databases already exist on DESKTOP-8DJV43U\CM?  If they don't exist, it should initialize them from the backups in the FULL folders, then start restoring the logs.

    Hope this helps

    DBA Dash - Free, open source monitoring for SQL Server

  • Hi David,

    Thanks for the quick reply and noted the info.

    Yes. The backups are already available on configured locations.

    Yes. The NT Service account has Read access to shared path (as per wiki). Tried with Full access as well.

    Yes. The Databases already exist on DESKTOP-8DJV43U\CM.

    EDIT: I did one more test yesterday and removed AS3 backup from Target Server instance to see if LogShippingService is able to create AS3 DB from Full Backup path. I got errors related to accessing master db. Please note that NT Service account already has dbcreator and sysadmin role.

    Can you please advise.

    • This reply was modified 2 days, 21 hours ago by  Humbl3Lrnr.
    • This reply was modified 2 days, 21 hours ago by  Humbl3Lrnr. Reason: Added info. about further testing
    Attachments:
    You must be logged in to view attached files.
  • Hi, there is problem with the permissions assigned to the account running the service.  If you are getting this error:

    CREATE DATABASE permission denied in database 'master'.

    The account it's running as is not sysadmin. Ideally I would recommend using a domain user account (GMSA is ideal).  I can get it working with Local System or Network Service though.  Are you running the log shipping service on the destination SQL instance?  If you are running it on a different system it might be the computer account it's running from that you need to grant access.

    To install as NETWORK SERVICE you would use:

    sc create "LogShippingService" binpath="C:\LogShippingService\LogShippingService.exe" start=delayed-auto obj= "NT Authority\NetworkService"

    You would then need to create a login.

    You would then need to create a login.
    CREATE LOGIN [NT AUTHORITY\NETWORK SERVICE] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]
    -- using sysadmin for testing.  See https://github.com/trimble-oss/sql-log-shipping-service/wiki/Permissions

    ALTER SERVER ROLE [sysadmin] ADD MEMBER [NT AUTHORITY\NETWORK SERVICE]

    To install as Local System, it's the default so you can omit the account name.

    sc create "LogShippingService" binpath="C:\LogShippingService\LogShippingService.exe" start=delayed-auto

    This worked out the box for me in my lab.  I think this is because BUILTIN\Administrators has sysadmin in my lab.

    If a role has sysadmin it has permissions do do everything and doesn't require any additional permissions.   If an account is definitely sysadmin and you are getting permission errors then it must be connecting as a different account than the one you think it is.

    For troubleshooting purposes, it's possible just to run LogShippingService as a console application.  It will run in the context of your own account though.

    Hope this helps

    DBA Dash - Free, open source monitoring for SQL Server

  • Hi David,

    I am using LogShippingService on Secondary Server only. For Primary, I am using Ola Hallengren scripts to make backups and backup job is running smoothly.

    I tried using Local Admin account for LogShippingService but it is giving the same result.

    I then tried running Console application with Admin account and it is unable to read header info. from Backup files and giving access denied error.

    But if I do it manually from SQL Studio, I am able to restore the Full backup and transactional log backups without any problem.

  • I'm not sure exactly what's going on but if you are getting CREATE DATABASE permission denied in database 'master'. it seems like the user it's running as isn't sysadmin.  With a local admin account, you might need to use the run as administrator option for it to get sysadmin when connecting to the SQL instance.

    As a test, you could try creating a SQL user and updating the connection string.  e.g.

    CREATE LOGIN [LogShipping] WITH PASSWORD=N'Your_Password_Here'
    GO
    -- using sysadmin for testing. See https://github.com/trimble-oss/sql-log-shipping-service/wiki/Permissions
    ALTER SERVER ROLE [sysadmin] ADD MEMBER [LogShipping]
    GO

    Connection string:

    Data Source=LOCALHOST;UID=LogShipping;Password=Your_Password_Here;Encrypt=True;Trust Server Certificate=True

    The password would be stored in plain text in this case so I would recommend switching back to integrated security (A group managed service account is ideal), and reducing the permissions from sysadmin.  It could be useful as a quick test to prove if the issue is related to the user.

    Note: A restart of the service is required to pick up any changes in the config.

    DBA Dash - Free, open source monitoring for SQL Server

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply