SQL Clone
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in

Log Shipping Cannot Access the File

By Bryan Bitzer,

Recently I encountered a problem with Log Shipping where the SQL Server Restore Log job was trying to restore a log file that was still in the process of being copied to the network share. The job returned the error: “The process cannot access the file because it is being used by another process.". The restore log job on the destination server had a schedule to run every hour and the source production server backed up and shipped the logs every 15 minutes.

Normally this would not be a problem because the files were generally small enough to copy before the restore log job ran. However, on the production server, I also had a job that re-indexed 5 different tables each night for maintenance. This production maintenance job created a much larger transaction log backup file to ship and therefore created the problem for the restore log job on the destination server.

I had two options here: One would be to schedule the restore log job to run only 18 hours while the remaining 6 hours the job was idle while the larger logs were expected to ship to the network share. A second option was to create a test to see if the file was accessible before restoring it. Because I like things to run consistently, I chose option 2. Below is the simple SSIS Packages I created to manage this.

  • Create a new folder called “Restore Logs” on the same physical drive as your Shipped Logs folder (the folder that receives the shipped logs from the source server). This is important because we are going to MOVE the files from the Shipped Logs directory to the Restore Logs directory. If the folders are on the same drive, the OS does not actually physically move the files.
  • Create a new SSIS package that will check and move logs.
  • Add three package scoped variables:
    • BKPFileName (String) Set to xxx – Variable to receive the path and name of the current log file.
    • FileIsReady (Boolean) Set to False – Variable to receive the status of the current log file.
    • RestoreLogs (String) - Variable that contains the directory where we are moving the ready log files to.


On the Control Flow Editor, add a For Each Loop Container and then edit the container with the following:

For the Collection

  • The Enumerator - For Each File Enumerator
  • Folder – The folder that contains the shipped logs
  • Files - *.bak (or whatever extensions you use for transaction log backup files.

Foreach Loop Editor

For the Variable Mappings, in the dropdown under Variable, select User::BKPFileName. Leave Index as 0.

You then need to add a Script Task to the For Each Loop Container. Edit the Script Task to set:

  • ReadOnlyVariables – User::BKPFileName
  • ReadWriteVariables – User::FileIsReady
  • Next we click "design script" and the Script Editor will open. Here we want to replace the code in Script Main with:

     Imports System
     Public Class ScriptMain
         Public Sub Main()
    Dts.Variables("User::FileIsReady").Value = OpenFile(CStr(Dts.Variables("User::BKPFileName").Value))
             Dts.TaskResult = Dts.Results.Success
         End Sub
         Private Function OpenFile(ByVal sFileName As String) As Boolean
             Dim bCanOpen As Boolean = False
    FileOpen(1, sFileName, OpenMode.Binary, OpenAccess.Read, OpenShare.LockRead)
                 bCanOpen = True
             Catch ex As Exception
             Return bCanOpen
         End Function
     End Class

    The code above is setting the value of FileIsReady to the return value (True or False) from the function OpenFile(). If the OpenFile() function can open the file, it returns True otherwise it returns the default of False.

    Now we can save this task and return to the Control Flow tab and the Fore Each Loop. We need to add a File System Task to the For Each Loop Container. Then connect the script task to the File System task, right click on the connector and edit the precedence constraint as:

    • Evaluation Operation - Expression and Constraint
    • Value – Success
    • Expression - @[User::FileIsReady] == True

    Precedence Constraint Editor

    Next you need to edit the General tab of the File system task. Set these values:

    • IsDestinationPathVariable – True
    • DestinationVariable – User::RestoreLogs
    • OverwriteDestination – True
    • Operation – Move File
    • IsSourcePathVariable – True
    • SourceVariable – User::BKPFileName
    • File System Task Editor

      Now click OK and Save your package. The control flow should now look something like this:


      You can save this package as you normally do. I usually save this type of package as a file in a separate directory with other SQL job related folders. Finally edit your Log Shipping restore job to run this package as the first step. Then edit the second step of the job to use the Restore Logs directory to restore the logs from.

      There you have it, a fairly simple test for files before trying to open them. This little package is also handy for other situations as well. For instance, if you have a SQL Job that refreshes an Excel report every night on a shared network, but the file was left open an employee’s PC making it inaccessible, this package, can catch that problem and add an alternative action.

Total article views: 3296 | Views in the last 30 days: 1
Related Articles

Access Denied where doing restore from a shared folder

Access Denied where doing restore from a shared folder


Restore all of the log file from a folder

Restore all of the log file from a folder


SSIS: Custom Folder in your Package Store

Recently at SQL Saturday #57 – Houston I explained how to add a custom folder to your SSIS package s...


SQL 2000 DTS Package -" Package contains unregistered custom task or Analysis Services Processing task"

Package contains unregistered custom task or Analysis Services Processing task


Wild Card Character in For Each loop container of SSIS package

Wild Card Character in For Each loop container of SSIS package