Powershell Database Backup Script

  • I agree with the points being made in the previous posts, but I still think this was an excellent article. It was clear and concise, without being distracted by going into too much detail and allowing for the undefined location variable. It laid out the general sequence of the script and the objects and syntax to use to get started. We can all then lookup the finer switches to add more control.

    I understand that running Backups within the SQL Server GUI gives useful info, but that same info can also be collected by the script and logged to a simpler Stats or Monitoring DB. Having managed over 350 servers in one role, I believe that scripted processes are more controlable, predictable repeatable, testable and documentable, than GUI tools. I also know that I may put a solution in place, but I am rarely around to run and Support that process, so I also agree with Jeffery's point re Keep it Simple ...

    Thanks again for the great article, Jimmy.

    JK


    Tks,

    JK

  • I'd like to make it perfectly clear that I never once said to run your backups through the GUI. Anyone who knows me knows I would never advocate such a thing.

    Watch my free SQL Server Tutorials at:
    http://MidnightDBA.com
    Blog Author of:
    DBA Rant – http://www.MidnightDBA.com/DBARant

    Minion Maintenance is FREE:

  • David Bird (1/5/2011)


    I could see using this script on a SQL express installation without SQL Agent installed. Windows Scheduled Taks would be used for scheduling it.

    I am not sure how I would use a SQL script to delete the old backup without using XP_CMDSHELL. So if I need to use DOS commands to delete the old backup why not use Powershell for file deletes and backups.

    Thanks Jimmy for the article

    If you're interested in using SQL Server to delete your old backup files without using xp_cmdshell, you could look into master.dbo.xp_delete_file

    exec master.dbo.xp_delete_file

    0, --File type [(0) FileBackup | (1) FileReport]

    N'<folderPath>', -- Folder path to look in

    N'<ext>', --File extension to be deleted (do not precede with a period)

    N'<date>', --Date line. All files meeting criteria created before this date will be deleted

    1 -- Subfolder flag. Indicates to delete from subfolders [ 1 | 0 ]

    keep in mind this procedure is undocumented and not supported, use at your own risk.

    I use it and it works fine for deleting native sql backups. This procedure is what is run when a maintenance task for deleting backup files is scripted. Please do plenty of testing before implementation.

    ______________________________________________________________________________________________
    Forum posting etiquette.[/url] Get your answers faster.

  • KenpoDBA (1/5/2011)


    Thanks Jeffrey...

    however in the scenario you described, doing a native SQL backup and putting it into an agent job is still the best solution. All you have to do now is call the job from a PS script. This way the DBAs can support the job from their end, and the OPs guys can support it from their end. And you still get all the benefits from having it in SQL.

    Just because a process can use PS doesn't mean the entire solution has to be written in PS. You can use it just as a step in the process.

    KenpoDBA,

    I did not state this in my previous post, but we were just calling the Native Backup via PowerShell. However, the only disadvantage is that the user in question does not get a progress bar. For me, it is not an issue because I have a pretty good idea on how long a process will take to complete, but for the OPs guys that may not be the case.

    In all honesty, I'm not as worried about the OPs guys as I am the business users. It is almost worth it to write a front end that is a winform that can display the progress and maybe time remaining. I've had a number of business users panic when the ad hoc backup of a 150 GB Database did not finish in 3 minutes!

    Ah, well. Have to love the users or I would not be employed.

    Regards, Irish 

  • Yeah, I understand that, but it's not a good idea to allow business users to backup DBs to begin with. They put backups in odd locations and fill up the drive. One guy doesn't know another guy just took a backup and next thing you know your resources are being spent taking one after another.

    A much better way to handle it is to train them to email the DBAs when they need a backup... or better yet, schedule it for them. A progress bar isn't necessary then and a DBA can tune the backup to perform better anyway. I try never to let anyone outside the DBA group backup a DB if at all possible. It only brings heartbreak later when they call us to fix whatever they've done wrong. So I put in some procedures that make it easy for us to back them up should they need something outside the regular scheduled process, and then just respond to those emails quickly. If you do it right, it only takes a few secs to kick off the backup so it's little burden on you.

    So just explain to them that you've got some intricate things in place that have to be done a certain way (space mgmt, log backups, etc) and you need to make sure nobody is taking backups out-of-cycle... and explain to them that they don't wanna be spending their time taking backups anyway. They usually agree as long as they get good service. Have them try it your way on a trial basis if they're concerned. The first couple times are the hardest then they find they actually like not having to mess with it.

    As for the article, I agree with the others that it was quite well-written. It was clear, methodic, and said what it needed to say. The only problem I have is with the content. It's just not something that needs to be taught. It's like having a very well-written article on how to turn on autoShrink for all your DBs.

    Watch my free SQL Server Tutorials at:
    http://MidnightDBA.com
    Blog Author of:
    DBA Rant – http://www.MidnightDBA.com/DBARant

    Minion Maintenance is FREE:

  • KenpoDBA (1/6/2011)


    Yeah, I understand that, but it's not a good idea to allow business users to backup DBs to begin with. They put backups in odd locations and fill up the drive. One guy doesn't know another guy just took a backup and next thing you know your resources are being spent taking one after another.

    A much better way to handle it is to train them to email the DBAs when they need a backup... or better yet, schedule it for them. A progress bar isn't necessary then and a DBA can tune the backup to perform better anyway. I try never to let anyone outside the DBA group backup a DB if at all possible. It only brings heartbreak later when they call us to fix whatever they've done wrong. So I put in some procedures that make it easy for us to back them up should they need something outside the regular scheduled process, and then just respond to those emails quickly. If you do it right, it only takes a few secs to kick off the backup so it's little burden on you.

    So just explain to them that you've got some intricate things in place that have to be done a certain way (space mgmt, log backups, etc) and you need to make sure nobody is taking backups out-of-cycle... and explain to them that they don't wanna be spending their time taking backups anyway. They usually agree as long as they get good service. Have them try it your way on a trial basis if they're concerned. The first couple times are the hardest then they find they actually like not having to mess with it.

    As for the article, I agree with the others that it was quite well-written. It was clear, methodic, and said what it needed to say. The only problem I have is with the content. It's just not something that needs to be taught. It's like having a very well-written article on how to turn on autoShrink for all your DBs.

    Oh how I wish it were so easy to limit the ability of creating backups to Database Admins. While I agree it is risky, sometimes there is little choice.

    I work for a Software Company and support a large number of corporations in a specific industry (I cannot be more specific than that). Due to the nature of the business and varying size of the corporations (500 to 40,000 employees) each one has a different skill set.

    I could tell you some nightmare-ish storied about recoveries I have done in the past, but we all have war stories. We have to apply the KISS principal very frequently just to ensure that backups are being created in the first place! 😀

    Anyway, I too agree that the article is well written and provides a great base to start with. PowerShell is a fantastic tool that I find to be under utilized.

    Regards, Irish 

  • KenpoDBA (1/5/2011)


    Thanks Jeffrey...

    however in the scenario you described, doing a native SQL backup and putting it into an agent job is still the best solution. All you have to do now is call the job from a PS script. This way the DBAs can support the job from their end, and the OPs guys can support it from their end. And you still get all the benefits from having it in SQL.

    Just because a process can use PS doesn't mean the entire solution has to be written in PS. You can use it just as a step in the process.

    While I do see a place for Powershell and SQL to mingle together in some aspects (basically where there is an order of magnitude benefit to justify the effort in using Powershell), for the most part I do agree to let SQL do what it does best. There is a place for Powershell, for example in creating an archiving strategy after the SQL backups are completed (so the zip feature would be useful combined with retrieval of timestamps).

    However, the one glaring example where this script can be very useful is for SQL Express which disables SQL maintenance jobs by not allowing you to run SQL Agent.

    Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein

  • This script is genius - but I'm having some issues with it. I've manged to get it to run on my XP machine, running sql 2005 - but when I change out the DB names and the server, and run it on a server 2008 box, I run into issues.

    I've got server 2008, running SQL Express 2005 and another 2008 machine running SQL Express 2008. I ran into this same error on my XP machine, but it was a result of the path not existing. I have created the path on the two servers.

    The error message I'm getting is:

    Exception calling "SqlBackup" with "1" argument(s): "Back failed for server 'server\instance,port'. "

    At line:1 char:20

    + $dbbackup.sqlbackup <<<< ($server);

    +categoryInfo :Notspecified: (:) [], MethodinvocationException

    + FullyQualifiedERrorId : DotNetMethodException

    Any assistance would be much appreciated.

    As a side note, I've already tried this: http://sqlaj.wordpress.com/2010/10/10/error-running-sql-2008-smo-sqlbackup/

  • Off the top of my head, that looks like it is looking for a Service Principle Name ?? What Security contaxt are you using to connect to the remote box ? If it is Kerberos, you need to setup an SPN for the SQL service. See this article for a start ...

    http://blogs.technet.com/b/askds/archive/2008/06/11/kerberos-authentication-problems-service-principal-name-spn-issues-part-3.aspx

    JK


    Tks,

    JK

  • After much searching, I found a post by Allen White that suggested to use a Trap to catch the inner exception.

    http://sqlblog.com/blogs/allen_white/archive/2009/06/08/handling-errors-in-powershell.aspx

    Turns out the Microsoft.SqlServer.BatchParser, Version=9.0.242.0, was "missing." Apparently it's a 64-bit box, with 32 bit SQL, and I was using the 32 bit powershell. Switching to 64 bit PoSh was the ultimate fix.

  • Actually there are a business cases where such a script is quite usefull. I work for a global corporation with over 700 instances of SQL Server spread across the globe. We are less than 20 SQL DBAs.

    We don't like turning on xp_cmdshell on all our servers as this is a security problem. But our backup script needs to create subdirectories, create a log file for each day, move those log files to a central logging server, archive old log files on that server, delete even older log files, delete old backups, etc.

    As you can see, pure T-SQL without xp_cmdshell will not help. A stored procedure written in .NET can. But I don't like loading assemblies on each servers either. And if I have to run a T-SQL script from a batch file which does all the work for me, then why not use PS? It's much more comfortable to write such operations in PS than in standard DOS batches.

  • That's what I'm talkin' about ...


    Tks,

    JK

  • The only thing I don't know yet is error handling. Does SMO forward errors that happen in the SQL Server while the backup runs to powershell so you can log them?

  • Here is a link to a good discussion on that subject. Seems SMO passes a value back from the Restore at least that the code is using to test for success. JK

    http://social.msdn.microsoft.com/Forums/en-US/sqldisasterrecovery/thread/2eb6810d-160f-4e65-a1ff-1d25eb1b0a5e


    Tks,

    JK

  • Hi all,

    I am using idea of script you mentioned to verify the backup status of backup files location in different location. Same like RESTORE VERIFYONLY.

    The 'C:\Temp\Paths.txt' is having backup file locations of different servers. The code is working perfectly when I use one path but its failing to execute when more than one path is specified.

    Please help me on what modifications to be done to the script in order verify the backup file status of different locations.

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null

    $server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") ("servername")

    $dbBackup = new-Object ("Microsoft.SqlServer.Management.Smo.Backup")

    $dbRestore = new-object ("Microsoft.SqlServer.Management.Smo.Restore")

    $a = Get-Content "C:\Temp\paths.txt"

    foreach ($path in $a)

    {

    $dbRestore.Devices.AddDevice($path, "File")

    if (!($dbRestore.SqlVerify($server))){

    $body = $path

    $emailFrom = "email"

    $emailTo = "email"

    $subject = "Backup Files Health"

    $smtpServer = "serverip"

    $smtp = new-object Net.Mail.SmtpClient($smtpserver)

    $smtp.Send($emailFrom, $emailTo, $subject, $body)

    Exit

    }

    }

    Thanks

    Vinay

Viewing 15 posts - 16 through 30 (of 39 total)

You must be logged in to reply to this topic. Login to reply