backups to network share or local to the disk

  • The only real way to tell if a backup is good is to restore the file.

    Gethyn Elliswww.gethynellis.com

  • Gethyn is correct. There is advice from various sources, including SQL CAT, that you restore each backup, each day, to a test machine to ensure it will work. I think that's a bit of overkill, but you ought to restore them all periodically to test. You can also run RESTORE VERIFYONLY (http://msdn.microsoft.com/en-us/library/ms188902.aspx) as a quick way of checking things, but I would not rely exclusively on this.

  • I have a question

    if any database become suspect than i set it in emergency mode and issue command

    alter database db1 set emergency

    alter database db1 set single_user

    dbcc checkdb (db1 , repair_allow_data_loss)

    alter database db1 set multi_user

    But the database is in emergency mode .How can i convert it in normal mode . and by mistake i have lost the backup

    please guide me

    thanks sir

  • ALTER DATABASE command to set to online mode.

    http://msdn.microsoft.com/en-US/library/ms174269%28v=SQL.90%29.aspx

  • I have done it but when I execute

    alter dababase db1 set online

    it takes 2-3 hours but still the query is executing and there is no end

  • this is a different problem. repost this to the corruption forum you will get a better response. If you don't have a backup your options will be limited.

    find out why database is suspect from errorlogs, night just be a database file has been moved.

    ---------------------------------------------------------------------

  • One thing to keep in mind is that the account being used to run the job (the user the agent service is running under) has to have access to that share drive. This is one reason that Microsoft recommends that the service user is a Domain user, so they can be given access to those resources.

    Local box users make it very difficult to have access to a network share to send data to.

    Here is my nightly (and hourly for tran logs) backup process for each database:

    1) set a cleanup task to cull the backups in the specific database's backup directory to just 2 days' worth.

    2) set backups to run on server at certain times (say, this database at 1am, this one at 1:15am, that one at 1:30am, etc.).

    3) run a job every hour at 45 minutes past the hour that runs a program that I wrote in VB.net that looks in each given directory in a command line parameter and checks each file to see if it is in the destination directory (file share). If not, it copies it there. If it is already there, it ignores it and goes to the next file.

    That way, the backups go offsite (in my case 1/2 mile away to a Data Domain dedup system) every hour. But, I still have 2 days' worth of backups on the server for speed in case anything happens. I can keep the backups on the Data Domain as long as I want (that I have room for). I wrote a program to cull that out, also (former programmer, has come in handy as a DBA).

    Hope this helps.

  • rhlangley (7/6/2010)


    3) run a job every hour at 45 minutes past the hour that runs a program that I wrote in VB.net that looks in each given directory in a command line parameter and checks each file to see if it is in the destination directory (file share). If not, it copies it there. If it is already there, it ignores it and goes to the next file.

    So you reinvented the robocopy wheel 😉

Viewing 8 posts - 16 through 22 (of 22 total)

You must be logged in to reply to this topic. Login to reply