BAckup's not completing 100%

  • I have a monthly full backup job in my prod instance serverinstance1. It runs fine and gives me a successful job status but it backs up only few dbs to the folder location for some reason. I compared the code with other prod instance backup job and the code is same and i debugged the current code on my serverinstance1 and everything looks fine. Anyone has any ideas????

    --Pra:-):-)--------------------------------------------------------------------------------

  • first look at the job's code itself;

    is it using a cursor to get the list of databases, or is it a static list?

    for example, if i use red gate to generate a job for backing up my db's, it lists the of six specific databases, so new databases do not get added to the plan, i have to edit it and add them manually.

    is that the question?

    or is the question related to the COPYTO an alternate location when you said

    but it backs up only few dbs to the folder location for some reason.

    in that case, the backups might work, but if there is no space, the file might not get copied; depending on your code, any warnign or error might be suppressed.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Lowell,

    I am not usuing raw dbnames but using a cursor to loop through all the dbs and back them up to the folder location. There is no space issue at all. I changed the code a lil bit ( commented out the sp_ExecuteSql statement that executes the dynamic sql for backing up the dbs and added a print statement) Then it gives this warning

    Warning! The maximum key length is 900 bytes. The index 'PK__#630D628__5F9AD70964F5AAFC' has maximum length of 2000 bytes. For some combination of large values, the insert/update operation will fail.

    But it does work..!! Donno if this might be helpful

    --Pra:-):-)--------------------------------------------------------------------------------

  • well, if you can show us the code, that would be helpful;

    does the current script gracefully handle wierd db names with quotename?

    a dynamically created backup command might fail, and move onto the next, for databases with spaces or dashes in their names.

    that's the next thing i'd wonder about, but i think the script itself would help the most.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • thats what i am wondering how to send.. A private message????

    i know its pretty known to evryone but just wondering...unless if u want me post here

    I am okay with doing so

    --Pra:-):-)--------------------------------------------------------------------------------

  • Lowell (5/5/2014)


    for example, if i use red gate to generate a job for backing up my db's, it lists the of six specific databases, so new databases do not get added to the plan, i have to edit it and add them manually.

    Both Redgate SQLBackup and Quest Litespeed have a paramter for all user databases, which avoids the need for editing lists.

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply