Archiving Performance... Need help

  • Hi,

    I have production database and reporting database. I have to move some table data from production to reporting database. Now we have a stored proc. which query production table data and compare with reporting database table and update reporting database if the records not exist. This sp is configured in jobs and jobs take huge time to do this process and my transaction log gets full and kills my time too.

    Need suggestions on any other way we can achive this senario. Please help...

    regards

    Hari

  • How big are the transactions, how often you are running this SP, that moves data from production to reporting, can you post the SP so we can see if there are any room for improvement 🙂

  • Hey Thanks for the reply,

    The transaction size is average of 100k/day... this job is scheduled to run everyday... ! 🙂

    Will post the sp shortly.. Need to know what are the best methods to do this kind of job.

  • Could you also post the production and reporting table definitions please?

  • 100 k/Day not that much data, this shouldn't cause any problems, the way you do is fine, other ways are having a backup and restore, which will leave you with down time 🙂

  • CrazyMan (1/22/2009)


    100 k/Day not that much data, this shouldn't cause any problems, the way you do is fine, other ways are having a backup and restore, which will leave you with down time 🙂

    Um - while it's hard to say for certain, without seeing some table defs and code, if he's having the problems stated moving 100k my personal guess is that there probably is an issue here.

  • that's right Andrew, but how much large the table is, if its properly designed and indexed and the SP is tuned, then this is a small task, as you said we have to see the structure as well 😎

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply