data replication software

  • Is using of Double-Take or similar type of software, for SQL Server databases replication advisable?

     

    I am most concern about the software interfering into the I\O path and the possibility of data corruption and performance degradation on the source server.

  • We are in a similar position.  MS-SQL replication consistantly fails due to the design of our database which unfortunately I have inheritted.  Currently we cannot change the database structure in any way due to the applications that in use.  I have both Veritas and Double-Take software and would like to know from the Guru's out there if there is any better mirroring software out there?

    Thanks in advance

  • James, what type of failures & design problems are you having ?

  • Hi there,  We have had numerous problems.  Firstly to tell you a bit about our environment.  We have a 100GB MS-SQL2K SP4 database which we are trying to replicate to another machine.  We have conformed top all MS recommendations in the database (as best we can). Some of the errors we have experienced are as follows:

    • Timeouts:  We have set the Agent to timeout after 10 minutes, and also set it to not time out at all.
    • Error 18773: The replication Log Reader fails with the following 18773 error message: The process could not execute 'sp_replcmds' on 'PUBLISHER'. Status: 0, code: 18773, text: 'Could not locate text information records for column # during command construction.’  - Which was a problem with 6.5 and 7.0; this should have been resolved with sp2 of SQL7.0!!
    • We have also had the replication complete and work for 5 hours or so, and then errors start cropping up, with the log reader, and this cascades to the distributor - there errors range from parameters missing in various tables repl stored procedures, to the 18773 error above.

    Bear in mind each time we make a defrag the indexes or change any object the entire snap shot gets regenerated, which takes anything from 8 to 16 hours!  Right now I am ready to throw SQL out the Window(s)!!!!

    Maybe you can suggest something?

    Kind regards

    James

     

  • Hi there,  We have had numerous problems.  Firstly to tell you a bit about our environment.  We have a 100GB MS-SQL2K SP4 database which we are trying to replicate to another machine.  We have conformed top all MS recommendations in the database (as best we can). Some of the errors we have experienced are as follows:

    • Timeouts:  We have set the Agent to timeout after 10 minutes, and also set it to not time out at all.
    • Error 18773: The replication Log Reader fails with the following 18773 error message: The process could not execute 'sp_replcmds' on 'PUBLISHER'. Status: 0, code: 18773, text: 'Could not locate text information records for column # during command construction.’  - Which was a problem with 6.5 and 7.0; this should have been resolved with sp2 of SQL7.0!!
    • We have also had the replication complete and work for 5 hours or so, and then errors start cropping up, with the log reader, and this cascades to the distributor - there errors range from parameters missing in various tables repl stored procedures, to the 18773 error above.

    Bear in mind each time we make a defrag the indexes or change any object the entire snap shot gets regenerated, which takes anything from 8 to 16 hours!  Right now I am ready to throw SQL out the Window(s)!!!!

    Maybe you can suggest something?

    Kind regards

    James

     

  • Whoops - Sorry about the double post!!! 

  • How many tables are you replicating ?  What type of replication are you doing ??  Transactional I assume. What are your business requirements for your subscriber DB in terms of up-time & latency ?

    We replicate a subset of our DB, about 60 G out of 140 G, but only about 30 tables out of 1,700. I break them out into different publications based on application, because I encountered problems when one publication tried to push too much. I still had to increase the agent timeout.  I wouldn't think defragging indexes would cause the snapshot to regenerate.  I run dbcc dbreindex during inactive periods and change recovery model to simple to avoid huge transaction logs.

    Maybe one of the experts around here will have some ideas of a better way.

  • Hi Homebrew01, 

    We have about 600 tables.  12 of these tables are over 10Million records in size, and yes it is transactional.  The business requirements are two fold, firstly to create a reporting server for the data (move reports off the live server). secondly to create multiple replicated machines for various hard and long running processes, that typically require huge amounts of reads across the database (namely the large tables I mentioned).  I have tried at least 5 times to split the tables into the last two years records, which is a business requirement, but this caused errors each time, from timeouts and missing parameter errors to cannot run SP errors.

    I run dbcc dbreindex once a month, it was immediately after this that the Log reader caused errors.

    I will try create multiple publicates as you suggest!

    Thanks!

  • " ....... I have tried at least 5 times to split the tables into the last two years records ...... "

    Does this mean you only need the last 2 years data at the subscriber ?    If you only need a subset of data, you can use row filtering to limit what you replicate. A row filter is a "where clause" that you can write. That's what we do for some of our tables, only bringing over data from the last few years for the users to query. 

  • Yes - that’s exactly what I mean.  I have checked the tables that we are trying to replicate, and I believe these tables were designed without replication in mind.  We have numerous TEXT/nTEXT type fields, and wide tables (large number of columns). 

     

    The original developers seemed to ignore normalisation in many ways, which I have a feeling contributes to the problem.  With your experience did you also go through this type of teething problems?

  • I think our situation is somewhat simpler than yours, so I went through a learning curve, but haven't had much problem since. I'm no expert, I've just managed to get what I need working through trial & error.

    You can also filter on colums if you don't need them all at the subscriber. That would cut down on the amount of data to transfer.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply