• I had a bad experience of Double Take some years back where I was asked to sort out a mess created by a systems guy (I'm a DBA) who had installed Double Take against a couple of SQLs intending to create a failover set-up. In fairness to the product I'm not sure Double Take was ever configured correctly but this is what I found: (a) Double Take corrupted several SQL databases (b) data was lost at failover time. I limited the effects of (a) by ensuring DT was only ever configured to handle the one application database it was intended for, and by ensuring the application database was only online to one SQL at a time. However when we did stress testing and failover we found that not all the data made it to the DR site, perhaps because of network latency.

    Anyway my exposure to DT was brief and some years back so might not be relevant. All I would say is understand how the product works and thoroughly test that it delivers what you expect. I found it configured badly and it became a source of problems itself.