Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 123»»»

Data Loss or Downtime Expand / Collapse
Author
Message
Posted Thursday, January 6, 2011 9:24 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Yesterday @ 4:16 PM
Points: 31,018, Visits: 15,456
Comments posted to this topic are about the item Data Loss or Downtime






Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1044119
Posted Friday, January 7, 2011 1:27 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, February 3, 2012 5:53 AM
Points: 41, Visits: 17
Being slightly pedantic I think the question:

So what is more important to you: downtime or data loss?


is wrong and not a reflection on the spirit of the article, more correctly it should be along the lines of:

So what is more important to you: downtime or partial/non-current data?

But, in answer to the spirit of the article, the answer is as always, it depends :). I can think of several of the systems I have written/support that could function on a partial, non current data set, and others that are completely dependant upon the results that have happened within tens of seconds before. I think the defining requirement is to understand how the customer works with a system and the impacts of all scenarios such that some form of SLA is in place which allows the most efficient and cost effective re-instation of a fully working system.
Post #1044212
Posted Friday, January 7, 2011 2:42 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, April 30, 2014 5:00 AM
Points: 150, Visits: 3,164
Hi,

It depends on the nature of the system. I work in a financial institute with a lot of trading; in a downtime vs. partial data loss situation business has to weigh the cost of being down until the data is recovered, risk of reputational loss and increased revenue loss against a smaller risk of reputational loss and less loss of revenue with partial data.

In most cases (in this scenario) it would be better to accept a partial data loss until it can be recovered, and get business up and running to mitigate the additional reputational and financial loss.
Post #1044237
Posted Friday, January 7, 2011 4:00 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 2:13 PM
Points: 39,866, Visits: 36,208
It depends. Among other things it depends on what data is going to be missing.

Thinking back to the bank there were some tables that we could do without during business hours but were critical for the overnight processes. There were other tables that we could do without for 3 weeks, but they had to be there (and complete) during the last week of the month. There were other tables where if the information in there was incomplete it was worse than if the system was completely offline.



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1044283
Posted Friday, January 7, 2011 4:09 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Friday, September 12, 2014 2:30 AM
Points: 1,422, Visits: 1,838
Hello!

To me, the strategy that I have followed is:

1. Get the database online as soon as possible - non-current data allowed
2. Attempt to restore the most-recent data first - mostly OLTP systems need transactions from within the last month 80% of the time
3. Attempt to restore the history data - the rest of the 80% data, used 20% of the time

Going side-track here, but I find the above strategy useful in data cleanup projects as well.

Have a great day!


Thanks & Regards,
Nakul Vachhrajani.
http://nakulvachhrajani.com
Be courteous. Drive responsibly.

Follow me on
Twitter: @sqltwins
Google Plus: +Nakul
Post #1044285
Posted Friday, January 7, 2011 4:48 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Monday, June 11, 2012 2:26 AM
Points: 36, Visits: 71
Fortunately our OLTP Database fits on one tape, we also restore to a backup database for immediate rollover in case of failure.

However recently we were hacked and although our DB was untouched some system files were corrupted so we could not load the DB.

We dump all our major files to csv files overnight from our OLTP DB and these were initially used as a psuedo Datawarehouse until we got our current full blown MS SQL version. Because these files were available we knew our customers addresses, we knew what stock we had, we knew where the stock was and we knew all the prices.

Now it wasn't as smooth and efficient as normal trading and the system says we didn't process anything that day but we got all the orders out that were due out. So the impact to the business was minimal and once the system files were restored all was well.

So how well you prepare for downtime is as important as getting the data back.
Post #1044299
Posted Friday, January 7, 2011 5:45 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, July 8, 2014 8:48 AM
Points: 18, Visits: 61
It Depends...
Like i use to work on the product which shows up the prices and then people do the purchasing from there.
so if i make the thinks online without taking the downtime then may be customers will see the bad prices and which can create a bad impact over my site just because of the bad data.So its better that i take the downtime and then recover all of my data and then make the things back to normal again.
Thanks
Vineet
Post #1044328
Posted Friday, January 7, 2011 5:48 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: 2 days ago @ 6:38 AM
Points: 13,755, Visits: 28,147
Obviously, I want my cake and I want to eat it too (and I'll have as much of yours as I can)... or at least that's how most businesses would approach it.

When I'm setting up DR for various systems, I ask the business, how much data loss are you prepared to handle? My assumption is, if stuff goes south, you're going to lose data. So then, it's a question of minimizing down time. yeah, I try to get after the data too, but if an app is mission critical, the question quickly comes up, everyone off line, or one user with incomplete data? I know how most businesses are going to answer that one.


----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #1044329
Posted Friday, January 7, 2011 6:41 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, May 16, 2014 9:38 AM
Points: 22, Visits: 148
Nakul Vachhrajani (1/7/2011)
Hello!

To me, the strategy that I have followed is:

1. Get the database online as soon as possible - non-current data allowed
2. Attempt to restore the most-recent data first - mostly OLTP systems need transactions from within the last month 80% of the time
3. Attempt to restore the history data - the rest of the 80% data, used 20% of the time

Going side-track here, but I find the above strategy useful in data cleanup projects as well.

Have a great day!


I agree with Nakul here. In most cases I'll go out on a limb and say uptime is more important than retrieving your existing data. I say that because downtime is putting future revenue at risk (assuming your system is revenue generating). At least if the system is up, even in a hobbled state and empty, future transactions can process.

Also, data loss is defined by your backup strategy. If you take tlog backups every 30 mins then that's your potential data loss, up to 30mins worth of data. Everything else is just offline until it can be restored assuming you have good backups. There's a big difference between the two. I consider data lost if it's truly lost, meaning there's no way to recover it. Everything else is just temporarily offline.
Post #1044363
Posted Friday, January 7, 2011 6:43 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Tuesday, September 16, 2014 2:03 PM
Points: 1,335, Visits: 3,069
GilaMonster (1/7/2011)
It depends. Among other things it depends on what data is going to be missing.



I think that this is one of the most important points made so far. Also, how easily is the missing data to rebuild? A bad lookup table is one thing, but a table that holds one's personal account records and balance is quite a different animal.


"Technology is a weird thing. It brings you great gifts with one hand, and it stabs you in the back with the other. ..."
Post #1044364
« Prev Topic | Next Topic »

Add to briefcase 123»»»

Permissions Expand / Collapse