Oops, I deleted that data

  • Comments posted to this topic are about the item Oops, I deleted that data

    MVDBA

  • Thanks for the post. I agree with you as I have run into this exact scenario numerous times.

    One caveat though to keep in mind:

    The commented out ROLLBACK / COMMIT sections might come back and bite you if the person executing the code forgets to execute either of these lines. The result is an uncommitted transaction that can become the head blocker in a long, long list of blocked sessions, causing all sorts of issues if left unchecked.

    I usually suggest keeping the ROLLBACK line uncommented, but this depends on the task at hand and any possible "repercussions" (eg: missing values in identity columns if the script contains INSERT statements).

    "My gosh! It's full of [SELECT] stars!"
  • I've always used database snapshots for this kind of thing. But this is when such updates to a table are planned and controlled and a snapshot can be created right before and then dropped as soon as the changed is confirmed to be correct.

    The performance degradation of having multiple snapshots on a busy OLTP system was too much to make snapshots viable as a continual protection against this kind of thing. This was on a 20 TB database with 5--600GB of T-Log backups per day.

  • I've used the same approach for many years and it's saved me more times than I can remember.

    I agree with @simoesp that I'd have the rollback by default. I also tend to show the after state as well as the before state so I can see what the change were.  For a simple scenario this may be overkill, but it's particularly useful for complex updates (cases) and/or updates/deletes with joins to other tables.

    My variation goes along the lines of:

    declare @doCommit bit = 0;
    SELECT 'BEFORE' 'BEFORE', * FROM dbo.account WHERE account_code='4c';
    SELECT @@ROWCOUNT;

    BEGIN TRAN
    UPDATE account SET active_yn =0 WHERE account_code='4c'
    SELECT 'AFTER' 'AFTER', * FROM dbo.account WHERE account_code='4c'
    SELECT @@ROWCOUNT

    if @doCommit = 1
    begin
    select '!! COMMITTED !!'
    commit tran
    end
    else
    begin
    select '!! ROLLBACK !!'
    rollback tran
    end
    go


  • I sometimes backup the table inside the database from the select statement. So it’s relatively easy to compare and - if necessary restore  data without the need of a backup.  SELECT * INTO products20191029 FROM products WHERE … You could do it also in the tempdb

  • I always back up the data I'm going to update, usually in a db created especially to keep some backouts.  Don't think I've ever had to use them for a rollback, but it's a very good habit to keep

  • This was removed by the editor as SPAM

  • Our methodology depends on the size of the set of data. For something relatively small we select the set of data and validate it is what we want. We convert this into a select into into a schema and table named for the ticket or issue number. Wrap all this in a begin try catch with the delete. For large sets of data we determine which takes less time, the restore of a database, table or the saved set of data and determine which method to use.

  • Great comments throughout

    one of the issues i'm looking at is that for a table with 50 million records backing up that entire table might not be practical - and how long do you retain the data, how do you identify what has been changed? - which is why I advocated taking just those few rows and attaching them to the support ticket.

    It's in no way perfect and I really want to make it better

    MVDBA

  • I agree with @Mark.hausmann and @Freddie-304292 about keeping a backup of the data before attempting any change. What I do not understand is why developers have free reign over the data and, especially, have the right to proceed in executing unsupervised updates at such ungodly hours. A strict change control process could include a review of the SQL code and assigning a DBA slot for executing the update.  During the review it would be ensured that the code would contain the correct transaction syntax and also that a temporary storage place were included e.g. a temporary table as suggested by @Mark.hausmann

  • What about temporal tables for your tables that need to be recovered very fast?

  • You can just insert the data you need in a table. You can uses joins and identify easily before and after values.  I think attaching the data outside the database is not a good Idea to follow, compromising security, no control what happens to the data outside the database…  I think you will end up adding a lot of complexity in the end.

  • That's the downside of Dev-Ops I guess, where developers are allowed to change live data.

  • ildjarn.is.dead wrote:

    What about temporal tables for your tables that need to be recovered very fast?

    good shout. worth looking into. I think our data churn is way too big for that , i'd rather get our devs doing it better 🙂

    MVDBA

  • Charlie Wongy wrote:

    That's the downside of Dev-Ops I guess, where developers are allowed to change live data.

    Interesting point, part of the devops process is about automated builds (including stopping people developing on live) and also data cloning/provisioning which also helps with GDPR issues in terms of data masking and anonymity. - that is a huge pipedream for a lot of us though

    i'm hoping steve weighs in on this as it was either him or grant that mentioned it.

    MVDBA

Viewing 15 posts - 1 through 15 (of 77 total)

You must be logged in to reply to this topic. Login to reply