The Human Touch

  • Comments posted to this topic are about the item The Human Touch

  • I understand what you're saying, Steve. But there is also another side to the equation.

    First, on using the script button in SSMS.

    Using this button to generate code can be dangerous - especially if you trust the script blindly. I know that older versions of SSMS often created scripts that were buggy. (Worst example I have seen, I think from the SQL2005 version of SSMS, is a script that "changed" a table by creating a new one, copying the rows, then dropping the old table. I noticed that some of the BEGIN TRAN / COMMIT TRAN statements were in the wrong places - a server failure in the worst possible moment could result in losing all data in the table!). Of course, the same risk is present when executing from the SSMS dialogs (since that uses the same script, except without showing it to the user). And I have to add that the scripts have improved, though I have not really inspected them in detail since SQL2008.

    So my take would be that the script button is very important - not as a way to easily create reusable scripts, as you indicate in the article, but as a way to double check the script generated by SSMS and correct it if it's wrong, unsafe, or inefficient.

    Then, on using scripts and/or PowerShell in general.

    You push using this kind of automation as a way to prevent human error, because you eliminate the error-prone process of a human filling out a dialog and pressing buttons. However, humans working with a dialog *know* that they might make errors. They will usually try to prevent this by working carefully and checking what they do. Using a script gives a (sometimes false) sense of security. When people use a script, they expect it to work, period. But what if there is an error in the script? Or if the script fails to handle an unanticipated situation appropriately? If people just fire off the script and expect it to work, it can take a long time, and a lot of executions, before someone finds out. And by then, reparing the damage may be impossible.

    I'm not saying that your arguments have no value. They do. But there's another side to it as well. No black, no white, just different shades of grey. (And I guess I'd better resist the temptation to count how many...)


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • I have to agree with Hugo. Relying on a script is all very well, but a human being was involved at some point in the script's creation (even if it was just entering the details into the SSMS dialogs) and therefore the script itself could be incorrect. Heck, the script could work fine for years, then fail when anyone who knows how it works has left the company, making for a very awkward situation indeed!

  • Hugo Kornelis (8/1/2013)


    Then, on using scripts and/or PowerShell in general.

    You push using this kind of automation as a way to prevent human error, because you eliminate the error-prone process of a human filling out a dialog and pressing buttons. However, humans working with a dialog *know* that they might make errors. They will usually try to prevent this by working carefully and checking what they do. Using a script gives a (sometimes false) sense of security. When people use a script, they expect it to work, period. But what if there is an error in the script? Or if the script fails to handle an unanticipated situation appropriately? If people just fire off the script and expect it to work, it can take a long time, and a lot of executions, before someone finds out. And by then, reparing the damage may be impossible.

    We really need to treat scripting like programming in general. We want to test the script using as wide a net as possible, using a variety of input, even some input that is downright bogus. Yes, this does mean generating or procuring test data and putting effort into evaluating what conditions this script will execute in. This does mean that scripting (or programming in general) will take more effort than running one instance of the steps manually.

    The advantage of scripting is that you can invest significantly more time into developing the script than would be involved in doing the steps manually, because once the script is working, you can then recoup the costs of developing the script over the subsequent executions that replace the manual steps. Every run of manual steps might include the possiblity of error, and a properly running script will eliminate this factor.

    I think we forget how errors creep into our manual work, because when we repetitively run manual operations, we start to take these operations for granted, sure, like you say we're diligent the first run through the manual steps because we anticipate that we can make errors, but how diligent do we remain after the 50th time running manual steps? The advantage to scripting is that we also should be diligent and watchful for errors during the time developing the script, but once the script is successful, that diligence against errors should be in the script now as tests for erroneous conditions and does not "age" or "decrease" as would repetitive execution of manual steps.

    And the advantage of scripting is that you can test that script to whatever extent you like, with the additional benefit that you can introduce edge cases (and downright bogus conditions) into your testing and see if you've covered conditions your script will subsequently encounter, but with manual processes, well I'm not so sure.

  • That's all true, but I come from a programming background, and I know that you will pretty much never, ever anticipate every daft or incorrect input a user is capable of providing. 🙂

  • I would also firmly point you towards Sidney Dekkers Field Guide to understanding Human Error.

    In my opinion one of the cross disciplinary must reads for any DBA.

  • paul.knibbs (8/1/2013)


    That's all true, but I come from a programming background, and I know that you will pretty much never, ever anticipate every daft or incorrect input a user is capable of providing. 🙂

    In my case, instead of ceasing programming, I instead looked for ways to simulate daft input, and have generalised this further into putting filters on input that required conditions to be "non daft" in nature 🙂

    Not a perfect situation, I've been surprised before, made errors before, but in my opinion, during the course of scripting a task we can compress all of the analysis and synthesis of error handling into the exercise of scripting, whereas Hugo is stuck repeating all that analysis and synthesis DURING EACH MANUAL RUN OF HIS TASK.

  • RichB (8/1/2013)


    I would also firmly point you towards Sidney Dekkers Field Guide to understanding Human Error.

    In my opinion one of the cross disciplinary must reads for any DBA.

    Awesome, I'm going to get that one!

    I also like reading the "risks" digest http://en.wikipedia.org/wiki/RISKS_Digest

  • patrickmcginnis59 10839 (8/1/2013)


    whereas Hugo is stuck repeating all that analysis and synthesis DURING EACH MANUAL RUN OF HIS TASK.

    Heh! You must have misunderstood my point. I never said I am opposed to scripts, or that I don't use them. I just pointed out that they, too, are imperfect.

    I hate having to repeat tasks manually; as soon as I anticipate that I might have to repeat a task, I almost always invest the time to script it.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Hugo Kornelis (8/1/2013)


    patrickmcginnis59 10839 (8/1/2013)


    whereas Hugo is stuck repeating all that analysis and synthesis DURING EACH MANUAL RUN OF HIS TASK.

    Heh! You must have misunderstood my point. I never said I am opposed to scripts, or that I don't use them. I just pointed out that they, too, are imperfect.

    I'm glad I just misunderstood it and that you didn't really mean what you typed!

  • Hugo Kornelis (8/1/2013)


    I understand what you're saying, Steve. But there is also another side to the equation.

    First, on using the script button in SSMS.

    Using this button to generate code can be dangerous - especially if you trust the script blindly. I know that older versions of SSMS often created scripts that were buggy. (Worst example I have seen, I think from the SQL2005 version of SSMS, is a script that "changed" a table by creating a new one, copying the rows, then dropping the old table. I noticed that some of the BEGIN TRAN / COMMIT TRAN statements were in the wrong places - a server failure in the worst possible moment could result in losing all data in the table!). Of course, the same risk is present when executing from the SSMS dialogs (since that uses the same script, except without showing it to the user). And I have to add that the scripts have improved, though I have not really inspected them in detail since SQL2008.

    So my take would be that the script button is very important - not as a way to easily create reusable scripts, as you indicate in the article, but as a way to double check the script generated by SSMS and correct it if it's wrong, unsafe, or inefficient.

    Then, on using scripts and/or PowerShell in general.

    You push using this kind of automation as a way to prevent human error, because you eliminate the error-prone process of a human filling out a dialog and pressing buttons. However, humans working with a dialog *know* that they might make errors. They will usually try to prevent this by working carefully and checking what they do. Using a script gives a (sometimes false) sense of security. When people use a script, they expect it to work, period. But what if there is an error in the script? Or if the script fails to handle an unanticipated situation appropriately? If people just fire off the script and expect it to work, it can take a long time, and a lot of executions, before someone finds out. And by then, reparing the damage may be impossible.

    I'm not saying that your arguments have no value. They do. But there's another side to it as well. No black, no white, just different shades of grey. (And I guess I'd better resist the temptation to count how many...)

    Good points, and this is a complex subject. For using SSMS, you have issues either way, but with a script, at least you can see what's being done. I've actually done this and then avoided running the script because I could see it was a table drop and recreate, which was an issue.

    In terms of PoSH and other scripting languages, I'd still assume you did testing, or expect you to test. Even your own scripts, and I would hope you take more care. You can make a mess of things much quicker with a script than you can with the GUI for sure.

  • RichB (8/1/2013)


    I would also firmly point you towards Sidney Dekkers Field Guide to understanding Human Error.

    In my opinion one of the cross disciplinary must reads for any DBA.

    Interesting. Never heard of it, but added to my reading list.

    It's here on Amazon: http://www.amazon.com/Field-Guide-Understanding-Human-Error/dp/0754648265

  • patrickmcginnis59 10839 (8/1/2013)


    paul.knibbs (8/1/2013)


    That's all true, but I come from a programming background, and I know that you will pretty much never, ever anticipate every daft or incorrect input a user is capable of providing. 🙂

    In my case, instead of ceasing programming, I instead looked for ways to simulate daft input, and have generalised this further into putting filters on input that required conditions to be "non daft" in nature 🙂

    Not a perfect situation, I've been surprised before, made errors before, but in my opinion, during the course of scripting a task we can compress all of the analysis and synthesis of error handling into the exercise of scripting, whereas Hugo is stuck repeating all that analysis and synthesis DURING EACH MANUAL RUN OF HIS TASK.

    There are risks either way, but you have to choose some method. The thing I'd say with a script is that I have the advantage of logging what happened and being able to debug it when I've made a mistake.

    If I've clicked on a button, I am relying on my own (highly suspect and faulty) witness testimony to trace back the actions.

  • Steve Jones - SSC Editor (8/1/2013)


    patrickmcginnis59 10839 (8/1/2013)


    paul.knibbs (8/1/2013)


    That's all true, but I come from a programming background, and I know that you will pretty much never, ever anticipate every daft or incorrect input a user is capable of providing. 🙂

    In my case, instead of ceasing programming, I instead looked for ways to simulate daft input, and have generalised this further into putting filters on input that required conditions to be "non daft" in nature 🙂

    Not a perfect situation, I've been surprised before, made errors before, but in my opinion, during the course of scripting a task we can compress all of the analysis and synthesis of error handling into the exercise of scripting, whereas Hugo is stuck repeating all that analysis and synthesis DURING EACH MANUAL RUN OF HIS TASK.

    There are risks either way, but you have to choose some method. The thing I'd say with a script is that I have the advantage of logging what happened and being able to debug it when I've made a mistake.

    If I've clicked on a button, I am relying on my own (highly suspect and faulty) witness testimony to trace back the actions.

    This is one of the biggest pros of using a script. It is reproducible and provides a "logging" mechanism of what has been done (almost self documenting). If that logging is not adequate, you can modify it to output to a log (file or table) what has been done and provide timing measures.

    Though it is not foolproof, it can be easier to "debug" and discover what went wrong and where it went wrong.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • Thanks Steve.

    To Error a few times is Human, to error repeatedly doing the same thing is insane! To write a script to replication the error is well beyond the pale.

    Miles...

    Not all gray hairs are Dinosaurs!

Viewing 15 posts - 1 through 15 (of 20 total)

You must be logged in to reply to this topic. Login to reply