No Works of Art

  • Comments posted to this topic are about the item No Works of Art

  • This is a very hard sell for me at work. Has anyone gone through the process of changing a culture from using GUI based installers to one using a text/scripting process?

  • We've moved to the point where everything developed in the past 12 months is deployed using Puppet and Vagrant. You trigger a build and a pre-configured system materialises either on your desktops or your servers.

    The challenge for us has been getting something that happily builds from a single master to something that can build on many hundred machines without a single master dependency.

    Technologies like Docker look incredibly promising.

    By treating infrastructure as software it means that the full configuration goes under version control!

    It's a brave and better new world.

  • Just like many readers of this editorial, I am sure, half way through I was thinking "scripting". It is a great way to maintain documentation (please tell me you add comments to your scripts!!!) in a way that assists manual ad hoc tasks as well as enabling automation.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • Interesting Topic

    We have a very complicated process to deploy our product. We use a combination of msi and powershell and manual configuration. Talk about work of art, this belongs in MoMA 🙂

    Seriously, because the software/product relies on so many technologies, SQL Server/IIS/SSIS/MongoDB, the deployment is very involved but credit to the developers, they are moving towards a simpler version that is more scripted. But it is taking a while.

    This is ok but the deployment tends to be more "black box" and when things go wrong it needs more than one person to deal with it. So we have the issue of confidence in the deployment.

    We also suffer from a 2 week deployment cycle which means we have very little time to test before a live release. We don't even have a environment that mirrors live..shocking I know

    All these things make for quite an anxious time. But amazingly we continue to release.

    I am all for a scripted approach or at least an automated process that removes decisions from the deployment engineer but can we really get there when there are many technologies and are we widening the knowledge gap between the creators of the software and those that are supposed to support it (I know this will vary company to company, but in my company we tend to keep developers away from Customer Support 😉 )

    Graeme

  • I am also a big fan of (preferably scriptable) environment testing utilities. Often PowerShell cmdlets.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • I'm a student of Master Sean McCown and his attitude towards being an Enterprise DBA: Let's script or automate everything!

  • Scripting and automation are great tools. They are perfect tools for lazy DBAs (we like to not have to do the same manual thing over and over). Sure it may take a little bit of extra effort to get there, but hopefully that effort level does not exceed the effort level of doing things manually.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • Sometimes it is not wasted time but mistakes that are trying to be avoided.

    Of course, if the script is not robust nor tested enough you can repeated the same mistake loads of times pretty quickly!!!

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • Documents never get rev'ed to keep up with changes. Human memory is faulty, and certainly we find ourselves clicking the wrong button at times

    Alas, scripts (especially the 'auto answer' variety) suffer from the same problem, perhaps more dangerously because you never easily see what is being replied

    ...

    -- FORTRAN manual for Xerox Computers --

  • Data state is something that is tough to automate, which I suppose is job security. There is also something of a mismatch still between windows automation, which is generally a run once approach and the *Nix side of the world which is constantly enforced state through puppet and chef etc, but it's bridgeable. Our 2014 migration will be entirely puppetized. But the individual databases are still gonna stay snowflakes. Scripted snowflakes, but still pretty melty.

  • jay-h (5/29/2014)


    Documents never get rev'ed to keep up with changes. Human memory is faulty, and certainly we find ourselves clicking the wrong button at times

    Alas, scripts (especially the 'auto answer' variety) suffer from the same problem, perhaps more dangerously because you never easily see what is being replied

    True, but scripts break. Then you fix them, and they work for long periods of time. So often a simple change in a screen will ensure that a document doesn't work any longer. What's more, you'll correct the issue in the software, but not the document. If you correct the script, it works from that point forward.

  • A key script is the check out script. Verify after you make changes that the changes are as expected.

  • I remember when things shifted from telling the user what to type in on the command line to the grahic gui. Going from simply "type this". to "Left click", "Double Click", "right click" etc. definitely complicated things. And then there customers who reversed the left and right buttons on their mouse!

    My favorite gauge of user computer savvy was on "dir *.exe". If they came back with "bad command or file name". I knew from then on, I'd have to say "dir space *.exe".

  • We used to push all of our changes and do lots of management through the GUI or by manually running scripts, installers, copying bits, etc. We slowly (and sometimes painfully) move towards automation for just about everything. We use Chef to set up our windows VMs, configurations, which features to install, how to configure them, and even what extra software to install and what scripts to use when installing it. We use SSDT to push our DB changes, which has been interesting at times, but very useful to make sure the target looks the way we want. We push our apps through Jenkins tasks.

    We had a lot of issues with our manual processes. The releases took hours, we regularly would miss a step or two along the way - not all of the time, but far too often. Sometimes we'd forget to include a critical change or script because it was released some time ago in some environment but not production. We rarely had a good rollback plan in place and tried to push through, which led to changes on the fly. After moving to scripted and automated releases we don't tend to have those issues. We push carefully through the environments and test to make sure we have everything along the way. That has made life much easier for our teams and our customers. We can usually even release during business hours while the system is in use with little or no customer impact.

    For those who haven't gone through it - it's not an easy path, but you'll really appreciate the work in the future when you can resolve issues, release software, or fix server issues with minimal effort.

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply