SQLServerCentral Editorial

Does Speed Compromise Quality?

,

This editorial was originally published on Apr 11, 2017. It is being re-run as Steve is out of town.

One of the parts of DevOps that is often hyped is the speed and frequency of releases. Starting with Flickr and their 10 deployments a day, to Etsy deploying 50 times a day, we've seen companies showcasing their deployment frequency. Amazon has reported they deploy code every 11.7 seconds on average. That seems crazy, but with so many applications and lots of developers, not to mention each change being smaller, perhaps it's not completely crazy. With the forum upgrade here at SQLServerCentral, we had two developers (with occasional other sets of eyes reviewing code changes), and while we were bug fixing, we deployed multiple times per day.

Is that a good idea? Does a rapid set of changes mean that quality is lower and more bugs are released? It certainly can. In fact, if you're a development shop that struggles with releases and code quality, producing software faster is not going to help you. In fact, if management pressures you to adopt DevOps, and deliver code faster without culture change, without implementing automated testing, including for your database code, and using automated scripts, tools, or something to deploy software, then you are going to get more bugs out faster. You'll still get to change direction quicker if you find you're building the wrong software, but you'll still end up becoming more inefficient because of bugs (And technical debt).

There's a fantastic video (long) about refactoring code in two minutes. A bit of an oxymoron since the presentation is nearly two hours long, but the video is from a real project. However, their approach is that good unit testing allows them to refactor code, to change things, without introducing bugs. That's a big part of the #DevOps philosophy. I always note in my DevOps presentations that if you can't implement unit testing, meaning you won't bother, then you don't get much benefit from CI, CD, or any DevOps ideas. Tests protect you from yourself (and others).

In many of the DevOps reports, companies that release faster report fewer bugs and less downtime. Since Amazon has increased their speed, they have 75% fewer outages across the last decade, 90% fewer time down, and many, many fewer deployments causing issues. Turbotax made over 100 production changes during tax season and increased their conversion rates. The State of DevOps reports bear this out (2016 here). Thousands of responses show that speed doesn't cause more bugs.

Because they work differently.

If your management won't let you change the way you work, if you don't implement automated unit tests (and other types of tests), if you don't take advantage of version control, if you don't ensure every change is scripted, then you won't work differently, and speed will bring bugs.

You can do better. Your company can do better. Will they?

Rate

5 (1)

You rated this post out of 5. Change rating

Share

Share

Rate

5 (1)

You rated this post out of 5. Change rating