Most teams building software seem to go a little too fast to ensure their code is both secure and of high quality. I don't think it really matters whether you are working in a waterfall process, agile, lean, or any other methodology. Whether fast or slow, humans will make mistakes, new code can introduce a vulnerability. Even if you follow great practices, it seems that hackers and criminals find new attack vectors all the time. I'm not sure we really can go slow enough and stay in business.
Those of us working as data professionals know that protecting the data in our databases is important. We are reluctant to allow too much change too quickly, especially when there might be changes that affect security. However, is limiting change the best idea?
I'd argue no. DevOps preaches the ability to update on demand, and often, as soon as code is complete. This doesn't mean we don't test or pen test or run security scans or anything else. It does try to limit the work in progress, which means that we aim to allow updates to our lives systems regularly.
An article for CIOs notes this that DevOps helps us improve security, precisely because we can fix things quickly. This might be especially important in high security environments, like government systems. The ability to patch, correct faulty code immediately, and respond to threats is important. There could be breakage from fast moving code, but another part of DevOps is improving your knowledge and skills, working to improve not only the quality of existing code, but also the quality of all future, first written code.
I would rather manage database systems that backed applications being updated on demand in a DevOps flow. I'd rather be able to patch and update libraries, platforms, and frameworks quickly. We've seen the problems in systems that aren't updated with the Equifax breach. We should learn from this incident and ensure we can patch and update systems on demand, whenever we need to do so.