Jason Massie called me out for a new SQL Quiz that was the diabolical idea of Chris Shaw.The gist is to post two mistakes you've made in your database career. I'm normally a very cautious person when it comes to making changes and I tend to be meticulous in these situations, so I don't have any catastrophic stories, but here are ones that I definitely felt the pain for.
This is from back when I was primarily a web developer. I was working on a complete re-write of an existing intranet portal and upgrading it from Access to SQL Server 7.0. Needless to say, I had done a ton of T-SQL coding. And periodically I was taking backups of my work because this kind of development was relatively new for the organization and we didn't have a source control platform yet. After a marathon session of coding I forgot to take a full backup. The next day I went in, something was up (I don't recall anymore), and I grabbed the latest backup and restored... not thinking about the fact that it was taken before all that code had been written. That code was gone. About 7-8 hours worth of data structures and stored procedures. *poof*
Did I mention I was a contractor at the time? And my mistake had cost that 7-8 hours of work. And it had been an extremely productive 7-8 hours of work. Now I could have just said, "Oh, well," and billed the company for the time it took to get me back on schedule, but it was my mistake in the first place. My mistake, my time. So for the next 3 days I worked 2-3 hours "off the clock" catching back up. The good news was that since I had written the code once and I had done it just recently, I was able to rewrite it all faster than the first time around. However, at the end of the day, I still lost that time.
This one is more recent. We were doing a power down of a bunch of servers and I automated the shutdown of our "admin" SQL cluster, along with many of the other Windows systems. It went down smooth. And I didn't think anything of it until the VMware admin yells out over the cube walls, "Hey, what happened to Virtual Center?!?" And then it hit me: the Virtual Center database was on my nicely shut down cluster. Oops. That meant running down to the data center, restarting one of the nodes, verifying the DBs came online, calling the VMware admin on his cell and saying, "Restart the service." Service restarted, and they were back in and able to begin shutting down the host ESX servers. Now I need to point out that the older versions of ESX this isn't a concern as you can SSH in and execute the full command set. However, the newest versions restrict what you can do unless you're at the console. You can override these settings, but why would you? Virtual Center is the answer as it allows you to shutdown without having to get to a console. Also, it's faster than logging on to every ESX server and issuing the shutdown command. The power down was on a timetable because we were redoing power to those racks and there were folks waiting on us. Total cost of the delay from discovery of the problem to resolution was about 15 minutes. We made out shutdown window by the skin of our teeth. Nothing like having your boss and your boss' boss looking at you for almost causing the whole timeline (which was tight to begin with) to slip.
I'll pass the torch on to: