SQL Clone
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in

Observe Carefully When Things Are Running Fine

Cross-posted from The Goal Keeping DBA blog:

RadarAs a worker in information technology, I’ve always been a big proponent of studying systems when they are working. I’ve also said that doing so helps me immensely when troubleshooting. However, I’ve never studied why that’s true. There’s a scientific explanation as to why, as covered at a layman’s level in How We Decide by Jonah Lehrer. And that explains why Lt. Cmdr. Michael Riley was able to make the right decision in a critical situation with lives on the line. If you aren’t familiar with Riley, here’s what happened, taken from How We Decide:

During the ground invasion of Desert Storm, Lt. Cmdr. Riley was a radar officer on a British vessel that was providing fleet defense for the US battleship, the USS Missouri. He saw a radar blip that didn’t look right. The problem he faced was he didn’t have enough information to determine if it was an Iraqi missile, a Silkworm, or a returning US attack aircraft, an A-6. The route the radar hit took followed the route returning A-6s took when coming back to the carriers. And the A-6 pilots had a bad habit of not re-enabling their IFF (Identification – Friend or Foe) systems once they cleared a certain point. They were turned off because the IFF could provide an easier means of locating and targeting the aircraft for Iraqi anti-air batteries. Riley was in a quandary.  To make matters worse, though he might have been able to figure it out based on a different radar system – one that measured altitude – that system was down.

Riley had to make a decision and he didn’t have enough information. His intuition told him it wasn’t an A-6. If he fired and he was wrong, two American pilots would die. If he didn’t fire and he was wrong, a Silkworm would slam into the USS Missouri and the life loss would be even greater. Either way, if he was wrong, people died. He went with his intuition. He ordered the launch of missiles to intercept. They did their job and whatever it was was shot down. Then the waiting came. The scene investigation proved him right: he had shot down a Silkworm missile. The question was, “How did he know?” He couldn’t explain. The investigation that followed couldn’t pinpoint a reason. The conclusion was that Lt. Cmdr. Riley just “got lucky.”

Enter a behavior specialist a few years later, one who had worked with the US Marine Corps. He studied all the data. Then he found how Riley knew. Because the Silkworm flew lower than returning A-6s, it stayed hidden in ground clutter for a period of time after entering the sweep range of the ship’s radar. The A-6s were picked up about 8 seconds before the Silkworm. Riley didn’t consciously recognize this. The initial investigators didn’t pick up on it, either. However, Riley’s brain subconsciously saw on it. And that’s what gave him the intuition that the radar hit wasn’t an A-6. That’s why he made the correct decision.

There’s a section in our brains that learns what is right. It trains itself towards expected behavior. Riley’s brain had trained itself towards the A-6 behavior. The reason our brains do this is so that when something isn’t right, they detect that something unexpected is out there and can immediately alert us. You can carry this out logically on how this would be very helpful in a survival situation. That’s what happened with Lt. Cmdr. Riley. His brain had trained itself to see the regular pattern of the A-6s. When a different pattern showed up, it altered his emotions and triggered his intuition. That’s how he knew to make the right call.

This is why it’s important to constantly look at systems and study things when everything is going fine. This explains why the best troubleshooters in IT always insist on observing their systems in this state.  They train their brain to see what the system looks like, what the diagnostics read, when things are fine. That way, when things aren’t, they are drawn to the diagnostics that point to a problem. We can apply this to many areas of our lives, not just computers. Make it a point to study things when everything is right. You’ll be training your brain to detect when things aren’t. You’ll be able to see the problem before others might. That will set you apart.

K. Brian Kelley - Databases, Infrastructure, and Security

IT Security, MySQL, Perl, SQL Server, and Windows technologies.


Posted by chuck.hamilton on 14 January 2013

Interesting story and very true. Let me share something similar.

How do you teach someone to recognize the difference between real money and counterfeit money? You get familiar with the look and feel of real money. A counterfeit bill will then be immediately recongnizable, You probably wont know why, but your instinct tells you it's fake.

Posted by GeorgeCopeland on 14 January 2013

The quality and relevancy of the data being observed is critical.

Leave a Comment

Please register or log in to leave a comment.