Sunday 1 December 2013

Nuclear Launch Codes and an Idiot's Luggage: The Danger of False Assumptions

I saw a fascinating article this week by Karl Smallwood on the website www.TodayIFoundOut.com. Apparently for 20 years, the Minuteman nuclear silos all had the same launch code of 00000000. Truth really is stranger than fiction.

It reminds me of the Spaceballs scene where the king is blackmailed into revealing his secret combination of 12345.  Rick Moranis' character shouts back, "That's the stupidest combination I've ever heard in my life! That's the kind of thing an idiot would have on his luggage!" Apparently the luggage of idiots was about as secure as the American nuclear arsenal for many years.


The launch code was revealed in 2004 by a former Minuteman launch officer. Apparently this passcode security was initiated by JFK to prevent a rogue general from starting a nuclear war on his own. The passcode was accompanied by a strong security system that could not be hot-wired to get around the passcode -- the missile simply could not launch without it. However, that entire safety system was undermined by generals who ignored the president's order and required the passcodes to be reset to zeroes after each inspection to ensure the missiles could be launched quickly if the Russians ever fired first.

I experienced a similar situation at one large organization that had very tight security on their enterprise data warehouse, although unlike the generals in charge of missiles this was an unintentional breach of security policy. Because a portion of the data warehouse contained sensitive client information, separate passwords were issued only to those employees that had a justifiable business need to access the data.  The complex password could not be changed by the user and it expired after a few months so the user had to repeat the application process to prove that they still had a business need to access the data. It was a fairly complex security system that was completely undermined by a single failed assumption. A new security analyst was transferred from the helpdesk to the data warehouse team to issue passwords, and he thought it was the same process as at the helpdesk, namely issuing temporary passwords to users who had forgotten theirs. No manager got around to telling him what his new role actually entailed. When I got my data warehouse password, I was shocked to see it was abc123. I soon found out that everyone in the department had the same password and that this had been happening for a few weeks already. Because no user liked the old complex passwords that they couldn't remember, everyone was thrilled with this new simple password and no one had any incentive to notify the data security manager of the obvious flaw in the process. Despite very strict data security processes and policies, sensitive client data had been made as secure as an idiot's luggage.

What's the lesson? 

It is so easy to make assumptions that will undermine the greatest of plans. Everyone assumed that generals would always obey an order from the president. That turned out to be a false assumption. Everyone assumed that each new data security analyst would be trained in the process of issuing new data warehouse passwords. On one occasion that turned out to be a false assumption. Simple human failures can undermine even the most complex security processes when assumptions are not identified and tested.

I've designed data processes for many years. It's a helpful discipline to get in the habit of asking myself:

  1. What assumptions have I made this time?  
  2. If any of those assumptions turn out to be false, will that have a significant effect on my process?
  3. If the effect is significant, can I build control processes to identify when those assumptions are false?

For example, in the data warehouse password situation above, the assumption was that the person issuing passwords will be trained on the procedures for issuing passwords. If that assumption is false, the effect is very significant. Therefore a simple control process would be to generate a report each day showing all the new passwords that have been issued. One glance at such a report would've shown that all passwords were suddenly identical and that an urgent problem existed.

The challenge is that answering question #1 is incredibly difficult. We simply aren't used to identifying our assumptions -- they're automatic. One way to get around that is to skip question #1 entirely and rephrase question #2 to be:

    2. If my process fails, will that have a significant effect on the business?

If yes, then build some control processes to identify process failures as early as possible. The failed assumptions will become clear later once problems are encountered.