Riding home for a quick lunch yesterday, I heard part of an NPR broadcast, detailing David Hoffman’s book, The Dead Hand.  A correspondent for the Washington Post, Hoffman has chronicled Cold War relations between the then Soviet Union and the USA, to show us that we still have some serious work to do.

So, I ran to the library upon return to campus and grabbed the book.  I was taken with Hoffman’s description of Stanislov Petrov (he appears very early in the story), commander of a missile-attack early warning system.  On 26 September, 1983, he had to make a fateful decision.  Keep in mind that, back then, the Soviet Union and the USA had thousands of nuclear warheads aimed at each other.  The Soviets knew that the Americans could fire missiles that would hit the Kremlin in about 30 minutes from launch.  A decision to respond had to be made in minutes.

On this shift, in the middle of the night, Petrov had to respond to what appeared to be an American missile attack.  A siren went off.  The map on the wall lit up.  As post commander, it would be his call, how to respond.  Now, I’ll let Hoffman tell the story:

The board said “high reliability.”  This had never happened before.  The operators at the consoles on the main floor jumped up, out of their chairs.  They turned and looked at Petrov, behind the glass.  He was the commander on duty.  He stood, too, so they could see him.  He started to give orders.  He wasn’t sure what was happening.  He ordered them to sit down and start checking the system.  He had to know whether this was real, or a glitch.  The full check would take ten minutes, but if this was a real missile attack, they could not wait ten minutes to find out.  Was the satellite holding steady?   Was the computer functioning properly? (The Dead Hand, 10.)

Running down the list of options (an actual attack, an accidental launch, a technical glitch), he drew on  his years of experience.  Something (his gut?) told him that this was not an actual attack, even though the system said it was.  He also had to draw on his moral courage.  And he did.  He reported to his crew: “This is a false alarm.”

Petrov had made a horrendously difficult, strategic decision on the basis of ambiguous and confusing evidence.  He had to reach down for something more than calculational logic.  And this one decision averted what most likely would have become a devastating nuclear confrontation.

I’ve been thinking quite a bit lately about how large-scale problems get solved.  We can offer lots of helpful, sophisticated analysis.  But most of the time, it comes down to one person, with the vision, strength and moral courage to say and do the right thing at the right time.

This is a spiritual matter.  It still comes down to the kind of person you are, and I am.  By God’s grace, in every day, let us be the kind of people who demonstrate character, wisdom and moral courage.

It Still Comes Down to the Kind of Person You Are

2 thoughts on “It Still Comes Down to the Kind of Person You Are

  • August 10, 2010 at 10:22 pm
    Permalink

    Good post, Steve. This is an important word to us as Christians. We have to trust in something (Someone) that may not line up quite so logically. It takes spiritual maturity to know when the illogical is actually the better path.

    Reply
  • August 21, 2010 at 6:13 pm
    Permalink

    While American manufacturing may be down from its glory days, those who spend day and night manufacturing outrage, for one thing or another, have a booming trade. When we’re supposed to be outraged about nearly everything (a Mosque in NYC, the President, the Republicans, Wall Street, the Environment, etc.) the role of ignoring “false alarms” become more important in our moral lives.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.