**7. That the risk was foreknown and knowledge of it was transmitted**

In the case of the *Challenger* disaster, nothing should blind us to the point that the most senior engineer involved was keenly aware of the fatal risk involved and sent red-flagged warnings to this effect. That these warnings were not heeded is sometimes obscured with the "argument" that one cannot hold up actions to be taken on the basis of warnings since every possible action will always have risks and it is next to impossible to take note of every single warning that comes across one's desk. The existence of warnings that cannot supposedly be noticed is referred to under the hypothesis of "weak signals".20 The hypothesis of "weak signals" is offered up as a rationale why warnings cannot always be well noted.

This "weak signals" hypothesis is easily refuted when one considers the *source*, the *content* and the *form* of the signals. The *source* of the signal in this case is the most senior scientist and the one who knows the most about the O-rings. It is not a crank call made by a tourist to the space center at the information desk. The source in this case was the project's senior engineer himself. It was a warning from the inside, by an insider, who knew the most about the technology about which he was issuing the warning.

The *content of the signal* warns of the danger being fatal to all aboard. There is no weakness in terms of the content of the message. In Boisjoly's famous memorandum of July 31, 1985, he warned before the fact that if there were a launch, 'The result would be a catastrophe of

<sup>18</sup> *Ibid*., p. 192. (emphasis his)

<sup>19</sup> *Ibid*., pp. 172-3. The author of this chapter was gratified when he personally received a hand-written letter written to him by the famous sociologist David Riesman from Harvard, author of *The Lonely Crowd*, who wrote to commend the merits of this author's first review of Diane Vaughan's book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, which was to come out in *Society* . Later, he was prompted by this author's second review, which took a very different tack, and which appeared in *Business Ethics Quarterly* to read her book after which he wrote to this author to say that Diane Vaughan's book was simply, in his words, 'a bad book'.

<sup>20</sup> The hypothesis of "weak signals" to describe the warnings of the failure of the O-rings was put forward by Diane Vaughan in her book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*.

the highest order – loss of human life.' In his earlier warning of July 22, 1985, he warned of a horrifying flight failure unless a solution were implemented to prevent O-ring erosion.21 One can readily see that there is no mincing of words to minimize the possibility that the danger might be understood to be less than absolutely extraordinary. The consequences in terms of life and death danger are spelled out in detail. The specific risk factor is named. One could not possibly ask for a stronger signal.

Finally, the *form* of the message is red-flagged to indicate the most serious of attention must be given to this message. There is no rational way that this warning can be construed as a "weak" signal. It is the strongest possible signal in three terms: its source, its content, and its form. Since the warning is multiple and not single, the form of multiple warnings should further make the case that this is not a single and therefore "weak" signal that might not be noticed. In the case of Boisjoly's warnings, there were two such red-flagged warnings.

It should be noted that this signal was not a one-time occurrence. *This signal was continuously sounded for eight years.* It was not a one-time message that could conceivably have been missed. There is a memorandum written in 1978 by John Q. Miller, Chief of the Solid Rocket Motor Branch to the Project Manager, George Hardy, when referring to the Thiokol field joint design that was chosen, in which he writes that this design was so hazardous that it could produce ' … hot leaks and resulting catastrophic failure.'22 *It was known eight years prior to the Challenger launch that this design choice was a dangerously, faulty one which could end in a catastrophe!* 

In terms of, in terms of the form of the "weak signals" hypothesis, it should be noted that when it came to the timing of the launch, 14 managers and engineers alike *unanimously*  voted against it.23 One could not conceive of a stronger signal than this. That this decision was overturned in a meeting of 4 managers (with no engineers present who were not managers) does not take away from the fact that these 4 managers were fully aware of the previous signal of 14 unanimous votes against the launch.24 There is no possibility that the signals that were made were not the strongest possible signals to be made. To refer to the warnings as "weak signals" is to turn a red light into a green light.

#### **8. Two straw men**

It should be emphasized that the Committee on Shuttle Criticality, Review and Hazard Analysis did not focus on the life and death consequences to the principals involved in taking the risk of launching the Challenger, but rather focused on the general, abstract case of space flight as an opportunity for learning about the universe. In short, they examined a "straw man" and not the real case in front of them to examine.

142 Risk Management – Current Issues and Challenges

glass of ice water during the televised Commission hearings and of his astonishment over the concept of 'acceptable risk'. In contrast, Boisjoly argued that Feynman's scientific experiment proved exactly what Feynman was trying to prove: rubber gets stiff at freezing temperatures. In Boisjoly's words, 'As the temperature decreases the sealing performance of the O-ring gets worse. At freezing temperature or below, it will get much worse. IT'S REALLY THAT SIMPLE.'18 Whose opinion should we accept, the opinion of a Nobel laureate physicist and the senior scientist who knew the most about the O-rings, or a sociologist with a Master's Degree who is evaluating the validity of a scientific experiment carried out by a Nobel laureate

We can point to three elementary characteristics of warning signals to illustrate that the risk was foreknown and knowledge of it was transmitted: The three characteristics of warning signals. To focus on risky technology is to put the cart before the horse. Risky choice is the horse: it pulls the technology along behind it. Without the horse, the cart would not move.

Technology has no power on its own. It is the servant of decision making 10 14-15.

signals" is offered up as a rationale why warnings cannot always be well noted.

the technology about which he was issuing the warning.

to this author to say that Diane Vaughan's book was simply, in his words, 'a bad book'.

Vaughan in her book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*.

<sup>18</sup> *Ibid*., p. 192. (emphasis his)

**7. That the risk was foreknown and knowledge of it was transmitted** 

In the case of the *Challenger* disaster, nothing should blind us to the point that the most senior engineer involved was keenly aware of the fatal risk involved and sent red-flagged warnings to this effect. That these warnings were not heeded is sometimes obscured with the "argument" that one cannot hold up actions to be taken on the basis of warnings since every possible action will always have risks and it is next to impossible to take note of every single warning that comes across one's desk. The existence of warnings that cannot supposedly be noticed is referred to under the hypothesis of "weak signals".20 The hypothesis of "weak

This "weak signals" hypothesis is easily refuted when one considers the *source*, the *content* and the *form* of the signals. The *source* of the signal in this case is the most senior scientist and the one who knows the most about the O-rings. It is not a crank call made by a tourist to the space center at the information desk. The source in this case was the project's senior engineer himself. It was a warning from the inside, by an insider, who knew the most about

The *content of the signal* warns of the danger being fatal to all aboard. There is no weakness in terms of the content of the message. In Boisjoly's famous memorandum of July 31, 1985, he warned before the fact that if there were a launch, 'The result would be a catastrophe of

<sup>19</sup> *Ibid*., pp. 172-3. The author of this chapter was gratified when he personally received a hand-written letter written to him by the famous sociologist David Riesman from Harvard, author of *The Lonely Crowd*, who wrote to commend the merits of this author's first review of Diane Vaughan's book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, which was to come out in *Society* . Later, he was prompted by this author's second review, which took a very different tack, and which appeared in *Business Ethics Quarterly* to read her book after which he wrote

20 The hypothesis of "weak signals" to describe the warnings of the failure of the O-rings was put forward by Diane

physicist the validity of which is confirmed by a rocket scientist?19

<sup>21</sup>*Op. cit*., p. 170.

<sup>22</sup> *Ibid.*, p. 151.

<sup>23</sup> *Ibid*., pp. 174, 195.

<sup>24</sup> *Ibid*., for an extended analysis of the unsound and unethical decision making process engaged in by the four middle managers.

In the case of the red-flagged warnings, regarding the launching of the *Challenger*, the consequences were not the consequences marked out by the Committee on Shuttle Criticality, Review and Hazard Analysis. The consequences marked by that Committee were outlined in terms of the risk that was to be accepted in terms of general space exploration. The implication is that those who accepted such a risk were space explorers and that they were fully aware of the risk that they were taking. Neither of these implications is accurate.

On the Very Idea of Risk Management: Lessons from the Space Shuttle *Challenger* 145

With respect to the Challenger launch, ' … Christa felt no anxiety about the flight. 'I don't see it as a dangerous thing to do.' She said, pausing for a moment. 'Well, I suppose it is, with

It was not only the case that Christa felt no pre-flight fears. She was never even informed that there was any real problem about which she should have been informed. Corrigan relates Christa McAuliffe's account of what the President and the pilot from a previous

*'They were told about the dangers of the space program. She said that one could be intimidated thinking of all that he had said until you realize that NASA employed the most sophisticated safety features, and they would never take any chances with their equipment, much less an* 

When interviewed by *Space News Roundup*, she said that, 'When the Challenger had the problem back in the summer with the heat sensors on the engine … and … one of Boston's papers called me and asked me what I thought was wrong, … I said, 'I have no idea. What

It is obvious that Christa McAuliffe was not informed of any O-ring faults and the consequent life and death risk she was engaged and committed to taking by her participation. Is this a case where the statement from the Committee on Shuttle Criticality, Review and Analysis, that 'The risks of space flight must be accepted by those who are asked to participate in each flight' is even relevant when no one has been informed about the specific risks to which this particular flight will be prone? The fallacy of conflating the general, unknown risks of space flight with the specifically known risks of this flight makes "risk management" here into an unethical practice. There is, strictly speaking, no risk management that is being practiced. There is simply a wanton *risk taking* with human life.

The Nobel laureate physicist, Richard Feynman was shocked to learn that NASA management had claimed that the risk factor of a launch crash was 1 in 100,000 which they had arrived at through a subjective engineering judgment without relying upon any actual past performance data. If one calculated risk based upon actual past performance data, the risk was, according to Professor Feynman, 1 in 100.29 While management, in defending its decision to launch, pointed to the risk involved as being 1 in 100,000, there was no examination of how these figures were generated. If one took the actual performance data of rocket engines in the past, as the Nobel laureate physicist Roger Feynman did, the risk was far greater. When one does this, one can more clearly consider the case of the possibility of incidence versus the actuality of consequence. Does one wish to risk the lives of the

27 Grace George Corrigan, *A Journal for Christa, Christa McAuliffe*, Teacher in Space, Lincoln and London: University of

**9. Subjective judgment versus performance data** 

astronauts and the civilians when the chance of their death is 1 in 100?

all those rockets and fuel tanks. But if I saw it as a big risk, I'd feel differently.'27

launch told her in the White House:

*astronaut's life.'* 

has NASA said?'28

<sup>28</sup> *Op*. *cit*., p. 185. 29 *Ibid*., p. 183.

Nebraska Press, 1993, pp. 115, 118.

The first straw man was to examine the case as if all aboard were astronauts. This assertion was contrary to fact. Not every person aboard the shuttle was an astronaut. In addition to the five astronauts, one of whom was a female, Judith Resnik, there were two passengers: one, a junior and high school teacher, a 37 year old mother of two, Christa McAuliffe and the other, Greg Jarvis, an engineer from Hughes Aircraft (not a member of the Astronaut Corp) who had been given a ride in a space flight as a prize for winning a company competition. Mrs. McAuliffe was scheduled to deliver a nationwide "lesson from space" called the "Ultimate Field Trip" to the nation's school children. She was also supposed to receive a telephone call from President Reagan in mid-flight.

Part of the consequences, then, was the risk of the death of two civilians, who were there, not to operate the space craft, but one to act as a Teacher in Space and the other to claim his contest prize. Both civilians were given the deceptive, camouflage designation of "payload specialists" that implied some kind of crew responsibilities that did not exist. For a cover, Mr. Jarvis was given the task of conducting a fluid dynamics experiment and Mrs. McAuliffe was to videotape six science demonstrations of the effects of weightlessness on gravity, etc.25 The Teacher-in-Space mission portion was to be featured in President Reagan's State of the Union address the evening of Tuesday, January 28, though the White House later denied it.

The second straw man argument is to assume that knowing the risk of space flight in general was equivalent to knowing the risk of a launch under weather conditions that were known to be unsafe with the O-ring technology in use. Again, neither the astronauts nor the two civilians aboard were made aware of the risk that they were taking. They may have been aware of the existential, general unknown risk of space exploration, but they were not aware of the specific, needless risk they were actually committing to take by being launched into space with a known, faulty technology.26 Indeed, would not the inclusion of the schoolteacher Christa McAuliffe, who had been given the understanding that the launch was safe, create the vivid impression to all aboard, and to the public at large around the world that this was a very safe flight? The genuine risk being taken was perceived to be minimal around the world by the inclusion of non-necessary personnel on the shuttle.

Was Mrs. Christa McAuliffe made aware of the risk that she was taking? Grace Corrigan wrote:

<sup>25</sup> Richard C. Cook, *Challenger Revealed, An Insider's Account of How the Reagan Administration Caused the Greatest Tragedy of the Space Age*, New York: Avalon, 2006, pp. 177-8.

<sup>26</sup> *Op. cit*.., p. 156.

With respect to the Challenger launch, ' … Christa felt no anxiety about the flight. 'I don't see it as a dangerous thing to do.' She said, pausing for a moment. 'Well, I suppose it is, with all those rockets and fuel tanks. But if I saw it as a big risk, I'd feel differently.'27

It was not only the case that Christa felt no pre-flight fears. She was never even informed that there was any real problem about which she should have been informed. Corrigan relates Christa McAuliffe's account of what the President and the pilot from a previous launch told her in the White House:

*'They were told about the dangers of the space program. She said that one could be intimidated thinking of all that he had said until you realize that NASA employed the most sophisticated safety features, and they would never take any chances with their equipment, much less an astronaut's life.'* 

When interviewed by *Space News Roundup*, she said that, 'When the Challenger had the problem back in the summer with the heat sensors on the engine … and … one of Boston's papers called me and asked me what I thought was wrong, … I said, 'I have no idea. What has NASA said?'28

It is obvious that Christa McAuliffe was not informed of any O-ring faults and the consequent life and death risk she was engaged and committed to taking by her participation. Is this a case where the statement from the Committee on Shuttle Criticality, Review and Analysis, that 'The risks of space flight must be accepted by those who are asked to participate in each flight' is even relevant when no one has been informed about the specific risks to which this particular flight will be prone? The fallacy of conflating the general, unknown risks of space flight with the specifically known risks of this flight makes "risk management" here into an unethical practice. There is, strictly speaking, no risk management that is being practiced. There is simply a wanton *risk taking* with human life.
