**6. Risky technology versus risky management**

138 Risk Management – Current Issues and Challenges

risk, since they were unaware of the risk.'

risk management.

<sup>9</sup> *Ibid*., p. 184. <sup>10</sup>*Ibid*., pp. 184-5, 187. <sup>11</sup>*Ibid*., p. 187.

*Challenger* disaster makes it an ideal case study for the purposes of examining the concept of

One can argue, fallaciously, that whenever an astronaut goes into space, that astronaut is subject to that general, unknown, universal-style risk of space travel. This, however, is not comparable to an astronaut going into space equipped with the full knowledge of the existence of a real, specific and pre-existing mechanical fault that could be potentially fatal to her or his space craft. It is only with the latter kind of known risk, known fully to those taking the risk, that a discussion on risk management should focus. Such was not the case

In the case of the *Challenger* launch, the overwhelming evidence has revealed that the astronauts were completely unaware of the specific dangers that the O-rings posed. According the Malcolm McConnell, the science reporter, no one in the astronaut corps had been informed of any problem with the SRB field joints.8 According to Charles Harris, Michael Pritchard and Michael Rabins, authors of *Engineering Risks, Concepts and Cases*, '… no one presented them [the astronauts] with the information about the O-ring behavior at low temperatures. Therefore, they did not give their consent to launch despite the O-ring

Claus Jensen writes in *No Downlink, A Dramatic Narrative about the Challenger Accident and Our Time*, that when the Rogers commission summoned a group of space shuttle astronauts, 'During this session, the astronauts reiterated that they had never been told about the problems with the solid rocket booster.' And, in a private correspondence with the present author and Roger Boisjoly, the late senior scientist and the engineer who knew the most about the O-rings, Boisjoly wrote, 'I KNOW for a FACT that the astronauts on Challenger did NOT KNOW about the problem with the O-rings at temperatures below 50 degrees F.' (emphasis his)9 According to Richard Lewis' book, *Challenger, The Final Voyage*, 'Along with the general public, the astronauts who were flying the shuttle were unaware of the escalating danger of joint seal failure. So were the congressional committees charged with overseeing the shuttle program. NASA never told them that the shuttle had a problem.' Later, in the same work, Lewis pointedly quotes from the Presidential Commission report:

*Chairman Rogers raised the question of whether any astronaut office representative was aware [of the O-ring problem]' Weitz, [an astronaut's representative] answered: "We were not aware of* 

Despite the very clear declarations above, nowhere in the 575 pages of Diane Vaughan's book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, is it ever mentioned that the astronauts and the civilians were not informed of the O-ring dangers.11 By not ever mentioning this crucial point of information, her book leaves one with the impression either" (i) that the astronauts knew about the risk they were taking and

*any concern with the O-rings, let alone the effect of weather on the O-rings.10*

8 Robert Elliott Allinson, *Saving Human Lives, Lessons in Management Ethics*, Dordrecht: Springer, 2005, p. 156.

with the astronauts and civilian passengers of the U.S. space shuttle, *Challenger*.

In the classic case of the space shuttle *Challenger* example, two years after the horrific event, when the official, U. S. government committee on Shuttle Criticality, Review and Hazard Analysis examined the risk that had been taken in launching the *Challenger*, the Chairman of that committee wrote in the very first paragraph of chapter four of their report, entitled 'Risk Assessment and Risk Management':

*Almost lost in the strong public reaction to the Challenger failure was the inescapable fact that major advances in mankind's capability to explore and operate in space – indeed, even in routine, atmospheric flight – will only be accomplished in the face of risk.* 

And, later, in the body of that same report, the Committee wrote: 'The risks of space flight must be accepted by those who are asked to participate in each flight …12

It is rather easy to spot the fallacy that is being made in this case. It is the fallacy of equating a general unknown risk (space flight in general) with a specifically foreknown fatal risk (the flawed design of the O-rings about which the senior engineer Roger Boisjoly had issued red flagged warnings).13 If the committee were pressed on this matter and replied that they were aware that there was some risk in the use of the technology employed at the time (the hazardous O-rings), in the case of the *Challenger* disaster, what the Committee on Shuttle Criticality, Review and Hazard Analysis would then seem to be saying is that it was justifiable to employ the 'risky technology' that was employed at the time because of the understanding that in order to make progress in space exploration that one was required to take chances with very risky technology. Is this actually the case? Was it necessary to take cavalier chances with risky technology in order to make progress in the arena of space exploration? The implication is that the risk would have to be taken and the "management" of the risk would seem to amount to something on the order of crossing one's fingers, and hoping that nothing would happen.

<sup>12</sup> Preface by Alton Slay, Chairman, Committee on Shuttle Criticality, Review and Hazard Analysis, *Post-Challenger Evaluation of Space Shuttle Risk Assessment and Management*, Washington, D.C.: National Academy Press, 1988, p. v; p. 33.

<sup>13</sup> Robert Elliott Allinson, *Saving Human Lives, Lessons in Management Ethics*, Dordrecht: Springer, 2005, p. 138.

When one examines the *Challenger* case more closely, one discovers that there was no need to choose the technology that was chosen in the first place. The risky technology in this case in point was the engineering design of the O-rings. Of four designs submitted, the one chosen was the least safe (and the cheapest). This choice was a case of risky judgment on the part of the managers who chose this design that was an initiating cause of the *Challenger* disaster, and not the fact that they were obliged to employ a risky design in order to venture into space. The design itself was risky. Why? This design was chosen because financial cost factors, read profit, were taken as a priority over safety. This decision to place cost ahead of safety is an example of ethical misjudgment. Thus, we may consider the decision to choose this unsafe design a case of management malpractice. There was no necessity to choose this particular design. Alternative designs were available.

On the Very Idea of Risk Management: Lessons from the Space Shuttle *Challenger* 141

and thus the lives of all passengers could have been saved with a parachute descent system. There was no need to take a risk with their lives! The continuing belief that the shuttle exploded continues to play its role in veiling the real issue of the ethics of risky decision

The above examples make it clear that the problem of risk lies in the *choices* to be made by the risk takers, or, more precisely, the risk decision makers, not in the *management* of risk already taken. By focusing on the term 'management', one takes for granted that a risk must exist in the first place and needs only to be managed. It is not even clear in this case how the risk is managed unless crossing one's fingers counts as management. In the case of the O-

One can look into the matter in more detail. Suffice it to say for the present discussion and analysis that there were two levels of risk taking that were matters of choice, not management. The first level was the choice of design. The choice of design could have been altered to have prevented the breaking apart of the space shuttle. The choice of the design could have been altered to prevent the death of the crew and passengers. It was design *choices* that determined the *cause* of the disaster and the *fatal consequences* of the disaster. The second was the choice to fly under weather conditions that heightened the risk involved. These were both management choices or decisions. The focus should be on risky management in the sense that there was risky choice and risky decision-making, not on the technology involved, because the risky technology only came into being in the first place because of these two very risky management choices: the choice of technology and the

There has been much erroneous discussion of the technical issues of the temperature and the O-rings and as a result, massive misconceptions have created complicated layers of confusion around the issue.16 Diane Vaughan's book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, has been responsible for generating much of this confusion. Note the first phrase in her sub-title: 'Risky Technology'. The clear implication is that the fault lay not in the decision to employ such technology but in the problem of having

According to Roger Boisjoly, the senior scientist at NASA and the one who knew the most about the O-rings, she routinely mixed up data relating to field joints and nozzle joints and did not have a clue about the difference between the two joints.17 Diane Vaughan is dismissive of Professor Feynman's famous gesture of dipping a piece of an O-ring into a

16 The mass of confusion surrounding the technical issues is, to the best of this author's knowledge, discussed in proper factual detail in the two chapters on the *Challenger* disaster in the author's *Saving Human Lives, Lessons in Management Ethics* (now in Kindle and Google Books). The author is indebted to extensive private correspondence with Roger Boisjoly for clearing up the confusion about the technical issues that made understanding the *Challenger* issues murky and seemingly resistant to plain understanding. To the best of the author's knowledge, such clear explanations that Boisjoly gives, sorting out the mistakes in previous analyses of the *Challenger* disaster such as in Diane Vaughan's confused studies on the issue only exist in print in the author's *Saving Human Lives*. *Cf*., especially Chapter Seven, 'The

Space Shuttle Challenger Disaster,' and Chapter Eight, 'Post-Challenger Investigations', pp. 107-197.

rings, there was no need for this risk to exist in the first place.

making.

choice of launch timing.

to rely upon risky technology.

<sup>17</sup> *Saving Human Lives*, pp.192-3 *et passim*.

The O-ring design of giant rubber gaskets keeping combustible hot gases from leaking out, in actual fact, ranked fourth out of four submitted engineering designs and, according to an important article co-written by Trudy Bell, Senior Editor of the engineering journal, *IEEE Spectrum* and Karl Esch: the selection of this design was the chief cause of the *Challenger* disaster.14 For the next space flight, this design was replaced with the safest design, demonstrating that the safest design could have been chosen in the first place and it was economically feasible to have done so.

There was also no need to choose a design for the space shuttle that did not include an abort system, that is explosive bolt hatches, space pressure suits and a parachute descent system. That it did not include such a safety system was a matter of policy, not impossibility since earlier spacecraft had been equipped with launch escape systems. Again, nowhere, in any place, in the 575 pages of Ms. Vaughan's, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, is it ever mentioned that there was no necessity to omit an escape system for the crew and passengers. Omission of this detail can readily be regarded as ethically unpardonable. It implies that there was no way to save the lives of the crew and passengers when in fact *they were indeed alive when the space craft broke apart* (the space shuttle did not explode, as was popularly reported – the spectacular image of the smoke was due to the chemical reaction of hydrogen colliding with oxygen). The astronauts were conscious as they hit the ocean floor at the tremendous impact (some of the crew had actually activated and used their emergency air packs) that caused their death. By omitting this incredibly important fact in her volume, Vaughan leaves one with the impression that the life and death risk that was taken was one that could not possibly have been prevented!15 That it could, indeed, have been prevented places an entirely different perspective on the kind of unnecessary, and therefore, incredibly unwarranted risk that the astronauts were forced to take. *The Challenger space shuttle astronauts never needed to take a risk with their lives. The risk did not exist: it was created by risky and unethical management decisions.* The false impression created by interpreting the plume of smoke to mean that the space shuttle *Challenger* had exploded blinds one to the fact that the crew compartment had separated from the orbiter

<sup>14</sup> Robert Elliott Allinson, 'Risk Management: demythologizing its belief foundations,' *International Journal of Risk Assessment and Management*, Volume 7, No. 3, 2007, p. 302.

<sup>15</sup>*Op. cit*., pp. 188-189.

and thus the lives of all passengers could have been saved with a parachute descent system. There was no need to take a risk with their lives! The continuing belief that the shuttle exploded continues to play its role in veiling the real issue of the ethics of risky decision making.

The above examples make it clear that the problem of risk lies in the *choices* to be made by the risk takers, or, more precisely, the risk decision makers, not in the *management* of risk already taken. By focusing on the term 'management', one takes for granted that a risk must exist in the first place and needs only to be managed. It is not even clear in this case how the risk is managed unless crossing one's fingers counts as management. In the case of the Orings, there was no need for this risk to exist in the first place.

One can look into the matter in more detail. Suffice it to say for the present discussion and analysis that there were two levels of risk taking that were matters of choice, not management. The first level was the choice of design. The choice of design could have been altered to have prevented the breaking apart of the space shuttle. The choice of the design could have been altered to prevent the death of the crew and passengers. It was design *choices* that determined the *cause* of the disaster and the *fatal consequences* of the disaster. The second was the choice to fly under weather conditions that heightened the risk involved. These were both management choices or decisions. The focus should be on risky management in the sense that there was risky choice and risky decision-making, not on the technology involved, because the risky technology only came into being in the first place because of these two very risky management choices: the choice of technology and the choice of launch timing.

There has been much erroneous discussion of the technical issues of the temperature and the O-rings and as a result, massive misconceptions have created complicated layers of confusion around the issue.16 Diane Vaughan's book, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, has been responsible for generating much of this confusion. Note the first phrase in her sub-title: 'Risky Technology'. The clear implication is that the fault lay not in the decision to employ such technology but in the problem of having to rely upon risky technology.

According to Roger Boisjoly, the senior scientist at NASA and the one who knew the most about the O-rings, she routinely mixed up data relating to field joints and nozzle joints and did not have a clue about the difference between the two joints.17 Diane Vaughan is dismissive of Professor Feynman's famous gesture of dipping a piece of an O-ring into a

140 Risk Management – Current Issues and Challenges

particular design. Alternative designs were available.

economically feasible to have done so.

*Assessment and Management*, Volume 7, No. 3, 2007, p. 302.

<sup>15</sup>*Op. cit*., pp. 188-189.

When one examines the *Challenger* case more closely, one discovers that there was no need to choose the technology that was chosen in the first place. The risky technology in this case in point was the engineering design of the O-rings. Of four designs submitted, the one chosen was the least safe (and the cheapest). This choice was a case of risky judgment on the part of the managers who chose this design that was an initiating cause of the *Challenger* disaster, and not the fact that they were obliged to employ a risky design in order to venture into space. The design itself was risky. Why? This design was chosen because financial cost factors, read profit, were taken as a priority over safety. This decision to place cost ahead of safety is an example of ethical misjudgment. Thus, we may consider the decision to choose this unsafe design a case of management malpractice. There was no necessity to choose this

The O-ring design of giant rubber gaskets keeping combustible hot gases from leaking out, in actual fact, ranked fourth out of four submitted engineering designs and, according to an important article co-written by Trudy Bell, Senior Editor of the engineering journal, *IEEE Spectrum* and Karl Esch: the selection of this design was the chief cause of the *Challenger* disaster.14 For the next space flight, this design was replaced with the safest design, demonstrating that the safest design could have been chosen in the first place and it was

There was also no need to choose a design for the space shuttle that did not include an abort system, that is explosive bolt hatches, space pressure suits and a parachute descent system. That it did not include such a safety system was a matter of policy, not impossibility since earlier spacecraft had been equipped with launch escape systems. Again, nowhere, in any place, in the 575 pages of Ms. Vaughan's, *The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA*, is it ever mentioned that there was no necessity to omit an escape system for the crew and passengers. Omission of this detail can readily be regarded as ethically unpardonable. It implies that there was no way to save the lives of the crew and passengers when in fact *they were indeed alive when the space craft broke apart* (the space shuttle did not explode, as was popularly reported – the spectacular image of the smoke was due to the chemical reaction of hydrogen colliding with oxygen). The astronauts were conscious as they hit the ocean floor at the tremendous impact (some of the crew had actually activated and used their emergency air packs) that caused their death. By omitting this incredibly important fact in her volume, Vaughan leaves one with the impression that the life and death risk that was taken was one that could not possibly have been prevented!15 That it could, indeed, have been prevented places an entirely different perspective on the kind of unnecessary, and therefore, incredibly unwarranted risk that the astronauts were forced to take. *The Challenger space shuttle astronauts never needed to take a risk with their lives. The risk did not exist: it was created by risky and unethical management decisions.* The false impression created by interpreting the plume of smoke to mean that the space shuttle *Challenger* had exploded blinds one to the fact that the crew compartment had separated from the orbiter

14 Robert Elliott Allinson, 'Risk Management: demythologizing its belief foundations,' *International Journal of Risk* 

<sup>16</sup> The mass of confusion surrounding the technical issues is, to the best of this author's knowledge, discussed in proper factual detail in the two chapters on the *Challenger* disaster in the author's *Saving Human Lives, Lessons in Management Ethics* (now in Kindle and Google Books). The author is indebted to extensive private correspondence with Roger Boisjoly for clearing up the confusion about the technical issues that made understanding the *Challenger* issues murky and seemingly resistant to plain understanding. To the best of the author's knowledge, such clear explanations that Boisjoly gives, sorting out the mistakes in previous analyses of the *Challenger* disaster such as in Diane Vaughan's confused studies on the issue only exist in print in the author's *Saving Human Lives*. *Cf*., especially Chapter Seven, 'The Space Shuttle Challenger Disaster,' and Chapter Eight, 'Post-Challenger Investigations', pp. 107-197.

<sup>17</sup> *Saving Human Lives*, pp.192-3 *et passim*.

glass of ice water during the televised Commission hearings and of his astonishment over the concept of 'acceptable risk'. In contrast, Boisjoly argued that Feynman's scientific experiment proved exactly what Feynman was trying to prove: rubber gets stiff at freezing temperatures. In Boisjoly's words, 'As the temperature decreases the sealing performance of the O-ring gets worse. At freezing temperature or below, it will get much worse. IT'S REALLY THAT SIMPLE.'18 Whose opinion should we accept, the opinion of a Nobel laureate physicist and the senior scientist who knew the most about the O-rings, or a sociologist with a Master's Degree who is evaluating the validity of a scientific experiment carried out by a Nobel laureate physicist the validity of which is confirmed by a rocket scientist?19

On the Very Idea of Risk Management: Lessons from the Space Shuttle *Challenger* 143

the highest order – loss of human life.' In his earlier warning of July 22, 1985, he warned of a horrifying flight failure unless a solution were implemented to prevent O-ring erosion.21 One can readily see that there is no mincing of words to minimize the possibility that the danger might be understood to be less than absolutely extraordinary. The consequences in terms of life and death danger are spelled out in detail. The specific risk factor is named.

Finally, the *form* of the message is red-flagged to indicate the most serious of attention must be given to this message. There is no rational way that this warning can be construed as a "weak" signal. It is the strongest possible signal in three terms: its source, its content, and its form. Since the warning is multiple and not single, the form of multiple warnings should further make the case that this is not a single and therefore "weak" signal that might not be noticed. In the case of Boisjoly's warnings, there were two such red-flagged warnings.

It should be noted that this signal was not a one-time occurrence. *This signal was continuously sounded for eight years.* It was not a one-time message that could conceivably have been missed. There is a memorandum written in 1978 by John Q. Miller, Chief of the Solid Rocket Motor Branch to the Project Manager, George Hardy, when referring to the Thiokol field joint design that was chosen, in which he writes that this design was so hazardous that it could produce ' … hot leaks and resulting catastrophic failure.'22 *It was known eight years prior to the Challenger launch that this design choice was a dangerously, faulty one which could end* 

In terms of, in terms of the form of the "weak signals" hypothesis, it should be noted that when it came to the timing of the launch, 14 managers and engineers alike *unanimously*  voted against it.23 One could not conceive of a stronger signal than this. That this decision was overturned in a meeting of 4 managers (with no engineers present who were not managers) does not take away from the fact that these 4 managers were fully aware of the previous signal of 14 unanimous votes against the launch.24 There is no possibility that the signals that were made were not the strongest possible signals to be made. To refer to the

It should be emphasized that the Committee on Shuttle Criticality, Review and Hazard Analysis did not focus on the life and death consequences to the principals involved in taking the risk of launching the Challenger, but rather focused on the general, abstract case of space flight as an opportunity for learning about the universe. In short, they examined a

<sup>24</sup> *Ibid*., for an extended analysis of the unsound and unethical decision making process engaged in by the four middle

warnings as "weak signals" is to turn a red light into a green light.

"straw man" and not the real case in front of them to examine.

One could not possibly ask for a stronger signal.

*in a catastrophe!* 

**8. Two straw men** 

<sup>21</sup>*Op. cit*., p. 170. <sup>22</sup> *Ibid.*, p. 151. <sup>23</sup> *Ibid*., pp. 174, 195.

managers.

We can point to three elementary characteristics of warning signals to illustrate that the risk was foreknown and knowledge of it was transmitted: The three characteristics of warning signals. To focus on risky technology is to put the cart before the horse. Risky choice is the horse: it pulls the technology along behind it. Without the horse, the cart would not move. Technology has no power on its own. It is the servant of decision making 10 14-15.
