**4.2 Reasonable-care model**

Also known as a due-care model, reasonable care builds on and goes a step further than the minimalist model. The technologist must take reasonable precautions and to provide care in practice (especially in the engineering profession). Interestingly, every major philosophical theory of ethics includes such a provision, such as three mentioned previously, i.e. the categorical imperative in duty ethics, the harm principle in utilitarianism, and the veil of ignorance in social contract ethics.

Determining what is reasonable can be quite subjective, so this model applies a mechanism borrowed from the legal profession, i.e. the *reasonable person standard*. Applying this to emerging technologies, the result of the decision to design or use a new technology is measured against the metric of the degree to which the design and use would be seen as ethical or unethical according to a "standard of reasonableness as seen by a normal, prudent nonprofessional"(ASME 2012).

In a highly technical area, this mechanism should be based on a more knowledgeable *and* reasonable person, i.e. a "reasonable engineer standard" or "reasonable scientist standard." If the innovator is an engineer, for example, this affiliation adds a professional onus. Not only should an action be acceptable to the majority of users it should also be acceptable to one's peers in the profession, as well as to scientists and designers outside of engineering, depending on the technology itself.

An action could very well be legal, and even professionally permissible, but may still fall below the ethical threshold if reasonable people consider it to be wrong.

### **4.3 Good works model**

A truly ethical model goes beyond obeying the law or preventing harm. An ethical innovator excels beyond the standards and codes and does the right thing to improve product safety, public health or social welfare. Doing what is "right" is more than simply avoiding what is "wrong." Much as peace is more than the absence of war, research integrity is more than avoiding immoral acts. It requires a proactive and preventive approach. Ethics must be integrated throughout the design life cycle (Vallero and Vesilind 2007). Certainly, the rules are important, but legality and morality are not completely inclusive (See Fig. 6).

The good works model is rooted in the moral development theories such as those expounded by Kohlberg (1981), Piaget (1965), Rest (1986), and Rest et al. (1999), who noted that moral action is a complex process entailing four components: moral sensitivity, judgment, motivation and character. They and others (e.g., Duska and Whelan, 1975) have noted that the age and education level are the two most important factors in determining a person's likelihood of making moral judgments, prioritizing moral values, and following

to predict problems, and is often certainly in the best position to foresee low probability, negative outcomes. In other words, the innovator may see the flaws of existing performance criteria and rules in assuring the safety, health and welfare. Knowing this places the onus on the innovator, since the regulators may not be aware of the potential risks (sometimes in ethical inquiries, the conclusion is that an engineer "knew or should have known" the risks

Also known as a due-care model, reasonable care builds on and goes a step further than the minimalist model. The technologist must take reasonable precautions and to provide care in practice (especially in the engineering profession). Interestingly, every major philosophical theory of ethics includes such a provision, such as three mentioned previously, i.e. the categorical imperative in duty ethics, the harm principle in utilitarianism, and the veil of

Determining what is reasonable can be quite subjective, so this model applies a mechanism borrowed from the legal profession, i.e. the *reasonable person standard*. Applying this to emerging technologies, the result of the decision to design or use a new technology is measured against the metric of the degree to which the design and use would be seen as ethical or unethical according to a "standard of reasonableness as seen by a normal, prudent

In a highly technical area, this mechanism should be based on a more knowledgeable *and* reasonable person, i.e. a "reasonable engineer standard" or "reasonable scientist standard." If the innovator is an engineer, for example, this affiliation adds a professional onus. Not only should an action be acceptable to the majority of users it should also be acceptable to one's peers in the profession, as well as to scientists and designers outside of engineering,

An action could very well be legal, and even professionally permissible, but may still fall

A truly ethical model goes beyond obeying the law or preventing harm. An ethical innovator excels beyond the standards and codes and does the right thing to improve product safety, public health or social welfare. Doing what is "right" is more than simply avoiding what is "wrong." Much as peace is more than the absence of war, research integrity is more than avoiding immoral acts. It requires a proactive and preventive approach. Ethics must be integrated throughout the design life cycle (Vallero and Vesilind 2007). Certainly, the rules are important, but legality and morality are not completely

The good works model is rooted in the moral development theories such as those expounded by Kohlberg (1981), Piaget (1965), Rest (1986), and Rest et al. (1999), who noted that moral action is a complex process entailing four components: moral sensitivity, judgment, motivation and character. They and others (e.g., Duska and Whelan, 1975) have noted that the age and education level are the two most important factors in determining a person's likelihood of making moral judgments, prioritizing moral values, and following

below the ethical threshold if reasonable people consider it to be wrong.

and declared them before the problems occurred).

**4.2 Reasonable-care model** 

ignorance in social contract ethics.

nonprofessional"(ASME 2012).

depending on the technology itself.

**4.3 Good works model** 

inclusive (See Fig. 6).

Fig. 6. Ethical decisions differ from legal decisions. Legality can be a subset of ethical decisions (A). An example is a biomedical researcher's responsibility to abide by copyright and intellectual property laws. In addition, ethics also includes extra-legal responsibilities, such as maintaining competence in the scientific discipline. Other situations may arise where a certain amount of legal behavior is, in fact, unethical (B). An example is the paying of bribes or gratuities or conducting research without appropriate informed consent of the subjects, which in certain cultures may be legal or even mandated by regulations. On rare occasions, the laws may be completely unethical (C), such as slavery. Some would argue, for example, that current research like embryonic stem cell and blastocyst research, cloning, and animal experimentation falls into this category.

Ethical Decisions in Emergent Science, Engineering and Technologies 41

Fig. 7. Comparison of Kohlberg's moral development stages to technological ethics

organization they represent, but also the broader society and even future generations.

is a challenge for such a results-oriented profession.

Ethical content is not an afterthought, but integrated within the decision making process. That is, the engineering exemplars recognize the broad impacts their decisions may have, and they act in such a way that their actions will be in the best interest of not only themselves and the

Much of the ethics training to date has emphasized pre-conventional thinking; that is, adherence to codes, laws and regulations within the milieu of profitability for the organization. This benefits the technologist and the organization, but is only a step toward full professionalism. Those who teach engineering ethics must stay focused on the engineer's principal client, "the public." One interpretation of the "hold paramount" provision mentioned previously is that it has primacy over all the others. So, anything the professional engineer does cannot violate this canon. No matter how competent, objective, honest, and faithful, the engineer must not jeopardize public safety, health or welfare. This

Technical professionals must navigate through their professional codes. The NSPE code, for instance, reminds its members that "public health and welfare are paramount

(Vallero 2007).

through on moral decisions. Consistent with Aristotle's argument that the way to achieve excellence is though practice, experience is particularly critical regarding moral judgment: A person's ability to make moral judgments tends to grow with maturity in pursuit of further education, generally reaching its final and highest stage of development in early adulthood. This theory of moral development is illustrated in Table 2.


Table 2. Kohlberg's (1981) stages of moral development.

During the two earliest stages of moral development, i.e. the "pre-conventional level," a person is primarily motivated by the desire to seek pleasure and avoid pain. This is similar to the malpractice model. The "conventional level" consists of stages three and four: In stage three, the consequences that actions have for peers and their feelings about these actions; in stage four, considering how the wider community will view the actions and be affected by them. The parallel here is with the reasonable-care model. Only a minority reach the "postconventional" stage, wherein they have an even broader perspective: Their moral decision making is guided by universal moral principles (Kant 1785); that is, by principles which reasonable people would agree should bind the actions of all people who find themselves in similar situations. This stage tracks closely with the good works model.

A normative model can be applied to emerging technologies. The moral need to consider the impact that the use of a technology will have on others forms the basis for the normative model. Pursuing a technological advancement merely with the goal of obeying the law may lead to avoiding punishment for wrongdoing, but it is not usually sufficient for any technological pursuit, let alone one with the uncertainties of emerging technologies. Pursuing a technology with the goal of improving profitability is clearly in line with investors' desires; but presumably customers', suppliers', and employees' desires must also be met at some level. And finally, pursuing an activity with the goal of "doing the right thing," behaving in a way that is morally right and just, can be the highest level of engineering behavior. This normative model of ethical engineering can be illustrated as Fig. 7.

There is a striking similarity between Kohlberg's model of moral development and growth within a technical profession. Avoiding punishment in the moral development model is similar to the need to avoid problems early in one's career. The pre-conventional level and early career experiences have similar driving forces.

The second level focuses on peers and community. The engineer must balance the needs of clients and fellow professionals with those of society at large. Engineering services and products must be of high quality and be profitable, but the focus is shifting away from selfcenteredness and personal well-being toward external goals.

Universal moral principles begin to govern actions at the highest level of moral development. The driving force or motivation is trying to do the right thing on a moral (not legal, financial or even advancement of science) basis. These behaviors set an example, now and in the future.

through on moral decisions. Consistent with Aristotle's argument that the way to achieve excellence is though practice, experience is particularly critical regarding moral judgment: A person's ability to make moral judgments tends to grow with maturity in pursuit of further education, generally reaching its final and highest stage of development in early adulthood.

Pre-Conventional Level 1. punishment-obedience orientation

Conventional Level 3. "good boy"-"nice girl" orientation

During the two earliest stages of moral development, i.e. the "pre-conventional level," a person is primarily motivated by the desire to seek pleasure and avoid pain. This is similar to the malpractice model. The "conventional level" consists of stages three and four: In stage three, the consequences that actions have for peers and their feelings about these actions; in stage four, considering how the wider community will view the actions and be affected by them. The parallel here is with the reasonable-care model. Only a minority reach the "postconventional" stage, wherein they have an even broader perspective: Their moral decision making is guided by universal moral principles (Kant 1785); that is, by principles which reasonable people would agree should bind the actions of all people who find themselves in

A normative model can be applied to emerging technologies. The moral need to consider the impact that the use of a technology will have on others forms the basis for the normative model. Pursuing a technological advancement merely with the goal of obeying the law may lead to avoiding punishment for wrongdoing, but it is not usually sufficient for any technological pursuit, let alone one with the uncertainties of emerging technologies. Pursuing a technology with the goal of improving profitability is clearly in line with investors' desires; but presumably customers', suppliers', and employees' desires must also be met at some level. And finally, pursuing an activity with the goal of "doing the right thing," behaving in a way that is morally right and just, can be the highest level of engineering behavior. This normative

There is a striking similarity between Kohlberg's model of moral development and growth within a technical profession. Avoiding punishment in the moral development model is similar to the need to avoid problems early in one's career. The pre-conventional level and

The second level focuses on peers and community. The engineer must balance the needs of clients and fellow professionals with those of society at large. Engineering services and products must be of high quality and be profitable, but the focus is shifting away from self-

Universal moral principles begin to govern actions at the highest level of moral development. The driving force or motivation is trying to do the right thing on a moral (not legal, financial or even advancement of science) basis. These behaviors set an example, now

Post-Conventional Level 5. social contract orientation

similar situations. This stage tracks closely with the good works model.

2. personal reward orientation

6. universal ethical principle orientation

4. law and order orientation

This theory of moral development is illustrated in Table 2.

Table 2. Kohlberg's (1981) stages of moral development.

model of ethical engineering can be illustrated as Fig. 7.

early career experiences have similar driving forces.

and in the future.

centeredness and personal well-being toward external goals.

Fig. 7. Comparison of Kohlberg's moral development stages to technological ethics (Vallero 2007).

Ethical content is not an afterthought, but integrated within the decision making process. That is, the engineering exemplars recognize the broad impacts their decisions may have, and they act in such a way that their actions will be in the best interest of not only themselves and the organization they represent, but also the broader society and even future generations.

Much of the ethics training to date has emphasized pre-conventional thinking; that is, adherence to codes, laws and regulations within the milieu of profitability for the organization. This benefits the technologist and the organization, but is only a step toward full professionalism. Those who teach engineering ethics must stay focused on the engineer's principal client, "the public." One interpretation of the "hold paramount" provision mentioned previously is that it has primacy over all the others. So, anything the professional engineer does cannot violate this canon. No matter how competent, objective, honest, and faithful, the engineer must not jeopardize public safety, health or welfare. This is a challenge for such a results-oriented profession.

Technical professionals must navigate through their professional codes. The NSPE code, for instance, reminds its members that "public health and welfare are paramount

Ethical Decisions in Emergent Science, Engineering and Technologies 43

 NG = (goodness of each consequence) x (importance) x (likelihood) (3) These analyses sometimes use ordinal scales, such as 0 through 3, where 0 is nonexistence (e.g. zero likelihood or zero importance) and 1, 2 and 3 are low, medium and high, respectively. Thus, there may be many small consequences that are near zero in importance and, since NG is a product, the overall net goodness of the decision is driven almost entirely by one or a few important and likely consequences. There are two cautions in using this approach. First, although it appears to be quantitative, the approach is very subjective. Second, as we have seen many times in cases involving health and safety, even a very

The tool can be modified from a purely ethical decision making tool to a risk management tool by incorporating the net goodness into a decision tree. For example, Fig. 8 shows a hypothetical decision on whether to use a GMO (Vallero 2010). The decision is based on the likelihood of various beneficial and adverse outcomes, with ranked importance to three receptors: the environment; public health; and food production. The analysis is qualitative, but can help to identify important factors, as well as potential downstream impacts and artifacts of an immediate decision. The difficulty will be to arrive at probabilities to fill the "likelihood" column. Sometimes these are published, but often will have to be derived from focus groups and expert elicitation. Often, likelihood is presented as an ordinal scale (e.g.

> Non-target effects Biodiversity effects

**Second Order Outcome**

> Pest resistance Crop damage

Direct poisoning\* Indirect contamination (e.g.

Cross-resistant bacteria

Transgenic food problems

track-in)

Fig. 8. Decision tree and net goodness analysis of a decision to insert *Bacillus thuringiensis*  genetic material into crops near an ecosystem. Data are hypothetical. Vallero 2010).

\*This has its own decision tree according to vulnerability index, i.e. percentile exposure (high to no exposure) and sensitive

**1 = Best; 5 = Worst**

**NA 5**

**NA**

Environment

**Likelihood** 0.810

0.005

0.001

0.010 0.020

0.002

0.030

0.020 0.002

0.100

**Importance** Public Health

**1 1 1**

**5 2 3 5 3 2**

**3 2 4 3 3 5**

**3 5 4 3 5 4**

**5 5 5 3 3 5**

Food Production

unlikely but negative consequence is unacceptable.

high, medium, or low – or 1, 2, or 3).

Efficacious with no human health impacts, but with ecological impacts

Efficacious with agricultural effects

Efficacious with human health impacts, but without ecological impacts

Nonefficacious

subpopulations (children, elderly, asthmatic, etc.)

Efficacious with no impacts

**First Order Outcome**

Spores and crystalline insecticidal proteins

considerations." Public safety and health considerations affect the design process directly. Almost every design now requires at least some attention to sustainability and environmental impacts. For example, recent changes in drug delivery have been required, such as moving away from the use of greenhouse gas propellants like chlorofluorocarbons (CFCs) and instead using pressure differential systems (such as physical pumps) to deliver medicines. This may seem like a small thing or even a nuisance to those who have to use them, but it reflects an appreciation for the importance of incremental effects. One inhaler does little to affect the ozone layer or threaten the global climate, but millions of inhalers can produce enough halogenated and other compounds that the threat must be considered in designing medical devices.

Technologists must consider how sustainable the technology will be over its useful lifetime. This requires thinking about the life cycle, not only during use, but when the use is complete. Such programs as "design for recycling" (DFR) and "design for disassembly" (DFD) allow the engineer to consider the consequences of various design options in space and time. They also help designers to pilot new systems and to consider scale effects when ramping up to full production of devices. However, if such a change inordinately affects a vulnerable population, this must be weighted properly in the decision. For example, if asthmatics are placed at additional risk due to a less efficacious delivery system, albeit more environmentally acceptable, it is likely not the best alternative. That is, the risk tradeoff between biomedical and environmental values leans more heavily toward the biomedical value (treating asthma effectively).

This illustrates that like virtually everything else in engineering, best serving the public is a matter of optimization. The variables that we choose to give large weights will often drive the design. The technologist must continue to advance the state-of-the science in high priority areas. Any possible adverse effects must also be recognized. These should be incorporated and properly weighted when we optimize benefits. We must weigh these benefits against possible hazards and societal costs.
