**4.1 Malpractice model**

36 Emerging Informatics – Innovative Concepts and Applications

considered to be a libertarian, introduced another modulation, i.e. the "veil of ignorance." That is, the technologist must project himself or herself behind a veil, not knowing who is harmed the most by the new technology. In fact, the technologist may be the one being most harmed. Thus, the ethics of the technology should be based on its impact on the most vulnerable members of society (pregnant women, the unborn, neonates, children, the infirm, the elderly) , including those in the future. These constructs have in common the need to consider a technology's potential impacts on future and distant people, both the ends and the means during research, development and use of a technology, and the responsibility unique to the developer since he or she is the only one likely to be aware of an ethical breach

Thus, ethics begins with awareness, followed by decisions, and ultimately behavior growing out of these ethical decisions. For engineers, these three steps are codified, at least at the most fundamental level, reflective of duty ethics. The canons of the National Society of Professional Engineers (NSPE 2006) code of ethics captures what engineers "ought" to do. It

6. Conduct themselves honorably, responsibly, ethically, and lawfully so as to enhance the

Such professional canons transcribe "morality," i.e. societal norms of acceptability (virtuous/good decisions and acts) and unacceptability (vicious/bad decisions and acts). These norms are shared by members of society to provide stability as determined by consensus (Beauchamp and Childress 2001). Professional codes of ethics and their respective canons are designed to provide *normative ethics*, i.e. classifying actions as right and wrong without bias. Normative ethics is contrasted with *descriptive ethics*, which is the study of what a group actually believes to be right and wrong, and how it enforces conduct. Normative ethics regards ethics as a set of norms related to actions. Descriptive ethics deals

Gert (2004) categorizes behaviors into what he calls a "common morality," which is a system that thoughtful people use implicitly to make moral judgments. Humans strive to avoid five basic harms: death; pain; disability; loss of freedom; and loss of pleasure. Arguably, the impetus for many emerging technologies is that they address society's needs and desires. With this in mind, Gert proposes ten moral rules of common morality. The first five directly prohibit the infliction of harm on others. The next five indirectly lead to prevention of harm. Interestingly, these rules track quite closely with the tenets and canons of the engineering

Numerous ethical theories can form the basis for emerging technologies. In large measure, engineering ethics is an amalgam of various elements of many theories. As evidence, the American Society of Mechanical Engineers (ASME 2012) has succinctly bracketed ethical

states that engineers, in the fulfillment of their professional duties, shall:

1. Hold paramount the safety, health and welfare of the public.

3. Issue public statements only in an objective and truthful manner. 4. Act for each employer or client as faithful agents or trustees.

2. Perform services only in areas of their competence.

honor, reputation, and usefulness of the profession.

with what "is" and normative ethics addresses "what should be."

behavior into three models discussed in the next sections.

in its early stages.

5. Avoid deceptive acts.

profession (See Table 1).

Also known as the "minimalist" model, it may not actually be an ethical model in that the engineer is only acting in ways that are required to keep his or her license or professional membership. It is more accurately defined as a *legalistic* model. The engineer operating within this framework is concerned exclusively with adhering to standards and meeting requirements of the profession and any other applicable rules, laws or codes. Minimalism tends to be retroactive in view; finding fault after failures, problems or accidents happen. Any ethical breach is assigned based upon design, building, operation or other engineering steps that have failed to meet recognized professional standards. This is a common approach in failure engineering and in ethical review board considerations. It is also the basis of numerous engineering case studies.

The minimalist approach to integrity may be particularly problematic if applied to emerging technologies. A failure could be catastrophic, since there are few if any precedents for this technology. That is, meeting performance criteria designed for normal and well-tested technologies may well not prevent problems associated with possible outcomes of untested technologies. The innovator may be the only one with the understanding of the technology

Ethical Decisions in Emergent Science, Engineering and Technologies 39

Fig. 6. Ethical decisions differ from legal decisions. Legality can be a subset of ethical decisions (A). An example is a biomedical researcher's responsibility to abide by copyright and intellectual property laws. In addition, ethics also includes extra-legal responsibilities, such as maintaining competence in the scientific discipline. Other situations may arise where a certain amount of legal behavior is, in fact, unethical (B). An example is the paying of bribes or gratuities or conducting research without appropriate informed consent of the subjects, which in certain cultures may be legal or even mandated by regulations. On rare occasions, the laws may be completely unethical (C), such as slavery. Some would argue, for example, that current research like embryonic stem cell and blastocyst research, cloning, and animal

experimentation falls into this category.

to predict problems, and is often certainly in the best position to foresee low probability, negative outcomes. In other words, the innovator may see the flaws of existing performance criteria and rules in assuring the safety, health and welfare. Knowing this places the onus on the innovator, since the regulators may not be aware of the potential risks (sometimes in ethical inquiries, the conclusion is that an engineer "knew or should have known" the risks and declared them before the problems occurred).
