**4. Ethical constructs**

34 Emerging Informatics – Innovative Concepts and Applications

The uncertain, yet looming threat of global climate change can be attributed in part to technological and industrial progress. Emergent technologies can help to assuage these problems by using alternative sources of energy, such as wind and solar, to reduce global demand for fossil fuels. However, these can have side effects, such as the low-probability but highly important outcomes of genetic engineering, e.g. genetically modified organisms (GMOs) used to produce food. GMOs may well help with world food and energy needs, but

The renowned physicist Martin Rees (2003) has voiced an extreme perspective related to the apprehension about nanotechnology, particularly its current trend toward producing "nanomachines." Biological systems, at the subcellular and molecular levels, could very efficiently produce proteins, as they already do for their own purposes. By tweaking some genetic material at a scale of a few angstroms, parts of the cell (e.g. the ribosome) that synthesize molecules could start producing myriad molecules designed by scientists, such as pharmaceuticals and nanoprocessors for computing. Rees is concerned that such assemblers could start self-replicating (like they always have), but without any "shut-off." Some have called this the "gray goo" scenario, i.e. creating of an "extinction technology" from the cell's unchecked ability to to replicate itself exponentially if part of their design is to be completely "omnivorous," using all matter as food! No other "life" on earth would

Though extreme and (hopefully) unlikely, this scenario calls attention to the problem that ethics usually follows technological advancement. All events that lead to even this extreme outcome are individually possible. Most life systems survive within a fairly narrow range of conditions. Slight modifications can be devastating. So, emerging technologies call for even more vigilance and foresight. Engineers and scientists are expected to push the envelopes of knowledge. We are rewarded for our eagerness and boldness. The Nobel Prize, for example, is not given to the chemist or physicist who has aptly calculated important scientific phenomena, with no new paradigms. It would be rare indeed for engineering societies to bestow awards only to the engineer who for an entire career used only proven technologies to design and build structures. This begins with our general approach to contemporary scientific research. Technologists are often rugged individualists in a quest to add new knowledge. For example, aspirants seeking Ph.D.s must endeavor to add knowledge to their specific scientific discipline. Scientific journals are unlikely to publish articles that do not at

Innovation is rewarded. Unfortunately, there is not a lot of natural incentive for the innovators to stop what they are doing to "think about" possible ethical dilemmas propagated by their discoveries. However, the engineering profession is beginning to come to grips with to this issue; for example, in emergent "macroethical" areas like nanotechnology, neurotechnology,

Thus, those engaged in emerging technologies are expected to push the envelopes of possible applications and simultaneously to investigate likely scenarios, from the very beneficial to the worst-case ("doomsday") outcomes. This link between fundamental work and outcomes becomes increasingly crucial as such research reaches the marketplace relatively quickly and cannot be confined to the "safety" and rigor of the laboratory and

Technological development thrusts the innovator into uncomfortable venues. Rarely is there a simple answer to the questions "How healthy is healthy enough?" And "How protected is

least contain some modicum of originality and newly found information.

and even sustainable design approaches (National Academy of Sciences 2004).

are not a panacea.

exist if this "doomsday" scenario were to occur.

highly controlled scale-ups.

For those involved in technologies, there are two general paths to ethical decisions, i.e. duty and outcome. Duty is at the heart of Immanuel Kant's (1785) "categorical imperative":

*Act only according to that maxim by which you can at the same time will that it should become a universal law.* 

The categorical imperative is at the heart of duty ethics (so called "deontology"), invoking the question as to whether one's action (or inaction) will make for a better world if all others in that same situation were to act in the same way. Thus, the technology itself can be ethically neutral, whereas the individual action's virtue or vice is seen in a comprehensive manner. The unknowns surrounding emerging technologies may cause one to add safeguards or even to abandon a technology or a particular use of the technology. The obligation of the technologist is to consider the effects of universalizing one's new technology, from an all inclusive perspective, considering all the potential good and all the potential bad.

Outcome-based ethics (so called "teleology") can be encapsulated in John Stuart Mill's (1863) utilitarianism's axiom of "greatest good for the greatest number of people." Even the most extreme forms of outcome-based ethics are moderated. For example, Mill added a "harm principle" which requires that no one be harmed in the pursuit of a noble outcome. That is, even though an emerging technology is expected to lead to benefits for the majority, it may still be unethical if it causes undue harm to even one person. John Rawls, who can be

Ethical Decisions in Emergent Science, Engineering and Technologies 37

1. Hold paramount the safety, health and

2. Perform services only in areas of their

3. Issue public statements only in an objective

4. Act for each employer or client as faithful

6. Conduct themselves honorably, responsibly, ethically, and lawfully so as to enhance the honor, reputation, and usefulness of the

basis of numerous engineering case studies.

and truthful manner. Do not deceive.

Table 1. Canons of the National Society of Professional Engineers (2006) Compared to Gert's

Also known as the "minimalist" model, it may not actually be an ethical model in that the engineer is only acting in ways that are required to keep his or her license or professional membership. It is more accurately defined as a *legalistic* model. The engineer operating within this framework is concerned exclusively with adhering to standards and meeting requirements of the profession and any other applicable rules, laws or codes. Minimalism tends to be retroactive in view; finding fault after failures, problems or accidents happen. Any ethical breach is assigned based upon design, building, operation or other engineering steps that have failed to meet recognized professional standards. This is a common approach in failure engineering and in ethical review board considerations. It is also the

The minimalist approach to integrity may be particularly problematic if applied to emerging technologies. A failure could be catastrophic, since there are few if any precedents for this technology. That is, meeting performance criteria designed for normal and well-tested technologies may well not prevent problems associated with possible outcomes of untested technologies. The innovator may be the only one with the understanding of the technology

welfare of the public.

competence.

agents or trustees.

5. Avoid deceptive acts.

profession.

(2001) Rules of Morality.

**4.1 Malpractice model** 

Engineers shall: Most Closely Linked to Rules of

 Do not kill. Do not cause pain. Do not disable.

 Do not deceive. Keep your promises.

 Do not cheat. Obey the law Do your duty.

 Do not cheat. Do your duty.

 Do not deceive. Keep your promises.

Do not cheat.

 Do your duty. Obey the law

Keep your promises.

Morality Identified by Gert

 Do not deprive of pleasure. Do not deprive of freedom.

 Do not deprive of pleasure. Keep your promises.

considered to be a libertarian, introduced another modulation, i.e. the "veil of ignorance." That is, the technologist must project himself or herself behind a veil, not knowing who is harmed the most by the new technology. In fact, the technologist may be the one being most harmed. Thus, the ethics of the technology should be based on its impact on the most vulnerable members of society (pregnant women, the unborn, neonates, children, the infirm, the elderly) , including those in the future. These constructs have in common the need to consider a technology's potential impacts on future and distant people, both the ends and the means during research, development and use of a technology, and the responsibility unique to the developer since he or she is the only one likely to be aware of an ethical breach in its early stages.

Thus, ethics begins with awareness, followed by decisions, and ultimately behavior growing out of these ethical decisions. For engineers, these three steps are codified, at least at the most fundamental level, reflective of duty ethics. The canons of the National Society of Professional Engineers (NSPE 2006) code of ethics captures what engineers "ought" to do. It states that engineers, in the fulfillment of their professional duties, shall:


Such professional canons transcribe "morality," i.e. societal norms of acceptability (virtuous/good decisions and acts) and unacceptability (vicious/bad decisions and acts). These norms are shared by members of society to provide stability as determined by consensus (Beauchamp and Childress 2001). Professional codes of ethics and their respective canons are designed to provide *normative ethics*, i.e. classifying actions as right and wrong without bias. Normative ethics is contrasted with *descriptive ethics*, which is the study of what a group actually believes to be right and wrong, and how it enforces conduct. Normative ethics regards ethics as a set of norms related to actions. Descriptive ethics deals with what "is" and normative ethics addresses "what should be."

Gert (2004) categorizes behaviors into what he calls a "common morality," which is a system that thoughtful people use implicitly to make moral judgments. Humans strive to avoid five basic harms: death; pain; disability; loss of freedom; and loss of pleasure. Arguably, the impetus for many emerging technologies is that they address society's needs and desires. With this in mind, Gert proposes ten moral rules of common morality. The first five directly prohibit the infliction of harm on others. The next five indirectly lead to prevention of harm. Interestingly, these rules track quite closely with the tenets and canons of the engineering profession (See Table 1).

Numerous ethical theories can form the basis for emerging technologies. In large measure, engineering ethics is an amalgam of various elements of many theories. As evidence, the American Society of Mechanical Engineers (ASME 2012) has succinctly bracketed ethical behavior into three models discussed in the next sections.


Table 1. Canons of the National Society of Professional Engineers (2006) Compared to Gert's (2001) Rules of Morality.
