**1. Introduction**

20 Emerging Informatics – Innovative Concepts and Applications

Tsou, M. (2004). Integrated mobile GIS and wireless internet map servers for environmental

Van Aardt, J. A. N.; Mckeown, D.; Faulring, J. ; Raqueno, N.; Casterline, M.; Renschler, C.;

*Engineering and Remote Sensing*, Vol.77, No.9, pp. 943-952, ISSN 0099-1112 Wheeler, C. (2010). ESRI charts a new direction in training. *ArcWatch*, December, 2010, Available from http://www.esri.com/news/arcwatch/1210/feature.html Xue, Y.; Cracknell, A. P. & Guo, H. D. (2002). Telegeoprocessing: The integration of remote

Yang, X.; Jiang, G.; Luo, X. & Zheng, Z. (2011). Preliminary mapping of high-resolution rural

Yoo, J.; Jazzar, A. B.; Jeong, K. J. & Lee, K. W. (2011). Development of real-time monitoring

www.saudigis.org/FCKFiles/File/6thSaudiGIS\_Papers/T1\_3.doc

Vol.31, No.3, pp. 153-165, ISSN 1545-0465

1893, ISSN 0141-1161

6228

monitoring and management. *Cartography and Geographic Information Science*,

Eguchi, R.; Messinger, D.; Krzaczek, R.; Cavillia, S.; Antalovich, J.; Philips, N.; Barlett, B.; Salvaggio, C.; Ontiveros, E. & Gill, S. (2011). Geospatial disaster response during the haiti earthquake: A case study spanning airborne deployment, data collection, transfer, processing, and dissemination. *Photogrammetric* 

sensing, geographic information system (GIS), global positioning system (GPS) and telecommunication. *International Journal of Remote Sensing*, Vol.23, No.9, pp. 1851-

population distribution based on imagery from Google Earth: A case study in the Lake Tai Basin, Eastern China. *Applied Gogeraphy*, Vol.32, pp. 221-227, ISSN 0143-

system integrated with GIS. *Proceedings of the Sixth National GIS Symposium in Saudi Arabia (Online)*, Khobar, Saudi Arabia, 24-26 April, 2011, Available from Emerging technologies present unique societal challenges. The public may be reluctant to accept them. The market niches are not always clear. They may have few precedents. They may rely on obscure knowledge and science that is not sufficiently understood outside the laboratory. These and other aspects of emerging technologies can present distinctive challenges to ethics.

Often, research and development of emerging technologies involve a very small group of experts in an esoteric enterprise. This often entails self-enforcement of difficult decisions. It also involves very dedicated and sharply focused researchers and advocates, who may have little incentive or aptitudes to be completely objective about the potential problems associated with their project. This is certainly understandable given that those engaged in advancing technologies have committed substantial intellectual and capital resources to the effort. Indeed, a key reason that many technologists are so successful is their laser-like focus. This is great for advancing the science, but can detract from considering the downsides of a new technology.

Another reason for lack of objectivity is motivation. Researchers at the cutting edge have much to lose if the technologies are delayed or stopped. For example, consider the dilemma of a doctoral student well into dissertation research who discovers a potential misuse of the technology. This could delay the research, or even require retrenchment and significant uncertainty in completing the doctorate. The problem is that doctoral students engaged in cutting edge research likely know more about the details than even the dissertation advisor and other experts on the committee. Indeed, even the ethics experts at the university will not know enough about the details of the research to see the ethical problems.

A third potential reason for missing possible ethical problems with an emerging technology can be traced to the scientific method itself, or at least the manner in which it is applied in cutting-edge research and development. Scientists often rely on weight-of-evidence. Evidence is gathered to support or refute a hypothesis. This often means that in order to keep the research from becoming unwieldy, all but one or a few variables are held constant, i.e. the laboratory condition.

The laboratory mentality can lead to looking at a very tightly confined data set, akin to looking for lost keys only under the light of the lamppost. Add to this the fact that

Ethical Decisions in Emergent Science, Engineering and Technologies 23

**Information to be communicated**

**Sensory Intellectual**

Interpretive Communication

> Mathematic - Formal

**Increasing Technical Complexity**

Models Equations Graphs

Verbal Symbolic

Perceptual Communication

1. Visual 2. Audible 3. Olfactory 4. Tactile 5. Taste

Fig. 1. Human communications. The right side of the figure is the domain of technical communication, but not of most people. Miscommunication can occur when members of the public may be overwhelmed by perceptive cues or may not understand the symbolic, interpretive language being used by an engineer. The potential for misunderstandings of an

Diagrams - Informal

emerging technology at a public meeting will differ from a more technical setting, depending on the type of communication employed. Source: Myer and Kaposi (2004).

mathematics is the language of science. Any non-mathematical communication is lost or at least valued less than quantitative information. Much of the ethical information is qualitative (e.g. honesty, integrity, justice, transparency, long-term impacts, etc.). When the good and bad aspects of a project are added up, it is not surprising that many of the potentially bad outcomes are underreported.

### **1.1 Transparency and open communication**

Responsible research depends on reliable communication and oversight. That is, there needs to be a set of checks and balances beyond the innovator to ensure that research is not violating scientific and ethical standards. This serves the potential users, the general public and the innovator, since it could well prevent mistakes and misuses, with attendant liabilities for the innovator and sponsors.

Technical communication can be seen as a critical path, where the engineer sends a message and the audience receives it (See Fig. 1). The means of communication can be either perceptual or interpretive (Myers and Kaposi 2004) Perceptual communications are directed toward the senses. Human perceptual communications are similar to that of other animals (Green 1989); that is, we react to sensory information (e.g. reading body language or assigning meaning to gestures, such as a hand held up with palms out, meaning "stop" or smile conveying approval).

Interpretive communications encode messages that require intellectual effort by the receiver to understand the sender's meanings. This type of communication can either be verbal or symbolic. Scientists and engineers draw heavily on symbolic information when communicating amongst themselves. Walking into a seminar covering an unfamiliar technical topic, using unrecognizable symbols and vernacular, is an example of potential symbolic miscommunication. In fact, the experts may be using words and symbols that are used in your area of expertise, but with very different meanings. For example, a biosensor may draw from both electrical engineering and microbiology. Both fields use the term "resistance," but they apply very different meanings. Such dual meanings can be problematic in technical communication. With emerging technologies, such ambiguity is not only frustrating, it can be dangerous.

Technical communication is analogous to the signal-to-noise ratio (S/N) in a transceiver. S/N is a measure of the signal strength compared to background noise. The signal is the electrical or electromagnetic energy traversing from one location to another. Conversely, noise is any energy that degrades the quality of a signal. In other words, for ideal transmission, most of the energy if the signal finds its way to the receiver. Similarly, in perfect communication, the message intended by the sender is exactly what is collected by the receiver (see Fig. 2). In other words, S/N = ∞, because N = 0. This is the goal of any technical communication, but this is seldom, if ever, the case.

There is always noise. A message is different than what was meant to be sent (i.e. is "noisy,") because of problems anywhere in the transceiver system. For starters, each person has a unique set of perspectives, contexts, and biases. We can liken these as "filters" through which our intended and received message must pass. Since both the sender and the receiver are people, each has a unique set of filters. So, even if the message were perfect, the filters will distort it (i.e. add noise). The actual filters being used depends on the type of message

mathematics is the language of science. Any non-mathematical communication is lost or at least valued less than quantitative information. Much of the ethical information is qualitative (e.g. honesty, integrity, justice, transparency, long-term impacts, etc.). When the good and bad aspects of a project are added up, it is not surprising that many of the

Responsible research depends on reliable communication and oversight. That is, there needs to be a set of checks and balances beyond the innovator to ensure that research is not violating scientific and ethical standards. This serves the potential users, the general public and the innovator, since it could well prevent mistakes and misuses, with attendant

Technical communication can be seen as a critical path, where the engineer sends a message and the audience receives it (See Fig. 1). The means of communication can be either perceptual or interpretive (Myers and Kaposi 2004) Perceptual communications are directed toward the senses. Human perceptual communications are similar to that of other animals (Green 1989); that is, we react to sensory information (e.g. reading body language or assigning meaning to gestures, such as a hand held up with palms out, meaning "stop" or

Interpretive communications encode messages that require intellectual effort by the receiver to understand the sender's meanings. This type of communication can either be verbal or symbolic. Scientists and engineers draw heavily on symbolic information when communicating amongst themselves. Walking into a seminar covering an unfamiliar technical topic, using unrecognizable symbols and vernacular, is an example of potential symbolic miscommunication. In fact, the experts may be using words and symbols that are used in your area of expertise, but with very different meanings. For example, a biosensor may draw from both electrical engineering and microbiology. Both fields use the term "resistance," but they apply very different meanings. Such dual meanings can be problematic in technical communication. With emerging technologies, such ambiguity is not

Technical communication is analogous to the signal-to-noise ratio (S/N) in a transceiver. S/N is a measure of the signal strength compared to background noise. The signal is the electrical or electromagnetic energy traversing from one location to another. Conversely, noise is any energy that degrades the quality of a signal. In other words, for ideal transmission, most of the energy if the signal finds its way to the receiver. Similarly, in perfect communication, the message intended by the sender is exactly what is collected by the receiver (see Fig. 2). In other words, S/N = ∞, because N = 0. This is the goal of any

There is always noise. A message is different than what was meant to be sent (i.e. is "noisy,") because of problems anywhere in the transceiver system. For starters, each person has a unique set of perspectives, contexts, and biases. We can liken these as "filters" through which our intended and received message must pass. Since both the sender and the receiver are people, each has a unique set of filters. So, even if the message were perfect, the filters will distort it (i.e. add noise). The actual filters being used depends on the type of message

potentially bad outcomes are underreported.

**1.1 Transparency and open communication** 

liabilities for the innovator and sponsors.

smile conveying approval).

only frustrating, it can be dangerous.

technical communication, but this is seldom, if ever, the case.

Fig. 1. Human communications. The right side of the figure is the domain of technical communication, but not of most people. Miscommunication can occur when members of the public may be overwhelmed by perceptive cues or may not understand the symbolic, interpretive language being used by an engineer. The potential for misunderstandings of an emerging technology at a public meeting will differ from a more technical setting, depending on the type of communication employed. Source: Myer and Kaposi (2004).

Ethical Decisions in Emergent Science, Engineering and Technologies 25

water (Smith et al. 2000). Unbeknownst to the engineers, however, as many as 77 million of the 125 million Bangladeshi people have been exposed to elevated concentrations of arsenic in their drinking water, resulting in thousands of debilitating skin lesions, with chronic

The engineering solution appeared to be a straightforward application of the physical sciences, but societal warnings were ignored. The tube wells did indeed solve the pathogen problem, but ignored the local people's protesting the use of groundwater in some locations as "the devil's water." The water was not tested for arsenic. Indigenous folklore that suggested problems with the aquifer was ignored. Indeed, this case provides another unfortunate example of misreading an application of an emerging technology. The World Health Organization (WHO) responded by installing thousands of ion exchange resin canisters to absorb the arsenic ion. The system worked well, until the villagers began inquiring what to do with the used canisters, which had reached arsenic concentrations of a hazardous waste. The WHO engineers failed to consider the disposition and disposal parts of the life cycle, and now Bangladesh has tens of thousands of these canisters with the

Designs flaws are often only identified and corrected at the very end of the project: the software crashes, the device fails in real-world test, the project is grossly overbid, or the sensor explodes. This is followed by a search for what went wrong. Eventually the truth emerges, and often the problems can be traced to the initial level of engineering design, the development of data and the interpretation of test results. This is why innovative designers must be extremely careful of their work. It is one thing to make a mistake (everyone does), but misinformation is clearly unethical. Fabricated or spurious test results can lead to catastrophic failures because there is an absence of a failure detection mechanism in engineering until the project is completed. Without trust and truthfulness in engineering,

*All engineering projects are communal; there would be no computers, there would be no airplanes, there would not even be civilization, if engineering were a solitary activity. What follows? It follows that we must be able to rely on other engineers; we must be able to trust their work. That is, it follows that there is a principle which binds engineering together, because* 

*without it the individual engineer would be helpless. This principle is truthfulness.* 

Thus, responsible conduct related to cutting edge research requires equipping the researcher to be aware of the ethical problems or potential problems, to make the right decisions even at a cost in time and resources and to follow with behavior that carries through one's entire

Socrates is said to have defined ethics as "how we ought to live." The "ought" becomes rather complicated in the rapidly advancing and highly competitive world of emerging technologies. Socrates might suggest that the first step toward the proper unfolding of new technologies is a blend of science and ethics: doing what is right and doing it in the right way. Technologists must learn how to survive and thrive, not only as innovators, but as

diseases expected to increase with time (World Health Organization, 2000).

potential to cause acute human health problems (Vallero and Vesilind 2007).

the system will fail. Bronowski (1958) framed this challenge succinctly:

**1.2 Transparency and self-enforcement** 

career.

fellow citizens.

Fig. 2. Transceiver analogy for communications, consisting of three main components: the sender, the message and the receiver. The distortion (noise) that decreases the S/N is caused by filtering at either end of the message. Source: Vallero and Vesilind. (2007).

being conveyed. In purely technical communications, the effect of cultural nuances should be minimal compared to most other forms of communications. Translating highly technical reports written in Spanish or another non-English language might be much easier and straightforward than translating literature and poetry.

One worst case scenario for an emerging technology, or even a novel use of an existing technology, is actually an aspect of justice. For example, uneducated people, those not familiar with a dominant culture's norms, and even well educated people unfamiliar with technical jargon, may be easily ignored.

A tragic example occurred in Bangladesh in the 1990s. An engineering solution to one problem played a major role in exacerbating the arsenic problem. Surface water sources, especially standing ponds, in Bangladesh have historically contained significant microbial pathogens causing acute gastrointestinal disease in infants and children. To address this problem, the United Nations Children's Fund (UNICEF) in the 1970s began working with Bangladesh's Department of Public Health Engineering to fabricate and install tube-wells in an attempt to give an alternative and safer source of water, i.e. groundwater. Tube wells are mechanisms that consist of series of 5 cm diameter tubes inserted into the ground at depths of usually less than 200 m. Metal hand pumps at the top of each tube were used to extract

Sender Message Receiver

/ *NS*

Receiver's Filters

Even Noisier Message

Fig. 2. Transceiver analogy for communications, consisting of three main components: the sender, the message and the receiver. The distortion (noise) that decreases the S/N is caused

/ *NS low*

being conveyed. In purely technical communications, the effect of cultural nuances should be minimal compared to most other forms of communications. Translating highly technical reports written in Spanish or another non-English language might be much easier and

One worst case scenario for an emerging technology, or even a novel use of an existing technology, is actually an aspect of justice. For example, uneducated people, those not familiar with a dominant culture's norms, and even well educated people unfamiliar with

A tragic example occurred in Bangladesh in the 1990s. An engineering solution to one problem played a major role in exacerbating the arsenic problem. Surface water sources, especially standing ponds, in Bangladesh have historically contained significant microbial pathogens causing acute gastrointestinal disease in infants and children. To address this problem, the United Nations Children's Fund (UNICEF) in the 1970s began working with Bangladesh's Department of Public Health Engineering to fabricate and install tube-wells in an attempt to give an alternative and safer source of water, i.e. groundwater. Tube wells are mechanisms that consist of series of 5 cm diameter tubes inserted into the ground at depths of usually less than 200 m. Metal hand pumps at the top of each tube were used to extract

by filtering at either end of the message. Source: Vallero and Vesilind. (2007).

Noisy Message

straightforward than translating literature and poetry.

Sender's Filters

technical jargon, may be easily ignored.

water (Smith et al. 2000). Unbeknownst to the engineers, however, as many as 77 million of the 125 million Bangladeshi people have been exposed to elevated concentrations of arsenic in their drinking water, resulting in thousands of debilitating skin lesions, with chronic diseases expected to increase with time (World Health Organization, 2000).

The engineering solution appeared to be a straightforward application of the physical sciences, but societal warnings were ignored. The tube wells did indeed solve the pathogen problem, but ignored the local people's protesting the use of groundwater in some locations as "the devil's water." The water was not tested for arsenic. Indigenous folklore that suggested problems with the aquifer was ignored. Indeed, this case provides another unfortunate example of misreading an application of an emerging technology. The World Health Organization (WHO) responded by installing thousands of ion exchange resin canisters to absorb the arsenic ion. The system worked well, until the villagers began inquiring what to do with the used canisters, which had reached arsenic concentrations of a hazardous waste. The WHO engineers failed to consider the disposition and disposal parts of the life cycle, and now Bangladesh has tens of thousands of these canisters with the potential to cause acute human health problems (Vallero and Vesilind 2007).

#### **1.2 Transparency and self-enforcement**

Designs flaws are often only identified and corrected at the very end of the project: the software crashes, the device fails in real-world test, the project is grossly overbid, or the sensor explodes. This is followed by a search for what went wrong. Eventually the truth emerges, and often the problems can be traced to the initial level of engineering design, the development of data and the interpretation of test results. This is why innovative designers must be extremely careful of their work. It is one thing to make a mistake (everyone does), but misinformation is clearly unethical. Fabricated or spurious test results can lead to catastrophic failures because there is an absence of a failure detection mechanism in engineering until the project is completed. Without trust and truthfulness in engineering, the system will fail. Bronowski (1958) framed this challenge succinctly:

*All engineering projects are communal; there would be no computers, there would be no airplanes, there would not even be civilization, if engineering were a solitary activity. What follows? It follows that we must be able to rely on other engineers; we must be able to trust their work. That is, it follows that there is a principle which binds engineering together, because without it the individual engineer would be helpless. This principle is truthfulness.* 

Thus, responsible conduct related to cutting edge research requires equipping the researcher to be aware of the ethical problems or potential problems, to make the right decisions even at a cost in time and resources and to follow with behavior that carries through one's entire career.

Socrates is said to have defined ethics as "how we ought to live." The "ought" becomes rather complicated in the rapidly advancing and highly competitive world of emerging technologies. Socrates might suggest that the first step toward the proper unfolding of new technologies is a blend of science and ethics: doing what is right and doing it in the right way. Technologists must learn how to survive and thrive, not only as innovators, but as fellow citizens.

Ethical Decisions in Emergent Science, Engineering and Technologies 27

Fig. 3. Hypothetical memory extinction curves. Curve A represents the most memorable case and Curves B and C less memorable. Curve C is extinguished completely with time. While the events in Curves A and B are remembered, less information about the event is remembered in Curve B because the event is less dramatic. The double arrow represents the difference in the amount of information retained in long-term memory. Source: D.A. Vallero (2007). Biomedical Ethics for Engineers: Ethics and Decision Making in Biomedical and

**Time since initial learning event**

Training programs employ some measures to overcome or at least ameliorate extinction. Annual or recurring training programs addressing ethics and responsible conduct are

Governing bodies are increasingly stressing the importance of responsible research. Thus, universities and research institutions have instituted training programs to ensure that research is conducted in a responsible manner. In the United States, for example, the Office of Research Integrity (ORI 2011) requires that all publicly funded entities include Responsible Conduct of Research Training (RCR). This is an important first step in instilling and enforcing ethical behavior, but ethical awareness is merely the first step in decision

Biosystem Engineering. Elsevier Academic Press, Burlington, MA.

common at many institutions (See Fig. 4).

**100%**

A

B

C

**Percent of learned information**

**0%**

making related to emerging technologies.
