**1. Introduction**

116 Virtual Reality and Environments

Sweeny, P. (2009, May 18). *Web 3.0: The Web Goes Industrial.* Retrieved September 3, 2009,

Tech-Faq. (2011). *Augmented Reality.* Retrieved September 19, 2011, from Tech-faq.com:

Terry, J. B. (October 2002). *Use of Intranet Technology has had an Increasingly Positive Effect on Corporate Communications.* Largo, MD: University of Maryland University College. Thomas, J. C., & Streib, G. (2005). E-Democracy, E-Commerce, and E-Research: Examining

Thompson, D. (2009, March 08). *Coming of Age in Second Life: An Anthropologist Explores the* 

life-an-anthropologist-explores-the-virtually-human-dusan-writers.html Toolbox for IT. (2007, June 19). *Virtual Workspace.* Retrieved October 15, 2010, from It.toolbox.com: http://it.toolbox.com/wiki/index.php/Virtual\_Workspace Twine.com. (2009). *The Technology: Twine.* Retrieved November 2, 2009, from

van Krevelen, D., & Poelman, R. (2010). A Survey of Augmented Reality Technologies, Applications and Limitations. *The International Journal of Virtual Reality, 9(2)*, 1-20. VanLeuven, L. J. (March 2009). *Optimizing Citizen Engagement during Emergencies Through Use of Web 2.0 Technologies.* Monterey, CA: Naval Postgraduate School. Vartiainen , M., & Hyrkkänen, U. (2010). Changing Requirements and Mental Workload

Vashistha, A., & Vashistha, A. (2006). *The offshore nation: strategies for success in global outsourcing and offshoring.* New York, NY: McGraw-Hill Professional. W3C. (2008, January 15). *W3C Opens Data on the Web with SPARQL*. Retrieved July 10, 2009, from www.w3.org: http://www.w3.org/2007/12/sparql-pressrelease W3C. (2009). *Semantic Web Activity Statement*. (W3C) Retrieved July 9, 2009, from

Wagner, C. S. (2005). *International Collaboration in Science and the Formation of a Core Group.*

Wang, X. (2009). An Experimental Study on Collaborative Effectiveness of Augmented Reality Potentials in Urban Design. *CoDesign, Volume 5, Issue 4*, 229-244. Watkins, K. (1995). Workplace Learning: Changing Times, Changing Practices. *New* 

Watts, S. A. (2007). Evaluative Feedback: Perspectives on Media Effects. *Journal of Computer-*

White, J. (2010, September 22). *The ripple effect of workplace empowerment.* Retrieved October

Wickens, C. D., Lee, J. D., Liu, Y., & Gordon-Becker, S. E. (2004). *An Introduction to Human Factors Engineering (2nd Edition).* Upper Saddle River, NJ: Pearson Prentice Hall.

11, 2010, from Hrmreport.com: http://www.hrmreport.com/editors-blog/The-

the Electronic Ties Between Citizens and Governments. *Administration & Society,* 

*Virtually Human.* Retrieved October 20, 2009, from blogoehlert.typepad.com: http://blogoehlert.typepad.com/eclippings/2009/03/coming-of-age-in-second-

Factors in Mobile Multi-Locational Work. *New Technology, Work and Employment,* 

Amsterdam, Netherlands: Amsterdam School of Communications Research

http://socialcomputingjournal.com/viewcolumn.cfm?colid=837

http://www.tech-faq.com/augmented-reality.html

www.twine.com: http://www.twine.com/technology

www.w3.org: http://www.w3.org/2001/sw/Activity

*Directions for Adult and Continuing Education, n68*, 3-16.

*Mediated Communication, Volume 12, Issue 2*, 384-411.

ripple-effect-of-workplace-empowerment/

from SocialComputingJournal.com:

*Vol. 37, No. 3*, 259-280.

*Vol. 25, Issue 2*, 117-135.

(ASCoR).

The way we interact with computers strongly influences how we use the technology. The traditional input methods such as using a keyboard or a computer mouse (Forlines et al, 2007) are still in dominance. Meanwhile, new methods have been explored and developed to provide more diversified or more intuitive user experiences. These include touch screen (Albinsson & Zhai, 2003), voice recognition (Krishnaraj et al, 2010) and brain wave detection (Li et al, 2010; Kaul, 2008). Inspired by the market success of Palm Pilot, and more recently Apple iPhone and iPad, touch screen based devices are becoming increasingly popular (Ostashewski & Reid, 2010) and are expanding their applications to many traditionally noncomputing intensive fields, such as health care (Astell et al, 2010; Clark et al, 2009), driving (Lenné et al, 2011) and education (Willis & Miertschin, 2004; Zurn & Frolik, 2004).

A traditional touch screen covers the entire displaying surface with a matrix of resistors or capacitors (Ritchie & Turner, 1975; IEEE Software, 1991). Special circuits are used to capture the changes of the resistances or capacities of the matrix due to the user's touch. The changes are then converted to cursor positions accordingly. The resistive or capacitive technology requires the use of special materials such as indium tin oxide to be both transparent and conductive. However, the supplies of such materials are dwindling fast and the costs are increasing dramatically. Meanwhile, the amount of materials used on such touch screens is almost proportional to their sizes. In compact equipment such as PDAs and cell phones, or in special applications such as public information kiosks, costs of such touch screens can be justified. In regular office or household use, price is often a hurdle that prevents them from being widely adopted.

Different approaches of touch screen technology are developed to overcome the difficulties associated with the resistive or capacitive technologies. For example, an ultrasonic method was proposed to take advantage of surface acoustic waves (Katsuki et al, 2003). Meanwhile, optical based touch screen technology gained renewed interest. It was introduced early but did not gain momentum due to the cost of the digital cameras decades ago. More recently, the prices of single chip digital cameras have dropped significantly. At the level around \$1 a piece, it becomes feasible to take cameras as the building blocks of today's input technology.

Several algorithms have been introduced within the optical input technology. Some examples include using frustrated total internal reflection (Han, 2005) or using two cameras

Optical Touch Screen and Its Application as a Next Generation Classroom Response System 119

In a PLUS system, each of the two cameras comes with a viewing angle that is equal to or greater than 90 degrees. As shown in Figure 1, a supporting structure encloses the flat screen display. At two adjacent corners, two cameras are mounted just above the surface of the display. They are positioned toward the center of the display such that the overlapped viewing field covers the entire screen surface. For convenience of illustration, we assume the cameras are located at the lower left and lower right corners as depicted in Figure 2. In the real-life applications as we will show later in this artical, the cameras are generally put on the top two corners to prevent occlusions by hand and to keep the camera lenses from

For convenience of analysis we first build a coordinate system. As seen in Figure 2, the origin of the system is at the focal point of the lower left camera. Its X and Y axes are parallel to the horizontal and vertical edges of the screen respectively. At this section, we only analyze the pointer positions. That is, we will only study the planar (X-Y) movement of the pointer projected at the surface of the display. Therefore, we assume a linear camera method

**2.1 System set up** 

collecting dust.

Fig. 1. Set up of the pointer locator using stereovision.

Fig. 2. Coordinate System for Pointer Locater Using Stereovision.

to transform an acrylic plastic to a touch screen (Wilson, 2004). However, these two methods require the cameras to be placed on the opposite side of the screen from the user. They inevitably increased the thickness of the display and are not suitable for household, office, or classroom use.

Another popular approach is to put the camera at the user's side and use it to track a laser pointer or a fingertip (Cheng & Takatsuka, 2006). Meanwhile, a popular do-it-yourself project hacking the infrared camera of a WII remote to track the motion of an infrared pointer (Lee, 2007) have won applauses in the Internet. Nevertheless, the view of the user side cameras can be easily blocked by the body of the users and needs calibrations whenever the display is moved. Except for the fixed large projector screens, these methods are difficult to be adopted on regular desktops or laptops.

Meanwhile, stereovision has long been used in robot navigation to control manipulation or maneuver (Hager, 1995). A virtual touch screen using stereovision and see-through head mounted display was also proposed (Koh et al, 2008). However, the set up of the approaches is not suitable for daily application and the image processing is complicate due to the unstructured and noisy background.

In this article, we will describe the concept of using stereo- or mono-vision from the corners of the displays and track the motions of the pointer from sides. As we will explain later, this approach is simple to implement, inexpensive to equip, and not sensitive to the wear and tear of the screen. Comparing to existing touch screen technologies, it is also easy to scale up or down for different sizes or height/width ratios. Further, we can even generate virtual forces in the active input space and provide more vivid and intuitive user experiences. What's more, one of the most outstanding capabilities of the optical method is that it will superimpose but not obstruct the existing surface. This makes it ideal to be used on nontraditional displaying applications, such as a whiteboard, desktop, or even a regular writing pad, which is a staple stationary in a student's backpack.

Further, we will briefly introduce our application of touch screen in the next generation Classroom Response System (CRS) (Langman & Fies, 2010; Suchman et al, 2006). It takes advantage of the superimposing capability of the optical touch screen on a regular writing surface and can obtain instantaneous feedback from the students beyond multi-choice questions provided by traditional Clickers (Nicol et al, 2003; Siau et al, 2006). That is, the students can write or sketch their answers using touch screen devices or touch screen modified writing pads employing the technology described in this chapter.

The structure of this article is as follows. After this introduction, we will provide two approaches of optical touch screens, one based on stereovision and the other one on pseudostereovision. Then we will introduce the idea of virtual force to be used in touch screen input and further the superimposed optical touch screen as a next generation CRS. After that, a quick conclusion and discussion will be followed in the final section.
