**7. References**

Albinsson, P & Zhai, S. (2003). High Precision Touch Screen Interaction, *Proceedings of the SIGCHI conference on Human factors in computing systems*, ISBN:1-58113-630-7, Fort Lauderdale, FL, April 5-10, 2003

Optical Touch Screen and Its Application as a Next Generation Classroom Response System 133

Koh, E.; Won, J. & Bae C. (2008). Vision-Based Virtual Touch Screen Interface. *Proceedings of* 

Koile, K. & Singer, D. (2006). Development of a Tablet-PC-Based System to Increase

Krishnaraj, A.; Lee, J.; Laws, S.; & Crawford, T. (2010), Voice Recognition Software: Effect on

Langman, J & Fies, C. (2010). Classroom response system-mediated science learning with English language learners. *Language and Education*, Vol. 24, Issue 2, 2010 Lee, J. (2007). Low-Cost Multi-point Interactive Whiteboards Using the Wiimote. Available

Lenné, M.; Salmon, P.; Triggs, T.; Cornelissen, M. & Tomasevic, N. (2011). How Does Motion

Li, Y.; Nam, C.; Shadden, B. & Johnson, S. (2010). A P300-Based Brain–Computer Interface:

Liang, J.; Liu, T.; Wang, H.; Chang, B.; Deng, Y.; Yang, J.; Chou, C.; KO, H.; Yang, S. & Chan,

Mockel, R.; Sprowitz, A.; Maye, J. & Ijspeert, A. (2007). An easy to use bluetooth scatternet

Nicol, J.; Boyle, T. & David, J. (2003). Using Classroom Communication Systems to Support

Ostashewski, N. & Reid, D. (2010). iPod, iPhone, and now iPad: The evolution of

Ritchie, G & Turner, J. (1975). Input devices for interactive graphics. *International Journal of* 

Roschelle, J. (2003). Unlocking the learning value of wireless mobile devices*. Journal of* 

*Man-Machine Studies*, Vol. 7, Issue 5, pp. 639-660, September 1975

*Computer Assisted Learning*, Vol. 19, No. 3, pp. 260-272. 2003

ISBN 1-880094-81-9, Chesapeake, VA, Toronto, Canada, June 29, 2010 Patten, B. (2006). Designing collaborative, constructionist and contextual applications for handheld devices. *Computers & Education*, Vol. 46. No. 3, pp. 294-308, April 2006 Richard, A.; Ruth, A.; Davis, K; Linnell, N.; Prince, C. & Razmov, V. (2007). Supporting

*Journal of Roentgenology*, Vol. 195, No. 1, pp. 194-197, July, 2010

Las Vegas, NV, USA, Januarary 9-13, 2008

from http://johnnylee.net/projects/wii

*Interaction*, Vol. 27, Issue 1, pp. 52-68. December 2010

*2007*, pp. 2801-2806, October 29-November 2, 2007

ISSN 0968-7769,Vol. 11, pp. 43-57. April, 2003

ISBN:1-59593-361-1. Covington, Kentucky, 2007

9781557534347, June 11, 2006

September 2011

*International Conference on Consumer Electronics,* pp. 1-2. ISBN: 978-1-4244-1458-1.

Instructor-Student Classroom Interactions and Student Learning. [book auth.] J. Prey and R. Reed D. Berque. *The Impact of Pen-based Technology on Education: Vignettes, Evaluations and Future Directions,* Purdue University Press, ISBN

Radiology Report Turnaround Time at an Academic Medical Center, *American* 

Influence the Use of Touch Screen In-Vehicle Information Systems? *Proceedings of the Human Factors and Ergonomics Society Annual Meeting,* vol. 55 no. 1 1855-1859

Effects of Interface Type and Screen Size. *International Journal of Human-Computer* 

T. (2005) A Few Design Perspectives on One-On-One Digital Classroom Environment. *Journal of Computer Assisted Learning*, Vol. 21. Issue 1. pp181-189. 2005

protocol for fast data exchange in wireless sensor networks and autonomous robots. *IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)* 

Interaction and Discussion in Large Class Settings. *Research in Learning Technology*,

multimedia access in a mobile teaching context, *In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2010*, pp. 2862-2864.

Active Learning and Example Based Instruction with Classroom Technology. *Proceedings of the 38th SIGCSE technical symposium on Computer science education*,


Anderson, R. (2004). Beyond PowerPoint: Building a New Classroom Presenter. *Syllabus,* pp.

Astell, A.; Ellis, M.; Bernardi, L.; Alm, N.; Dye, R.; Gowans, G. & Campbell, J. (2010) Using a

Berque, D.; Bonebright, T. & Whitesell, M. (2004). Using Pen-Based Computers Across the

Caldwell, J. (2007). Clickers in the Large Classroom: Current Research and Best-Practice

Cheng, K. & Takatsuka M. (2006). Estimating Virtual Touchscreen for Fingertip Interaction

Clark, K.; Bardwell, W.; Arsenault, T.; DeTeresa, R. & Loscalzo, M. (2009), Implementing

Forlines, C.; Wigdor, D.; Shen, C. & Balakrishnan, R. (2007). Direct touch vs mouse input for

*systems*, ISBN: 978-1-59593-593-9, San Jose, CA, USA, April 28-May 3, 2007 Hager, G. (1995). Calibration-free visual control using projective invariance. *Proceedings of* 

IEEE Software Staff, Touch Screens Now Offer Compelling Uses, *IEEE Software*, Vol. 8, No.

Han, J. (2005). Low-cost multi-touch sensing through frustrated total internal reflection.

Jaimes A. & Sebe N. (2007). Multimodal human–computer interaction: A survey. *Computer* 

Karen, K; Carr, J. & Dozier, C. Dozier. (2001). Response-Card Instruction and Student

Katsuki, T.; Nakazawa, F.; Sano, S.; Takahashi, Y. & Satoh, Y. (2003). A Compact and High

Kaul, P. (2008). Brain Wave Games and E-Learning. *Innovative Techniques in Instruction Technology, E-Learning, E-Assessment*, ISBN 978-1-4020-8739-4\_72, pp410-415, 2008

*technology*, pp. 115 - 118 , Seattle, WA, USA, October 23-27, 2005

*Technology Education (SIGITE'04)*, Norfolk, VA, USA, March 3-7, 2004 Bransford, J.; Brown, A. & Cocking, R. (2000). *How People Learn: Brain, Mind, Experience, and* 

*School*, National Academy Press, Washington, DC, 2000

18, No. 8, pp. 822–830, ISSN: 1057-9249, 2009

Tips*. CBE Life Sciences Education*, Vol. 6, No. 1, pp. 9-20. 2007

touch screen computer to support relationships between people with dementia and caregivers, *Interacting with computers*, Vol. 22, Issue 4, Pages 267-275, July 2010 Bannan-Ritland, B. (2002). Computer-mediated communication, elearning, and interactivity:

A review of the research*. Quarterly Review of Distance Education*, Vol. 3. No. 2, pp.

Computer Science Curriculum, *Proceedings of Special Interest Group on Information* 

with Large Displays. *Proceedings of 20th conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction: design: activities, artefacts and environments (OZCHI'06).* pp. 397-400. Sydney, Australia,

touch-screen technology to enhance recognition of distress. *Psycho-Oncology*, Vol.

tabletop displays. *Proceedings of the SIGCHI Conference on Human factors in computing* 

*5th International Conference of Computer Vision*, pp. 1009-1015, Cambridge, MA, USA,

*Proceedings of the 18th annual ACM symposium on User interface software and* 

*Vision and Image Understanding*, Vol. 108, Issues1-2, pp.116-134, October-November

Learning in a College Classroom. *Teaching of Psychology*, Vol. 28, No. 2, pp. 101-104,

Optical Transmission SAW Touch Screen With ZnO Thin-Film Piezoelectric Transducers. *Proceedings of the 2003 IEEE Symposium on Ultrasonics*, Vol. 1, pp. 821-

31-33. June, 2004

161-179, 2002.

November 20-24, 2006

June 20-23, 1995

2007

May 2001

2, pp. 93-94, March 1991

824, Honolulu, HI, Oct. 5-8, 2003


**0**

**7**

*Spain*

**Personalization of Virtual Environments**

Dani Tost, Sergi Grau and Sergio Moya *CREB-Polytechnical University of Catalonia*

**Navigation and Tasks for Neurorehabilitation**

The use of "serious" computer games designed for other purposes than purely leisure is becoming a recurrent research topic in diverse areas from professional training and education to psychiatric and neuropsychological rehabilitation. In particular 3D computer games have been introduced in neuropsychological rehabilitation of cognitive functions to train daily activities. In general, these games are based on the first person paradigm. Patients control an avatar who moves around in a 3D virtual scenario. They manipulate virtual objects in order to perform daily activities such as cooking or tidying up a room. The underlying hypothesis of Cognitive Neuropsychological Virtual Rehabilitation (CNVR) systems is that 3D Interactive Virtual Environments (VE) can provide good simulations of the real world yielding to an effective transfer of virtual skills to real capacities (Rose et al., 2005). Other potential advantages of CNVR are that they are highly motivating, safe and controlled, and they can recreate a diversity of scenarios (Guo et al., 2004). Virtual tasks are easy to document automatically. Moreover, they are reproducible, which is useful for accurate analyses of the

Several studies has shown that leaving tasks unfinished can be counterproductive in a rehabilitation process (Prigatano, 1997). Therefore, CNVR systems tend to provide free-of-error rehabilitation tasks. For this, by opposite to traditional leisure games, they integrate different intervention mechanisms to guide patients towards the fulfillment of their goals, from instruction messages to automatic realization of part and even all the task.

The main drawback of CNVR is that the use of technology introduces a complexity factor alien to the rehabilitation process. In particular, it requires patients to acquire spatial abilities (Satalich, 1995) and navigational awareness (Chen & Stanney, 1999) in order to find ways through the VE and perceive relative distances to the patient's avatar. In addition, interacting with virtual objects involves recognizing their shape and semantics (Nesbitt et al., 2009). Moreover, pointing, picking and putting virtual objects is difficult, especially if the environments are complex and the objects size is small (Elmqvist & Fekete, 2008). Finally, steering virtual objects through VEs needs spatio-temporal skills to update the perception of the relative distance between objects through motion (Liu et al., 2011). These skills vary from one individual to another, and they can be strongly affected by patient's neuropsychological impairments. Therefore, it is necessary to design strategies to make technology more usable

**1. Introduction**

patients behavior.

