**Part 3**

**Novel AR Applications in Daily Living and Learning** 

208 Augmented Reality – Some Emerging Application Areas

Raskar, R.; Welch, G. & Low, K.L. (2001). Shader Lamps: Animating Real Objects With

Rosen, J.M.; Soltanian, H.; Redett, R.J. & Laub, D.R. (1996). Evolution of virtual reality

Seo, B.K.; Lee, M.H.; Park, H.; Park, J.I. & Kim, Y.S. (2007). Direct-projected ar based

Stoianovici, D.; Whitcomb, L.; Anderson, J.; Taylor, R. & Kavoussi, L. (1998). A modular

Taylor, R.H. & Stoianovici, D. (2003). Medical robotics in computer-integrated surgery. *IEEE Transactions on Robotics and Automation*, Vol. 19, No.5, pp.765-781, ISSN 1042-296X. Taylor, R.H. (2008). Medical robotics and computer-integrated surgery. *32nd Annual IEEE* 

Teber, D.; Baumhauer, M.; Simpfendoerfer, T.; Hruza, M.; Klein, J. & Rassweiler, J. (2009).

Thoranaghatte, R.U.; Giraldez, J.G. & Zheng, G. (2008).Landmark based augmented reality

Wang, Y.P.; Chui, C.K.; Lim, H.L.; Cai, Y. & Mak, K.H. (1999). Real-time interactive

Watterson, J.D; Beiko, D.T.; Kuan, J.K.; & Denstedt, J.D. (2002). Randomized prospective

*Brazilian Society of Urology,* Vol.168, No.5, pp.1928–1932, ISSN 00225347. Wen, R.; Yang, L.; Chui, C.K.; Lim, K.B. & Chang, S. (2010). Intraoperative visual guidance and

*in Biomedicine, IEEE Transactions*, Vol. 5, No. 2, pp. 97- 107, ISSN 1089-7771. Yang, L.; Chui, C.K. & Chang, S. (2009). Design and Development of an Augmented Reality

Yang, L.; Wen, R.; Qin, J.; Chui, C.K.; Lim, K.B. & Chang, S.K.Y. (2010). A robotic system for

*Transactions on Mechatronics*, Vol. 15, No. 6, pp.887-897, ISSN 1083-4435.

*Biology Society*, 2008, ISBN 978-1-4244-1814-5, Vancouver, BC, Aug.2008. Uranues, S.; Maechler, H.; Bergmann, P.; Huber, S.; Hoebarth, G.; Pfeifer, J.; Rigler, B.;

*European Surgery*, Vol. 34, No. 3, pp. 190–193, ISSN 1682-4016.

*Surgery,* Vol. 3, No. 5, 1999, ISSN 1097-0150.

Vol. 8, No.1, pp.27-35, ISSN 1741-1882.

211-83709-4, London, Jun.2001.

0-7695-3056-7, Esbjerg, Jylland, Nov.2007.

Turku, Finland, July-August 2008.

258, ISSN 1569-9056.

978-3-540-65136-9, Chicago, USA, October 1998.

ISSN 0739-5175.

Image-Based Illumination. *Eurographics Workshop on Rendering Techniques*, ISBN 3-

[Medicine] *Engineering in Medicine and Biology Magazine*, Vol.15 , No.2, pp. 16-22,

interactive user interface for medical surgery. *Artificial Reality and Telexistence,* ISBN

surgical robotic system for image guided percutaneous procedures. *Medical Image Computing and Computer-Assisted Interventation — MICCAI'98,* pp.404-410, ISBN

*International Computer Software and Applications*, pp.1-1, ISBN 978-0-7695-3262-2,

Augmented reality a new tool to improve surgical accuracy during laparoscopic partial nephrectomy. *European Urology Supplements*, Vol. 7, No. 3, (March 2008), pp.

endoscope system for sinus and skull-base surgeries. *Engineering in Medicine and* 

Tscheliessnigg, K.H. & Mischinger, H.J (2002). Early experience with telemanipulative abdominal and cardiac surgery with the Zeus™ Robotic System.

simulator for percutaneous coronary revascularization procedures. *Computer Aided* 

blinded study validating acquisition of ureteroscopy skills using computer based virtual reality endourological simulator. *International braz j urol official journal of the* 

control interface for augmented reality robotic surgery. *IEEE International Conference on Control and Automation*, pp.947-952, ISBN 978-1-4244-5195-1, Xiamen, China, Jun. 2010. Xia, J.; Ip, H.H.S.; Samman, N.; Wong, H.T.F.; Gateno, J.; Wang, Dongfeng; Yeung, R.W.K.;

Kot, C.S.B. & Tideman, H. (2001). Three-dimensional virtual-reality surgical planning and soft-tissue prediction for orthognathic surgery. *Information Technology* 

Robotic System for Large Tumor Ablation. *International Journal of Virtual Reality*,

overlapping radiofrequency ablation in large tumor treatment. *IEEE/ASME* 

**11** 

*Algeria* 

**Augmented Reality Platform for Collaborative** 

One of the recent design goals in Human-Computer Interaction is the extension of the sensorymotor capabilities of computer systems enabling a combination of the real and the virtual in order to assist the user in performing his task in a physical setting. Such systems are called Augmented Reality (short: AR). The growing interest of designers for this paradigm is due to

AR is a new interactive approach, where virtual objects (such as texts, 2D images and 3D models) are added to real scenes in real time by using sensing and visualization technology. The computer generated digital information is overlaid on the user's physical environment

Augmented Reality (AR) is derived from Virtual Reality (VR) in which the user is completely immersed in an artificial world. In VR systems, there is no way for the user to interact with objects in the real world. Using AR technology, users can thus interact with

AR research is of major interest in several domains. Azuma gives a description of various applications of AR systems in (Azuma, 1997) including medical visualisation, manufacturing

AR application is an excellent domain for maintenance tasks in industrial environment (Changzhi et al, 2006). AR allows the user to see virtual objects increased upon real world scenes through display devices such as PCs, Laptops, Pc-Pockets**,** Video-projectors or Head Mounted Displays (HMD). The technician can interact with the virtual world and may dispose of additional information in various forms; for instance additional maintenance tasks instructions may be given in the form of texts, images, video or audio augmentations. Several maintenance platforms based on AR have been developed. ARVIKA (Marsot et al., 2009) introduces AR in the life cycle of industrial products, AMRA implements a mobile AR system in an industrial setting (Didier & Roussel, 2005), STARMATE (Schwald et al., 2001) assists the operator during maintenance tasks on complex mechanical systems, ULTRA (Riess & Stricker, 2006) has developed a software architecture that enables production of

the dual need of users to benefit from computers and interact with the real world.

so that he can perceive currently important information where needed.

mixed virtual and real worlds in a natural way (Zhong & Boulanger, 2002).

and repair, robot path planning, entertainment and military aircraft.

**1. Introduction** 

Samir Benbelkacem1, Nadia Zenati-Henda1, Fayçal Zerarga1,

**E-Maintenance Systems** 

Salim Malek1 and Mohamed Tadjine2 *1Development Centre of Advanced Technologies* 

*2Polytechnic National School* 

Abdelkader Bellarbi1, Mahmoud Belhocine1,
