**4. Different models for the intelligent user interface**

User Interfaces are developed using mostly languages like XML, UML and UIML etc.as it deals with different types of users and their requirements in open mode [11].

There are models which are used towards the development of these Intelligent Interfaces.


## **5. Challenges for intelligent user interfaces**

Success of IUI depends on many factors like Misinterpretation of Cognitive Interaction leading to incorrect understanding by user, erroneous interpretation

**Figure 6.** *Samsung SUR40 tabletop intelligent user Interface [15].*

of instructions by the devices leading to controlling of the system by the user. Depending upon the length of the query demanded by the user the issue of security and privacy will be compromised leading to complex situation [12, 13].

The important factor leading the functioning of the IUI is the dynamism of the user demand depending upon the environment to which the system is exposed to at that time. The solution may be strong algorithm development according to the dynamism of the situation around the user.

One of the most used IUI will be the Tabletop Devices as Intelligent User Interface [14, 15]:

The Tabletop design is an example of IUI to be used in different fields like Education system and Ambient Assistance system as an Augmented Reality facility to cater the needs of the present day situation, when the students aa well as the elderly people are exposed to distance mode of learning and functioning with multimodal facilities around them. This will enhance the growth of interactive tabletop device development activities to cater to the need of academicians and industry people. This has happened due to the increase of interactive systems with digital support brining digital power to table surfaces to support human activities intelligently. To mention the core aspects of interactive tabletop computing devices it may be mentioned that (i) Hardware development of horizontal touch surfaces, (ii) Multimodal Interfacing facilities as per the need like touch, speech, voice and gesture facilities meeting the novel touch and modes of use by the user. A snap shot of the Samsung SUR40 Intelligent User Interface is given below [15] (**Figure 6**).

#### **6. Interface design in Indian language perspectives**

One can think of designing an interface intelligent enough to cater the needs of the user. But the category of the users is a big challenge towards the development of the devices. Intelligent User Interface design will mostly depend on the features of Speech Processing, Image Processing and Natural Language Processing Tools.

A screen shot of a Speech Processing Tool developed for Odia Language is given below. This based on the concatenative method of phones used in Odia Language. It *Indian Language Compatible Intelligent User Interfaces DOI: http://dx.doi.org/10.5772/intechopen.98525*

helps a user to listen to any text written in Odia language being converted to speech form. It uses the Artificial Intelligence based techniques to cater to the need of the user. This technique has already been tested for many Indian Languages including Indian English. Here the phones, which is the smallest sound unit used in any language are identified and depending upon the pronunciation rules those are concatenated to form a word, thus a sentence. A text may be directly written or read from a file already available like office orders. It reads it out for the user [16–19] (**Figure 7**).

#### **Figure 7.**

*Odia language phone based text-to-speech system [16].*


#### **Figure 8.**

*Odia language optical character recognition system which performs on speech commands and reads out the output the OCRed text to the listen [29].*

It is observed that many documents are in scanned document form which is not possible to link it to any Text-To-Speech System. Here is a successful attempt to read the content for the needy persons like illiterate and visually challenged [20–25].

A snap shot of the intelligent user interface for physically challenged is shown below, which works on mouse click as well as speech commands [26–30] (**Figure 8**).

For all such type of Intelligent Interface Development, Natural Language matters as the user may not be English Language Always. The problem to make it more effective towards global use the language factor consisting of Image and Speech are to be handled where the natural process of writing, speaking matters. Thus the factors to be considered are to be considered in connection with the Natural Language Processing aspect.

Here the author has used several user interfaces which are able to manage intelligently the commands or uses expressed in terms of speech or written form. To solve it and exactly interpret to convert it to command for the device different aspects of NLP like Machine Translation, Name Entity Recognition, Grammar restriction free command etc. free like natural speech in command mode are dealt with. Some of the works are mentioned in the papers [31–34].

Even the gesture is most important as any user will prefer to use gesture to express the mind as at cognition level the brain synthesizes the command and expresses it different form like Speech or Text.
