**2. Related works**

More work has been carried out in developing comprehensive web-based and software interfaces that can adapt to significant end-user needs. Some of this work requires the development of adaptive algorithms to learn about changes in user interests or emotions [4]. A flawless user interface (UI) would automatically adapt or change its layout and web content elements to suit the needs of the users and similarly allow for users themselves to alter the contents of the UI [5].

Users easily adapt to the less com plex applications due to the cognitive ability to easily familiarize themselves with friendly and well-designed interphases, such as those used as a means of information distribution and learning [6]. Visually complex applications change the way users view content [7]. Reactions from the users can relay quantitative information when physiological sensors are part of the equipment used to study and interpret perception.

Other state-of-the-art techniques, such as that written by Dean C Karnopp et al. [8] and that of Franziska Kretzschmar and Simon P Liversedge et al. [9, 10], try to find the mystery surrounding emotions, how they work, and how they affect our lives have which not yet been unraveled. But recent techniques such as [11–13] developed a system and method provided for detecting emotional states, one of which uses statistics. A speech is first received and then an acoustic parameter is extracted from the speech signal. Then statistics or features from samples of the voice are calculated from extracted speech parameters. The features serve as inputs to a classifier, which can be a computer program, a device, or both. The classifier assigns at least one emotional state from a finite number of possible emotional states to the speech signal. Such techniques enable scientists to further debate the real nature of emotions, whether they are evolutionary, physiological, or cognitive to explain affective states. Results from applying this methodology on real-time data collected from a single subject demonstrated a recognition level of 71.4% which is comparable to the best results achieved. The detection mechanism outlined in this chapter has most of the characteristics required to perform emotion detection on real-time visual stimulus.
