**1. Introduction**

Assistance systems in Ambient Assisted Living and in medical care have to recognize relevant situations, that require fast assistive intervention. Former projects in this field like tecla [1–3] or PAUL [4] have been focused on the application of the new AAL-technologies in AAL test beds to get information about the acceptance level [5, 6] of the technologies and the different new applications for the patients. Additionally, business models [7, 8] have been drafted to realize a successful AAL business area in future.

The clinical established measurement technology for diagnostic, monitoring and risk stratification does not translate directly to the outpatient area (ambulant or domestically environment). The key challenge is, that many relevant situations are only noticeable, when various sensor modalities are merged – such as for

**4**

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

embedded vision in IoT for smart healthcare centers. Journal of Artificial Intelligence and Systems;(15):1, 2019

[10] Idoudi M, Bourennane EB, Grayaa K. Wireless visual sensor

and tracking of a patient for rehabilitation task. IEEE Sensors Journal. 2018;**18**(14):5915-5928

multimedia. 2009

[9] Soro S, Heinzelman W. A survey of visual sensor networks. Advances in

network platform for indoor localization

[11] Kokkoni E, Mavroudi E, Zehfroosh A, Galloway JC, Vidal R, Heinz J, et al. GEARing smart environments for pediatric motor rehabilitation. Journal of NeuroEngineering and Rehabilitation. 2020;**17**(1):16

[1] Lewandowski M, Orczyk T, Płaczek B. Human activity detection based on the iBeacon technology. Journal of Medical Informatics &

[2] Płaczek B, Bułdak RJ, Polaniak R. Automatic immunogold particle detection in transmission electron micrographs of cancer cells. Journal of Medical Imaging and Health Informatics. 2015;**5**(6):1350-1357

[3] Bernas M, Płaczek B. Fully connected

[4] Meharouech A, Elias J, Mehaoua A. Moving towards body-to-body sensor networks for ubiquitous applications: A survey. Journal of Sensor and Actuator

[5] Lewandowski M, Płaczek B. An Event-Aware Cluster-Head Rotation Algorithm for Extending Lifetime of Wireless Sensor Network with Smart Nodes. Sensors. 2019;**19**(19):4060

[6] Lee KH, Kung SY, Verma N. Low-energy formulations of support vector machine kernel functions for biomedical sensor applications. Journal of Signal Processing Systems.

[7] Faraone A, Delgado-Gonzalo R. Convolutional-recurrent neural networks on low-power wearable platforms for cardiac arrhythmia detection. In: 2020 2nd IEEE

International Conference on Artificial Intelligence Circuits and Systems (AICAS). IEEE; 2020. pp. 153-157

[8] Hussain T, Muhammad K, Khan S, Ullah A, Lee MY, Baik SW. Intelligent baby behavior monitoring using

2012;**69**(3):339-349

neural networks ensemble with signal strength clustering for indoor localization in wireless sensor networks. International Journal of Distributed Sensor Networks. 2015;**11**(12):403242

Networks. 2019;**8**(2):27

Technologies. 2016:25

discrimination between pathological, emotional [9] or stress induced increase of the heart rate [10]. This is only possible by the use of the combination of multiple different sensors [11]. The same applies to the analysis of joint kinematics of everyday activities, which requires more and inertial sensors with higher accuracy.

fulfill the boundaries of a haptive working network. Here, physiological, cognitive and kinematic information of a patient are captured with the help of intelligent sensor data fusion. These data can be combined to provide an integrated picture of the patient's physical and mental situation. In this way, it should be ensured that the framework can be used for applications, in which feedback has to be embedded synchronically. This can be realized in visual, auditive, tactile or proprioceptive string of perception, such as in the field of support of motor function and kinemat-

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

**Figure 1** shows an overview of the system concept of the project approach for an integrated sensor infrastructure in the home of an elderly person. It consists of GPS data, air pressure and temperature data, vital parameters, cameras, optical sensors

These sensor data are summarized in real-time and buffered in a database system. From this database, an integrated real-time situation analysis is generated that touches on three areas of human life: firstly, the kinematic data such as localization, movement and posture. The second area is the cognitive sub-area with awareness, emotionality and mental clarity. The third subsection deals with the physiological data in which cardiovascular metabolic and neurological data can be

This entirety of the data in the home of the living person can be evaluated integratively and can accordingly provide a precise analysis of his health. In this project, apart from the emotional and neurological aspects, all the addressed areas were recorded and evaluated. After evaluating the situation analysis, actuators are implemented for rehabilitation, in a special case of an active prosthesis of the foot, which can adjust different heel heights, automatic adaptation to different floor conditions or rapid walking. Furthermore, the client should be provided with a realtime display of his vital parameters as a so-called Smart Home Assistant, which can

For a real-time application, it is necessary that the latency times between sensor detection and actuator actuation are less than several Milliseconds. This ensures a so-called haptic functionality of the system and can be achieved with the help of new radio technologies and fast network technologies such as FTTH and the fifth

ics for the rehabilitation and for active prosthetics and orthotics.

and so-called inertial sensors (IMU) together.

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

give a helpful health support to the client.

recorded and analyzed.

**Figure 1.**

**7**

*Integrated system concept.*

The next generation of radio networks (5G) [12] shows the possibility of introducing new possibilities of real-time communication in all areas of life with very low latency and high data rates. One speaks of a so-called tactile Internet. People come into contact with their surroundings through their senses, which involve several different reaction times. Here, muscular, audio-visual and tactile response times are of particular importance. The typical muscular response time is around 1 second, that of the hearing at 100 ms, while the visual response time is in the range of 10 ms [13].

In the case of active control of an object, such as a car or a machine, the information must first be recorded while a reaction must be carried out at the same time. The well-known use of a touch screen requires that you move your finger in a controlled manner across the screen. It is therefore necessary that the touch screen can achieve a response time of less than 1 ms in order not to produce any noticeable delay in the visual impression. In the case of an active prothesis, which was applied in this study, the response time must be below 10 ms to achieve a practical application basis for its use in daily life. Therefore, fast sensor data-frameworks are needed to analyze the conditions of real-time identification and subsequently provide a medical valid corresponding assistance [12, 14].

The aim of the fast care project was to develop a real-time sensor data analysis framework [9] for intelligent assistance systems in the area of Ambient Assisted Living (AAL), eHealth, mHealth, tele-rehabilitation and tele-care. It provides a medically valid, integrated real-time situation picture based on a distributed, ad hoc networking, everyday use and energy-efficient sensor infrastructure with a latency of less than several ms. The integrated situation picture that includes physiological, cognitive, kinematic information of the patient is generated by the intelligent fusion of sensor data [15, 16]. It can serve as a basis both for the rapid detection of risks and dangerous situations as well as for everyday use medical assistance systems that autonomously intervene in real time [17, 18] and allows active telemedical feedback [10].

In this chapter of the book, after an introduction, the technical goals and implementation options of a fast sensor network with real-time data analysis are presented followed without contact by the structure of the overall system. In the Section 2, the details of the technological concept such as data fusion and telemetry are presented. All relevant interfaces for real-time applications are discussed in detail. In the following section, the hardware, sensors/actuators and the specific installation of the demonstrator in laboratory operation are discussed. In the following part, details of the individual sensor systems and the corresponding visualization of the sensor data presented by an Avatar are distinguished. In the Section 3, the acceptance test for the use of the sensor components of the demonstration are analyzed and discussed. Finally, a summary with a view of upcoming developments will be given at the end.

#### **2. Technical goals and solutions**

#### **2.1 System setup**

The basis of a medical valid - integrated real-time picture of the situation is an ad hoc interconnected sensor infrastructure. Its latency period should be very fast to

#### *Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*

fulfill the boundaries of a haptive working network. Here, physiological, cognitive and kinematic information of a patient are captured with the help of intelligent sensor data fusion. These data can be combined to provide an integrated picture of the patient's physical and mental situation. In this way, it should be ensured that the framework can be used for applications, in which feedback has to be embedded synchronically. This can be realized in visual, auditive, tactile or proprioceptive string of perception, such as in the field of support of motor function and kinematics for the rehabilitation and for active prosthetics and orthotics.

**Figure 1** shows an overview of the system concept of the project approach for an integrated sensor infrastructure in the home of an elderly person. It consists of GPS data, air pressure and temperature data, vital parameters, cameras, optical sensors and so-called inertial sensors (IMU) together.

These sensor data are summarized in real-time and buffered in a database system. From this database, an integrated real-time situation analysis is generated that touches on three areas of human life: firstly, the kinematic data such as localization, movement and posture. The second area is the cognitive sub-area with awareness, emotionality and mental clarity. The third subsection deals with the physiological data in which cardiovascular metabolic and neurological data can be recorded and analyzed.

This entirety of the data in the home of the living person can be evaluated integratively and can accordingly provide a precise analysis of his health. In this project, apart from the emotional and neurological aspects, all the addressed areas were recorded and evaluated. After evaluating the situation analysis, actuators are implemented for rehabilitation, in a special case of an active prosthesis of the foot, which can adjust different heel heights, automatic adaptation to different floor conditions or rapid walking. Furthermore, the client should be provided with a realtime display of his vital parameters as a so-called Smart Home Assistant, which can give a helpful health support to the client.

For a real-time application, it is necessary that the latency times between sensor detection and actuator actuation are less than several Milliseconds. This ensures a so-called haptic functionality of the system and can be achieved with the help of new radio technologies and fast network technologies such as FTTH and the fifth

**Figure 1.** *Integrated system concept.*

discrimination between pathological, emotional [9] or stress induced increase of the heart rate [10]. This is only possible by the use of the combination of multiple different sensors [11]. The same applies to the analysis of joint kinematics of everyday activities, which requires more and inertial sensors with higher accuracy.

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

The next generation of radio networks (5G) [12] shows the possibility of introducing new possibilities of real-time communication in all areas of life with very low latency and high data rates. One speaks of a so-called tactile Internet. People come into contact with their surroundings through their senses, which involve several different reaction times. Here, muscular, audio-visual and tactile response times are of particular importance. The typical muscular response time is around 1 second, that of the hearing at 100 ms, while the visual response time is in the

In the case of active control of an object, such as a car or a machine, the information must first be recorded while a reaction must be carried out at the same time. The well-known use of a touch screen requires that you move your finger in a controlled manner across the screen. It is therefore necessary that the touch screen can achieve a response time of less than 1 ms in order not to produce any noticeable delay in the visual impression. In the case of an active prothesis, which was applied in this study, the response time must be below 10 ms to achieve a practical application basis for its use in daily life. Therefore, fast sensor data-frameworks are needed to analyze the conditions of real-time identification and subsequently provide a

The aim of the fast care project was to develop a real-time sensor data analysis framework [9] for intelligent assistance systems in the area of Ambient Assisted Living (AAL), eHealth, mHealth, tele-rehabilitation and tele-care. It provides a medically valid, integrated real-time situation picture based on a distributed, ad hoc networking, everyday use and energy-efficient sensor infrastructure with a latency of less than several ms. The integrated situation picture that includes physiological, cognitive, kinematic information of the patient is generated by the intelligent fusion of sensor data [15, 16]. It can serve as a basis both for the rapid detection of risks and dangerous situations as well as for everyday use medical assistance systems that autonomously intervene in real time [17, 18] and allows active telemedical

In this chapter of the book, after an introduction, the technical goals and imple-

The basis of a medical valid - integrated real-time picture of the situation is an ad hoc interconnected sensor infrastructure. Its latency period should be very fast to

mentation options of a fast sensor network with real-time data analysis are presented followed without contact by the structure of the overall system. In the Section 2, the details of the technological concept such as data fusion and telemetry are presented. All relevant interfaces for real-time applications are discussed in detail. In the following section, the hardware, sensors/actuators and the specific installation of the demonstrator in laboratory operation are discussed. In the following part, details of the individual sensor systems and the corresponding visualization of the sensor data presented by an Avatar are distinguished. In the Section 3, the acceptance test for the use of the sensor components of the demonstration are analyzed and discussed. Finally, a summary with a view of upcoming developments

range of 10 ms [13].

feedback [10].

will be given at the end.

**2.1 System setup**

**6**

**2. Technical goals and solutions**

medical valid corresponding assistance [12, 14].

generation of mobile radio networks (5G). To ensure private data security, all data is stored and evaluated in a so-called home server which is situated in the client's apartment. Further intervention options are possible by a secure cloud connection to medical services or the system administrators for possible updates of the sensor and actuator components.

unsatisfactory rehabilitation outcome. This system approach of the sensor integration into an active foot prothesis is called a real-time active prosthetics/orthotics time controller. Another project section describes the online execution of the estimation of cognitive condition, the motion analysis for rehabilitation and

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

Based on the project goals, the technical and content requirements of the technological topics to be worked on were specified, categorized and summarized by the individual partners. The basic requirements are listed in the following areas:

The system diagram of the research approach of the fast care framework is shown in the **Figure 3**. The fast care framework is the technical basis for the realization of the fast care project, which implements the fusion of heterogeneous sensors via heterogeneous networks. The basic idea of the fast care framework is to derive a condition from the past and the current states of the sensory data using different newly developed sensor applications, including the following areas and interfaces (see **Figure 3**). From the network topological representation, a breakdown of the used network interfaces was made, specified by the project partners. Based on this, a suitable communication protocol was selected regarding the individual implementations. Communication via MQTT forms the basis of the used communication between the sensor-applications and the real-time controller depicted in **Figure 3**. In the left side of the figure, the sensor-applications are situated, consisting of a Kinect system for motion data, inertial motion units (IMU) for the detection of movements of body and objects in a fixed sequence for the analysis of a workout in a kitchen, motion sensors/actuators in an active intelligent prothesis, a camera based heart rate and breathe sensor, and finally a special sensor of volatile organic components in the room air. Prothesis, body and objects sensors are connected via smartphone and Bluetooth low energy. While the smartphone

In total, the seven sensor components are listed there on the left. The active prosthesis, the heart rate measurement, the respiratory rate measurement, the detection of VOC components in the breathing air, the detection of movement in the room and the measurement of room temperature and humidity, as well as the use of the emergency button, uses the corresponding network structure according

After the individual implementations of the interfaces a suitable software communication server was selected. The MQTT protocol [19] was implemented using a real-time capable Linux variant. Suitable hardware was procured by the project partner of the Harz University of Applied Sciences, a suitable operating system was installed and the MQTT software server "mosquitto" [20] was installed and configured. The definition of topics (message channels) and the specification of the data formats were necessary for smooth communication of the individual partner realizations "in-itself" and "with each other." A detailed description of the

cardiopulmonary performance.

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

**2.2 Technological concept**

1.Hardware/sensors,

4.Actuators/intervention/feedback

transfers the data to the real-time controller.

to the blocks shown in the sketch.

**9**

2.Network,

3.Data analysis,

The challenge of a distributed, real-time medical sensor technology and signal processing is to be processed by means of sensor-based data processing and sensor hubs, optical sensors, hardware system optimization, the development of distributed systems as well as by interface network sensors. The focus of the project was on the intelligent fusion of sensor and actuator data as well as the evaluation and delivery in real-time. In order to meet this objective, the following developments took place in the Ambient Assisted Living (AAL)-Lab of the Harz University of Applied Sciences in Wernigerode (**Figure 2**).


The objective of a distributed, real-time medical sensor technology and signal processing is to get an evaluation of the patient's situation from the available data in real-time. The main application focuses in the area of the application of orthopedic devices. For example, the optimization process of the leg prosthesis` damping members and active foot positioning points shall be executed online. Currently, these parameters are performed offline and hand-made by orthopedic technicians with variable quality. This often leads to suboptimal adapted orthopedic devices; whose functionality and efficacy are correspondingly limited and therefore to an

**Figure 2.** *Application of fast care real-time sensor system.*

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*

unsatisfactory rehabilitation outcome. This system approach of the sensor integration into an active foot prothesis is called a real-time active prosthetics/orthotics time controller. Another project section describes the online execution of the estimation of cognitive condition, the motion analysis for rehabilitation and cardiopulmonary performance.

#### **2.2 Technological concept**

Based on the project goals, the technical and content requirements of the technological topics to be worked on were specified, categorized and summarized by the individual partners. The basic requirements are listed in the following areas:


generation of mobile radio networks (5G). To ensure private data security, all data is stored and evaluated in a so-called home server which is situated in the client's apartment. Further intervention options are possible by a secure cloud connection to medical services or the system administrators for possible updates of the sensor

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

The challenge of a distributed, real-time medical sensor technology and signal processing is to be processed by means of sensor-based data processing and sensor hubs, optical sensors, hardware system optimization, the development of distributed systems as well as by interface network sensors. The focus of the project was on the intelligent fusion of sensor and actuator data as well as the evaluation and delivery in real-time. In order to meet this objective, the following developments took place in the Ambient Assisted Living (AAL)-Lab of the Harz University of

The objective of a distributed, real-time medical sensor technology and signal processing is to get an evaluation of the patient's situation from the available data in real-time. The main application focuses in the area of the application of orthopedic devices. For example, the optimization process of the leg prosthesis` damping members and active foot positioning points shall be executed online. Currently, these parameters are performed offline and hand-made by orthopedic technicians with variable quality. This often leads to suboptimal adapted orthopedic devices; whose functionality and efficacy are correspondingly limited and therefore to an

and actuator components.

Applied Sciences in Wernigerode (**Figure 2**).

• Situation detection and assistance in real-time

• Analysis of requirements

• Data acquisition

• Acceptance analysis

• Data analysis

• Data fusion

**Figure 2.**

**8**

*Application of fast care real-time sensor system.*


The system diagram of the research approach of the fast care framework is shown in the **Figure 3**. The fast care framework is the technical basis for the realization of the fast care project, which implements the fusion of heterogeneous sensors via heterogeneous networks. The basic idea of the fast care framework is to derive a condition from the past and the current states of the sensory data using different newly developed sensor applications, including the following areas and interfaces (see **Figure 3**). From the network topological representation, a breakdown of the used network interfaces was made, specified by the project partners. Based on this, a suitable communication protocol was selected regarding the individual implementations. Communication via MQTT forms the basis of the used communication between the sensor-applications and the real-time controller depicted in **Figure 3**. In the left side of the figure, the sensor-applications are situated, consisting of a Kinect system for motion data, inertial motion units (IMU) for the detection of movements of body and objects in a fixed sequence for the analysis of a workout in a kitchen, motion sensors/actuators in an active intelligent prothesis, a camera based heart rate and breathe sensor, and finally a special sensor of volatile organic components in the room air. Prothesis, body and objects sensors are connected via smartphone and Bluetooth low energy. While the smartphone transfers the data to the real-time controller.

In total, the seven sensor components are listed there on the left. The active prosthesis, the heart rate measurement, the respiratory rate measurement, the detection of VOC components in the breathing air, the detection of movement in the room and the measurement of room temperature and humidity, as well as the use of the emergency button, uses the corresponding network structure according to the blocks shown in the sketch.

After the individual implementations of the interfaces a suitable software communication server was selected. The MQTT protocol [19] was implemented using a real-time capable Linux variant. Suitable hardware was procured by the project partner of the Harz University of Applied Sciences, a suitable operating system was installed and the MQTT software server "mosquitto" [20] was installed and configured. The definition of topics (message channels) and the specification of the data formats were necessary for smooth communication of the individual partner realizations "in-itself" and "with each other." A detailed description of the

**N. Description Partner Technology**

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

HSH IMU (Object) Smartphone

HSH IMU (Body) Smartphone

TUD HSH Camera based Vital parameter sensors Real-time controller

OvGU HSH Kinect-System Real-time controller

HO HSH VOC air sensor Real-time controller

HSH HSH Smartphone Real-time controller

Communication system Cloud-System 4G + 4G +

Real-time controller Cloud-System

WAN WAN

Cloud-System Terminal

WAN WAN

WAN WAN

Prothesis Real-time controller

OvGu OvGu Kinect-System Terminal

OBO HSH Prothesis Smartphone

S1 Communication interface between IMU's (Object) and a smartphone

S2 Communication interface between IMUs (body) and a smartphone

S3 Communication interface between a prosthesis and a smartphone

S4 Communication interface between the camera-based vital sensors and the real-time controller

S5 Communication interface between the Kinect system and the real-time controller

S6 Communication interface between the VCO air sensor and the real-time controller

S7 Communication interface between the smartphone and the real-time controller

S8 Communication system between the smartphone and the cloud system

S9 Communication system between the real-time controller and the cloud system

S10 Communication system between the cloud system and the end device

S11 Communication system between the Kinect system and the end device

S12 Communication system between the prosthesis and the real-time controller

**Table 1.**

**11**

HSH EXE HO

HSH URO HO

URO HO

EXE OBO OvGu

OBO HSH/ OBO

*Overview of network interface parts used in fast care.*

URO BST

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

URO BST OvGu

**Figure 3.** *Network topology.*

communication formats between the sensors built by the partners and the MQTT server can be found in the final design plan of the fast care project [21].

At the beginning of the project, the communication protocols that should be used between the individual project partners for data exchange have been discussed and clearly defined (see **Table 1**). The interfaces for the network used in the project are essentially the Bluetooth LE transmission, the Wi-Fi transmission and the wired transmission via Ethernet 802.3. Furthermore, wireless transmission via LTE or 4G plus was used by several partners. This resulted in a very broad transmission application scenario. An overview of the transmission technology of the sensor infrastructure to the real-time controller and the forwarding to the real-time visualization is depicted in **Figure 4**.

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*


#### **Table 1.**

communication formats between the sensors built by the partners and the MQTT

At the beginning of the project, the communication protocols that should be used between the individual project partners for data exchange have been discussed and clearly defined (see **Table 1**). The interfaces for the network used in the project are essentially the Bluetooth LE transmission, the Wi-Fi transmission and the wired transmission via Ethernet 802.3. Furthermore, wireless transmission via LTE or 4G plus was used by several partners. This resulted in a very broad transmission application scenario. An overview of the transmission technology of the sensor infrastructure to the real-time controller and the forwarding to the real-time

server can be found in the final design plan of the fast care project [21].

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

visualization is depicted in **Figure 4**.

**Figure 3.** *Network topology.*

**10**

*Overview of network interface parts used in fast care.*
