**2.4 Sensor data visualization**

The project partners agreed to the technical implementation of the data fusion on the planned real-time server and the development of a user interface. After the data collection of all partners, these data are evaluated centrally on the real-time controller. The user should receive feedback about the obtained information. This feedback is based on the visualization of the situation analysis. The main view of the real-time visualization is shown in **Figure 15**. With its end customer platform, Exelonix GmbH forms the technological basis for the visualization in the fast care project. All sensor data collected in the MQTT server of the Harz University of Applied Sciences are evaluated using the Axel Onyx and Customer Platform, and all sensor data collected in the MQTT server of the Harz University are collected using the end customer platform from Exelonix. The sensor data were evaluated and visualized in a web page to which only the project partners had access. The transformation and preparation of the "technical information and data packets" received on the "real-time controller" was realized into a form that can be interpreted by those in need of care, relatives and experts. Among other things, time courses and

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

The visualization is shown in **Figure 15**. An Avatar appears on the left, in which both, the heart rate and the breathing rate are shown optically in a movement of the heart and chest. On the right side of the picture there is a heart with the heart rate and with a lung that the respiratory rate. Furthermore, the data of the Exelonix sensor as well as the emergency button status, the room temperature and the room humidity are shown. An indication of the condition of the indoor air is shown directly below these displays, in this case the icon of a green cloud shows that the

Additional sensor data is depicted on the Avatar sketch. In the hip, knee and ankle area of the legs, the information about the energetic states of the batteries of the IMUs for recording the posture and knee angle is shown. The measured knee angle from the leg with the prosthesis is shown online in the graphic on the right,

The measurement of the gait parameters of the patient, which is also recorded by the IMUs on the hips, knees and ankles (see Section 2.3.3), can be seen online to the right of the two icons on the gait width and lifting height of the foot. This allows

In addition to this main page of the real-time display, a sub-page has been created for each application of the partners, in which the details of the individual

where the knee angle is shown in degrees over time while walking.

the gait to be assessed and improved in situ for rehabilitation purposes.

histories are added.

**21**

**Figure 15.**

indoor air is in good condition.

*Real-time visualization of the measured sensor data.*

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

**Figure 14.** *Real-time controller with MQTT server.*

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*

**Figure 15.** *Real-time visualization of the measured sensor data.*

A second sensor that also works via NbIoT transmission is a motion-sensitive sensor. This has been installed to register movements in the room and additionally to transmit the room temperature and air pressure to the real-time server via the public radio network. In this case, too, the data is transmitted in JSON format. Further information on the exact key data of the sensors can be found in the

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

Within the fast care project, the Harz University of Applied Sciences developed

The project partners agreed to the technical implementation of the data fusion on the planned real-time server and the development of a user interface. After the data collection of all partners, these data are evaluated centrally on the real-time controller. The user should receive feedback about the obtained information. This feedback is based on the visualization of the situation analysis. The main view of the

a real-time platform for the sensor data fusion of the partial realizations of the partners. For this purpose, a Linux-based application server was configured based on a communication protocol (MQTT) selected for the project. This "real-time controller", on which all information converges, forms the central "sensor data fusion". The device includes a rack mounted server PC with Intel I7 topology and a memory of 16 GByte 1600 MHz DDR3 which is depicted in **Figure 14**. The LINUX version is "Red Hat Enterprise Linux Server release 7.7 (Maipo)". The network interfaces are two 1 GB IEEE 802.3 and a "Realtek Semiconductor Co., Ltd. RTL8192EE PCIe Wireless Network Adapter". More detailed information can be

found in [21] the so called final design plan of the fast care project.

publications by Stege et al. [45–48].

*2.3.9 Real-time controller*

**2.4 Sensor data visualization**

**Figure 14.**

**20**

*Real-time controller with MQTT server.*

real-time visualization is shown in **Figure 15**. With its end customer platform, Exelonix GmbH forms the technological basis for the visualization in the fast care project. All sensor data collected in the MQTT server of the Harz University of Applied Sciences are evaluated using the Axel Onyx and Customer Platform, and all sensor data collected in the MQTT server of the Harz University are collected using the end customer platform from Exelonix. The sensor data were evaluated and visualized in a web page to which only the project partners had access. The transformation and preparation of the "technical information and data packets" received on the "real-time controller" was realized into a form that can be interpreted by those in need of care, relatives and experts. Among other things, time courses and histories are added.

The visualization is shown in **Figure 15**. An Avatar appears on the left, in which both, the heart rate and the breathing rate are shown optically in a movement of the heart and chest. On the right side of the picture there is a heart with the heart rate and with a lung that the respiratory rate. Furthermore, the data of the Exelonix sensor as well as the emergency button status, the room temperature and the room humidity are shown. An indication of the condition of the indoor air is shown directly below these displays, in this case the icon of a green cloud shows that the indoor air is in good condition.

Additional sensor data is depicted on the Avatar sketch. In the hip, knee and ankle area of the legs, the information about the energetic states of the batteries of the IMUs for recording the posture and knee angle is shown. The measured knee angle from the leg with the prosthesis is shown online in the graphic on the right, where the knee angle is shown in degrees over time while walking.

The measurement of the gait parameters of the patient, which is also recorded by the IMUs on the hips, knees and ankles (see Section 2.3.3), can be seen online to the right of the two icons on the gait width and lifting height of the foot. This allows the gait to be assessed and improved in situ for rehabilitation purposes.

In addition to this main page of the real-time display, a sub-page has been created for each application of the partners, in which the details of the individual sensor elements and their operation are compressed. The details of the real-time visualization of the partners can be seen especially in the final design plan, which can be found in the publication of Kußmann et al. [21].

### **3. User acceptance studies**

In addition to the technical development activities, an analysis of acceptance was executed at the AAL-Lab of the Harz University. As a result of the project, fast care wants to develop feasible products and create the medical fundamentals for an interaction (feedback) in real time.

The project partners agreed to the technical implementation of the data fusion on the planned real-time server and the development of a user interface. This is done in addition to the workload of the integration of all technical components and the planed example application. After the data collection of all partners, these data are evaluated centrally on the real-time controller. The user should receive feedback about the obtained information. This feedback is based on the visualization of the situation analysis.

In the analysis of acceptance of the system, a small sample of a total of 20 subjects from different age groups was interviewed. The following figure shows the distribution by gender and age (**Figure 16**). Although this study is not representative, it gives a first insight into the valuation of the developed technology.

During the survey, the subjects had to assess both the individual systems of the project partners and the overall system. The survey results of the entire system were very positive. 60% of the respondents stated, that they would like to use the technology privately, 70% of the respondents would like to have access to the technology, 35% would be willing to buy the presented technology and 95% see a great benefit for themselves and for others in the tested technology (see **Figure 17**).

In another part of the test, the sample's affinity for technology was queried. On average, the confidence "in your own skills" when dealing with new technology was rated with 3.33 out of 5 points, the willingness to use new and unknown technology with 4 out of 5 points and the degree of technical overload with only 2.13 out of 5 Points. As a result, the test subjects showed a great willingness to use new technologies and did not feel overwhelmed with the used technology (see **Figure 18**).

**Figure 19** illustrates, that the subsystem of the project partner Otto Bock was rated positively by the test subjects. The success of the measurement was rated on average with 4.35 out of 5 points, the success of the calibration with 3.97 out of 5 points and the intelligibility of the display with 3.27 out of 5 points. The women rated the manageability of the system with 4.08 out of 5 points slightly better than

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

The gait analysis of the project partner of the Otto von Guericke University was rated as very positive by the subjects with 4.27 out of 5 points. The technology used by the OvGU Kinect system with 3.9 out of 5 points. The more the test subjects were overwhelmed with the technology, the more negative the system was rated (see

the men with 3.44 out of 5 points.

*Technical affinity of the test persons.*

**Figure 20**).

**23**

**Figure 18.**

**Figure 17.**

*Use of the presented technologies.*

**Figure 16.** *Age and gender distribution of the testing persons.*

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*


#### **Figure 17.**

sensor elements and their operation are compressed. The details of the real-time visualization of the partners can be seen especially in the final design plan, which

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

In addition to the technical development activities, an analysis of acceptance was executed at the AAL-Lab of the Harz University. As a result of the project, fast care wants to develop feasible products and create the medical fundamentals for an

The project partners agreed to the technical implementation of the data fusion on the planned real-time server and the development of a user interface. This is done in addition to the workload of the integration of all technical components and the planed example application. After the data collection of all partners, these data are evaluated centrally on the real-time controller. The user should receive feedback about the obtained information. This feedback is based on the visualization of the

In the analysis of acceptance of the system, a small sample of a total of 20 subjects from different age groups was interviewed. The following figure shows the distribution by gender and age (**Figure 16**). Although this study is not representa-

During the survey, the subjects had to assess both the individual systems of the project partners and the overall system. The survey results of the entire system were very positive. 60% of the respondents stated, that they would like to use the technology privately, 70% of the respondents would like to have access to the technology, 35% would be willing to buy the presented technology and 95% see a great benefit for themselves and for others in the tested technology (see **Figure 17**). In another part of the test, the sample's affinity for technology was queried. On average, the confidence "in your own skills" when dealing with new technology was rated with 3.33 out of 5 points, the willingness to use new and unknown technology with 4 out of 5 points and the degree of technical overload with only 2.13 out of 5 Points. As a result, the test subjects showed a great willingness to use new technologies and did not feel overwhelmed with the used technology (see **Figure 18**).

tive, it gives a first insight into the valuation of the developed technology.

can be found in the publication of Kußmann et al. [21].

**3. User acceptance studies**

interaction (feedback) in real time.

situation analysis.

**Figure 16.**

**22**

*Age and gender distribution of the testing persons.*

*Use of the presented technologies.*

**Figure 18.** *Technical affinity of the test persons.*

**Figure 19** illustrates, that the subsystem of the project partner Otto Bock was rated positively by the test subjects. The success of the measurement was rated on average with 4.35 out of 5 points, the success of the calibration with 3.97 out of 5 points and the intelligibility of the display with 3.27 out of 5 points. The women rated the manageability of the system with 4.08 out of 5 points slightly better than the men with 3.44 out of 5 points.

The gait analysis of the project partner of the Otto von Guericke University was rated as very positive by the subjects with 4.27 out of 5 points. The technology used by the OvGU Kinect system with 3.9 out of 5 points. The more the test subjects were overwhelmed with the technology, the more negative the system was rated (see **Figure 20**).

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*


The implemented sensor structure records the heart rate, the breathing rate, the VOC content of the room air, analyzes the gait for rehabilitation and measures the temperature and humidity in the room. An emergency button has also been

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

An active prosthetic foot was used as a special application of the sensor-actor System. Its running parameters can be measured online, and the prosthesis can automatically adapt to the floor covering and the running demands via the network. This means that users have an intelligent active prosthesis at their disposal to help

It was shown that even with a heterogeneous network consisting of the components WiFi, Bluetooth LE, Gigabit LAN and 4G+, real-time operation was possible for the use of the AAL components. Even the display of the measured data, which was transferred to a website via the cloud, only showed latencies of an additional few milliseconds. This made it possible to create a real-time image in the form of an Avatar for all vital parameters and the automatic setting of the active prosthetic

In addition to the technical development activities, an analysis of acceptance was

Unfortunately, some slow network technologies such as Bluetooth LE had to be used to carry out the project. It is to be expected, that with the full expansion of the networks to the fifth generation (5G), there will still be a significant leap in transmission speed and transmission quality. It is therefore to be expected that eHealth applications in the home area can be implemented in real time in the near future. After the data fusion, further processing with the help of the artificial intelligence will bring further benefits to the client for the prevention of his physical and mental

The fast care project was supported by the German Federal Ministry of Education and Research in the program "Zwanzig20 – Partnerschaft für Innovation", contract no. 03ZZ0519I. It was carried out in the form of a joint project with eight partners and a project coordinator. We thank all fast care project partners for their contributions to this work personally listed in the following: Thomas Kirste, Christian Haubelt, Albert Hein, Florian Grützmacher from University Rostock, Ernst Albrecht-Laatsch, Bernhard Graimann, Martin Schmidt and Katharina Olze from Ottobock, Alexander Trumpp, Daniel Wedekind, Martin Schmidt, Sebastian Zaunseder, Hagen Malberg from Technische Universität Dresden, Christian Reinboth and Jens-Uwe Just from HarzOptics, Matthias Stege, Frank Schäfer, Tristan Heinig and Sascha Huth from Exelonix, Rainer Dorsch from Bosch Sensortec, Lutz Schega, Sebastian Stoutz and Kim-Charline Broscheid from

executed at the demonstrator in the AAL-laboratory. The survey results of the entire system were very positive. 60% of the respondents stated, that they would like to use the technology privately, 70% of the respondents would like to have access to the technology, 35% would be willing to buy the presented technology and 95% see a great benefit for themselves and for others in the tested technology.

foot, which enables the client to notice his physical condition in situ.

integrated.

health.

**25**

**Acknowledgements**

Otto-von-Guericke-Universität Magdeburg.

them cope with everyday life more easily.

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

#### **Figure 19.**

*Evaluation of the application of the active prothetic foot.*

#### **Figure 20.** *Evaluation of the applications of the demonstrators of OvGU and TU Dresden.*

Analyzing the system of the TU Dresden, the success of the measurement was rated 4.05 out of 5 points and the comprehensibility of the instructions with 4.05 out of 5 points. The comprehensibility of the instructions was more incomprehensible for the test subjects when they were overwhelmed by the technology. The intelligibility of the display and the results was rated with 3.58 out of 5 points (see **Figure 20**).
