*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

In the following subsections all of the used hardware and all sensors/actors are

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

Rapid and intelligent sensors and actuators, an improvement of motion pattern recognition and intelligent algorithms for real-time network integration in three demonstrators of the AAL-Lab serve as solution approaches. Within the fast care project, a real-time network integration with demonstrators is to be carried out at the AAL-Lab of the Harz University. The various partial results of the project partners have been collected and integrated in the AAL-Lab. The integration at the AAL-Lab will be performed with the focus on user friendliness and the interaction with him by means of a show flat. **Figure 5** illustrates the realized structure of the AAL-Lab with various elements for monitoring and evaluation of the measured vital data. The lab includes the following parts: Sensors on the walls: Pulse, Blood pressure, breathing frequency, Motion/position, VOC breath analysis, e-rehabilitation

In **Figure 6** you can see the laboratory, including a sofa, several armchairs, a bed

and all the sensor components that were attached to the room, as shown in the **Figure 5**. The room has been deliberately designed like an old room to create a pleasant atmosphere for the examinations. After the technology was installed, the

The Kinect sensor used by the Otto von Guericke University in fast care is a physical device with depth sensor technology, integrated color camera, infrared transmitter and microphone array that detects the position and movement of people and voices. **Table 2** shows the data of the KINECT depth sensor, while **Figure 7** shows the workout scene. The application is to make a therapeutically workout with the patient and give him in real-time information and helpful feedback to move him

*AAL lab of the Harz university; sketch of installations; (a) sensors on the walls: Pulse, blood pressure, breathing frequency, skin resistance, motion/position, VOC breath analysis, (b) E-rehabilitation, (c) real-time controller.*

collected and described.

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

*2.3.1 AAL lab installation*

workout and the real-time controller PC.

*2.3.2 E-rehabilitation system*

**Figure 5.**

**13**

acceptance tests were carried out in this environment.

After the data has been transferred to the real-time controller, the data is available in the form of JSON objects that were stored on the Linux system of the server. At the same time, an integrative situation analysis of the sensor data is carried out and the corresponding information is transferred to the real-time visualization via the public network to a cloud server, which generates a website with the correspondingly evaluated real-time data in the form of an Avatar.

#### **2.3 Hardware, sensors, actors**

In this part all of the hardware components which have been developed in the project are described. On the one hand, this includes sensors with the task of capturing a physical measured variable like motion, VOC gas, heart rate, etc. Furthermore, sensor modules have been developed with implemented combined sensors which form a functional unit with actuators e.g. the electronically controllable lower leg prosthesis. For a better overview of the components used by the individual partners, a matrix of the use of all partners and their network interfaces was created. (See **Table 2**).


**Table 2.**

*Types of hardware components used by the cooperation partners.*

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*

In the following subsections all of the used hardware and all sensors/actors are collected and described.

### *2.3.1 AAL lab installation*

Rapid and intelligent sensors and actuators, an improvement of motion pattern recognition and intelligent algorithms for real-time network integration in three demonstrators of the AAL-Lab serve as solution approaches. Within the fast care project, a real-time network integration with demonstrators is to be carried out at the AAL-Lab of the Harz University. The various partial results of the project partners have been collected and integrated in the AAL-Lab. The integration at the AAL-Lab will be performed with the focus on user friendliness and the interaction with him by means of a show flat. **Figure 5** illustrates the realized structure of the AAL-Lab with various elements for monitoring and evaluation of the measured vital data. The lab includes the following parts: Sensors on the walls: Pulse, Blood pressure, breathing frequency, Motion/position, VOC breath analysis, e-rehabilitation workout and the real-time controller PC.

In **Figure 6** you can see the laboratory, including a sofa, several armchairs, a bed and all the sensor components that were attached to the room, as shown in the **Figure 5**. The room has been deliberately designed like an old room to create a pleasant atmosphere for the examinations. After the technology was installed, the acceptance tests were carried out in this environment.

#### *2.3.2 E-rehabilitation system*

After the data has been transferred to the real-time controller, the data is available in the form of JSON objects that were stored on the Linux system of the server. At the same time, an integrative situation analysis of the sensor data is carried out and the corresponding information is transferred to the real-time visualization via the public network to a cloud server, which generates a website with the corre-

In this part all of the hardware components which have been developed in the

**Camera**

**OvGU** + + + **URO** + + ++++ +++ +++

**EX** +++ + + ++

**OBO** + ++

**HO** + ++

**VOC Sen.**

**Smart**

**phone**

**Real-time**

**controller**

**Cloud**

**Terminal**

project are described. On the one hand, this includes sensors with the task of capturing a physical measured variable like motion, VOC gas, heart rate, etc. Furthermore, sensor modules have been developed with implemented combined sensors which form a functional unit with actuators e.g. the electronically controllable lower leg prosthesis. For a better overview of the components used by the individual partners, a matrix of the use of all partners and their network interfaces was

spondingly evaluated real-time data in the form of an Avatar.

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

**2.3 Hardware, sensors, actors**

**Figure 4.**

*Network infrastructure [22].*

created. (See **Table 2**).

**Kinect**

**BST** + +

**Table 2.**

**12**

**IMUs (Body)**

**IMUs**

*Types of hardware components used by the cooperation partners.*

**(Object)**

**Prothesis**

**HSH** +++ **TUD** + ++ +

The Kinect sensor used by the Otto von Guericke University in fast care is a physical device with depth sensor technology, integrated color camera, infrared transmitter and microphone array that detects the position and movement of people and voices. **Table 2** shows the data of the KINECT depth sensor, while **Figure 7** shows the workout scene. The application is to make a therapeutically workout with the patient and give him in real-time information and helpful feedback to move him

#### **Figure 5.**

*AAL lab of the Harz university; sketch of installations; (a) sensors on the walls: Pulse, blood pressure, breathing frequency, skin resistance, motion/position, VOC breath analysis, (b) E-rehabilitation, (c) real-time controller.*

**Figure 6.** *Photograph of AAL lab.*

in the right way. Additionally, a gait analysis [23, 24] can be performed by the use of IMUs positioned at the feet, shown in **Figure 7**. More detailed information can be found by Stoutz et al. in [25] (**Table 3**).

#### *2.3.3 Inertial measurement unit (IMU)*

The IMU used by the project partners "Otto Bock HealthCare GmbH", "Otto von Guericke University" and "University of Rostock" describes an initial measuring unit. It is a self-contained measuring system that continuously records, analyzes, and, if necessary, pre-processes defined physical parameters (e.g. movement, acceleration, pressure, etc.) and forwards them to downstream communication and network protocols (see **Figure 8**). A distinction is made between two application modes. On the one hand, the IMUs on an object e.g. be installed in a kitchen appliance [26], which describes the use of "IMU on object" and provides measurement data for further analysis. Another area of application is the use of an IMU through suitable holders on the body of a person, which in turn describes the use of the "initial sensor on body" and also provides measurement data for further analysis [27, 28]. The project partner "Bosch Sensortec GmbH" [29, 30] developed and produces the IMU's used in the fast care project [31].

**Figure 7.**

*measurement.*

Depth sensor 512 424, 30 Hz FOV: 70 60 One-Modus: 0.5–4.5 m

1080p-Color Camera

512 424, 30 Hz

**Table 3.**

**15**

30 Hz (15 Hz in poor lighting conditions)

Neue aktive Infrarot-Funktionen

*Setup of the gait measurements for e-rehabilitation of Otto von Guericke university; above left: IMU application at the feet; above right: Therapeutic movements with avatar; lower middle: Presentation of gait analysis*

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems*

*DOI: http://dx.doi.org/10.5772/intechopen.93735*

Multi-Array-Microphone Four microphones etc. to find the sound source and the

Interfaces Kinect AUX (USB)

Optimized 3D visualization, detection of smaller objects in

IR functions for lighting independent observations

particular and stable body tracking

Camera with 1080p resolution

direction of the audio wave

Kinect2 AUX (USB)

**Feature Description**

*Data of the used KINECT sensor system for e-rehabilitation.*

#### *2.3.4 Camera-based vital parameter sensor*

The camera-based vital sensors [32, 33] used by the project partner of the "Technical University Dresden" [34–36] are based on one or more camera systems with an associated, spectrally controllable lighting system and generate a spatial image of the surroundings as a database for further evaluations. Camerabased photoplethysmography (cbPPG) remotely detects the volume pulse of cardiac ejection in the peripheral circulation. The system does measure the heart rate, the breath rate with a camera system contactless in real time. More detailed information's are described in the work of the Technical University of Dresden, Institute of Biomedical Technologies of Zaunseder et al. [37, 38]. The camera-based system records the change in the movement of the surface of the face in a fast data recording (see **Figure 9**).

The exposure with an LED light source with a special spectral range is necessary to obtain a particularly good contrast. The raw image data are sent directly to a controller and evaluated there. The evaluated data (heart rate, respiratory rate) are transferred directly as a JSON object to the real-time controller via Ethernet cabling at 1 Gb/s and stored there in the MQTT server. The representation of the respiratory rate and the heart rate is then realized in real time in the Avatar (see Sensor Data Visualization 2.4).

*Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems DOI: http://dx.doi.org/10.5772/intechopen.93735*

#### **Figure 7.**

in the right way. Additionally, a gait analysis [23, 24] can be performed by the use of IMUs positioned at the feet, shown in **Figure 7**. More detailed information can be

*Data Acquisition - Recent Advances and Applications in Biomedical Engineering*

The IMU used by the project partners "Otto Bock HealthCare GmbH", "Otto von Guericke University" and "University of Rostock" describes an initial measuring unit. It is a self-contained measuring system that continuously records, analyzes, and, if necessary, pre-processes defined physical parameters (e.g. movement, acceleration, pressure, etc.) and forwards them to downstream communication and network protocols (see **Figure 8**). A distinction is made between two application modes. On the one hand, the IMUs on an object e.g. be installed in a kitchen appliance [26], which describes the use of "IMU on object" and provides measurement data for further analysis. Another area of application is the use of an IMU through suitable holders on the body of a person, which in turn describes the use of the "initial sensor on body" and also provides measurement data for further analysis [27, 28]. The project partner "Bosch Sensortec GmbH" [29, 30] developed and

The camera-based vital sensors [32, 33] used by the project partner of the "Technical University Dresden" [34–36] are based on one or more camera systems with an associated, spectrally controllable lighting system and generate a spatial image of the surroundings as a database for further evaluations. Camerabased photoplethysmography (cbPPG) remotely detects the volume pulse of cardiac ejection in the peripheral circulation. The system does measure the heart rate, the breath rate with a camera system contactless in real time. More detailed information's are described in the work of the Technical University of Dresden, Institute of Biomedical Technologies of Zaunseder et al. [37, 38]. The camera-based system records the change in the movement of the surface of the face in a fast data

The exposure with an LED light source with a special spectral range is necessary

to obtain a particularly good contrast. The raw image data are sent directly to a controller and evaluated there. The evaluated data (heart rate, respiratory rate) are transferred directly as a JSON object to the real-time controller via Ethernet cabling at 1 Gb/s and stored there in the MQTT server. The representation of the respiratory rate and the heart rate is then realized in real time in the Avatar (see Sensor Data

found by Stoutz et al. in [25] (**Table 3**).

**Figure 6.**

*Photograph of AAL lab.*

*2.3.3 Inertial measurement unit (IMU)*

produces the IMU's used in the fast care project [31].

*2.3.4 Camera-based vital parameter sensor*

recording (see **Figure 9**).

Visualization 2.4).

**14**

*Setup of the gait measurements for e-rehabilitation of Otto von Guericke university; above left: IMU application at the feet; above right: Therapeutic movements with avatar; lower middle: Presentation of gait analysis measurement.*


#### **Table 3.**

*Data of the used KINECT sensor system for e-rehabilitation.*
