**3.2 IoT sensor data structure**

As described in Section 3.1, the IoT sensor installed on the floor to identify the user's location in the indoor area is a Beacon (hereinafter referred to as Bluetooth Low Energy (BLE) in this chapter), and this sensor is based on the MAP of the indoor area that provides the route guidance service. As a result, IoT sensor mapping was done through the following appropriate zone design for each sensor:


After zoned on the MAP of the indoor area for location-based service in this way, the IoT sensors were mapped for each zone, and then, each identifier code system for each mapped sensor was designed. In this chapter, the BLE sensor standard data structure was applied in consideration of service scalability and terminal compatibility with the platform (Android and iOS). It was designed to use the identifier for classifying route guidance services for the visually impaired in the universally unique identifier (UUID) field of the data structure, the local information identifier for the area where the indoor area is located in the Major field, and the facility information identifier of the indoor area in the Minor field. The information in these two fields is configured differently depending on the characteristics of indoor areas such as railway stations, underground shopping malls, and buildings. In the figure, the allocation range means information about each zone zoned in the sensor mapping process. Each BLE sensor having information by such a standard identification code emits a radio frequency (RF) signal having physical location information by allocating it to each zone in the indoor area map. **Table 4** shows an example of designing an identifier code for the major and minor fields when the indoor area to be serviced is a metro


*IoT-Based Route Guidance Technology for the Visually Impaired in Indoor Area DOI: http://dx.doi.org/10.5772/intechopen.105549*

#### **Table 4.**

*Example for Beacon identifier code in case metro station.*

station. If the area to be serviced is not a railway station, but a different area such as an underground shopping mall, the structure of the Major and Minor fields will be adjusted according to the characteristics of the target area.

### **3.3 IoT-based positioning algorithm in indoor area**

In this chapter, the user's location in the indoor area is confirmed based on the smart braille block with the built-in BLE sensor with the data structure presented in Section 3.2. Although the user's location is identified based on a receiver signal strength indicator (RSSI) signal from a sensor installed on the floor, sensor signals of adjacent sections can be received at the same time, so a method of determining in which section the user is actually located is required. In addition, in order to increase the accuracy of the route guidance information, even if the area of the sensor where the user is located is determined, it is necessary to monitor how far away from the sensor and whether the user deviates from the set route while moving.

To measure the user's moving direction and distance from the sensor, a hybrid positioning algorithm is applied through pedestrian dead reckoning (PDR) technology, which corrects the position through various sensors built into the mobile terminal. PDR is a technique for estimating the relative position change from the previous position through the detection of a pedestrian's steps, estimation of the stride length to determine the distance traveled, and estimation of the direction to determine the direction of walking by using the measurement values of three sensors in the inertial measurement unit (IMU) built into the smartphone. For positioning error correction, Kalman filter (KF) was applied to remove the error included in the RSSI value measured by the inertial sensor of the mobile terminal, and an algorithm for correcting the accumulated error of the inertial sensor of the smartphone was applied. During positioning, error correction and algorithms are applied according to the situation such as the position of terminal, stride length, and speed. The user's current location, movement direction, and movement distance are determined using map information based on the link information between nodes of the BLE sensors mapped to the indoor area map and a hybrid positioning algorithm. Route guidance is provided to

#### **Figure 9.**

*Concept of BLE-based tracking information.*

the user through the app based on the user's current location information determined by this algorithm.

The overview of the user tracking algorithm through the BLE signal is shown in **Figure 9**. It shows the concept of tracking information and area determination when a user enters area A and then moves to area C via area B. Multiple BLE signals are simultaneously received at the user's current location; in consideration of the magnitude of these signals and the magnitude of the received signal of each signal in the previous position, the user's moving position is estimated and which current sensor zone the user is in is determined. As shown in **Figure 10**, when the user enters area A and exits area C through area B, multiple BLE signals may be received by the user's terminal, and some signals may be within the error range. The current user location is estimated in consideration of the strength of the received BLE signals, the mapped link information between the sensors, and area information from the previous location.

#### **Figure 10.**

*Calibration concept when ambient signals are measured higher.*

### *IoT-Based Route Guidance Technology for the Visually Impaired in Indoor Area DOI: http://dx.doi.org/10.5772/intechopen.105549*

As shown in **Figure 10**, based on the received sensor information, it is estimated which area the user is in now or from which area the user is moving to which area. In the figure, "A → B" means that, although it is estimated that the user is moving from area A to area B, the user is currently in area A. The part shown in **Figure 10** is the case that the signals of sensor No. 2 of A and No. 3 sensor of B and the signals of sensor No. 1 of A and No. 4 sensor of B are received within the error range, respectively; although it is ambiguous to determine where the signal is in A or B area from only this received signal, since the previous position is in area A, in this algorithm, it is determined that the user is moving to area B, while he is in region A.

As described above, after estimating the user's location as a zone first, which sensor the user is located in is estimated in detail by the method shown in **Figure 11**. It is checked whether the sensor signal received from the terminal is a signal from a valid sensor, and if it is a valid signal, it is determined as the first priority signal of the user's current location based on the received RSSI value through the above-described location correction algorithm. In addition, the sensor after ranking correction is compared with the previous tracking information to check whether there is a change, and finally, the user's tracking information is updated.

**Figure 11** is an example of a case in which the BLE signal of area C is strongly measured in area A. In this case, since the strongest signal combination is No.1 and No.2 BLE of area A, the surrounding BLE information is searched. Through this, the nearest BLE after BLE No. 1 and No. 2 is determined as No. 3 and signal No. 5 is ignored. **Figure 12** shows the flowchart for checking the user's tracking information based on the algorithm described so far. That is, it is checked whether the BLE signal received from the mobile phone is a valid BLE signal, and if it is a valid signal, it is determined as the first priority signal of the user's current location based on the received RSSI value through the above-described location correction algorithm. In addition, the BLE after ranking correction is compared with the previous tracking information to check whether there is a change, and finally, the user's tracking information is updated. That is, the BLE signal processing order for user tracking is processed according to the following order.

**Figure 11.**

*Node link of sensors when the desired route is straight line.*

**Figure 12.**

*Flowchart of user tracking information checking.*

For route guidance in an indoor area, the criteria for continuous route guidance are divided into one unit through a smart braille block with built-in IoT sensors, and separate map node information is stored in the server for each divided unit. And when the user arrives at a location where route guidance is possible, the map node information of the corresponding unit is designed to be downloaded from the server to the mobile terminal. When the user's mobile terminal detects the sensor of the smart braille block, the current location is provided to the user by voice, and basic information and brief usage of the app are provided by voice. A list of facilities for a destination reachable from the current location is provided, and when the user selects a facility corresponding to the destination, route information is set up to that facility, and route guidance information can be provided in image and voice according to the user's movement.
