4.6. First round of image filtering

This process has been implemented to remove all objects that are connected to the edge, in which these noises could cause problems for tactile detection in the image. Furthermore, pixels

Figure 6. Image conversion process. (a) Grayscale image and (b) binary image.

Figure 7. Connected components algorithm.

Figure 8. Image with filled holes inside connected components.

that are connected to the edge are more than likely to be useless in gaining any information that is required, therefore removing them would work as filtering. Figure 9 shows the image with a cleared border.

#### 4.7. Second round of image filtering

As shown in Figure 9(a), there are still far too many noises present in the image, even after the first round of image filtering. Therefore, another method has been implemented to filter the image of the noises even more. This method is to set a threshold value for the pixel size, and anything that has a pixel value below the threshold value will be removed. Figure 9(b) shows the image after it has removed all components that have a pixel value of less than 1000 pixel2 .

Vision-Based Tactile Paving Detection Method in Navigation Systems for Visually Impaired Persons 39 http://dx.doi.org/10.5772/intechopen.79886

Figure 9. Filtering process. (a) First round filtered image and (b) second round filtered image.

#### 4.8. Extract required parameters from connected components

In this phase, after the image is filtered by preprocessing, some parameters are required from the connected pixels/components that are left. The parameters required by the tactile detection algorithm are area and perimeter of the left image. A function in MATLAB will be used to determine these required parameters. Figure 10 shows the pieces of area, which is recognized, and each of the parameters found by the function on the connected pixels is shown in Table 1(a)–(f).

#### 4.9. Determining the metric

This phase will be the main part that will decide whether the connected components/pixels in the image are the potential image containing tactile or not. The metric m is determined by using Eq. (1).

$$m = \frac{4\pi A}{p^2} \tag{1}$$

where m indicates the metric, A indicates the area, and p indicates the perimeter. Therefore, it is important to obtain the required parameters, which are the area and perimeter of the connected pixels before this phase. MATLAB will calculate the metric from the parameters obtained earlier using Eq. (1). Table 2 shows the result of the metrics of the connected components/pixels.

Figure 10. Parameters (area, perimeter, centroid) of the connected pixels.


Table 1. Metric table.


Table 2. Calculation results of metric for each area.

#### 4.10. Producing auditory output based on metric

After the metric for each connected component/pixel has been calculated, the "shape" for each will be determined. In the results, it is proven that the connected pixels which have metric values in the range of 0.85–1.0 will most likely be a circle, representing the warning tactile. In contrast, the connected pixels that have metric values in the range of 0.15–0.30 will most likely be a bar, which represents the directional tactile. Figure 11(a) and (b) shows the results of the metric for warning and directional tactile images. After the results of metric values have been calculated, the system will then send a signal to the auditory output system, and notify the visually impaired people about what have been detected.

Figure 12 shows the overall process flow system hardware for the auditory output. In this case, when the warning tactile has been detected (metric value in range from 0.85 to 1.0) in MATLAB, MATLAB will send a signal to the voice module to have an auditory output saying WARNING via Arduino microcontroller, and if the directional tactile has been detected (metric value in range from 0.15 to 0.30), the auditory output would be DIRECTION. Prior to sending signal to the voice module from MATLAB via Arduino microcontroller, a serial communication between MATLAB and Arduino must first be made. After the serial communication has been established, only then will Arduino microcontroller be able to receive any signals from MATLAB when the command is being given.

Vision-Based Tactile Paving Detection Method in Navigation Systems for Visually Impaired Persons 41 http://dx.doi.org/10.5772/intechopen.79886

Figure 11. Final image results. (a) Warning tactile and (b) direction tactile.

Figure 12. System hardware control flowchart.

After the metric has been determined, a certain signal will be sent to the Arduino microcontroller, where coding is already uploaded to the Arduino microcontroller board beforehand through the Arduino I/O interface. There are three cases where the metric values are in the range 0.15–0.30, 0.85–1.0 and the range other than the mentioned metric values. For example, when the metric values of range 0.15–0.30 are found, MATLAB will send a signal to indicate DIREC-TION, and Arduino microcontroller will receive it, and execute the next command. It is the same for two other cases, where 'WARNING' voice signal will be sent if metric values of range 0.85 to 1.0, while 'ERROR' voice signal will be sent if none of this two metric range are determined.
