3. Equidistant projection function for fisheye lenses

Figure 2. The principles for various lenses: (a) shows different lens projections, p, p1, p2, p3 and p4 are respectively perspective projection, stereographic projection, equidistance projection, equisolid angle projection and orthogonal projection; the corresponding distances between image points and the principal point are represented with r, r1, r2, r3 and r4; (b) shows the difference between pinhole lens and fisheye lens. In terms of fisheye lens, perspective image's projection on

Figure 3. Radial distortion in the 2D imaging plane: O represents the image centre, Pu, Pd ∈ R<sup>2</sup> are pixel co-ordinate vectors in the input undistorted and output distorted images, respectively, ru, rd ∈ R are the distance of Pu and Pd from

the hemisphere surface into the image plane is the actual image.

34 Smartphones from an Applied Research Perspective

centre, and θ ¼ θ<sup>u</sup> ¼ θdis the angle of OPu or, equivalently, OPd.

In order to model the perfect fisheye lens, scene projections are necessary. These can be defined by two main characteristics. Firstly, field of vision covers 2π steradians, it creates a circular image and the distortions become symmetrical with reference to centre of the image. Secondly, fisheye lens has an infinite depth of field. All objects in the image have a precise focus. Therefore, two postulates, namely the azimuth angle invariability and the equidistant projection rule, govern the formation of non-linear image distortion. These pre-suppositions explain the projection of object points into the sensor. They directly affect the eventually developing dewarping algorithm [78].

The azimuth angle invariability, which is the first postulate, determines the projection of points of the plane (which passes through the optical axis that is perpendicular to the sensor plane). The azimuth angle of the object points and their projections onto the sensor remain unchanged due to differences in the object distance or elevation within the content plane [78]. According to Ref. [79], the equidistant lens is 'preferable for measurement of incidence angles (θ) and azimuth angles. The effect of error of lens position is small, and the linear relation of radial distance (rd) and incidence angle (θ) of a ray from the three-dimensional point is convenient to analyse'.

The second postulate, the equidistant projection rule, depicts the relationship between radial distance (rd) of an image point on the sensor plane-zenith (incidence (θ)) angle which is created by the vector of image centre-world object point in Figure 4. According to this rule, there is a linear relationship between the centre to rd image point radial distance and (θ) zenith angle [78].

As the zenith angle varies from 0 to 90�, the radial distance of the corresponding image point varies linearly from 0 to a maximum value R, determined by the modelled sphere's [78].

(rd) on the image plane, which is the radial distance in equidistant projection, is directly proportional to incident ray's angle. It is equivalent to arc segment's length, which is located between the z-axis and the projection ray of point P on the sphere in Figure 5 [62].

Thus, the equidistant projection function is given in Eq. (8).

$$\mathbf{r\_d} = \mathbf{f.} \mathbf{\bar{\theta}} \tag{8}$$

where rd is the fisheye radial distance of a projected point from the centre, f is focal distance and θ represents the incidence angle of a ray which begins from the projected three-dimensional

Figure 4. Equidistant projection (Equidistant projection, θ=90 ¼ dc=R) [43].

Figure 5. Equidistant fisheye projection function representation.

point into the image plane. In fisheye cameras, following process is performed by the help of this common mapping function. The other mapping functions are stereographic, equisolid and orthogonal [80]. Eq. (9) is derived by substituting arctangent function for θ in Eq. (8). Where ru is the height of the projection on the image plane (the subscript u being used to denote the undistorted projection) [73].

$$\mathbf{r\_d} = \mathbf{f}. \arctan\left(\frac{\mathbf{r\_u}}{\mathbf{f}}\right) \tag{9}$$

In equidistant projection model, the distorted radial distance on the image plane is linearly expressed as the projected ray's angle in radians. Moreover, the length of the arc segment between z-axis and xp is equivalent to the projected distorted distance rd (xp is the intersection point of the projection ray of point X, which has the projection sphere) [73].

Most real optical systems have some undesirable effects, rendering the assumption of the pinhole camera model inaccurate. The most evident of these effects is radial barrel distortion, particularly noticeable in fisheye camera systems, where the level of this distortion is relatively extreme [62]. For most of the applications, the effect of radial distortion can be negligible in normal and narrow field of view (FOV) cameras. However, radial distortion can cause some problems in wide-angle and fisheye cameras both in terms of visual issues and in the processing of computer vision applications such as object detection, recognition and classification processes [73]. Because of the distortion of the radial lens, points on the image plane are displaced from their ideal position into rectilinear pinhole camera model in a non-linear way. The movement occurs in a radial axis from distortion centre on the equidistant image plane. The image in the foveal areas has a better resolution because of the displacement factor of fisheye optics. In addition, the peripheral areas of the image satisfy a resolution that decreases non-linearly [81].

Horizon

Horizon

Zenith angle θ (deg.)

**R**

dc

Zenith

Hemisphere

36 Smartphones from an Applied Research Perspective

Figure 4. Equidistant projection (Equidistant projection, θ=90 ¼ dc=R) [43].

Figure 5. Equidistant fisheye projection function representation.

Additional parameters to compensate for deviations of the geometric fisheye model from the physical reality are the same parameters that are applied, as they are in common use, for central perspective lenses [69]. Accordingly, the equidistant projection function with additional parameters is given in Eq. (10).

$$\mathbf{r\_d} = \mathbf{f.arctan}\left(\frac{\mathbf{r\_u}}{\mathbf{f}}\right) + \text{Additional Parameters} \tag{10}$$

Due to the particularly high levels of distortion present in fisheye cameras, there have been several alternative models developed [81]. Some models are fisheye transform, field of view, division model, and polynomial model [82]. The work in Ref. [67], investigates the addition of the brown-parameters to the basic geometric fisheye model to compensate for the remaining 'systematic effects' [82].

Three co-ordinate systems are used in order to define the projection of an object point into a hemispherical fisheye-dimensional image. These are: the superordinated cartesian object co-ordinate system (X, Y, Z) and the camera co-ordinate system (x, y, z) in Figure 6. The image coordinate system (x<sup>0</sup> , y<sup>0</sup> ) is defined similar to its usual definition in photogrammetric applications. So, the image centre becomes the origin. The x<sup>0</sup> and y<sup>0</sup> axes are parallel with the x and y axes of

Figure 6. Geometrical model of a fisheye camera.

camera co-ordinate system [67]. The geometric concept is based on the dependence of the image radius r<sup>0</sup> and the angle of incidence θ [61].

Object co-ordinates are transformed into the camera co-ordinate system. Eq. (11), where X is the co-ordinate vector in the object co-ordinate system, x is the co-ordinate vector in the camera co-ordinate system, R is the rotation matrix and X0 is the translation between object and camera co-ordinate system:

$$\mathbf{x} = \mathbb{R}^{-1}(\mathbb{X} - \mathbb{X}\_0) \tag{11}$$

The incidence angle θ in the camera co-ordinate system is defined as follows:

$$\tan \theta = \frac{\sqrt{\mathbf{x}^2 + \mathbf{y}^2}}{\mathbf{z}} \tag{12}$$

Instead of functions for the image radius r<sup>0</sup> , functions for the image co-ordinates x<sup>0</sup> and y<sup>0</sup> are required. For this purpose, Eq. (13) is applied:

$$\mathbf{r}' = \sqrt{\mathbf{x}'^2 + \mathbf{y}'^2} \tag{13}$$

After transformations of the equations described above, the final fisheye projection equations for the image co-ordinates is derived. The model equations are finally extended by the co-ordinates of the principal point x<sup>0</sup> <sup>0</sup> and y<sup>0</sup> <sup>0</sup> Eq. (14) and the correction terms Δx<sup>0</sup> and Δy<sup>0</sup> [Eqs. (15) and (16)], which contain additional parameters to compensate for systematic effects.

Equidistant projection:

$$\mathbf{x}' = \mathbf{c}. \frac{\arctan\frac{\sqrt{\mathbf{x}^2 + \mathbf{y}^2}}{x}}{\sqrt{\left(\frac{\mathbf{y}^2}{\mathbf{x}}\right)^2 + 1}} + \mathbf{x}'\_0 + \Delta\mathbf{x}' \qquad \mathbf{y}' = \mathbf{c}. \frac{\arctan\frac{\sqrt{\mathbf{x}^2 + \mathbf{y}^2}}{x}}{\sqrt{\left(\frac{\mathbf{x}}{\mathbf{y}}\right)^2 + 1}} + \mathbf{y}'\_0 + \Delta\mathbf{y}' \tag{14}$$

$$
\Delta \mathbf{x} = \mathbf{x}' \left( \mathbf{A}\_1 \mathbf{r}'^2 + \mathbf{A}\_2 \mathbf{r}'^4 + \mathbf{A}\_3 \mathbf{r}'^6 \right) + \mathbf{B}\_1 \left( \mathbf{r}'^2 + 2 \mathbf{x}'^2 \right) + 2 \mathbf{B}\_2 \mathbf{x}' \mathbf{y}' + \mathbf{C}\_1 \mathbf{x}' + \mathbf{C}\_2 \mathbf{y}' \tag{15}
$$

$$\Delta\_{\mathbf{y}} = \mathbf{y}'. \left( \mathbf{A}\_1 \mathbf{r}'^2 + \mathbf{A}\_2 \mathbf{r}'^4 + \mathbf{A}\_3 \mathbf{r}'^6 \right) + 2 \mathbf{B}\_1 \mathbf{x}' \mathbf{y}' + \mathbf{B}\_2 \left( \mathbf{r}'^2 + 2 \mathbf{y}'^2 \right) \tag{16}$$

where;

camera co-ordinate system [67]. The geometric concept is based on the dependence of the image

Object co-ordinates are transformed into the camera co-ordinate system. Eq. (11), where X is the co-ordinate vector in the object co-ordinate system, x is the co-ordinate vector in the camera co-ordinate system, R is the rotation matrix and X0 is the translation between object and

> ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi x2 <sup>þ</sup> y2 <sup>p</sup>

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi x0 <sup>2</sup> þ y<sup>0</sup> 2

q

After transformations of the equations described above, the final fisheye projection equations for the image co-ordinates is derived. The model equations are finally extended by the

[Eqs. (15) and (16)], which contain additional parameters to compensate for systematic

ðX � X0Þ (11)

<sup>z</sup> (12)

(13)

, functions for the image co-ordinates x<sup>0</sup> and y<sup>0</sup> are

<sup>0</sup> Eq. (14) and the correction terms Δx<sup>0</sup> and Δy<sup>0</sup>

<sup>x</sup> <sup>¼</sup> <sup>R</sup>�<sup>1</sup>

tanθ ¼

r 0 ¼

<sup>0</sup> and y<sup>0</sup>

The incidence angle θ in the camera co-ordinate system is defined as follows:

radius r<sup>0</sup> and the angle of incidence θ [61].

Figure 6. Geometrical model of a fisheye camera.

38 Smartphones from an Applied Research Perspective

Instead of functions for the image radius r<sup>0</sup>

co-ordinates of the principal point x<sup>0</sup>

effects.

required. For this purpose, Eq. (13) is applied:

camera co-ordinate system:

A1, A2, and A3 are radial distortion parameters,

B1 and B2 are decentric distortion parameters,

C1 and C2 are horizontal scale factor and shear factor, respectively, and

c is the camera constant, which equals to focal distance.
