**4.1 Environment awareness using a spherical camera**

Rather than using several cameras with a limited Field of View (FoV) to observe the surrounding space, a 360° FoV camera [25] is used. The spherical camera records images in a "spherical format" which is comprised of two wide-angle frames stitched together to form a virtual sphere [25]. The image can be rectified to the classic distortionless rectilinear format of a pinhole camera [26]. However, due to the nature of the "spherical format," it is preferable to split the image into smaller segments and rectify each one to achieve results closer to the pinhole camera model. Instead of splitting into equal sized square segments [27], each image is split into tiles based on orientation-independent circles. With every tile having a different a-priori known calibration, the rectification can be carried out for each one independently, without high computational cost. By applying the solution and rectifying the image in **Figure 7**, for a selection of *N* = 12 tiles, the resulting rectified partitions are visualized in **Figure 8**.

For the case of collaborating drones, it is assumed that each one carries passive markers for visual recognition. Subsequently, the rectified images are processed for identification of these markers [28–31] thus estimating the neighboring drone's pose. For improved pose extraction, the solution of a multi-marker rhombicuboctahedron formation arrangement [32] is assumed to be present in each target.

The experimental setup for evaluation consists of the spherical camera mounted in a 2.7 m protruding stick, which subsequently is mounted to the underside of the octarotor using the generic mount base discussed in Section 5. A rhombicuboctahedron arrangement with markers at its faces is attached to a DJI-Mavic drone. Both UAVs were located within the MoCaS test volume, as shown in **Figure 9**. The quadrotor drone was flown in a randomized trajectory near the vicinity of the octarotor.

In **Figure 10** the relative 3D-flight path between the drones is presented. The results recorded from the MoCaS and the visual ones are shown, where for the cases of detecting the marker the relative accuracy these measurements was 2.2 cm respectively.

**Figure 7.** *Spherical flat image.*

**Figure 8.** *Rectified N* = 12 *partitions for a single "spherical" frame.*

**Figure 9.** *Experimental setup for 360° camera relative localization.*
