**2. In vivo insect experiments and data acquisition**

In this section, a unified methodology is introduced for the surface reconstruction of insect wings during free flying motion. The currently proposed method eliminates all rigid wing assumptions while minimizing the total number of tracking points in the outputted highspeed images from the photogrammetry system. The objective is to obtain a reconstructed insect modeled to capture the details of the real insect as much as possible, which will have implications in the flight aerodynamics.

#### **2.1. High-speed videography**

Image sequences of free-flying insects are collected using three synchronized Photron Fastcam SA3 60 K high-speed cameras capable of up to 1000 frames per second at a resolution of 1024 × 1024 pixels resolution with a shutter speed of 2 μs. Three cameras are fixed on an aluminum framework, as shown in **Figure 4a**. This setup allows us to reconstruct the insects' motion in a virtual space (**Figure 4b**).The framework and foundation can ensure that cameras are aligned orthogonal to each other on an optical breadboard. The slotted channels in framework allow us to adjust the distance between cameras and insects. For providing excellent temporal and spatial resolution, the cameras are positioned 1.5 m away from the dragonfly based on the body size and flapping frequency of the specimen. The optical breadboard not only allows us to mount our hardware to a sturdy anchor but also minimizes vibrations that will occur within the system. For the lighting system, two halogen photo optic lamps (OSRAM, 54,428) are chosen for our experiment.

**Figure 4.** High-speed camera system setup in the laboratory (a) and in a virtual space (b).

After the camera system is installed, we use a daisy-chain method to loop the cameras to one another and trigger them by an external transistor-transistor logic (TTL) signal. This triggering method is an efficient way to minimize camera delay and maximize the response time. It allows us to capture images of a free-flying insect being synchronized and triggered simultaneously in three directions. The camera system can configure any standard personal computer, and the recorded sequences from the three cameras are stored locally on each camera's internal memory. All the data captured by the cameras are downloaded simultaneously to a desktop computer for subsequent analysis. The video sequences are named by using the number of cameras. Based on the quality of the image, the usable segments are identified for reconstruction.

The camera system was validated by evaluating the projection errors of filming geometries. The validation results show that there are less than 0.5 degrees in error over the entire filming data, and thus, the perspective errors can be ignored. Even though this system can minimize the human error associated with triggering the cameras, there are still some difficulties during the data collection. First, it is impossible to completely overcome the fact that the cameras are delayed from frame to frame by a small variation. Moreover, wing surface reflections caused by the lighting system sometimes make it difficult to identify the marker point clearly in the images. Since the flight paths of insects are unpredictable, capturing the true voluntary flight motion in a certain focus range of cameras is the greatest challenge.

#### **2.2. 3D surface reconstruction**

during free flight. The small wing size, fast flapping motion, and unpredictability of insect

In this section, a unified methodology is introduced for the surface reconstruction of insect wings during free flying motion. The currently proposed method eliminates all rigid wing assumptions while minimizing the total number of tracking points in the outputted highspeed images from the photogrammetry system. The objective is to obtain a reconstructed insect modeled to capture the details of the real insect as much as possible, which will have

Image sequences of free-flying insects are collected using three synchronized Photron Fastcam SA3 60 K high-speed cameras capable of up to 1000 frames per second at a resolution of 1024 × 1024 pixels resolution with a shutter speed of 2 μs. Three cameras are fixed on an aluminum framework, as shown in **Figure 4a**. This setup allows us to reconstruct the insects' motion in a virtual space (**Figure 4b**).The framework and foundation can ensure that cameras are aligned orthogonal to each other on an optical breadboard. The slotted channels in framework allow us to adjust the distance between cameras and insects. For providing excellent temporal and spatial resolution, the cameras are positioned 1.5 m away from the dragonfly based on the body size and flapping frequency of the specimen. The optical breadboard not only allows us to mount our hardware to a sturdy anchor but also minimizes vibrations that will occur within the system. For the lighting system, two halogen photo optic lamps

movement complicate the tracking of the details of wing kinematics and deformation.

**2. In vivo insect experiments and data acquisition**

implications in the flight aerodynamics.

6 Flight Physics - Models, Techniques and Technologies

(OSRAM, 54,428) are chosen for our experiment.

**Figure 4.** High-speed camera system setup in the laboratory (a) and in a virtual space (b).

**2.1. High-speed videography**

For reconstructing the wing kinematics and deformation, each insect wing is marked with a fine-tipped permanent marker before shooting the videos (e.g., a dragonfly as shown in **Figure 5a**). Since the added weight of the ink on the surface of the wing is small, we assume it is negligible and does not affect the flight performance. For an arbitrary point on a dragonfly's wing in each frame, we use the perspective projection method to decide its location in multiple projection planes. The photogrammetry system is used to capture the insect in flight.

The initial 3D wing template models are generated with Catmull-Clark subdivision surfaces by using a computer graphics software Autodesk Maya (as shown in **Figure 5b**). Based on the high-speed films, we align first-level vertices of the subdivision surface hierarchy corresponding to the marker points on insects' wings (e.g., dragonfly's forewings and hindwings). After the initial template surfaces of wings are generated, they are recorded as a keyframe animation. By repeatedly adjusting the anchor point-based alignment process along with each axis for each time step, the first level vertices of wings are completed. Although the whole process of wing reconstruction is a bit labor intensive, it is currently the only effective way to reconstruct a deformable, quad-winged insect in free flight. **Figure 6** presents the front and side views of the reconstructed wings overlapping with the corresponding high-speed images. Thus, the approximation of the 3D wing shapes such as spanwise bend, chordwise bend, and twist can be captured with smooth subdivision surface representation. Comparing with tethered insects, free-flying insects present many challenges to the surface reconstruction work due to the nonlinear translation and rotation motion, especially during turning maneuvers. **Figure 7** visualizes a reconstructed motion of the dragonfly during the free-flight maneuver at selected instants.

**Figure 5.** Initial configuration of a dragonfly template mesh. (a) Dragonfly with marker points on its wings. (b) Wing and body template models [49].

#### **2.3. Numerical method**

To study the aerodynamics of free-flight insects, the flow fields were generated by direct numerical simulations of the three-dimensional unsteady, viscous incompressible Navier-Stokes equations, as written in the following equations.

$$\frac{\partial u\_i}{\partial x\_i} = \mathbf{0} \tag{1}$$

$$\frac{\partial \, u\_i}{\partial t} + \frac{\partial \left< u\_i, u\_j \right>}{\partial \, x\_j} = -\frac{\partial \, p}{\partial \, x\_i} + \frac{1}{Rc} \frac{\partial}{\partial \, x\_j} \left( \frac{\partial \, u\_i}{\partial \, x\_j} \right) \tag{2}$$

where *ui* (*i* = 1, 2, 3) are the velocity components in the x-, y-, and z-directions, respectively, *p* is the pressure, and *Re* is the Reynolds number.

The above Navier-Strokes equations are discretized using a cell-centered, collocated (nonstaggered) arrangement, where the velocity components and pressure are located at the same physical location. The equations are then solved by using the fractional step method. The discretization of the convective terms and diffusion terms are achieved by using an Adams-Bashforth scheme and an implicit Crank-Nicolson scheme, respectively. The immersed boundary method is a computational method used to simulate fluid flow over bodies which are embedded within a Cartesian grid. It eliminates the need for the complicated re-meshing

**Figure 6.** Reconstructed wings at a time step where a large amount of twist and camber is present in multiple wings.

Learning from Nature: Unsteady Flow Physics in Bioinspired Flapping Flight http://dx.doi.org/10.5772/intechopen.73091 9

**Figure 7.** Motion reconstruction of dragonfly turning maneuver. The side panels show 4 of 116 frames recorded by highspeed videography [49].

algorithms and reduces the computational cost for the mesh generation in each time step that is usually employed by conventional body-conformal methods. More details for this current numerical approach can be found in [50]. The current in-house solver has also been validated by simulating canonical revolving/flapping plates [51–56], the flapping wings of insects [14, 49, 57, 58], and physiological flows [59].
