**3. Ultrasound-based navigation — Enabling technologies**

State of the art ultrasound imaging is crucial for guiding interventions. But unlike freehand guidance and guidance based on ultrasound guides (figure 1) having optimal images on the ultrasound scanner is not enough to enable surgical navigation. In order to use ultrasoundbased navigation to guide such procedures we usually have to:


#### **3.1. Streaming of ultrasound data**

Convenient ultrasound-based navigation of surgical instruments requires real-time access to the ultrasound data in the navigation software (figure 7). This is required in order to tag the ultrasound frames with position and orientation data from the tracking system (alternatively the tracking data could be directed directly into the scanner and the ultrasound frames could be used off-line, e.g. to generate a 3D volume from the tagged 2D frames). The traditional way of getting real-time access to ultrasound frames is to connect the analog output (e.g., composite video, S-video) of the ultrasound scanner to a frame-grabbing card on the navigation computer. Using the analog output might affect the image quality due to the double digital-to-analog-todigital conversion and no metadata (e.g. depth) follow the ultrasound images. Alternatively digital data can be streamed directly from the ultrasound scanner and into the navigation computer. Traditionally this has required some kind of research collaboration between the ultrasound manufacturer and the user but open ultrasound scanners are becoming available (e.g. the Ultrasonix scanner). These systems usually provide just a one-way streaming interface but two-way communication protocols where the scanner can be controlled (e.g. depth) by the navigation system exists making more integrated solutions possible (figure 7). Either way, the protocol (or interface / API) used is typically proprietary, although proposals for real-time standards are starting to emerge (e.g. OpenIGTLink, DICOM in surgery (WG24)). When the link between the ultrasound scanner and navigation system is digital, ultrasound data at different stages in the processing chain on the scanner can be transferred (e.g. scan-converted, scan-line and RF-data). Furthermore, a digital streaming interface will be required in order to use the real-time 3D scanners that are now becoming available also for navigation. It's difficult to capture the 3D content in the scanner display using a frame grabber so the data needs to be transferred in real-time or tagged with a tracking reference on the ultrasound scanner.

**Figure 7.** Streaming ultrasound data into the navigation system. The interface can either be analog using a frame grabber or digital using a direct link and a proprietary protocol. A digital interface can either be one-way (i.e. stream‐ ing) or two-way (i.e. optionally control the scanner from the navigation system as well). In any case the image stream must be tagged with tracking data and in order to do that the two streams need to be synchronized.

#### **3.2. Tracking of ultrasound probes**

nonlinear back-scattering from the bubble. At higher mechanical indexes the tissue will also respond nonlinearly and it then becomes difficult to differentiate the tissue signal from the bubble signal. For contrast imaging at high frequencies, such as 10 – 30 MHz, that can be used in minimal invasive interventions where the probe can be close to the object being imaged, conventional contrast imaging techniques often have limitations. The dual band SURF technique then has some advantages where the low frequency manipulation pulse can be tuned to match the bubble resonance frequency (typically around 2-3 MHz) whereas the high frequency imaging pulse can be optimized for the object being imaged and can for example be 20 MHz. The low frequency then manipulates the bubble oscillation and back-scattering which is interrogated by the high frequency pulse. The high frequency imaging pulse is hence

**Figure 6.** Example of SURF transmit pulse complexes where a low frequency manipulation pulse at 1 MHz is co-propa‐ gating with a high frequency imaging pulse at 10 MHz. The high frequency imaging pulse is in the left and right panel

State of the art ultrasound imaging is crucial for guiding interventions. But unlike freehand guidance and guidance based on ultrasound guides (figure 1) having optimal images on the ultrasound scanner is not enough to enable surgical navigation. In order to use ultrasound-

**•** Get the images out of the ultrasound scanner and into the navigation software in real-time.

**•** Synchronize the image and tracking streams (temporal calibration) and find the transfor‐ mation between the tracking sensor mounted on the ultrasound probe and the ultrasound

**•** Reconstruct all the position tagged ultrasound frames from a conventional 2D ultrasound probe into a regular 3D volume that can be used in the same way as preoperative MR or CT is.

Convenient ultrasound-based navigation of surgical instruments requires real-time access to the ultrasound data in the navigation software (figure 7). This is required in order to tag the ultrasound frames with position and orientation data from the tracking system (alternatively

decoupled from the resonance properties of the contrast bubbles.

placed at low and high manipulation pressure, respectively.

36 Advancements and Breakthroughs in Ultrasound Imaging

based navigation to guide such procedures we usually have to:

**3.1. Streaming of ultrasound data**

**3. Ultrasound-based navigation — Enabling technologies**

**•** Track the position and orientation of the ultrasound probe at all times.

scan plane (spatial calibration), which is the interesting part to track.

In order to use ultrasound to guide surgical procedures the ultrasound probe must be tracked. Several tracking technologies have been proposed over the years (mechanical, acoustical,

optical and electromagnetic), but currently the most widely used solutions are optical or electromagnetic systems (see figure 8). Choosing the best tracking technology depends on the application at hand and the ultrasound probes used. If possible optical tracking sys‐ tems should be preferred as magnetic tracking in the operating room can be challenging due to disturbances from metallic objects and the accuracy is close but not as good as optical systems under favorable conditions. For flexible us-probes or probes that are inserted into the body magnetic tracking is required as the transformation between the sensor and the scan plane must be rigid and optical tracking demands clear line of sight to the cameras. In addition the magnetic sensors are very small, crucial in order to be embedded in instru‐ ments and put into the body. When the ultrasound probe is tracked it becomes one of several tools and the streamed ultrasound data can either be shown in real time at the right spot in the patient or made into a 3D volume and shown together with other images to the surgeon. A brief description of the two main tracking technologies can be found below [51, 52]:

**•** *Optical tracking systems*: The basic idea is to use one or more cameras with markers distrib‐ uted on a rigid structure where the geometry is specified beforehand (figure 8A). At least three markers are necessary to determine the position and orientation of the rigid body in space. Additional markers allow a better camera visibility of the tracked object and improve the measurement accuracy. The markers can be infrared light-emitting diodes (active markers), infrared light reflectors (passive markers) or some kind of pattern (usually a checker board) that can be identified using visual light and image analysis.

**Figure 8.** Optical (A) and electromagnetic (B) tracking of ultrasound probes.

tracking sensor mounted on the ultrasound-probe (see figure 9):

After streaming ultrasound data into the navigation software and tracking the ultrasound probe, calibration is needed in order to integrate the image stream with the tracking stream. Ultrasound probe calibration is an important topic as this is the main error source for ultra‐ sound-based navigation (see section on accuracy). Two types of calibration are necessary; temporal calibration to find the lag between the image and tracking streams and spatial calibration [56, 57] to find the transformation between the ultrasound scan plane and the

Ultrasound-Based Guidance and Therapy http://dx.doi.org/10.5772/55884 39

**•** *Temporal calibration (find the time lag between the image stream and the tracking stream, see figure 9A)*: The most common way to do this is to move the ultrasound probe up and down in a water bath and extract some feature in the generated us-images (or correlate the images and measure the displacement). This gives us two sinus-like curves, one for the vertical position of the extracted feature in the images and one for the vertical component in the tracking data. The two curves are compared and one of them is fitted to the other to find the time lag

**•** *Spatial calibration (find the transformation between the image and the sensor, see figure 9B)*: Considerable effort has been spent on probe calibration over the last decade, and it still seems to be a hot research topic. Maybe because it is a challenging task to make it accurate, especially if the same method / phantom is to be used for substantially different probes. It is not possible to measure this transform with a ruler because the orientation of the scan plane relative to the sensor frame is unknown, we do not know the origin of the us-plane inside the probe housing and magnetic sensors do not have a known origin. A commonly used approach for probe calibration is to acquire 2-D images of a phantom with known geometry and to identify distinct features in the images. Because the location of the same

**3.3. Ultrasound probe calibration**

between the two streams.


**Figure 8.** Optical (A) and electromagnetic (B) tracking of ultrasound probes.

#### **3.3. Ultrasound probe calibration**

optical and electromagnetic), but currently the most widely used solutions are optical or electromagnetic systems (see figure 8). Choosing the best tracking technology depends on the application at hand and the ultrasound probes used. If possible optical tracking sys‐ tems should be preferred as magnetic tracking in the operating room can be challenging due to disturbances from metallic objects and the accuracy is close but not as good as optical systems under favorable conditions. For flexible us-probes or probes that are inserted into the body magnetic tracking is required as the transformation between the sensor and the scan plane must be rigid and optical tracking demands clear line of sight to the cameras. In addition the magnetic sensors are very small, crucial in order to be embedded in instru‐ ments and put into the body. When the ultrasound probe is tracked it becomes one of several tools and the streamed ultrasound data can either be shown in real time at the right spot in the patient or made into a 3D volume and shown together with other images to the surgeon. A brief description of the two main tracking technologies can be found below [51, 52]:

38 Advancements and Breakthroughs in Ultrasound Imaging

**•** *Optical tracking systems*: The basic idea is to use one or more cameras with markers distrib‐ uted on a rigid structure where the geometry is specified beforehand (figure 8A). At least three markers are necessary to determine the position and orientation of the rigid body in space. Additional markers allow a better camera visibility of the tracked object and improve the measurement accuracy. The markers can be infrared light-emitting diodes (active markers), infrared light reflectors (passive markers) or some kind of pattern (usually a

**•** *Electromagnetic tracking systems*: A receiver (sensor) is placed on the ultrasound probe and the system measures the induced electrical currents when the sensor is moved within a magnetic field generated by either an alternating current (AC) or direct current (DC) transmitter / generator (figure 8B). The AC and DC devices are both sensitive to some types of metallic objects placed too close to the transmitter or receiver, and to magnetic fields generated by power sources and devices such as cathode-ray tube monitors. Therefore, both types of electromagnetic systems are challenging to use in an environment such as an operating room, where various metallic objects are moved around in the field [53]. The two metal related phenomena that influence the performance of electromagnetic tracking systems are ferromagnetism and eddy currents [54]. Ferromagnetic materials (e.g., iron, steel) affect both AC and DC systems, because they change the homogeneity of the trackergenerated magnetic field, although the DC systems may be more sensitive to these effects. In contrast, the AC technology is more affected by the presence of conductors such as copper and aluminum because of distortions caused by eddy currents [53, 55]. DC systems minimize the eddy-current related distortions by sampling the field after eddy currents have

**•** *Comparisons between optical and magnetic tracking systems - pros and cons:* The main advantages with optical tracking systems are their robustness and high accuracy and the challenges are line of sight problems and the relatively big sensor frames. For electromagnetic tracking

checker board) that can be identified using visual light and image analysis.

decayed.

system it's basically the other way around.

After streaming ultrasound data into the navigation software and tracking the ultrasound probe, calibration is needed in order to integrate the image stream with the tracking stream. Ultrasound probe calibration is an important topic as this is the main error source for ultra‐ sound-based navigation (see section on accuracy). Two types of calibration are necessary; temporal calibration to find the lag between the image and tracking streams and spatial calibration [56, 57] to find the transformation between the ultrasound scan plane and the tracking sensor mounted on the ultrasound-probe (see figure 9):


features are known in the global coordinate system, the probe calibration matrix can be found from a relatively simple matrix equation. The probe calibration methods reported in the literature mainly differ with respect to the phantom geometry, whereas the processing of the acquired data is more or less common for all methods. The majority of probe calibra‐ tion methods can be categorized into one of three different classes: single- point or line; 2- D alignment; and freehand methods. The calibration matrix can be calculated as follows. Acquire the necessary amount of calibration images and find the coordinates of all the calibration points in each image. Next, we transform the corresponding physical points from global reference coordinates into sensor frame coordinates by using the inverse of the tracking matrix. The rigid body transformation that minimizes the mean Euclidian distance between the two homologous point sets will be the probe calibration matrix. The matrix can be calculated using a direct least squares error minimization technique [58].

addition to the lateral (azimuth) direction so that the ROI can be covered while the probe is

Ultrasound-Based Guidance and Therapy http://dx.doi.org/10.5772/55884 41

**Figure 10.** Reconstruction methods: A) Voxel Nearest Neighbor (VNN), B) Pixel Nearest Neighbor (PNN), Distribution

**Figure 12.** Matrix probes. Using a 2D array of elements (A) the beam can be steered in two directions (B) and a trun‐

**Figure 11.** Motorized / mechanical tilting (A), translation (B) and rotation (C). Source: Fenster [59]

standing still making real-time 3D ultrasound imaging possible (figure 12).

Step (DS) and C) Functional Based Methods (FBM).

cated pyramid of data is acquired (C).

In practice the following methods are in use:

**Figure 9.** Temporal (A) and spatial (B) calibration of the ultrasound probe.

#### **3.4. 3D Ultrasound**

It is difficult to guide an instrument into place using conventional 2D ultrasound only (freehand guidance): in order to know where the instrument is we need to see it in the ultrasound image and to reach the target we have to know where to go from there, a chal‐ lenging hand-eye coordination task. It's much more convenient to acquire a 3D ultrasound volume first and let the tracked instrument extract slices from the volume that can be annotated with the position and / or orientation of the instrument (see section on visualization).

3D ultrasound data can be acquired in different ways [59]. A conventional 1D array probe (2D +t) can be moved over the area of interest, either by freehand motion or by a motor. If freehand movement is used all the ultrasound frames can be put together into a volume using tracking data (figure 10) or correlation. A motor inside the probe hosing or external to it can also be used to cover the ROI by tilt, translation or rotation of the 1D array (figure 11). Furthermore, with a 2D matrix probe the ultrasound beam can be steered in the elevation direction in addition to the lateral (azimuth) direction so that the ROI can be covered while the probe is standing still making real-time 3D ultrasound imaging possible (figure 12).

features are known in the global coordinate system, the probe calibration matrix can be found from a relatively simple matrix equation. The probe calibration methods reported in the literature mainly differ with respect to the phantom geometry, whereas the processing of the acquired data is more or less common for all methods. The majority of probe calibra‐ tion methods can be categorized into one of three different classes: single- point or line; 2- D alignment; and freehand methods. The calibration matrix can be calculated as follows. Acquire the necessary amount of calibration images and find the coordinates of all the calibration points in each image. Next, we transform the corresponding physical points from global reference coordinates into sensor frame coordinates by using the inverse of the tracking matrix. The rigid body transformation that minimizes the mean Euclidian distance between the two homologous point sets will be the probe calibration matrix. The matrix can

be calculated using a direct least squares error minimization technique [58].

It is difficult to guide an instrument into place using conventional 2D ultrasound only (freehand guidance): in order to know where the instrument is we need to see it in the ultrasound image and to reach the target we have to know where to go from there, a chal‐ lenging hand-eye coordination task. It's much more convenient to acquire a 3D ultrasound volume first and let the tracked instrument extract slices from the volume that can be annotated

3D ultrasound data can be acquired in different ways [59]. A conventional 1D array probe (2D +t) can be moved over the area of interest, either by freehand motion or by a motor. If freehand movement is used all the ultrasound frames can be put together into a volume using tracking data (figure 10) or correlation. A motor inside the probe hosing or external to it can also be used to cover the ROI by tilt, translation or rotation of the 1D array (figure 11). Furthermore, with a 2D matrix probe the ultrasound beam can be steered in the elevation direction in

with the position and / or orientation of the instrument (see section on visualization).

**Figure 9.** Temporal (A) and spatial (B) calibration of the ultrasound probe.

40 Advancements and Breakthroughs in Ultrasound Imaging

**3.4. 3D Ultrasound**

**Figure 10.** Reconstruction methods: A) Voxel Nearest Neighbor (VNN), B) Pixel Nearest Neighbor (PNN), Distribution Step (DS) and C) Functional Based Methods (FBM).

**Figure 11.** Motorized / mechanical tilting (A), translation (B) and rotation (C). Source: Fenster [59]

**Figure 12.** Matrix probes. Using a 2D array of elements (A) the beam can be steered in two directions (B) and a trun‐ cated pyramid of data is acquired (C).

In practice the following methods are in use:

**•** *Freehand 3D ultrasound:* This is still the most wildly used method (mainly because of its flexibility) and usually the method works in the following manner: *Scan* the area of interest using a conventional 2D probe that is tracked and *reconstruct* the position tagged ultrasound frames into a regular 3D volume that can be used in the same way as preoperative MR or CT. The ultrasound probe is usually tracked by optical or electromagnetic sensors, but other methods have been proposed. Furthermore, different methods exist to reconstruct all the 2D frames into a regular 3D volume. The methods can be categorized into tree main groups [60]:

(figure 12B) and sweep out a volume shaped like a truncated pyramid (figure 12C). The main challenge with this technology is the large and heavy cable that would be required to connect all the elements in the array to a wire. Fortunately technological achievements in terms of multiplexing, sparse arrays and parallel processing over the last decade have made these systems commercially available. They are used extensively in echocardiology, which

Ultrasound-Based Guidance and Therapy http://dx.doi.org/10.5772/55884 43

Ultrasound and navigation can be integrated in different ways as we have seen. Complete

**•** *Two-rack systems:* Where the navigation computer with tracking system etc. and the ultra‐ sound scanner are two separate systems. This is most common, especially in a research environment. The main reason for this is flexibility, in principle any ultrasound scanner with an analog output can be used together with a navigation system that is equipped with ultrasound-based navigation software. An example of such a configuration is our in house research system for us-based navigation called CustusX (figure 13A). The system is used for different clinical applications (e.g. neurosurgery and laparoscopy), each navigation rack is equipped with both optical and magnetic tracking and can be connected to a variety of

**•** *One-rack systems:* Here the ultrasound scanner and the navigation computer have been integrated in the same system. These systems are more convenient to use in the operating room but are less flexible. Most commercial solutions belong to this category. Two variations

**◦** *An ultrasound scanner with navigation software integrated*. The PercuNav system from Philips, an integrated solution for navigation and intraoperative imaging, is an example

**◦** *A navigation system with an ultrasound scanner integrated*: The SonoWand system (Trond‐ heim, Norway), where an ultrasound scanner has been embedded in the navigation rack, is an example of this (figure 13C). The system can be used in three distinct ways: 1) as a navigation system based on preoperative MR/CT data, 2) as a standalone ultrasound scanner and 3) as an ultrasound-based navigation system with intraoperative imaging

**4. Registration and segmentation in ultrasound-based navigation**

Registration is the process of transforming an image into the coordinate system of a patient, or another image. After registration, the same anatomical features have the same coordinates in both the image and the patient, or in both images. Image-to-patient registration is one of the cornerstones of any navigation system, and is necessary for navigation using pre-operative

requires dynamic three-dimensional imaging of the heart and its valves.

**3.5. Integrated ultrasound-based navigation solutions**

ultrasound scannersusing analog and digital interfaces.

systems can usually be categorized as follows:

exists:

of this (figure 13B).

capabilities, its main use.


(figure 12B) and sweep out a volume shaped like a truncated pyramid (figure 12C). The main challenge with this technology is the large and heavy cable that would be required to connect all the elements in the array to a wire. Fortunately technological achievements in terms of multiplexing, sparse arrays and parallel processing over the last decade have made these systems commercially available. They are used extensively in echocardiology, which requires dynamic three-dimensional imaging of the heart and its valves.

## **3.5. Integrated ultrasound-based navigation solutions**

**•** *Freehand 3D ultrasound:* This is still the most wildly used method (mainly because of its flexibility) and usually the method works in the following manner: *Scan* the area of interest using a conventional 2D probe that is tracked and *reconstruct* the position tagged ultrasound frames into a regular 3D volume that can be used in the same way as preoperative MR or CT. The ultrasound probe is usually tracked by optical or electromagnetic sensors, but other methods have been proposed. Furthermore, different methods exist to reconstruct all the 2D frames into a regular 3D volume. The methods can be categorized into tree main groups

**◦** *Voxel-based methods (VBM)*: VBM traverse each voxel in the target voxel grid and gather information from the input 2D images to be placed in the voxel. One or several pixels may contribute to the value of each voxel. The simplest method in this category is Voxel Nearest Neighbor (VNN), which traverses each voxel in the target volume and assigns

**◦** *Pixel-based methods (PBM)*: PBM usually consists of two steps: a Distribution Step (DS) where the input pixels are traversed and applied to one or several voxels and a Hole-Filling Step (HFS) where the voxels are traversed and empty voxels are being filled. The simplest method in this category is Pixel Nearest Neighbor (PNN) that runs through each pixel in all the 2D input images and assigns the pixel value to the nearest voxel in the

**◦** *Function based methods (FBM)*: FBM choose a particular function (like a polynomial) and determine the coefficients to make the functions pass through the input pixels. After‐ wards, the function can be used to create a regular voxel array by evaluating the function at regular intervals (see figure 10C). These methods produce reconstructed volumes with the highest quality but are very computational intensive and are in limited use today.

**•** *Motorized (or mechanical) 3D ultrasound*: Instead of using freehand movement of the ultra‐ sound probe over the area of interest a motor can cover the same region by tilting (figure 11A), translating (figure 11B) or rotating (figure 11C) a conventional 1D ultrasound array. Motorized probes have existed for a long time and the motor can either be mounted inside the probe housing (easy to use but requires a specially build ultrasound probe) or be applied externally (more flexible as conventional probes can be used). Many of the benefits with freehand scanning also apply to motorized scanning, e.g. the possibility to use high frequency probes with higher spatial resolution, also in the elevation direction (1.25D/1.5D probes). Motorized scanning can use the same kind of reconstruction methods as freehand scanning but usually more optimized methods are used as the movement is known and the probe do not need to be tracked during the acquisition. Compared to freehand ultrasound the motorized probes are easier to use in an intraoperative setting, but on the other hand,

**•** *Real-time 3D ultrasound using 2D matrix probes* [61-65]: Instead of using a conventional 1D array transducer that is moved by freehand or by a motor to sweep out the anatomy of interest, transducers with 2D phased arrays (figure 12A) that can generate 3D images in real time have been developed. Electronics is used to control and steer the ultrasound beam

the value of the nearest image pixel (see figure 10A).

target volume (see figure 10B).

42 Advancements and Breakthroughs in Ultrasound Imaging

they are not as flexible in general.

[60]:

Ultrasound and navigation can be integrated in different ways as we have seen. Complete systems can usually be categorized as follows:

	- **◦** *An ultrasound scanner with navigation software integrated*. The PercuNav system from Philips, an integrated solution for navigation and intraoperative imaging, is an example of this (figure 13B).
	- **◦** *A navigation system with an ultrasound scanner integrated*: The SonoWand system (Trond‐ heim, Norway), where an ultrasound scanner has been embedded in the navigation rack, is an example of this (figure 13C). The system can be used in three distinct ways: 1) as a navigation system based on preoperative MR/CT data, 2) as a standalone ultrasound scanner and 3) as an ultrasound-based navigation system with intraoperative imaging capabilities, its main use.
