Preface Preface

The first stage of satellite information classification and interpretation is data collection using remote sensing. Space technology includes both satellite and aerial remote sensing applications. In general, these applications operate at different spectra of the electromagnetic radiation, as energy from the Sun reaches Earth's surface and is again reflected, transmitted, or absorbed by the objects that collected it via satellite sensors or recorded it in the satellite's memory.

The next stage of classifying and interpreting data is making a finer interpretation of spectra for identifying Earth's features. Significant advances in sensor technology stemmed from subdividing the spectral ranges of electromagnetic radiation into several bands, thus allowing sensors across these bands to form multispectral images, opening up opportunities for bringing into sharp relief Earth's features at high accuracy and increased segments of processing. In general, there are three different types of data products: black and white photograph or panchromatic image (single band), normal color, and false color composite (multichannel). Single band images display as grayscale, but a combination of three bands at a time generates color composite images.

Interpretation of satellite information may be visual or digital, or it may integrate both modes, containing the line of the processes of detection, identification, description, and assessment of the detected object.

 Visible imagery is due to radiation in the electromagnetic wave range of 0.4–0.7 µ; it is available during daylight hours and when atmospheric transparence is good. Some satellites can sense low-intensity visible light at night, but these data are not routinely used by operational meteorologists. In general, visible imagery is black and white. White is used for the brightest and most reflective energy received by the sensor, whereas black displays the least reflective values. Low brightness is associated with oceans, lakes, and the background of Earth; medium brightness values come from land, including forests and deserts. Clouds produce high brightness, displayed in white or light gray.

Atmospheric windows are generally used for signal detection from Earth. One such is near infrared, covering wavelengths of 0.75–1.4 µm. Infrared (or thermal infrared, IR) imagery is derived from terrestrial radiation emitted by Earth, cloud tops, and the atmosphere in the range of 10–12 µ. This portion of the spectrum is available 24 hours a day and does not depend on atmospheric conditions. IR values are a measure of the temperature of the emitting surface, with some modifications due to absorption and reemission as the radiation passes through the atmosphere.

A complication of interpreting IR images is due to their lower resolution (relative to wavelengths of the visible spectrum). Healthy vegetation reflects infrared radiation much more strongly than it reflects green energy, appearing very bright in the image. A simple example is the light tone appearance given by vegetation species and the dark tone given by water. Particularly, in thermal infrared images, brightness represents the warmest temperature and darkness the coolest.

In radar imagery, smooth surfaces reflect highly, whereas areas blocked from radar signals appear dark. Bridges and cities appear very bright, while calm water, pavement, and dry lake beds appear very dark.

The basic elements of radar image interpretation are the following:


**Rustam B. Rustamov**  Khazar University, Azerbaijan Section 1
