**Meet the editor**

Dr. Jill S. M. Coleman is an associate professor in the Department of Geography at Ball State University, USA. Her primary research interests include synoptic climatology and hydroclimatic variability, particularly the relationship between atmospheric teleconnections (e.g., the North Atlantic Oscillation) and eastern North American flood, drought, and snowfall patterns. She also

investigates topics in the areas of atmospheric hazards, human biometeorology, and tropical cyclone climatology. Dr. Coleman received her PhD in geography-climatology from The Ohio State University, USA.

## Contents

## **Preface XI**


César Augusto Aguirre and Armando Benito Brizuela

## Preface

Hazards are processes and events that have the potential to create loss and are sources of danger to human beings and their environment. Some hazards are largely human induced, stemming from technological failures, dangerous procedures, and human actions, such as transportation accidents, chemical explosions, and building collapses. In contrast, natural hazards are caused by features of the physical environment that operate independently of human activities and include geological, biological, hydrological, and atmospheric process‐ es. Environmental hazards describe events resulting from both the natural and built envi‐ ronment that are usually more global with large-scale ecological implications (e.g., climate change). Regardless of causation, the events produce negative impacts or hazards for human physical and psychological well-being and socioeconomic infrastructure.

Atmospheric hazards focus on those events generated primarily from atmospheric process‐ es, such as tropical cyclones, tornadoes, thunderstorms and lightning, hail, blizzards, and other meteorological extremes. However, the distinction between atmospheric-based haz‐ ards and other geophysical hazards is not always clear. For instance, floods are a combina‐ tion of extreme precipitation, geomorphology, watershed structure, and human development (e.g., flood control measures). Tropical cyclones are generated from complex atmospheric and oceanic thermodynamics whose disaster potential is amplified by shallow continental shelves and high-density coastal populations. For these reasons, the investiga‐ tion and management of atmospheric and other natural hazards often require a multidisci‐ plinary approach.

In addition to the physical mechanisms that create inclement atmospheric conditions, societ‐ al factors are perhaps equally important in determining the resiliency of the population to cope with and/or minimize loss. Risk, or the degree of exposure and vulnerability to a haz‐ ard, is determined by the product of the probability of the event and the consequences of the loss. No environment is risk free from atmospheric hazards and large geographic variations in risk exposure exist. Population demographics (e.g., age and gender), resource accessibility (e.g., communication network and capital), and other human vulnerability measures are needed for hazard risk assessment.

Natural and environmental hazards research comprises a diverse set of subjects and meth‐ odologies and this book is no exception—offering the reader only a small glimpse into the physical and social processes that threaten human interests. *Atmospheric Hazards - Case Stud‐ ies in Modeling, Communication, and Societal Impacts* explores atmospheric-based hazards through focused investigations ranging from a local to global perspective. Within this short compendium, the major scales of atmospheric motion are well represented with topics on microscale turbulent transport of pollutants, mesoscale events stemming from thunderstorm

complexes, and synoptic scale extreme precipitation episodes. Chapters include discussions on modeling aspects for investigating hazards (pollution, regional climate models) and the forecasting and structure of high wind events (derechos), whereas others delve into hazard communication, preparedness, and social vulnerability issues (tornadoes, hurricanes, and lightning). The major theme of the first three chapters is weather and societal impacts, whereas the latter chapters have a stronger focus into the physical processes and modeling aspects of hazards. Although the chapters are quite disparate upon first inspection, the top‐ ics are united through their interweaving of both the physical and societal mechanisms that create the atmospheric hazard and eventual disaster.

I would like to thank the authors for contributing their work to this collection and their pa‐ tience during the review process. In addition, I would like to acknowledge the helpful assis‐ tance and dedication of InTech publishing managers, Ana Simčić for initiating the project and Romina Rovan for completing the editorial processing.

> **Dr. Jill S. M. Coleman** Associate Professor, Department of Geography, Ball State University, USA

**Part I: Atmospheric Hazards and Societal Implications**

complexes, and synoptic scale extreme precipitation episodes. Chapters include discussions on modeling aspects for investigating hazards (pollution, regional climate models) and the forecasting and structure of high wind events (derechos), whereas others delve into hazard communication, preparedness, and social vulnerability issues (tornadoes, hurricanes, and lightning). The major theme of the first three chapters is weather and societal impacts, whereas the latter chapters have a stronger focus into the physical processes and modeling aspects of hazards. Although the chapters are quite disparate upon first inspection, the top‐ ics are united through their interweaving of both the physical and societal mechanisms that

I would like to thank the authors for contributing their work to this collection and their pa‐ tience during the review process. In addition, I would like to acknowledge the helpful assis‐ tance and dedication of InTech publishing managers, Ana Simčić for initiating the project

**Dr. Jill S. M. Coleman**

Ball State University, USA

Associate Professor, Department of Geography,

create the atmospheric hazard and eventual disaster.

VIII Preface

and Romina Rovan for completing the editorial processing.

## **Lightning Occurrence and Social Vulnerability**

Ronald L. Holle and Mary Ann Cooper

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/63001

## **Abstract**

The occurrence of lightning in time and space around the world is well known. Lightning fatalities and injuries are well delineated in the United States; however, there is much less information about lightning impacts on people in the developing world. It is estimated that between 6000 and 24,000 people are killed globally per year, and 10 times as many are injured. The fatality rate per capita has become very low in the developed countries during the past century due to the availability of lightning-safe structures and vehicles, less labor-intensive agriculture, and other factors, but this reduction has not occurred where people continue to work and live in lightningunsafe situations. Lightning safety advice often mistakenly expects that the direct strike is most common, but ground current, direct contact, side flash, and upward streamers are much more frequent mechanisms. In developed countries, the injury:death ratio is approximately 10:1, meaning that 90% survive but may have permanent disabling injuries. The proximate cause of death is cardiac arrest and anoxic brain injury at the time of the lightning strike, and, at this time, the damage from a lightning strike cannot be reversed or decreased in survivors. Lightning vulnerability in many developing countries continues to be a major issue due to widespread exposure during laborintensive agriculture during the day when thunderstorms are the most frequent and while occupying lightning-unsafe dwellings at night.

**Keywords:** lightning, lightning strikes, lightning fatalities, lightning safety, lightning occurrence

## **1. Introduction**

More lightning occurs in clouds than that strikes the ground; and as the cloud moves above the earth, opposite charges are induced on the surface of the earth and on objects on the ground under the cloud. Upward streamers, not usually visible from these objects, will reach up and

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

attemptto connect with thedownward-moving lightning channel.Acloud-to-groundflash has one or more return strokes. When you see lightning flickering, those are return strokes within the same flash. There are about four strokes per flash, when averaged over a large sample size. The first stroke comes faintly to ground and contacts the surface of the earth. The stroke may attach to a tree, open land, the ocean, or other objects. Then the light fills the channel going upward. The next stroke in the flash will likely come down from the cloud to the ground in the same channel as the first stroke, andso on.Occasionally, one ofthe subsequent strokeswill stray from the pre-existing channel and come to ground one or two kilometers away.

One of the important features of this well-known mechanism of lightning strikes is the search radius. The cloud-to-ground channel makes its way downward in step leaders of about 50-m lengths (**Figure 1**). At the lower tip of each step, the channel searches for a feature that makes a convenient connection to ground. Higher up in the cloud, there is nothing to strike, so the step leader keeps coming toward the ground as each step attempts to reach out and connect to the ground. Branching occurs that is generally downward. Only when the lowest tip of the channel is 30–50 m from the surface of the earth does it 'decide' what to strike. This last step leader, not usually visible from a distance, is nearly vertical and connects one of the upward streamers emanating from objects such as trees, poles, and sometimes open water. The most likely connections will be made to objects that are tall, isolated, and pointed. But if that object is more than 30–50 m away from the lowest tip of the downward leader, the lightning channel may come all the way to ground, close to a tall building or tower.

**Figure 1.** Cloud-to-ground lightning photographed from Oro Valley, Arizona, USA (©R. Holle).

## **2. Lightning occurrence**

The location, time of year and day, and number of lightning occurrences throughout the world are known quite well. **Figure 2** shows the latest 10-year map of the United States cloud-toground flash density per square kilometer. With respect to time of day, about two-thirds of lightning occurs during the afternoon from noon to 1800 local time. Cloud-to-ground lightning is the most frequent along the coasts of Florida and the Gulf of Mexico, and generally decreases to the north and west as the number of days with substantial low-level moisture decreases in frequency. Large variations in lightning frequency also occur in the western states where large elevation changes affect thunderstorm formation.

attemptto connect with thedownward-moving lightning channel.Acloud-to-groundflash has one or more return strokes. When you see lightning flickering, those are return strokes within the same flash. There are about four strokes per flash, when averaged over a large sample size. The first stroke comes faintly to ground and contacts the surface of the earth. The stroke may attach to a tree, open land, the ocean, or other objects. Then the light fills the channel going upward. The next stroke in the flash will likely come down from the cloud to the ground in the same channel as the first stroke, andso on.Occasionally, one ofthe subsequent strokeswill stray

One of the important features of this well-known mechanism of lightning strikes is the search radius. The cloud-to-ground channel makes its way downward in step leaders of about 50-m lengths (**Figure 1**). At the lower tip of each step, the channel searches for a feature that makes a convenient connection to ground. Higher up in the cloud, there is nothing to strike, so the step leader keeps coming toward the ground as each step attempts to reach out and connect to the ground. Branching occurs that is generally downward. Only when the lowest tip of the channel is 30–50 m from the surface of the earth does it 'decide' what to strike. This last step leader, not usually visible from a distance, is nearly vertical and connects one of the upward streamers emanating from objects such as trees, poles, and sometimes open water. The most likely connections will be made to objects that are tall, isolated, and pointed. But if that object is more than 30–50 m away from the lowest tip of the downward leader, the lightning channel

from the pre-existing channel and come to ground one or two kilometers away.

4 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

may come all the way to ground, close to a tall building or tower.

**Figure 1.** Cloud-to-ground lightning photographed from Oro Valley, Arizona, USA (©R. Holle).

The location, time of year and day, and number of lightning occurrences throughout the world are known quite well. **Figure 2** shows the latest 10-year map of the United States cloud-toground flash density per square kilometer. With respect to time of day, about two-thirds of

**2. Lightning occurrence**

**Figure 2.** Cloud-to-ground flash density per square kilometer per year over the contiguous United States from the Na‐ tional Lightning Detection Network from 2006 through 2015 (Courtesy Vaisala, Inc.).

**Figure 3.** Lightning stroke density per square kilometer per year over the world from the Global Lightning Dataset GLD360 from 2012 through 2015 (Courtesy Vaisala, Inc.).

**Figure 3** shows the global occurrence of lightning strokes. The highest densities are over tropical and subtropical coastlines, near large elevation changes, and east coasts at middle latitudes. The lowest lightning frequency is over oceans, the polar regions, and west coasts at middle latitudes. The dominance of land lightning is due to daytime heating that can produce updrafts rising to altitudes where temperatures are colder than freezing, the altitudes where lightning initiates. Since much of outdoor human activity occurs during daytime, the juxtaposition of lightning with people can be expected to result in fatalities and injuries.

Nearly everywhere in the world has a season with more lightning than others. In the middle latitudes about two-thirds of lightning occurs during meteorological summer (June, July, and August in the Northern Hemisphere and December, January, and February in the Southern Hemisphere). In the Tropics, the passage of the equatorial trough or the summer monsoon strongly affects the frequency of lightning. The equatorial trough (also known as the Intertropical Convergence Zone, or ITCZ) is a somewhat continuous east-west area of rain and thunderstorms that stays within about 20 degrees latitude of the equator, shifting northward (southward) during the Northern (Southern) Hemisphere summer. For locations near the equator, the equatorial trough crosses locations twice a year which can result in two rainy seasons whereas more subtropical locations have only a single rainy season. The Asian Monsoon can be considered to be a variation of the equatorial trough that is strongly affected by the large land mass of Asia and the Himalayas. The equatorial trough generally moves northward in the Northern Hemisphere summer and brings tropical moisture from the south, often resulting in higher thunderstorm and lightning frequencies, and then reverses to send dry air flowing from the north during the Northern Hemisphere winter when lightning occurrence is much less likely.

## **3. Lightning fatalities and injuries**

## **3.1. United States fatalities and injuries**

According to the U.S. National Weather Service (NWS), lightning fatalities have averaged 32 per year over the past decade. **Figure 4** shows the latest decade of available fatality data with the top panel showing the number of fatalities by state while the lower panel indicates the fatality rate per million people. The fatality rate turns out to be rather different from the actual number of fatalities, and indicates how the number of lightning flashes and the number of fatalities are related, but not as directly as may be expected. States with the highest fatality rates in the western United States appear to indicate a region with drier air and lower rainfall rates that give the perception that lightning is not as much of a threat as when it is raining heavily in other locations in the United States.

**Figure 3** shows the global occurrence of lightning strokes. The highest densities are over tropical and subtropical coastlines, near large elevation changes, and east coasts at middle latitudes. The lowest lightning frequency is over oceans, the polar regions, and west coasts at middle latitudes. The dominance of land lightning is due to daytime heating that can produce updrafts rising to altitudes where temperatures are colder than freezing, the altitudes where lightning initiates. Since much of outdoor human activity occurs during daytime, the juxtaposition of lightning with people can be expected to result in fatalities and

6 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Nearly everywhere in the world has a season with more lightning than others. In the middle latitudes about two-thirds of lightning occurs during meteorological summer (June, July, and August in the Northern Hemisphere and December, January, and February in the Southern Hemisphere). In the Tropics, the passage of the equatorial trough or the summer monsoon strongly affects the frequency of lightning. The equatorial trough (also known as the Intertropical Convergence Zone, or ITCZ) is a somewhat continuous east-west area of rain and thunderstorms that stays within about 20 degrees latitude of the equator, shifting northward (southward) during the Northern (Southern) Hemisphere summer. For locations near the equator, the equatorial trough crosses locations twice a year which can result in two rainy seasons whereas more subtropical locations have only a single rainy season. The Asian Monsoon can be considered to be a variation of the equatorial trough that is strongly affected by the large land mass of Asia and the Himalayas. The equatorial trough generally moves northward in the Northern Hemisphere summer and brings tropical moisture from the south, often resulting in higher thunderstorm and lightning frequencies, and then reverses to send dry air flowing from the north during the Northern Hemisphere winter when lightning

According to the U.S. National Weather Service (NWS), lightning fatalities have averaged 32 per year over the past decade. **Figure 4** shows the latest decade of available fatality data with the top panel showing the number of fatalities by state while the lower panel indicates the fatality rate per million people. The fatality rate turns out to be rather different from the actual number of fatalities, and indicates how the number of lightning flashes and the number of fatalities are related, but not as directly as may be expected. States with the highest fatality rates in the western United States appear to indicate a region with drier air and lower rainfall rates that give the perception that lightning is not as much of a threat as when it is raining

injuries.

occurrence is much less likely.

**3. Lightning fatalities and injuries**

**3.1. United States fatalities and injuries**

heavily in other locations in the United States.

**Figure 4.** Ranks of lightning fatalities per state (a) and fatality rate per million people (b) from 2005 through 2014 based on the United States National Weather Service website [2].

The necessary ingredients of lightning frequency and population density in the United States have been combined in a recent study that is able to replicate the primary locations of observed lightning fatalities [1]. This approach shows the concentration of fatalities in urban areas that have moderate to high lightning frequencies. Such a study has not yet been attempted in other countries where the lightning risk is very different due to such factors as the availability of lightning-safe buildings and vehicles, agricultural participation, and related societal differen‐ ces.

Underreporting of lightning casualties has been a problem in past years in the United States, but much less in the past couple of decades. A primary reason for underreporting is that about 90% of all lightning fatalities and injuries are to one person at a time, which leads to the tendency for such events to be reported less often in the media than multiple-casualty events. With the inception of the Lightning Safety Awareness Team, a multidisciplinary group sponsored by the National Weather Service, it is probable that every fatality is documented [2], but an estimate is that only about 70% of the injuries reach the reporting system that is maintained by all NWS offices across the country. An additional issue is that underreporting may seem to be occurring when there is actually a definition issue. The National Weather Service in its *Storm Data* publication does not include secondary casualties due to lightning. For example, a house caught on fire at night due to lightning that results in a fatality or injury is not counted as a lightning impact, since the primary cause is coded as fire not lightning. A useful method for estimating injury underreporting is the ratio of injuries to deaths. An intensive study over three full years across the state of Colorado showed that about ten injuries occur per fatality [3]. When the ratio is less than ten, an indication exists that not enough injuries have been reported and documented. The 10:1 injury:death ratio is assumed to apply to the United States and more developed countries with their similar socio-economic infrastructures, providing widespread availability of lightning-safe buildings and vehicles.

The lightning fatality rate per million people in the United States has dropped by more than two orders of magnitude since 1900 (**Figure 5**). Similar trends have been observed in many more developed countries of the world with published lightning fatality rates during the same period [4]. Also shown is the rural percentage of the population, which decreased from 60% in 1900 to fewer than 20% at present. The United States population not only transitioned out of a mainly labor-intensive agricultural society a century ago, but also moved into more substantial home and workplace buildings with grounded wiring and plumbing, together with the ready availability of fully enclosed metal-topped vehicles, better medical care, and greatly improved meteorological information about thunderstorms. All of these factors are present in more developed countries that have lower lightning death rates.

**Figure 5.** Solid red line: United States lightning deaths per million people from 1900 to 2013. Dashed blue line: percent rural population.

**Figure 6** graphs United States lightning fatality types one hundred years apart, comparing the 1890s (top) with the 2005–2014 period (bottom). During the late nineteenth century, being indoors was the most common situation for lightning fatalities, while agriculture and outdoors were also a large component of the events. During the decade starting in 2005, indoor fatalities have become quite rare, agriculture events have greatly reduced from a century ago as farming became mechanized, and lightning-safe buildings and vehicles became common. Recreation and sports situations are relatively more frequent now than earlier. However, the overstated scenario of golf fatalities has been exceeded substantially in recent years by hiking, climbing, boating, and other water-related activities [2].

**Figure 6.** Comparison of the percentage of types of the United States lightning fatalities in the 1890s versus 2005 through 2014 [20].

## **3.2. Global fatalities and injuries**

sponsored by the National Weather Service, it is probable that every fatality is documented [2], but an estimate is that only about 70% of the injuries reach the reporting system that is maintained by all NWS offices across the country. An additional issue is that underreporting may seem to be occurring when there is actually a definition issue. The National Weather Service in its *Storm Data* publication does not include secondary casualties due to lightning. For example, a house caught on fire at night due to lightning that results in a fatality or injury is not counted as a lightning impact, since the primary cause is coded as fire not lightning. A useful method for estimating injury underreporting is the ratio of injuries to deaths. An intensive study over three full years across the state of Colorado showed that about ten injuries occur per fatality [3]. When the ratio is less than ten, an indication exists that not enough injuries have been reported and documented. The 10:1 injury:death ratio is assumed to apply to the United States and more developed countries with their similar socio-economic infrastructures,

The lightning fatality rate per million people in the United States has dropped by more than two orders of magnitude since 1900 (**Figure 5**). Similar trends have been observed in many more developed countries of the world with published lightning fatality rates during the same period [4]. Also shown is the rural percentage of the population, which decreased from 60% in 1900 to fewer than 20% at present. The United States population not only transitioned out of a mainly labor-intensive agricultural society a century ago, but also moved into more substantial home and workplace buildings with grounded wiring and plumbing, together with the ready availability of fully enclosed metal-topped vehicles, better medical care, and greatly improved meteorological information about thunderstorms. All of these factors are present in

**Figure 5.** Solid red line: United States lightning deaths per million people from 1900 to 2013. Dashed blue line: percent

providing widespread availability of lightning-safe buildings and vehicles.

8 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

more developed countries that have lower lightning death rates.

rural population.

Worldwide, **Figure 7** shows information on lightning fatality rates that has been collected in a number of countries during the last quarter century; nevertheless, significant gaps exist in our knowledge of the absolute numbers [4]. Studies have estimated the number of deaths per year attributed to lightning globally anywhere from a few thousand to 24,000 [5–7]; however, much uncertainty exists due to the limited sample size (**Figure 7**). One country of particular interest is Malawi with a very high rate of 84 lightning deaths per million people per year that far exceeds the rate in any other country [4]. The 1008 annual fatalities for this small but populous country may represent a very complete data collection method that is what adjacent countries in the region should report, so very large numbers of lightning fatalities may be actually occurring but we do not know them. If the "10:1 (injury:death) rule" is appropriate for developing countries, then lightning is causing a large number of deaths and injuries worldwide, regardless of our inability to state the actual numbers.

**Figure 7.** Lightning fatality rate per million people per year by continent. Red shading indicates rate >5.0 fatalities per million per year, orange is 0.6 to 5.0, and yellow is 0.5 or less. White indicates no national summaries have been pub‐ lished for datasets ending in 1979 or later (Updated with permission from Holle [4]).

**Figure 8.** Many forms of labor-intensive work are far from lightning safety such as (a) fishermen in open boats on Lake Victoria, the second largest lake in the world, and (b) fields in Nepal (courtesy ACLENet and M.A. Cooper).

In contrast with the United States, many populous less-developed countries have as much as 90% of the population living and working in lightning-unsafe locations and situations. During the day, many people are involved in labor-intensive agriculture, fishing in open boats, walking to market, or inside schools without recourse to safety in an appropriate building or vehicle [8] (**Figures 8** and **9**). At night, people live in lightning-unsafe dwellings without adequate wiring, plumbing, or metal structural components that can carry lightning into the ground without affecting individuals inside [9] (**Figure 10**).

**Figure 7.** Lightning fatality rate per million people per year by continent. Red shading indicates rate >5.0 fatalities per million per year, orange is 0.6 to 5.0, and yellow is 0.5 or less. White indicates no national summaries have been pub‐

**Figure 8.** Many forms of labor-intensive work are far from lightning safety such as (a) fishermen in open boats on Lake

Victoria, the second largest lake in the world, and (b) fields in Nepal (courtesy ACLENet and M.A. Cooper).

lished for datasets ending in 1979 or later (Updated with permission from Holle [4]).

10 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**Figure 9.** Most forms of transportation or going to market in developing countries are not lightning safe such as (a) taxi in India, (b) ox cart in India, and (c) bota-bota taxi in Nepal (©M.A. Cooper).

**Figure 10.** An estimated 90% of sub-Saharan buildings and housing are not lightning-safe, such as (a) farming settle‐ ment with thatch and sheet metal roofs on mud brick walls in Zambia, (b,c) combination shops and homes in India (©M.A. Cooper).

In the developing world, the injury:death ratio is expected to be lower since fewer lightningsafe locations are available, resulting in a higher proportion of deaths. In addition, more people die per lightning event in the developing world than in the United States, particularly in agriculture and school events. For developing countries, counting both "primary" and "secondary" causes may be preferable until fatality rates decrease and the availability of safe locations increases.

In more developed countries, the dominant profile of lightning casualties is the young male. Risk-taking in recreation, workplaces, and organized sports tends to be dominated by this group of males between about 15 and 30 years old. However, in lesser developed countries, the distribution is substantially more equal between female and male, and the ages are much more disperse, both because of more widespread exposure. A recent study of labor-intensive agriculture in mainly India and Bangladesh shows that 47% of the fatalities and injuries were females as they work during the daytime when thunderstorms are most frequent [8]. The lack of lightning-safe dwellings, schools, and workplaces means that all ages and both genders are equally vulnerable at all times.

## **3.3. Damage to property and indirect impacts**

In developed countries, precautions against the effects of lightning to property, electronics, and utility lines are usually routine and subject to building codes using well-accepted practices for public buildings such as churches, hospitals, and schools. In developing countries, the effects of lightning damage to property can have not only direct effects on the structure but also indirect economic effects. The impacts include food spoilage from lack of refrigeration after electrical failure, electrical parts and repairs are unaffordable or not available for days, hospitals are without power, and databases and expensive, irreplaceable electronics are damaged. These adverse effects can occur in countries that are already struggling with other pressing issues such as drought, HIV, underemployment, or civil strife. Individual families can suffer not only when one or more of their members is injured or killed by lightning but also when their livestock, often the major measure of wealth, are killed en masse by lightning (**Figure 11**).

**Figure 11.** Cattle killed by a cloud-to-ground lightning strike in South Africa (©I. Jandrell).

## **4. Lightning injury mechanisms and safety**

the distribution is substantially more equal between female and male, and the ages are much more disperse, both because of more widespread exposure. A recent study of labor-intensive agriculture in mainly India and Bangladesh shows that 47% of the fatalities and injuries were females as they work during the daytime when thunderstorms are most frequent [8]. The lack of lightning-safe dwellings, schools, and workplaces means that all ages and both genders are

In developed countries, precautions against the effects of lightning to property, electronics, and utility lines are usually routine and subject to building codes using well-accepted practices for public buildings such as churches, hospitals, and schools. In developing countries, the effects of lightning damage to property can have not only direct effects on the structure but also indirect economic effects. The impacts include food spoilage from lack of refrigeration after electrical failure, electrical parts and repairs are unaffordable or not available for days, hospitals are without power, and databases and expensive, irreplaceable electronics are damaged. These adverse effects can occur in countries that are already struggling with other pressing issues such as drought, HIV, underemployment, or civil strife. Individual families can suffer not only when one or more of their members is injured or killed by lightning but also when their livestock, often the major measure of wealth, are killed en masse by lightning

**Figure 11.** Cattle killed by a cloud-to-ground lightning strike in South Africa (©I. Jandrell).

equally vulnerable at all times.

(**Figure 11**).

**3.3. Damage to property and indirect impacts**

12 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

A widespread misconception exists that the direct strike is the most common lightning injury mechanism. A direct result of this misconception is the mistaken development of the most lightning safety avoidance rules that address the stroke coming straight down and striking a person in the head. This is the least common mechanism. Instead, there are five primary mechanisms of lightning injuries (**Figure 12**).

**Figure 12.** Mechanisms of lightning injury and death.

The five mechanisms lead to a very different conclusion regarding lightning avoidance than the direct strike—everywhere outside is unsafe from lightning. Lowering oneself in height is not sufficient. In the order of highest frequency:


Lightning safety myths abound. One cannot anticipate with certainty where lightning will strike the ground or what it will strike. At best, lightning can only be described statistically as more likely to hit certain types of objects: tall, isolated, and/or pointed. Safety messages that stress avoiding standing in a certain way, not holding specific objects, being a certain distance from tall objects, and variations on these concepts are not reliable. Safety messages that emphasize what is on the feet are irrelevant; after lightning rips apart several kilometers of air on its journey from cloud to ground, a thin rubber shoe is overwhelmed and immaterial. Part of the perpetuation of many false myths is due to the fact that around 90% of lightning casualties survive. What may seem to be a factor in an individual case, and go on to assume mythological quality, usually does not generalize to a population.

It is estimated that around 10% of lightning casualties are related to trees [11]. Perhaps, a third of all cloud-to-ground lightning flashes around the globe attach to trees. Once lightning strikes a tree, the current comes down the trunk and spreads horizontally (ground current). It also produces side flashes to people or animals who are close by, such as those seeking to stay dry under the canopy (**Figure 11**) and are also close to the trunk. Some people may suffer contact injury if they are touching the tree at the moment lightning strikes it. Finally, blunt and penetrating trauma can occur when bark and tree limbs explode outward at a high speed up to tens of meters away.

Only two reliable safe locations exist from lightning. One reliably safe location is inside a substantial well-constructed building with wiring, plumbing, and perhaps metal structural members. Such buildings where people work or live are able to provide a path for a lightning strike into the ground without causing harm. The other reliably safe location is inside a fully enclosed metal-topped vehicle. The effects of such buildings and vehicles are similar to that of a Faraday cage where the current flows outside of people within the structure or vehicle. Direct strikes to such buildings and vehicles can be frightening and sometimes have discon‐ certing impacts. However, such property damage is massively preferred to people being outside of such locations.

Unsafe structures include anywhere with the word shelter attached—beach, sun, shade, rain, or bus shelter. While they can be made safe, most people cannot tell if that is the case (**Figure 13**). One should always assume they are not lightning-safe because they will likely not surround a person inside with a certain path for lightning to follow. Similarly, any other structure is unsafe when made of mud, brick, or thatch without specifically designed metalconducting paths for the current to follow.

**Figure 13.** Lightning-unsafe small structure. Note sign recommending a safe place elsewhere rather than staying at this location (©R. Holle).

Unsafe vehicles include motorcycles, convertibles, golf carts, tuk-tuks, bota-botas, fourwheelers, and similar vehicles. A summary of motorcycle lightning events, often resulting in deaths, is in [12]. A common misconception refers to rubber tires being of relevance. When lightning strikes a fully enclosed metal-topped vehicle, the current flows through the metal structure around the people inside, then exits through the ground. The tires are the shortest path to ground, so they may explode or flatten. Tires are damaged as an effect of the lightning strike, but they did not protect people inside, instead safety is provided by the metal structure surrounding the person inside.

## **5. Lightning injury**

**•** Upward leader: Occurs when an upward leader is induced from a person and rises to meet the downward-traveling stepped leader from cloud to the ground. The upward streamer is strong enough to cause injury even when the lightning channel is not completed through

**•** Blunt trauma: Occurs with or separately from all of these mechanisms as a person is thrown

Lightning safety myths abound. One cannot anticipate with certainty where lightning will strike the ground or what it will strike. At best, lightning can only be described statistically as more likely to hit certain types of objects: tall, isolated, and/or pointed. Safety messages that stress avoiding standing in a certain way, not holding specific objects, being a certain distance from tall objects, and variations on these concepts are not reliable. Safety messages that emphasize what is on the feet are irrelevant; after lightning rips apart several kilometers of air on its journey from cloud to ground, a thin rubber shoe is overwhelmed and immaterial. Part of the perpetuation of many false myths is due to the fact that around 90% of lightning casualties survive. What may seem to be a factor in an individual case, and go on to assume

It is estimated that around 10% of lightning casualties are related to trees [11]. Perhaps, a third of all cloud-to-ground lightning flashes around the globe attach to trees. Once lightning strikes a tree, the current comes down the trunk and spreads horizontally (ground current). It also produces side flashes to people or animals who are close by, such as those seeking to stay dry under the canopy (**Figure 11**) and are also close to the trunk. Some people may suffer contact injury if they are touching the tree at the moment lightning strikes it. Finally, blunt and penetrating trauma can occur when bark and tree limbs explode outward at a high speed up

Only two reliable safe locations exist from lightning. One reliably safe location is inside a substantial well-constructed building with wiring, plumbing, and perhaps metal structural members. Such buildings where people work or live are able to provide a path for a lightning strike into the ground without causing harm. The other reliably safe location is inside a fully enclosed metal-topped vehicle. The effects of such buildings and vehicles are similar to that of a Faraday cage where the current flows outside of people within the structure or vehicle. Direct strikes to such buildings and vehicles can be frightening and sometimes have discon‐ certing impacts. However, such property damage is massively preferred to people being

Unsafe structures include anywhere with the word shelter attached—beach, sun, shade, rain, or bus shelter. While they can be made safe, most people cannot tell if that is the case (**Figure 13**). One should always assume they are not lightning-safe because they will likely not surround a person inside with a certain path for lightning to follow. Similarly, any other structure is unsafe when made of mud, brick, or thatch without specifically designed metal-

the person.

to tens of meters away.

outside of such locations.

conducting paths for the current to follow.

**•** Direct strike: The least common mechanism.

or when too near where the lightning strikes [10].

14 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

mythological quality, usually does not generalize to a population.

## **5.1. Effects of lightning on people**

A myriad of injuries from lightning have been reported including damage to the ears, eyes, skin, heart, and brain [13]. The proximate cause of death is cardiac arrest and anoxic brain injury at the time of the strike, even if resuscitation delays the legal pronouncement for a few days.

Most people assume that lightning causes a significant burn injury but, in developed countries, burns tend to be superficial and insignificant and lightning causes more neurological injury and blunt trauma. At the time of the strike, injured persons often suffer keraunoparalysis, a paralytic state lasting minutes to hours with loss of sensation affecting the lower limbs more than the upper limbs. In developed countries, keraunoparalysis usually resolves without treatment, although some may have permanent weakness. In developing countries, where mud brick, thatched roofs, and other insubstantial buildings are the norm, keraunoparalysis may prevent even the most robust person from escaping as burning thatch falls on them, resulting in reports by journalists of "charred bodies" [14].

Lightning-injured persons may suffer temporary or permanent neurological problems including chronic pain syndromes and cognitive damage similar to those reported in postconcussive syndrome with inability to multitask, attention-deficit, memory problems, learning difficulty, irritability, and inability to return to their previous level of employment [15, 16]. Disability may significantly affect a family's socioeconomic status if the survivor is unable to return to work or needs chronic care. A further setback to the victim's family, particularly in developing countries such as in Africa, is a common belief that a family affected by lightning injury has been "cursed". This may force the family to leave their community, home, and employment to start over in a new community where their tragedy is unknown [17].

## **5.2. Treatment of lightning injury**

While the effects of lightning injury can be treated, currently there is no way to reverse or decrease the damage that is set in motion when the strike occurs. Lightning Strike and Electric Shock Survivors International (LSESSI) is a support group that has helped hundreds of survivors and their families [16]. Treatment is standard for pain syndromes, anoxic brain injury, and cognitive disability. Unfortunately, this type of care is expensive and seldom available in developing countries.

As in most injuries and illnesses, prevention is far better than caring for those injured by lightning and, in developed countries, lightning injury prevention is simple and cost-effective [18]. However, decreasing lightning injuries in developing countries is a much more complex task than in countries where lightning-safe structures and vehicles are common and close by. In cases where lightning affected pupils and staff at unsubstantial schools in developing nations, a review of over 100 events in the past decade found 200 deaths and 700 injuries. These events occurred most often in primary and high schools with many situations involving dozens of children per event. Partially in response to such events, the African Centre for Lightning and Electromagnetics Network (www.ACLENet.org) was established in 2013. The confluence of lightning frequency, personal vulnerability at work, school, and home in less developed countries in Africa, as well as in Southeast Asia [19], makes this a timely endeavor.

## **6. Conclusions**

The location, time, and frequency of cloud-to-ground lightning around the globe have become quite well known. About two-thirds of cloud-to-ground lightning occurs between noon and 1800 local time, and about two-thirds occur in summer in the middle latitudes. Lightning is more frequent along coastlines and near large topographic features in the Tropics and subtropics. However, due to socio-economic factors, the number of fatalities and injuries is not as closely related to lightning frequency as might be expected. In the United States and other developed countries, the fatality rate per capita is more than two orders of magnitude lower than a century ago. This rate decrease is attributable to a shift from rural to urban settings where fewer people are involved in labor-intensive agriculture. Many lightning injuries and deaths in the developed world are from leisure activities, often related to various types of activities in the vicinity of water bodies [2]. At least as important has been improvements to the quality of buildings that are usually safe from lightning due to grounding according to codes that provide safety to people inside. In addition, the widespread availability of fully enclosed metal-topped vehicles provides mobile lightning-safe locations almost everywhere. Additional factors include better medical treatment and improved understanding of thunder‐ storms and their associated lightning threat.

In contrast, many developing countries continue to have very high per-capita fatality and injury rates. A large portion of their population is involved in labor-intensive agriculture during the daytime when thunderstorms are more common. No lightning-safe locations are typically available to these people, often nearly equally male and female, while working in the fields. In addition, dwellings occupied outside of working hours often are not lightningsafe due to their construction of mud brick and thatch or sheet metal roofing. As a result, an estimate is that as many as 24,000 deaths and 240,000 injuries occur from lightning globally every year [7], almost entirely in the less-developed countries of the world. These figures remain elusive, due to sporadic data gathering in the most lightning-vulnerable locations where data collection will be slow to improve in the near future such as equatorial countries of Africa and Asia. Injury prevention is far better than taking care of people after they are injured; both scenarios are relatively easier in developed countries. Unfortunate‐ ly, the infrastructure and housing in developing countries precludes easy answers like "When Thunder Roars, Go Indoors", and lightning remains a substantial threat to entire villages, schools, and populations.

## **Author details**

and blunt trauma. At the time of the strike, injured persons often suffer keraunoparalysis, a paralytic state lasting minutes to hours with loss of sensation affecting the lower limbs more than the upper limbs. In developed countries, keraunoparalysis usually resolves without treatment, although some may have permanent weakness. In developing countries, where mud brick, thatched roofs, and other insubstantial buildings are the norm, keraunoparalysis may prevent even the most robust person from escaping as burning thatch falls on them,

Lightning-injured persons may suffer temporary or permanent neurological problems including chronic pain syndromes and cognitive damage similar to those reported in postconcussive syndrome with inability to multitask, attention-deficit, memory problems, learning difficulty, irritability, and inability to return to their previous level of employment [15, 16]. Disability may significantly affect a family's socioeconomic status if the survivor is unable to return to work or needs chronic care. A further setback to the victim's family, particularly in developing countries such as in Africa, is a common belief that a family affected by lightning injury has been "cursed". This may force the family to leave their community, home, and

employment to start over in a new community where their tragedy is unknown [17].

While the effects of lightning injury can be treated, currently there is no way to reverse or decrease the damage that is set in motion when the strike occurs. Lightning Strike and Electric Shock Survivors International (LSESSI) is a support group that has helped hundreds of survivors and their families [16]. Treatment is standard for pain syndromes, anoxic brain injury, and cognitive disability. Unfortunately, this type of care is expensive and seldom

As in most injuries and illnesses, prevention is far better than caring for those injured by lightning and, in developed countries, lightning injury prevention is simple and cost-effective [18]. However, decreasing lightning injuries in developing countries is a much more complex task than in countries where lightning-safe structures and vehicles are common and close by. In cases where lightning affected pupils and staff at unsubstantial schools in developing nations, a review of over 100 events in the past decade found 200 deaths and 700 injuries. These events occurred most often in primary and high schools with many situations involving dozens of children per event. Partially in response to such events, the African Centre for Lightning and Electromagnetics Network (www.ACLENet.org) was established in 2013. The confluence of lightning frequency, personal vulnerability at work, school, and home in less developed

countries in Africa, as well as in Southeast Asia [19], makes this a timely endeavor.

The location, time, and frequency of cloud-to-ground lightning around the globe have become quite well known. About two-thirds of cloud-to-ground lightning occurs between noon and 1800 local time, and about two-thirds occur in summer in the middle latitudes. Lightning is

resulting in reports by journalists of "charred bodies" [14].

16 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**5.2. Treatment of lightning injury**

available in developing countries.

**6. Conclusions**

Ronald L. Holle1\* and Mary Ann Cooper2

\*Address all correspondence to: rholle@earthlink.net

1 Holle Meteorology & Photography, Oro Valley, Arizona, USA

2 Department of Emergency Medicine, University of Illinois at Chicago, Chicago, Illinois, USA

## **References**


[12] Cooper MA, Holle RL. Casualties from lightning involving motorcycles. In: Preprints of the International Conference on Lightning and Static Electricity, 28–30 August 2007; Paris, France; 2007. paper Ic07/PPRKM02.

**References**

ties.

s11069-014-1254-9.

Meteorological Society; 1999. pp. 379-380.

18 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Society, paper 103-34 KMS; 2003. 7 pp.

California, Vaisala; 2016, 4 pp.

j.atmosres.2011.04.020.

California, Vaisala; 2016.

Orlando, FL, Vaisala; 2010. 19 pp.

[1] Roeder WP, Cummins BH, Cummins KL, Holle RL, Ashley WS. Lightning fatality risk map of the contiguous United States. Natural Hazards, 2015;79:1681-1692. DOI 10.1007/

[2] National Oceanic and Atmospheric Administration, National Weather Service, www.lightningsafety.noaa.gov, Victims/Medical, Analysis of Recent Lighting Fatali‐

[3] Cherington M, Walker J, Boyson M, Glancy R, Hedegaard H, Clark S. Closing the gap on the actual numbers of lightning casualties and deaths. In: Preprints of the 11th Conference on Applied Climatology, 10–14 January 1999; Dallas, Texas, American

[4] Holle RL. The number of documented global lightning fatalities. In: Preprints of the 6th International Lightning Meteorology Conference, 18-21 April 2016; San Diego,

[5] Gomes R, Ab Kadir MZA. A theoretical approach to estimate the annual lightning hazards on human beings. Atmospheric Research, 2011;101:719-725. doi:10.1016/

[6] Cardoso I, Pinto Jr. O, Pinto IRCA, Holle RL. A new approach to estimate the annual number of global lightning fatalities. In: Preprints of the 14th International Conference on Atmospheric Electricity (ICAE), 8–11 August 2011; Rio de Janeiro, Brazil, 2011. 4 pp.

[7] Holle RL, López RE. A comparison of current lightning death rates in the U.S. with other locations and times. In: Preprints of the International Conference on Lightning and Static Electricity, 16–19 September 2003; Blackpool, England, Royal Aeronautical

[8] Holle RL. Lightning-caused deaths and injuries related to agriculture. In: Preprints of the 6th International Lightning Meteorology Conference, 18–21 April 2016; San Diego,

[9] Holle RL. Lightning-caused casualties in and near dwellings and other buildings. In: Preprints of the International Lightning Meteorology Conference, 21–22 April 2010;

[10] Blumenthal R, West NJ. Investigating the risk of lightning's pressure blast wave. South

[11] Holle RL. Lightning-caused deaths and injuries In the vicinity of trees. In: Preprints of the 6th Conference on the Meteorological Applications of Lightning Data, 6–10 January

African Journal of Science, 2015;111(3/4): 5; article 2014-0187.

2013; Austin, Texas, American Meteorological Society; 2013. 8 pp.


## **Analysis of the Twitter Response to Superstorm Sandy: Public Perceptions, Misconceptions, and Reconceptions of an Extreme Atmospheric Hazard**

John A. Knox, Brendan Mazanec, Emily Sullivan, Spencer Hall and Jared A. Rackley

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/64019

## **Abstract**

Superstorm Sandy was the second-costliest hurricane in U.S. history, causing cata‐ strophic flooding and prolonged power outages in New Jersey and New York. The public's response to this extreme event on the social network Twitter is examined using statistical analysis and manual inspection of 185,000 "tweets" relating to Sandy. Sentiment analysis of tweets from Manhattan Island reveals a statistically significant trend toward negative perceptions, especially on the southern half of the island, as Sandy made landfall. Inspection of all tweets uncovered scientific misconceptions regarding hurricanes, and a surprising and disquieting anthropomorphic *reconception* of Sandy. This reconception, divorced from factual information about the storm, dominated the "Twittersphere" compared to official scientific information. The implications of such reconceptions for social media communication during future extreme events, and the utility of the methodology employed for analysis of other events, are discussed.

**Keywords:** hurricane, social media, scientific misconceptions, Superstorm Sandy, Twitter

## **1. Introduction**

At the end of October 2012, Superstorm Sandy—a hurricane with the size of an extratropical cyclone—battered the mid-Atlantic coast of the United States. This chapter examines the

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

response of the public on the social media venue Twitter [1] to the approach, landfall, and immediate aftermath of the storm.

Twitter is one of the most popular social networks in the world, with approximately 180 million users at the time of Sandy and over 300 million users at the end of 2015 [2]. A novel feature of Twitter is the restriction of posts to 140 characters or fewer. This restriction encourages brevity and contractions among users, and limits in-depth discussion, unlike other social networks such as Facebook.

The social media reaction to Sandy has been examined from a variety of angles. Edwards et al. analyzed millions of geolocated "tweets" (i.e., posts on Twitter) and focused on the utility of Twitter to meet needs often provided heretofore by first responders and relief agencies [3]. Similarly, Chatfield et al., in a conference presentation, studied the ability of Twitter users to convey time-critical information during this disaster [4]. Lachlan et al. noted, however, that Twitter communications during the storm were used more for emotional release than for dissemination of information, and that messages from official organizations were largely absent [5]. In addition, a large automated effort to examine Twitter messages during Sandy is underway at the National Center for Atmospheric Research [6].

Our research presented in this chapter attempts to combine the best aspects of statistical analysis of a large dataset with qualitative insights gained by manual, not automated, exami‐ nation of individual tweets. We have accomplished the latter via the creation of original software which makes visual inspection of Twitter messages easy and efficient. Our initial objective was to characterize misconceptions and their propagation in Twitter posts; however, our research revealed not only misconceptions but an intriguing, and disquieting, *reconcep‐ tion* of Sandy that threatened to drown out factual messages regarding the storm.

Below, we provide an overview of Superstorm Sandy, discuss in detail the two methodologies used to study the dataset of tweets, examine the results from both methodologies, and provide conclusions based on our quantitative and qualitative results.

## **2. Physical science overview**

This overview of Sandy relies on the authoritative post-storm analysis of [7]. Sandy began as a tropical wave off the west coast of Africa on 11 October 2012. After traversing the tropical Atlantic and moving westward into the Caribbean without much growth, it reached tropical storm intensity (34 kt or 39 mph or 17 ms−1) on 22 October south of Jamaica (**Figure 1**). At this point the storm received the name "Sandy," from the list created by the World Meteorological Organization for Atlantic hurricanes which alternates between male and female names familiar to the cultures that border the tropical Atlantic. "Sandy" was selected a female name, given to the 18th named storm of the 2012 Atlantic hurricane season, following "Rafael."

Two days later, after executing a loop south of Jamaica, Sandy reached Category 1 hurricane strength (64 kt or 74 mph or 33 ms−1) on the Saffir-Simpson Hurricane Intensity Scale at 1200 UTC 24 October just off the southeast coast of Jamaica. It then crossed Jamaica, moving northward, and became a major hurricane (100 kt or 115 mph or 51 ms−1), i.e. a Category 3 on the Saffir-Simpson scale, just prior to landfall in Cuba. Jamaica, Cuba, Haiti, and the Dominican Republic were all impacted by Sandy, with 69 deaths and hundreds of thousands of homes destroyed, particularly in Cuba. Sandy expanded greatly in size after crossing Cuba, with the radius of tropical-storm-force winds doubling by the time it passed the Bahamas. However, its winds dropped in intensity, below the threshold for a hurricane.

response of the public on the social media venue Twitter [1] to the approach, landfall, and

Twitter is one of the most popular social networks in the world, with approximately 180 million users at the time of Sandy and over 300 million users at the end of 2015 [2]. A novel feature of Twitter is the restriction of posts to 140 characters or fewer. This restriction encourages brevity and contractions among users, and limits in-depth discussion, unlike other social networks

The social media reaction to Sandy has been examined from a variety of angles. Edwards et al. analyzed millions of geolocated "tweets" (i.e., posts on Twitter) and focused on the utility of Twitter to meet needs often provided heretofore by first responders and relief agencies [3]. Similarly, Chatfield et al., in a conference presentation, studied the ability of Twitter users to convey time-critical information during this disaster [4]. Lachlan et al. noted, however, that Twitter communications during the storm were used more for emotional release than for dissemination of information, and that messages from official organizations were largely absent [5]. In addition, a large automated effort to examine Twitter messages during Sandy is

Our research presented in this chapter attempts to combine the best aspects of statistical analysis of a large dataset with qualitative insights gained by manual, not automated, exami‐ nation of individual tweets. We have accomplished the latter via the creation of original software which makes visual inspection of Twitter messages easy and efficient. Our initial objective was to characterize misconceptions and their propagation in Twitter posts; however, our research revealed not only misconceptions but an intriguing, and disquieting, *reconcep‐*

Below, we provide an overview of Superstorm Sandy, discuss in detail the two methodologies used to study the dataset of tweets, examine the results from both methodologies, and provide

This overview of Sandy relies on the authoritative post-storm analysis of [7]. Sandy began as a tropical wave off the west coast of Africa on 11 October 2012. After traversing the tropical Atlantic and moving westward into the Caribbean without much growth, it reached tropical storm intensity (34 kt or 39 mph or 17 ms−1) on 22 October south of Jamaica (**Figure 1**). At this point the storm received the name "Sandy," from the list created by the World Meteorological Organization for Atlantic hurricanes which alternates between male and female names familiar to the cultures that border the tropical Atlantic. "Sandy" was selected a female name, given to

Two days later, after executing a loop south of Jamaica, Sandy reached Category 1 hurricane strength (64 kt or 74 mph or 33 ms−1) on the Saffir-Simpson Hurricane Intensity Scale at 1200 UTC 24 October just off the southeast coast of Jamaica. It then crossed Jamaica, moving

*tion* of Sandy that threatened to drown out factual messages regarding the storm.

the 18th named storm of the 2012 Atlantic hurricane season, following "Rafael."

underway at the National Center for Atmospheric Research [6].

22 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

conclusions based on our quantitative and qualitative results.

**2. Physical science overview**

immediate aftermath of the storm.

such as Facebook.

**Figure 1.** The track of Sandy from inception in the Caribbean Sea until its demise after extratropical transition (ET) over the northeastern United States. Saffir-Simpson categories are indicated by number and shading. Image adapted from https://coast.noaa.gov/hurricanes/?redirect=301ocm.

As it moved northward in the western Atlantic, Sandy regained strength and became a menace to the Atlantic coast of the United States. Due to interactions with an approaching trough over North America and the influence of warm Gulf Stream waters underneath it, Sandy intensified as it moved north and reached a secondary maximum of 85 kt (98 mph or 44 ms−1) at 1200 UTC on 29 October about 220 nautical miles (405 km) southeast of Atlantic City, New Jersey. Under the influence of a strong high-pressure system to its north, Sandy executed a highly unusual left turn that brought the storm inland over New Jersey instead of heading eastward out to sea. At 2330 UTC on 29 October, Sandy made landfall just northeast of Atlantic City with an estimated sustained wind of 70 kt (81 mph or 36 ms−1) and a minimum central pressure of 945 mb. It was one of the largest and most intense hurricanes ever observed in the mid-Atlantic, particularly for so late in hurricane season. Due to the increasingly extratropical nature of the storm as it approached the coast, hurricane experts reclassified Sandy as a "post-tropical" storm shortly before landfall. However, in terms of hazards such as wind, rain, and storm surge, Sandy was virtually indistinguishable from a hurricane. In this paper we adopt the descriptor "Superstorm" for Sandy in order to reflect its dual nature.

**Figure 2.** The Battery Park underpass at the southern tip of Manhattan Island immediately after Sandy (top), and what it normally looks like (bottom). Photo courtesy K.C.Wilsey/FEMA.

Sandy's impacts in the United States were unusually severe and widespread, owing to its nearly 1000-mile swath of gale-force winds near the time of landfall. It was the deadliest tropical cyclone to hit outside of the southern U.S. in 40 years, with 72 U.S. deaths directly attributed to Sandy. A majority of these deaths (41 out of 72, or 57%) were due to storm surge along the Atlantic coast, but 20 of the deaths (28%) were due to falling trees. In addition, 87 indirect deaths were caused by Sandy, mostly related to loss of power during cold weather. At least 650,000 homes were damaged or destroyed in the U.S., approximately half in the state of New York near or along the coastline. So many homes perished because of the exceptional storm surge, aided by a full-moon tide, which reached a record 9.40 feet (2.87 m) above normal at the Battery on the southern tip of Manhattan Island (**Figure 2**). Similar high tides occurred just to the right of Sandy's eye at landfall; before it failed, the Sandy Hook tide gauge recorded a surge of 8.57 feet (2.61 m) above normal. Throughout the region, from New Jersey to Connecticut, barrier islands were inundated and in some cases breached; coastal areas were flooded to depths of several feet.

Superstorm Sandy caused approximately \$50 billion in damage and was the second-costliest hurricane in U.S. history, topped only by Hurricane Katrina in 2005. About 8.5 million people lost electrical power during the storm, most of them in the hard-hit regions of New Jersey and New York; some customers were without power for months. It was the worst disaster in the history of the New York subway system. The flooding and power outages due to Sandy also closed the New York Stock Exchange for 2 days, the NYSE's longest closure in 124 years. It is in the context of this extreme event that we now turn to the public's response on Twitter.

## **3. Data and methodology**

particularly for so late in hurricane season. Due to the increasingly extratropical nature of the storm as it approached the coast, hurricane experts reclassified Sandy as a "post-tropical" storm shortly before landfall. However, in terms of hazards such as wind, rain, and storm surge, Sandy was virtually indistinguishable from a hurricane. In this paper we adopt the

**Figure 2.** The Battery Park underpass at the southern tip of Manhattan Island immediately after Sandy (top), and what

it normally looks like (bottom). Photo courtesy K.C.Wilsey/FEMA.

descriptor "Superstorm" for Sandy in order to reflect its dual nature.

24 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

As a reminder, data for this study are Twitter posts, which are uncensored public utterances on a social media platform. Readers are advised of more-than-occasional strong language that is inevitably included in this narrative.

According to Pew Research, there were over 20 million tweets about Superstorm Sandy from October 27, 2012 through November 1, 2012 [8]. An analysis of that volume of data was beyond the scope of this project. Instead, we chose to isolate subsets of this immense trove of tweets and eventually created our own software to examine, both qualitatively and quantitatively, a sizable subset of the full trove. The intent was to create an intermediate level of breadth of tweets and depth of analysis of the tweets, rather than either to crunch statistics on a huge dataset or else to scrutinize in fine detail a small number of tweets. Our approach merges both statistical analysis with informed qualitative impressions based on the personal reading of thousands of tweets, made efficient via our software.

To this end, a third-party Twitter export service, GNIP Company, was used to acquire the necessary Twitter data needed for this study. The service used the two keywords "sandy" and "superstorm" to sift through all of the tweets posted between October 25, 2012, and November 2, 2012, tag the tweets that contained one or both of those keywords, and export the tagged tweets to two JavaScript Object Notation (JSON) files. A JSON file is a representation of JavaScript Objects in text form. These files are referred to below as "the full dataset" or "the complete dataset," and contain approximately 185,000 individual tweets. To our knowledge this wealth of data makes our research one of the more comprehensive analyses to date of social media during the Sandy event.

Two methodologies were pursued in the course of our research, and are discussed in detail below.

## **3.1. Sentiment analysis methodology**

## *3.1.1. Data*

Initial analysis of the dataset was begun using the open source software OpinionFinder [9], which can identify subjectivity and positive or negative sentiment in phrases. This software is used widely in multiple disciplines [10]. Tweets are classified on a numeric scale with positive tweets set to greater than zero, negative tweets set to less than zero and neutral tweets set to zero. Problems inherent in the OpinionFinder classifying system are described in Ref. [11]. This study assumes that OpinionFinder correctly identifies tweets as positive, negative, or neutral, but also notes the tendency of OpinionFinder to over-classify tweets as neutral. Fortunately, a neutral zero does not skew the data. Numerous other studies have used OpinionFinder analysis of tweets to conduct research, including some work on Superstorm Sandy [12].

A small subset of the full dataset, tweets on Manhattan Island, was examined. The point of this component of the research was to track the evolution of sentiment over time; therefore, classified tweets were divided into equal time intervals. The first analysis uses 17 time intervals of 12 hours each, while the second compares a before-event interval of to a during/after event period. Twelve-hour intervals were chosen for simplicity while maintaining high enough time resolution to denote change. The two 12-hour periods from 1200 UTC 29 October to 1200 UTC 30 October also encompass an appropriate time period of the direct impacts from Sandy on Manhattan. The time of 1200 UTC 29 October was chosen as the mark between pre-event and event based on Manhattan weather observations.

Classified tweets for each time period were then aggregated using ArcMap to census tracts in Manhattan, with each tract taking the sum of the classified tweets (i.e., tracts that have a higher proportion of positive tweets to negative tweets obtain a more positive rank). Aggregating the data points to census tracts made it more manageable and help to smooth out pockets of many tweets versus areas with fewer tweets. Census tracts also help to distribute the data by population, because tracts are roughly similar in population count. The census tracts in Manhattan are small enough in size that it can be assumed residents within the area experi‐ enced similar effects from Sandy.

Bias is inherent in using data such as tweets. Here, the database is obviously biased towards those with access to the internet via desktop (and especially access to smartphones, because power outages occurred), and is presumably also biased toward a younger demographic that utilizes Twitter. For this study, we assume that the opinions reflected in the tweets adequately reflect the opinion of others in the area that did not have access to Twitter.

## *3.1.2. Statistical analysis*

complete dataset," and contain approximately 185,000 individual tweets. To our knowledge this wealth of data makes our research one of the more comprehensive analyses to date of

26 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Two methodologies were pursued in the course of our research, and are discussed in detail

Initial analysis of the dataset was begun using the open source software OpinionFinder [9], which can identify subjectivity and positive or negative sentiment in phrases. This software is used widely in multiple disciplines [10]. Tweets are classified on a numeric scale with positive tweets set to greater than zero, negative tweets set to less than zero and neutral tweets set to zero. Problems inherent in the OpinionFinder classifying system are described in Ref. [11]. This study assumes that OpinionFinder correctly identifies tweets as positive, negative, or neutral, but also notes the tendency of OpinionFinder to over-classify tweets as neutral. Fortunately, a neutral zero does not skew the data. Numerous other studies have used OpinionFinder analysis of tweets to conduct research, including some work on Superstorm

A small subset of the full dataset, tweets on Manhattan Island, was examined. The point of this component of the research was to track the evolution of sentiment over time; therefore, classified tweets were divided into equal time intervals. The first analysis uses 17 time intervals of 12 hours each, while the second compares a before-event interval of to a during/after event period. Twelve-hour intervals were chosen for simplicity while maintaining high enough time resolution to denote change. The two 12-hour periods from 1200 UTC 29 October to 1200 UTC 30 October also encompass an appropriate time period of the direct impacts from Sandy on Manhattan. The time of 1200 UTC 29 October was chosen as the mark between pre-event and

Classified tweets for each time period were then aggregated using ArcMap to census tracts in Manhattan, with each tract taking the sum of the classified tweets (i.e., tracts that have a higher proportion of positive tweets to negative tweets obtain a more positive rank). Aggregating the data points to census tracts made it more manageable and help to smooth out pockets of many tweets versus areas with fewer tweets. Census tracts also help to distribute the data by population, because tracts are roughly similar in population count. The census tracts in Manhattan are small enough in size that it can be assumed residents within the area experi‐

Bias is inherent in using data such as tweets. Here, the database is obviously biased towards those with access to the internet via desktop (and especially access to smartphones, because power outages occurred), and is presumably also biased toward a younger demographic that utilizes Twitter. For this study, we assume that the opinions reflected in the tweets adequately

reflect the opinion of others in the area that did not have access to Twitter.

social media during the Sandy event.

**3.1. Sentiment analysis methodology**

event based on Manhattan weather observations.

enced similar effects from Sandy.

below.

*3.1.1. Data*

Sandy [12].

The first step in analyzing the evolution of the opinions over time was to determine if there is a statistically significant change. To accomplish this, an analysis of variance (ANOVA) model was run using the RStudio statistical IDE [13]. ANOVA can effectively determine if one or more of the time intervals has a statistically different positive/negative opinion. In our work, the null hypothesis was that no change in opinion occurred over the 9 days.

To further assess an evolution in opinion, a second test was used to determine if there was a difference in opinion between two time periods: before Sandy's impacts were felt in Manhattan and after the effects began to be felt. In order to determine which test is appropriate for measuring the difference between these time pairs, that data were tested for normality using the Kolmogorov-Smirnov (K-S) Goodness-of-Fit test. If the data were determined to be nonnormal by the K-S test, then the Wilcoxon Matched Pairs Signed-Ranks test was then used to determine a difference. (The Wilcoxon Matched Pairs Signed-Ranks test is appropriate for determining whether there are significant differences in a pair of non-parametric data such as this.) Again, the null hypothesis was that there was no change in opinion before and after Sandy's impacts. Finally, sums of the classified opinions for each of the 12 time intervals were plotted using R's barplot command. This allowed for visualization of the data and further determines directionality of the alternate hypothesis.

## **3.2. TweetReviewer methodology**

## *3.2.1. Data*

After our initial OpinionFinder effort, we decided to explore the full dataset in more breadth and depth. The JSON files were then imported to a MySQL database using an import program written in PHP to accomplish this more in-depth analysis.

## *3.2.2. Quantitative and qualitative analysis*

Once the data were in the MySQL database, a tool was needed to aid in reviewing the tweets, sorting them as relevant (pertaining to Superstorm Sandy in some manner) or irrelevant (not referring to Superstorm Sandy) to the project, and bookmarking the tweets of interest. The program TweetReviewer was built by the second author on the .Net 4.0 Framework and written in C# for this purpose (**Figure 3**).

A set of filter words were created and plugged into the program to help determine the relevance of the numerous tweets. The filter words are listed in the Appendix. These filters were used to aid the researchers as they went through the tweet dataset by hand to determine the relevance of the tweets the filters did not tag. The tweets deemed relevant either through the filter tagging process or by hand, were sorted together. The non-relevant tweets were also marked accordingly and separated from the relevant tweets. However, none of the nonrelevant tweets were deleted. The entire database of tweets was archived and retained.

**Figure 3.** TweetReviewer software graphical user interface.

Once the relevant and non-relevant tweets were separated, new keywords were searched using MySQL's search capabilities to identify the number of times that particular keyword was used. This process helped provide a clearer picture of the perception of Superstorm Sandy through the lens of Twitter.

The posts have been analyzed spatially as well as temporally, with word counts and word clouds as a function of day and time scrutinized as quantitative measures of public responses. We have also examined tweets individually by the thousands using the database, which has permitted the authors to develop qualitative insights into the public response that would be difficult or impossible without simple visualization and bookmarking tools.

## **4. Results**

## **4.1. Sentiment analysis**

Analysis using ANOVA allowed for a rejection of the null hypothesis that no change in opinion occurred over the 9 days examined. ANOVA showed that there was a significant difference between the opinions of the different 12-hour time intervals. As seen in **Table 1**, based on an *F*-value of 3.971 calculated using [14] and a *p*-value of 0.0464, a difference was found between the time intervals that are significant at the 95% confidence level.

The second set of tests included the Kolmogorov-Smirnov (K-S) Goodness-of-Fit test, and Wilcoxon Matched Pairs Signed-Ranks test. Results from the K-S tests, shown in **Table 2**, demonstrate that the data are not normally distributed. *p*-Values for both "before" and "after" data are vanishingly small (<< 0.001), indicating that the null hypothesis can be rejected and a non-parametric test statistic would be more appropriate, such as the Wilcoxon Matched Pairs Signed-Ranks test.

Analysis of the Twitter Response to Superstorm Sandy: Public Perceptions, Misconceptions, and Reconceptions of an Extreme Atmospheric Hazard http://dx.doi.org/10.5772/64019 29


**Table 1.** ANOVA results for Manhattan Island sentiment analysis. An asterisk indicates that the result is significant at the 95% confidence level.


**Table 2.** Lilliefors (Kolmogorov-Smirnov) Normality test results for the Manhattan Island sentiment analysis.

The results of the Wilcoxon Matched Pairs Signed-Ranks test give p << 0.001 (**Table 3**). A *p*value this small indicated that, once again, the null hypothesis can be rejected. We can confidently say that opinions for the time period before Sandy's effects were felt in Manhattan differ from the period during and after effects were felt.


**Table 3.** Wilcoxon Matched Pairs Signed-Ranks test results for the Manhattan Island sentiment analysis.

**Figure 3.** TweetReviewer software graphical user interface.

28 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

the lens of Twitter.

**4. Results**

**4.1. Sentiment analysis**

Signed-Ranks test.

Once the relevant and non-relevant tweets were separated, new keywords were searched using MySQL's search capabilities to identify the number of times that particular keyword was used. This process helped provide a clearer picture of the perception of Superstorm Sandy through

The posts have been analyzed spatially as well as temporally, with word counts and word clouds as a function of day and time scrutinized as quantitative measures of public responses. We have also examined tweets individually by the thousands using the database, which has permitted the authors to develop qualitative insights into the public response that would be

Analysis using ANOVA allowed for a rejection of the null hypothesis that no change in opinion occurred over the 9 days examined. ANOVA showed that there was a significant difference between the opinions of the different 12-hour time intervals. As seen in **Table 1**, based on an *F*-value of 3.971 calculated using [14] and a *p*-value of 0.0464, a difference was found between

The second set of tests included the Kolmogorov-Smirnov (K-S) Goodness-of-Fit test, and Wilcoxon Matched Pairs Signed-Ranks test. Results from the K-S tests, shown in **Table 2**, demonstrate that the data are not normally distributed. *p*-Values for both "before" and "after" data are vanishingly small (<< 0.001), indicating that the null hypothesis can be rejected and a non-parametric test statistic would be more appropriate, such as the Wilcoxon Matched Pairs

difficult or impossible without simple visualization and bookmarking tools.

the time intervals that are significant at the 95% confidence level.

**Figure 4.** Sentiment analysis of opinion change in tweets on Manhattan Island from before Sandy to during/after San‐ dy's impact on the area.

**Figure 4** demonstrates this change geographically across Manhattan Island. The most negative changes occurred south of Central Park, broadly consistent with the region that experienced a power blackout due to Sandy [15]. Lesser negative trends, and some positive trends, were found most often north of Central Park, further from the blackout and from the Atlantic coast.

In addition, sums of opinions were plotted over time using R's barplot command. The resulting diagram (**Figure 5**) allows a clear visualization of the evolution of opinions over time. The bar graph shows that overall opinions through 1200 UTC on 28 October tended to be positive, with the exception of slightly negative tweets from 1200 UTC to 1200 UTC on 26 October–27 October. After 1200 UTC 28 October, tweets became much more negative, reaching their negative peak at 1200 UTC on 29 October through 1200 UTC 30 October. Tweets remained negative in sentiment until the end of the period.

Overall, the results indicate that public opinion of Sandy did change in the Manhattan area. It appears that public opinion, expressed through tweets, was more lighthearted before effects from Sandy were felt in Manhattan. This correlates with observed skepticism from many in the path of the storm prior to actual impacts (see the following section). Interestingly, tweets become more negatively skewed 24 hours prior to actual impacts to the local Manhattan area. This could be related to non-meteorological impacts from Sandy that occurred prior to the storm's direct impacts. As stores, subways, and other services closed, public opinion may have begun to shift. These early negative tweets could also be the result of evacuation orders, or news of destruction as Sandy swept up the East Coast. Further tests would be needed to determine the impacts of these and many other variables. It is clear, however, that the ratio of positive to negative opinions became most negative at the time when Sandy's impacts were being felt in Manhattan (1200 UTC–1200 UTC 29–30 October).

**Figure 5.** Temporal analysis of sentiment (positive or negative) on Manhattan Island in 12-hour increments from 25 October 2012 through 2 November 2012.

## **4.2. TweetReviewer analysis**

**Figure 4** demonstrates this change geographically across Manhattan Island. The most negative changes occurred south of Central Park, broadly consistent with the region that experienced a power blackout due to Sandy [15]. Lesser negative trends, and some positive trends, were found most often north of Central Park, further from the blackout and from the Atlantic coast.

30 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

In addition, sums of opinions were plotted over time using R's barplot command. The resulting diagram (**Figure 5**) allows a clear visualization of the evolution of opinions over time. The bar graph shows that overall opinions through 1200 UTC on 28 October tended to be positive, with the exception of slightly negative tweets from 1200 UTC to 1200 UTC on 26 October–27 October. After 1200 UTC 28 October, tweets became much more negative, reaching their negative peak at 1200 UTC on 29 October through 1200 UTC 30 October. Tweets remained negative in

Overall, the results indicate that public opinion of Sandy did change in the Manhattan area. It appears that public opinion, expressed through tweets, was more lighthearted before effects from Sandy were felt in Manhattan. This correlates with observed skepticism from many in the path of the storm prior to actual impacts (see the following section). Interestingly, tweets become more negatively skewed 24 hours prior to actual impacts to the local Manhattan area. This could be related to non-meteorological impacts from Sandy that occurred prior to the storm's direct impacts. As stores, subways, and other services closed, public opinion may have begun to shift. These early negative tweets could also be the result of evacuation orders, or news of destruction as Sandy swept up the East Coast. Further tests would be needed to determine the impacts of these and many other variables. It is clear, however, that the ratio of positive to negative opinions became most negative at the time when Sandy's impacts were

**Figure 5.** Temporal analysis of sentiment (positive or negative) on Manhattan Island in 12-hour increments from 25

sentiment until the end of the period.

October 2012 through 2 November 2012.

being felt in Manhattan (1200 UTC–1200 UTC 29–30 October).

The results uncovered by sentiment analysis failed to capture other insights that the research‐ ers felt were important but were outside the scope of OpinionFinder. This led to the creation and use of TweetReviewer software to allow us to examine, both quantitatively and qualita‐ tively, other aspects of the full database.

While analyzing the full database, it was determined there were two major common categories of interest with regard to the public's perceptions of Superstorm Sandy: scientific misconcep‐ tions, and an anthropomorphic reconception of Sandy that, in the parlance, "went viral."

## *4.2.1. Misconceptions*

There were many misconceptions regarding Hurricane Sandy and hurricanes in general in the dataset. One quotidian misconception was the common misspelling of the word "hurricane." In fact, the word "hurricane" was misspelled so often that the list of filtered words had to be adapted to accommodate the many misspellings (e.g., see #51 in Appendix). More substan‐ tively, there was a general lack of understanding of what defines a hurricane. These miscon‐ ceptions can be separated into four categories: *strength, category, size*, and *duration*.

The strength of a hurricane is most commonly associated with the Saffir-Simpson Hurricane Wind Scale categories. Overall, the Twitter users had a fairly decent grasp of the numerical scale as far as the numbers were concerned, but the understanding of what those numbers stood for was almost entirely absent. Throughout the dataset (**Table 4**) were tweets, many from Florida, that downplayed Hurricane Sandy claiming there was little to worry about because it was "only cat 1." Instead of acknowledging the force of the hurricane, users discriminated against the lower numbers even though the Saffir-Simpson Scale is based on wind only. It does not take into account the damage from other impacts like the storm surge and rainfall, and neither did the Twitter users. A few tweets mentioned Hurricane Sandy as having a storm surge more often associated with a Category 3 or Category 4 hurricane, such as this (tweet #76357 in the full database):


"Irene was a category 3. Sandy is a 1 but with the storm surge it's supposed to act as a 4 at most."

**Table 4.** Tweet word counts related to hurricane strength.


**Table 5.** Tweet word counts related to hurricane size.


**Table 6.** Tweet word counts related to hurricane duration.

The size and duration of Hurricane Sandy was also frequently misunderstood (**Tables 5** and **6**). Very few tweets implied that the users fully grasped the sheer size of any hurricane, let alone the immense size of Sandy. They appeared to assume that a hurricane was a small storm that would be bad, much like a supercell or "derecho-sandy thing" (tweet #4885) that would "touchdown" or "touch land" (tweet #100800), wreak havoc, close schools, and leave, all within a matter of hours. There were several tweets that asked if the storm was "over yet?" Hardly any of the Twitter users seemed to grasp the fact that a hurricane is actually a huge storm spanning hundreds of miles which can last for days. Some tweets even called Hurricane Sandy a tornado (tweets #38455 and #49032) or, as one CEO put it, a "tornadocaine" (tweet #5267).

Among many visual misconceptions propagated during Superstorm Sandy, one of the most prominent and widely shared was a Photoshopped image of a supercell thunderstorm over the Statue of Liberty in New York City (e.g., tweet #60774). The supercell thunderstorm was indicated to be Sandy—another sign of confusion regarding the vast differences in size, strength, and duration between hurricanes and tornadic thunderstorms. This misconception became so popular on social media that a story about it appeared in the Philadelphia media [16].

## *4.2.2. The reconception of Sandy*

**Tornado Touchdown Landfall How Long**

**Over/over yet Finish Finally Bring it Passed/past**

The size and duration of Hurricane Sandy was also frequently misunderstood (**Tables 5** and **6**). Very few tweets implied that the users fully grasped the sheer size of any hurricane, let alone the immense size of Sandy. They appeared to assume that a hurricane was a small storm that would be bad, much like a supercell or "derecho-sandy thing" (tweet #4885) that would "touchdown" or "touch land" (tweet #100800), wreak havoc, close schools, and leave, all within a matter of hours. There were several tweets that asked if the storm was "over yet?" Hardly any of the Twitter users seemed to grasp the fact that a hurricane is actually a huge storm spanning hundreds of miles which can last for days. Some tweets even called Hurricane Sandy a tornado (tweets #38455 and #49032) or, as one CEO put it, a "tornadocaine" (tweet #5267).

Among many visual misconceptions propagated during Superstorm Sandy, one of the most prominent and widely shared was a Photoshopped image of a supercell thunderstorm over the Statue of Liberty in New York City (e.g., tweet #60774). The supercell thunderstorm was indicated to be Sandy—another sign of confusion regarding the vast differences in size, strength, and duration between hurricanes and tornadic thunderstorms. This misconception

25 Oct. 2012 13 0 10 0 26 Oct. 2012 21 0 11 1 27 Oct. 2012 33 1 10 1 28 Oct. 2012 126 7 22 2 29 Oct. 2012 213 4 142 13 30 Oct. 2012 170 2 43 10 **Total 576 14 238 27**

32 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

25 Oct. 2012 28 1 0 2 2 26 Oct. 2012 69 2 3 13 3 27 Oct. 2012 76 2 6 35 6 28 Oct. 2012 244 17 17 112 22 29 Oct. 2012 784 38 60 191 60 30 Oct. 2012 785 30 67 23 82 **Total 1986 90 153 376 175**

**Table 5.** Tweet word counts related to hurricane size.

**Table 6.** Tweet word counts related to hurricane duration.

As Superstorm Sandy approached the mid-Atlantic coast, a number of Twitter users began to do something peculiar, at least from the perspective of scientists or emergency managers. Instead of relaying factual information regarding the storm, accounts pretending to be the personification of Hurricane Sandy started appearing, as venues for posting jokes about the hurricane from a first-person perspective. With the help of these accounts, the Twitter community took the idea of personifying Sandy and expanded upon it. Without any visible evidence of premeditation regarding the nature of this anthropomorphic Sandy, the Twitter community banded together and simply accepted a persona of their creation without question. They also hurled insults at the hurricane based on its fabricated persona and its perceived status as a female.

Sandy was suddenly no longer an impersonal, inanimate hurricane; she was an "independent sassy black hurricane who don't need no man" (tweets #13365 and 29 subsequent tweets), who grew up in the ghetto, went to school with (Hurricane) Irene, cursed up a storm, voted Democratic, knew how to work the pole, had an avid and colorful sex life, and was the brunt of many cruel taunts related specifically to female anatomy. **Table 7** provides a sampling of word counts related to this persona in the days leading up to landfall.


**Table 7.** Tweet word counts related to the invented persona of Sandy.

The various Twitter accounts created to portray this user-crafted persona encouraged its incredibly fast proliferation and popularity across the website. They posted tweets which consisted mainly of rather dirty jokes, inappropriate suggestions, racial slurs, female-specific insults, and Republican-specific insults. One of the aforementioned tweets talked about the Sandy persona tossing a trailer at a woman in a minivan simply because she had a bumper sticker of a Republican presidential candidate on her car (tweet #21587).

It is possible that the timing of the hurricane's landfall, within a week of the 2012 presidential elections when tensions between the Democratic and Republican parties were already high, amplified the politically slanted comments. But the outright cruelty and twisted content went beyond simple political reasons. To make matters worse, these offensive tweets were shared hundreds of times by users who were apparently from a variety of ages, races, and political beliefs.

The tweets referring to the Hurricane Sandy persona consistently referred to her as a "bitch" and joked about her coming to "blow" the entire east coast and make everyone "wet" or simply told her to "fuck off." This was such an exceedingly common theme that the words "bitch," "blow," "wet," and "fuck" became keywords for both the filters during the process of going through the tweets as well as for searching their word counts. Of these four words, "wet" was used the least common, only garnering a word count of 544 times used. The word "blow" was used 1129 times, "fuck" was used 3655 times, and "bitch" led all epithets in the full database with a word count of 4335 times used with regard to Hurricane Sandy.

These word counts are, of course, small compared to the total number of words used in the full database. To give a sense of how the persona of Sandy dominated the "Twittersphere," **Table 8** presents the fractional representation of scientific/hurricane-related terms in the full database versus the top three persona terms. As shown in the table, the persona was many times more popular on Twitter than were factual reports about Sandy.


**Table 8.** Ratio of word count of scientific terms to the word counts of the three most common Sandy-persona terms.

## **5. Conclusions**

Our results shed light on public perceptions, misconceptions, and reconceptions of an extreme atmospheric hazard. Superstorm Sandy was a virtually unprecedented event along the Atlantic coast of the United States, in terms of intensity, size, and path. What did the public make of this event on Twitter?

From our analysis of Manhattan Island data, it is plausible that residents underrated the storm's ferocity until the last 24 hours before landfall. The largest sentiment swings occurred, quite naturally, in and near the regions most affected by the storm: the coastline and the south Manhattan neighborhoods blacked out due to a power plant failure.

Our more fine-grained analysis using the TweetReviewer software revealed additional aspects of the public's reaction to Sandy. Profound confusion exists regarding the size, strength, and duration of hurricanes. The public seems to confuse hurricanes with tornadoes; this confusion during Sandy extended to the Photoshopped image of a supercell thunderstorm over the Statue of Liberty. Expectations that Sandy would be as brief and intense as a tornado were not met; in particular, the unusually large extent of Sandy (the largest storm in terms of diameter of gale-force winds since records began in 1988; see [7]) was not well understood by those on Twitter. The more subtle point that Sandy could be "only a Category 1" and still do extensive damage due to storm surge was also not grasped.

amplified the politically slanted comments. But the outright cruelty and twisted content went beyond simple political reasons. To make matters worse, these offensive tweets were shared hundreds of times by users who were apparently from a variety of ages, races, and political

The tweets referring to the Hurricane Sandy persona consistently referred to her as a "bitch" and joked about her coming to "blow" the entire east coast and make everyone "wet" or simply told her to "fuck off." This was such an exceedingly common theme that the words "bitch," "blow," "wet," and "fuck" became keywords for both the filters during the process of going through the tweets as well as for searching their word counts. Of these four words, "wet" was used the least common, only garnering a word count of 544 times used. The word "blow" was used 1129 times, "fuck" was used 3655 times, and "bitch" led all epithets in the full database

These word counts are, of course, small compared to the total number of words used in the full database. To give a sense of how the persona of Sandy dominated the "Twittersphere," **Table 8** presents the fractional representation of scientific/hurricane-related terms in the full database versus the top three persona terms. As shown in the table, the persona was many

**Table 8.** Ratio of word count of scientific terms to the word counts of the three most common Sandy-persona terms.

Our results shed light on public perceptions, misconceptions, and reconceptions of an extreme atmospheric hazard. Superstorm Sandy was a virtually unprecedented event along the Atlantic coast of the United States, in terms of intensity, size, and path. What did the public

From our analysis of Manhattan Island data, it is plausible that residents underrated the storm's ferocity until the last 24 hours before landfall. The largest sentiment swings occurred, quite naturally, in and near the regions most affected by the storm: the coastline and the south

Our more fine-grained analysis using the TweetReviewer software revealed additional aspects of the public's reaction to Sandy. Profound confusion exists regarding the size, strength, and duration of hurricanes. The public seems to confuse hurricanes with tornadoes; this confusion during Sandy extended to the Photoshopped image of a supercell thunderstorm over the Statue of Liberty. Expectations that Sandy would be as brief and intense as a tornado were not met; in particular, the unusually large extent of Sandy (the largest storm in terms of diameter of gale-force winds since records began in 1988; see [7]) was not well understood by those on

with a word count of 4335 times used with regard to Hurricane Sandy.

34 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

times more popular on Twitter than were factual reports about Sandy.

Manhattan neighborhoods blacked out due to a power plant failure.

**NWS MPH Category/cat Landfall Windspeed** 0.022 0.133 0.387 0.079 0.0001

beliefs.

**5. Conclusions**

make of this event on Twitter?

What the "Twittersphere" did seem to eagerly grasp was a user-generated anthropomorphic "Sandy" who dealt out death and destruction like a villain in a superhero comic or movie. This personified "Sandy" was then made the object of race-based and gender-based slurs that were widely perceived to be amusing, rather than offensive if they had been made in public. This created an increased level of noise that tended to drown out the true signal: reports of important developments in Hurricane Sandy's category changes, the watches, advisories, and warnings, and tweets with legitimate scientific information that could have better informed the public. Our results thus align more with those of [5] than with the more positive, life-saving impacts of Twitter found by other researchers. Our analysis of geo-located tweets may bias our results somewhat in this respect, however (R. Morss, pers. comm., January 2016).

When much of New York City lost electricity, the theme of attacking the hurricane based on its manufactured persona quickly fell by the wayside in favor of complaining about the power loss and realizing Sandy was in fact a hurricane and not a fictitious creation. The persona continued to play a substantial role in the dataset, but the newest point of interest was the loss of power. In fact, the word "power" was found 8649 times in our database, making it more popular than "bitch."

We conclude, with some surprise, that until the hurricane interfered directly with people's personal lives, Twitter users seemed content with obsessing over the invented persona of Hurricane Sandy. Rather than bemoaning this flight from reality, however, we encourage a more proactive response among emergency management personnel, meteorologists, and others who communicate directly with the public. Perhaps it is possible that this behavior can be utilized for the benefit of the public. If the public could quickly create and propagate a persona for a hurricane, it stands to reason that official outlets could do the same. They could create a Twitter account for a new hurricane, create their own persona for it, and use this to disseminate the important relevant information to the public in a format that is more easily digestible than more esoteric scientific criteria.

For example, a large swath of the U.S. population is familiar with superheroes on some level, whether DC, Marvel (or for even younger audiences, Pokémon). What if scientists began using popular superheroes to help describe the strength of a storm? As outlandish as it sounds, this is something the general public would grasp fairly easily, simply because they are already familiar with the characters. For instance, intense heat could be described as being on par with an attack by Marvel's Human Torch, a member of the Fantastic 4 superhero group. A powerful electrical storm could be compared to Marvel superhero Thor wielding Mjolnir in battle. The incredible storm surge of a hurricane such as Sandy could be compared to the DC supervillain Ocean Master or New Wave in combat. Similar translations of specific hazards into personae would be possible in Pokémon, reaching even younger audiences.

These examples may come across as childish, but the public would easily grasp a general idea of the intensity of the heat, the electrical storm, and the storm surge from them. More impor‐ tantly, children would easily comprehend these examples. A parent can disregard weather alerts, but not if their children continually bother them. If this method of using pop-culture references to explain something as confusing as weather can bridge the communication gap to the next generation, then they could learn to listen to and obey weather alerts. This "if you can't beat 'em, join 'em" approach to anthropomorphizing atmospheric hazards could capture some of the social media energy that might otherwise propel a completely non-factual (and offensive) personification to prominence, as occurred with Sandy.

Finally, we advocate the use and/or development of software such as our TweetReviewer as a means for visually inspecting thousands of tweets easily and efficiently. As our research indicates, actually reading the tweets provides insights that are unlikely to be gained by mere statistical crunching on datasets. The filtering capabilities of TweetReviewer enable the user to focus on relevant tweets and screen out non-relevant tweets, significantly accelerating the process and permitting human analysis of relatively large datasets.

## **Acknowledgements**

The first author gratefully acknowledges the support of an M.C. Michael Award from the University of Georgia, which funded much of the research presented here.

## **Appendix**

Filter words used in our TweetReviewer software to narrow the database to the most relevant posts are:


**15.** "#fyousandy"

alerts, but not if their children continually bother them. If this method of using pop-culture references to explain something as confusing as weather can bridge the communication gap to the next generation, then they could learn to listen to and obey weather alerts. This "if you can't beat 'em, join 'em" approach to anthropomorphizing atmospheric hazards could capture some of the social media energy that might otherwise propel a completely non-factual (and

Finally, we advocate the use and/or development of software such as our TweetReviewer as a means for visually inspecting thousands of tweets easily and efficiently. As our research indicates, actually reading the tweets provides insights that are unlikely to be gained by mere statistical crunching on datasets. The filtering capabilities of TweetReviewer enable the user to focus on relevant tweets and screen out non-relevant tweets, significantly accelerating the

The first author gratefully acknowledges the support of an M.C. Michael Award from the

Filter words used in our TweetReviewer software to narrow the database to the most relevant

offensive) personification to prominence, as occurred with Sandy.

36 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

process and permitting human analysis of relatively large datasets.

University of Georgia, which funded much of the research presented here.

**2.** "#sandy" if either "springs" or "beach" is not present.

**3.** "#hurricanesandy" or "#hurricainesandy"

**Acknowledgements**

**1.** "red cross" or "redcross"

**7.** "hurricane death megatron" **8.** "sandy aftermathpocalypse"

**9.** "halloweenpocalypse"

**4.** "hurricane sandy"

**5.** "jersey shore"

**10.** "#postsandy" **11.** "#aftersandy" **12.** "#fucksandy"

**13.** "#fuckyousandy"

**14.** "#fusandy"

**6.** "#njsandy"

**Appendix**

posts are:


## **Author details**

John A. Knox1\*, Brendan Mazanec1 , Emily Sullivan1 , Spencer Hall1 and Jared A. Rackley2

\*Address all correspondence to: johnknox@uga.edu


## **References**

**45.** If the tweet contains both "Obama" and "Fema"

38 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**50.** If the tweet contains both "prayers" and "affected by sandy"

**46.** "fema"

**47.** "#damnsandy"

**48.** "#darnsandy"

**49.** "#ihatesandy"

**51.** "@hurricannesandy"

**52.** "@ahurricanesandy"

**53.** "@sandydahurricane"

**54.** "@sandyshurricane"

**56.** "#stormsandy"

**57.** "sandy storm"

**58.** "Frankenstorm"

**59.** "hurricane"

**60.** "bitch sandy"

**62.** "#hurricane"

**Author details**

John A. Knox1\*, Brendan Mazanec1

**61.** "#pray"

**55.** "#survivingsandy" or "#survingsandy"

**63.** If the entire tweet is "superstorm sandy xuo"

\*Address all correspondence to: johnknox@uga.edu

2 Oak Ridge National Laboratory, Oak Ridge, TN, USA

1 University of Georgia, Athens, GA, USA

, Emily Sullivan1

, Spencer Hall1

and Jared A. Rackley2


## **Communication and Preparedness Issues on Various Scales from Extreme Atmospheric Hazards**

Robert M. Schwartz

[12] George Mason University. The GeoSocial Gauge [Internet]. 2016. Available from: http://

[13] R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria, 2015. URL https://www.R-project.org/. [14] Free Statistics Calculators. Version 4.0 [Internet]. 2016. Available from: http://

[15] New York Magazine. From The Editors [Internet]. 2012. Available from: http:// nymag.com/nymag/letters/hurricane-sandy-editors-letter-2012-11/ [Accessed:

[16] Mathis J. That Awesome Hurricane Sandy Photo of the Statue of Liberty? Fake [Internet]. 2012. Available from: http://www.phillymag.com/news/2012/10/29/

awesome-hurricane-sandy-photo-statue-liberty-fake/ [Accessed: 2016-04-03]

geosocial.gmu.edu/projects/superstorm-sandy/ [Accessed: 2016-04-03]

40 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

www.danielsoper.com/statcalc3/calc.aspx?id=4 [Accessed: 2016-04-03]

2016-04-03]

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/63447

## **Abstract**

Most federally declared disasters are from atmospheric hazards. These could be from floods, tropical cyclones, tornadoes, or winter storms. Some of these hazards are events with relatively short warning times such as tornadoes or have sufficient warnings from tropical cyclones. This research examined communication of weather information and personal preparedness following the Florida landfall from Tropical Storm Debby in 2012. Another case study examined emergency management issues such as prepared‐ ness and response after the 2011 tornado in Tuscaloosa, Alabama. The concentration was on emergency management agencies at the County, University, State, and Federal levels. A sample of elderly residents of Pinellas and Pasco Counties in Florida completed a self-administered survey to examine various means of receiving weather informa‐ tion along with hurricane preparedness actions. In-depth interviews were conducted with representatives of various agencies on different scales in regard to preparedness and response following the Tuscaloosa tornado. The elderly used television as the primary means of receiving weather information, thus stressing the importance of utilizing both traditional and newer forms of communications to reach all citizens. One of the major issues on all levels following the Tuscaloosa tornado is related to communications such as resource allocations and response actions.

**Keywords:** disaster communications, tornado and hurricane preparedness, personal and community emergency management preparedness, methods of receiving weather information, emergency management response

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

## **1. Introduction**

Most of the federally major declared disasters are due to atmospheric hazards. Of the 3477 total declarations, from 1953 to 2015, 2449 or 71% are weather related [1]. There were 293 major disaster declarations from 2011 to 2015. Only 18 of these major declarations were nonweather related such as earthquakes [2]. Meteorological disasters could be from events such as but not limited to floods, tropical cyclones, tornadoes, or winter storms. Some of these hazards are quick fuse events with relatively short warning times such as tornadoes or have sufficient warning such as a tropical cyclone. This research examined two different atmos‐ pheric hazards that were declared major disasters. The first studied means of communicat‐ ing weather information and personal preparedness of elderly citizens following Tropical Storm Debby in 2012 that made a landfall in Florida. An additional case study examined emergency management issues such as preparedness, response, recovery, and mitigation after the tornado in Tuscaloosa, Alabama, in 2011. The research covered emergency management organizations at the county, university, state, and federal levels.

## **2. Background**

## **2.1. Tropical cyclones**

Some of the most destructive storms are tropical cyclones. Based on the geographic location, tropical cyclones are known as typhoons over the western Pacific Ocean, cyclones over the Indian Ocean, and hurricanes over the Atlantic and eastern Pacific Oceans. Hurricane season in the North Atlantic basin (which impacts the United States) starts 1 June and ends 30 November, with the peak being around mid-September. The North Atlantic basin includes the Gulf of Mexico, Caribbean Sea, and Atlantic Ocean [3,4].

Tropical cyclones are classified by organization of thunderstorm clusters, circulation patterns, and wind speeds. First there is a tropical wave which is an unorganized cluster of thunder‐ storms with a weak surface circulation. An organized cluster of thunderstorms is generally 150–350 miles (250–600 km) in diameter with no closed circulation and maintains an identity for 24 hours is a tropical disturbance. A tropical depression has an identifiable pressure drop, closed circulation, and wind speed less than 39 mph (34 kts); the system is then assigned a number by the National Hurricane Center. If the storm continues to organize with wind speeds greater than or equal to 39 mph (34 kts) to less than 74 mph (64 kts), it is classified as a tropical storm and given a name. Continued development of the system becomes a hurricane with wind speeds exceeding or equal to 74 mph (64 kts) [3,4].

There are five major environmental factors which determine successful tropical cyclone development. These are sea surface temperature, surface layer of warm water, weak vertical wind shear, sufficient moisture in the middle troposphere, and a location at least 5° north or south of the equator. Sea surface temperature must be greater than 80°F (26.5°C) as this supplies the heat and moisture released into the atmosphere. The layer of warm water is usually around 200 ft (60 m) to ensure there are enough warm water and energy as the ocean gets mixed in a process known as upwelling. This depth keeps the warmer water rising and cooler water in deeper regions that will not impede the energy source. Wind shear (winds from opposite directions or too high a speed) needs to be relatively weak to form the vortex. If speeds are too high, the vortex will be torn apart and moves downstream. Dry air in the middle troposphere can weaken the storm by reducing latent heat and increasing downdrafts through evaporative cooling; hence, a tropical cyclone requires enough moisture to ensure formation. Rotation is another important aspect of tropical cyclones and thunderstorm clusters. These clusters need to be 5° north or south of the equator so the Coriolis force is strong enough to help develop rotation; the Coriolis force is zero at the equator [4].

The Saffir-Simpson Scale classifies hurricanes based on wind speed on a scale of 1–5 with 5 the strongest. Before 2010, the scale predicted storm surge height and barometric pressure along with wind speeds. The wind scale was revised again in 2012 (**Table 1**) due to the rounding and conversions from mph to km/h; only category 4 and 5 storms were impacted with this update. Previous storms will not have their categories changed with the new scale. Expected damage generally increases with higher category levels. Category 3 and higher hurricanes are known as major hurricanes [4,5].


**Table 1.** Saffir-Simpson scale [5].

**1. Introduction**

**2. Background**

**2.1. Tropical cyclones**

Most of the federally major declared disasters are due to atmospheric hazards. Of the 3477 total declarations, from 1953 to 2015, 2449 or 71% are weather related [1]. There were 293 major disaster declarations from 2011 to 2015. Only 18 of these major declarations were nonweather related such as earthquakes [2]. Meteorological disasters could be from events such as but not limited to floods, tropical cyclones, tornadoes, or winter storms. Some of these hazards are quick fuse events with relatively short warning times such as tornadoes or have sufficient warning such as a tropical cyclone. This research examined two different atmos‐ pheric hazards that were declared major disasters. The first studied means of communicat‐ ing weather information and personal preparedness of elderly citizens following Tropical Storm Debby in 2012 that made a landfall in Florida. An additional case study examined emergency management issues such as preparedness, response, recovery, and mitigation after the tornado in Tuscaloosa, Alabama, in 2011. The research covered emergency management

Some of the most destructive storms are tropical cyclones. Based on the geographic location, tropical cyclones are known as typhoons over the western Pacific Ocean, cyclones over the Indian Ocean, and hurricanes over the Atlantic and eastern Pacific Oceans. Hurricane season in the North Atlantic basin (which impacts the United States) starts 1 June and ends 30 November, with the peak being around mid-September. The North Atlantic basin includes the

Tropical cyclones are classified by organization of thunderstorm clusters, circulation patterns, and wind speeds. First there is a tropical wave which is an unorganized cluster of thunder‐ storms with a weak surface circulation. An organized cluster of thunderstorms is generally 150–350 miles (250–600 km) in diameter with no closed circulation and maintains an identity for 24 hours is a tropical disturbance. A tropical depression has an identifiable pressure drop, closed circulation, and wind speed less than 39 mph (34 kts); the system is then assigned a number by the National Hurricane Center. If the storm continues to organize with wind speeds greater than or equal to 39 mph (34 kts) to less than 74 mph (64 kts), it is classified as a tropical storm and given a name. Continued development of the system becomes a hurricane with

There are five major environmental factors which determine successful tropical cyclone development. These are sea surface temperature, surface layer of warm water, weak vertical wind shear, sufficient moisture in the middle troposphere, and a location at least 5° north or south of the equator. Sea surface temperature must be greater than 80°F (26.5°C) as this supplies the heat and moisture released into the atmosphere. The layer of warm water is usually around

organizations at the county, university, state, and federal levels.

42 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Gulf of Mexico, Caribbean Sea, and Atlantic Ocean [3,4].

wind speeds exceeding or equal to 74 mph (64 kts) [3,4].

Tropical cyclones are some of the most destructive storms. Some of the impacts are due to storm surge, heavy rain, inland flooding, high winds, and potential tornadoes. Storm surge refers to the rise sea level as the hurricane makes landfall. Onshore winds and the barometric effect (rise in sea level due to low pressure) cause storm surge. Other factors are the wave height, tides, and shoreline shape. Storm surge can cause extensive damage to the landscape and structures. If the tropical cyclone is slow moving or stalled, heavy rains can hit an area. Heavy rain can produce inland flooding and causes the most fatalities along with property destruction. High winds can cause damage, especially to structures not built to withstand higher tropical force winds. There can also be hurricane spawned tornadoes which are often highly concentrated in the right, front quadrant (when looking at the perspective of the hurricane approaching the shore). These tornadoes are usually in the EF0–EF2 range and can also cause property damage [4].

## **2.2. Tropical Storm Debby**

During the 2012 hurricane season, Tropical Storm Debby originated in the south-central Gulf of Mexico after a surface low developed near the Yucatan peninsula and propagated eastward toward an area where the prevailing subtropical ridge had weakened. There was a northern edge of a tropical wave in the Caribbean Sea that merged with the disturbance near the Yucatan peninsula that eventually became Tropical Storm Debby on 22 June [6]. Hurricane Hunter aircraft determined circulation was well defined and the winds were tropical storm strength on 23 June (**Figure 1**). Over the next 24 hours, Tropical Storm Debby moved slowly north to northeastward without a well-defined trajectory forming a rain shield over the northeastern Gulf of Mexico on 24 June. On 25 June, Debby approached the Big Bend area of Florida (area where the Panhandle and Florida peninsula curve on the Gulf of Mexico) and made landfall near Steinhatchee, Florida, on 26 June. Peak winds were estimated at 63 mph (55 kts) and minimum surface pressure was 990 mb [6].

**Figure 1.** Track positions of Tropical Storm Debby [6].

Winds were not the major impact from Debby but the torrential rains caused major impacts along with inland flooding in various parts of the Florida peninsula. A local observer in Wakulla County measured 29 in. (731 mm) during the event and there were other reports greater than 20 in. (508 mm) in the same region (**Figure 2**). Wakulla County is northwest of the track line near the Great Bend with the light blue shading. There were several totals greater than 10 in. (254 mm) over western and northeastern Florida (see **Figure 2**, south of the purpleshaded track line) [6].

Besides inland flooding, storm surges from 2 to 4.5 ft (0.6–1.4 m) occurred from the Florida Panhandle to the southwestern coast of Florida. This resulted in inundation 1–3 ft (0.3–0.9 m) aboveground level with the highest surge reported between Apalachicola and Cedar Key [6] (see **Figure 2**) on and northwest of the purple-shaded track line) [6]. In addition to the heavy rain and flooding, the rain bands east of the center produced a number of tornadoes. NOAA's Storm Prediction Center (SPC) recorded 24 tornadoes primarily rated at EF0 in central Florida on 23 June. On 24 June, tornadoes hit over the southern and central Florida with several rated at EF1 and EF2 (**Figure 3**) [6].

**Figure 2.** Rainfall totals associated with Tropical Storm Debby [6].

**2.2. Tropical Storm Debby**

minimum surface pressure was 990 mb [6].

**Figure 1.** Track positions of Tropical Storm Debby [6].

shaded track line) [6].

During the 2012 hurricane season, Tropical Storm Debby originated in the south-central Gulf of Mexico after a surface low developed near the Yucatan peninsula and propagated eastward toward an area where the prevailing subtropical ridge had weakened. There was a northern edge of a tropical wave in the Caribbean Sea that merged with the disturbance near the Yucatan peninsula that eventually became Tropical Storm Debby on 22 June [6]. Hurricane Hunter aircraft determined circulation was well defined and the winds were tropical storm strength on 23 June (**Figure 1**). Over the next 24 hours, Tropical Storm Debby moved slowly north to northeastward without a well-defined trajectory forming a rain shield over the northeastern Gulf of Mexico on 24 June. On 25 June, Debby approached the Big Bend area of Florida (area where the Panhandle and Florida peninsula curve on the Gulf of Mexico) and made landfall near Steinhatchee, Florida, on 26 June. Peak winds were estimated at 63 mph (55 kts) and

44 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Winds were not the major impact from Debby but the torrential rains caused major impacts along with inland flooding in various parts of the Florida peninsula. A local observer in Wakulla County measured 29 in. (731 mm) during the event and there were other reports greater than 20 in. (508 mm) in the same region (**Figure 2**). Wakulla County is northwest of the track line near the Great Bend with the light blue shading. There were several totals greater than 10 in. (254 mm) over western and northeastern Florida (see **Figure 2**, south of the purple-

Besides inland flooding, storm surges from 2 to 4.5 ft (0.6–1.4 m) occurred from the Florida Panhandle to the southwestern coast of Florida. This resulted in inundation 1–3 ft (0.3–0.9 m) aboveground level with the highest surge reported between Apalachicola and Cedar Key [6] (see **Figure 2**) on and northwest of the purple-shaded track line) [6]. In addition to the heavy

**Figure 3.** Tornado tracks during Tropical Storm Debby [6].

Five direct fatalities and three indirect deaths were recorded with Debby. A mother was killed in Venus, Florida (west of Lake Okeechobee, the largest lake in Florida), after a tornado hit a mobile home. One drowned in heavy surf in Pinellas County, Florida (west-central Florida on the Gulf of Mexico), and another in Orange Beach, Alabama. A canoe capsized near Lake Dorr, Florida (central Florida north of Orlando), and the person drowned and another man pre‐ sumed to drown from the storm was found near Anclote Key, Florida (west-central Florida). The indirect fatalities were a man wading in floodwaters in Pinellas County, Florida, and two others were in automobile crashes on wet roads from Debby [6].

Most of the damage was from inland flooding from the heavy rains in parts of northern and central Florida. The Sopchoppy River in Wakulla County (northwest of track line in **Figure 2**) crested at 36.8 ft (11.21 m) and affected 400 structures. There was flooding in Pasco County (south of track line near the Gulf of Mexico in **Figure 2**) along the Anclote and Pithlachascotee Rivers damaging 106 homes. The Suwanee River had observations highest since Hurricane Dora in 1964. Roads such as US Highway 90 and Interstate 10 (north of track line in **Figure 2**) were closed due to floodwaters. US 90 was closed for almost two weeks and Interstate 10 for two days [6]. In addition, there were several roads closed in many counties and sinkhole problems in Marion County (south of track line in **Figure 2**) due to the heavy rains [7].

Coastal areas were affected from storm surge in the Panhandle, Big Bend, and along US Highway 19 in Hudson, Florida, on the west-central coast of Florida. Some roads were submerged for days and others were washed out. There was extensive beach erosion form Pinellas County southward to Charlotte County with the worse erosion in Treasure and Anna Maria Islands [6].

Preliminary insured losses in Florida were \$105 million according to the Property Claims Services with \$40 million in flood damage covered by the National Flood Insurance Program. Total damage could be \$250 million since insured values are doubled along with flood losses [6]. The Federal Emergency Management Agency (FEMA) approved 6758 Individual Assis‐ tance applications in the amount of \$27,800,267.48 and Public Assistance for communities for \$52,197,352.72. These figures do not include flood insurance claims [8].

## **2.3. US tornado background, activity, and measurement**

## *2.3.1. Tornado background*

Tornadoes are rapidly rotating columns of air extending from the cloud to the ground [4]. According to the American Meteorological Society [9], a tornado is "a violently rotating column of air, pendant from a cumuliform cloud or underneath a cumuliform cloud, and often (but not always) visible as a funnel cloud." By definition, tornadoes are invisible but visible once debris is in the funnel cloud [4].

All tornadoes come from thunderstorms but all thunderstorms do not form tornadoes [10]. Most tornadoes are formed in supercell thunderstorms but can develop from hurricane thunderstorms, squall lines, and regular thunderstorms. Some of these are non-supercell tornadoes, landspouts, waterspouts, mesovortices, and gustnadoes [4]. Supercells are rotating thunderstorms and consist of a mesocyclone (circulation that is detected on radar) and can spawn a tornado along with hail, high winds, lightning, and heavy rain [10].

The typical life cycle or tornadogenesis begins in the dust whirl stage, followed by the organizing stage that reaches full damage at the mature stage, the weakening stage, and finally the rope stage. Environmental factors necessary for formation include vertical wind shear and horizontal rotation [4]. Uplift is also necessary with temperature and pressure differentials.

Tornado widths are commonly 150 ft to 0.5 miles (50–800 m) with wind speeds ranging from 65 mph to greater than 200 mph (57 kts to over 174 kts) [4]; however, greater tornado diameters can occur. Most tornadoes are only on the ground for a short time such as less than 10 minutes [10] but can remain over an hour with damage paths over 30 miles (50 km) [4].

A tornado can occur at any time of the year in the United States. The United States averages around 1000 tornadoes a year. Peak season depends on the location. On the Gulf of Mexico coast, it is the early spring, while May and early June generally have more tornadoes in the southern plains, while the northern plains and upper Midwest have more in June and July [10].

Other countries also get tornadoes but most are reported in the United States. Some of the regions associated with tornado activity are also main agricultural areas that do not include the tropics [4]. Countries with reported tornadoes are Canada, United Kingdom, Bangladesh, Mexico, Argentina, Brazil, and Russia [10].

## *2.3.2. Tornado measurement and activity*

Five direct fatalities and three indirect deaths were recorded with Debby. A mother was killed in Venus, Florida (west of Lake Okeechobee, the largest lake in Florida), after a tornado hit a mobile home. One drowned in heavy surf in Pinellas County, Florida (west-central Florida on the Gulf of Mexico), and another in Orange Beach, Alabama. A canoe capsized near Lake Dorr, Florida (central Florida north of Orlando), and the person drowned and another man pre‐ sumed to drown from the storm was found near Anclote Key, Florida (west-central Florida). The indirect fatalities were a man wading in floodwaters in Pinellas County, Florida, and two

Most of the damage was from inland flooding from the heavy rains in parts of northern and central Florida. The Sopchoppy River in Wakulla County (northwest of track line in **Figure 2**) crested at 36.8 ft (11.21 m) and affected 400 structures. There was flooding in Pasco County (south of track line near the Gulf of Mexico in **Figure 2**) along the Anclote and Pithlachascotee Rivers damaging 106 homes. The Suwanee River had observations highest since Hurricane Dora in 1964. Roads such as US Highway 90 and Interstate 10 (north of track line in **Figure 2**) were closed due to floodwaters. US 90 was closed for almost two weeks and Interstate 10 for two days [6]. In addition, there were several roads closed in many counties and sinkhole problems in Marion County (south of track line in **Figure 2**) due to the heavy rains [7].

Coastal areas were affected from storm surge in the Panhandle, Big Bend, and along US Highway 19 in Hudson, Florida, on the west-central coast of Florida. Some roads were submerged for days and others were washed out. There was extensive beach erosion form Pinellas County southward to Charlotte County with the worse erosion in Treasure and Anna

Preliminary insured losses in Florida were \$105 million according to the Property Claims Services with \$40 million in flood damage covered by the National Flood Insurance Program. Total damage could be \$250 million since insured values are doubled along with flood losses [6]. The Federal Emergency Management Agency (FEMA) approved 6758 Individual Assis‐ tance applications in the amount of \$27,800,267.48 and Public Assistance for communities for

Tornadoes are rapidly rotating columns of air extending from the cloud to the ground [4]. According to the American Meteorological Society [9], a tornado is "a violently rotating column of air, pendant from a cumuliform cloud or underneath a cumuliform cloud, and often (but not always) visible as a funnel cloud." By definition, tornadoes are invisible but visible once

All tornadoes come from thunderstorms but all thunderstorms do not form tornadoes [10]. Most tornadoes are formed in supercell thunderstorms but can develop from hurricane thunderstorms, squall lines, and regular thunderstorms. Some of these are non-supercell tornadoes, landspouts, waterspouts, mesovortices, and gustnadoes [4]. Supercells are rotating

\$52,197,352.72. These figures do not include flood insurance claims [8].

**2.3. US tornado background, activity, and measurement**

others were in automobile crashes on wet roads from Debby [6].

46 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Maria Islands [6].

*2.3.1. Tornado background*

debris is in the funnel cloud [4].

Tornadoes are classified by the Enhanced Fujita Scale (EF). Prior to 2007, the Fujita Scale was used based on damage. The original Fujita Scale was developed by Dr. Theodore Fujita in 1971 as a forensic examination of structural damage. Since 2007, the EF takes into account the type of building materials and construction, damage level, and estimated sustained wind speeds. EF categories range from EF0 to EF5 [4,10,11].

**Figure 4.** Percentage of all US tornadoes that occurred in each EF-scale category [4].

The majority of tornadoes in the United States are EF0 (25.6%) and EF1 (37.3%). EF4 and EF5 tornadoes are the rarest at 2.0% and 0.3% of all tornadoes, respectively (**Figure 4**) [4]. Wind speeds with EF0 range from 65 to 85 mph (105–137 km/h), EF1 86 to 110 mph (138–177 km/h), EF2 111 to 135 mph (178–217 km/h), EF3 136 to 165 mph (218–266 km/h), EF4 166 to 200 mph (267–321 km/h), and EF5 >200 mph (322 km/h) (**Table 2**). Damage associated with tornadoes is from property damage to structures from high wind speeds and heavy rain from the thunderstorms which can lead to flash flooding and hail [4].


**Table 2.** Enhanced Fujita Scale [11].

## **2.4. Tuscaloosa Alabama tornado**

A tornado outbreak consisting of 353 tornadoes in 21 states occurred between 25 April and 28 April 2011 [4]. This outbreak had more tornadoes than the 1974 Super Tornado Outbreak (148 tornadoes) and more fatalities than the outbreak that occurred on Palm Sunday in 1965. There were also 2400 injuries and over \$4.2 billion in damages associated with this outbreak. In the southeast United States, there were 122 tornadoes that resulted in 313 fatalities in the afternoon and evening of 27 April. Tornadoes that had hit before dawn on the 27th added three more deaths for a total of 316. States affected on the 27th include central and northern Mississippi, central and northern Alabama, eastern Tennessee, northern Georgia, and southwestern Virginia. There were 15 violent tornadoes (EF4 or EF5) and eight had paths longer than 50 miles (or about 80 km) (**Figure 5**). Two of these tornadoes in Alabama, one in the northern part of the state and the other that struck Birmingham and Tuscaloosa, each had more than 60 deaths [12].

The outbreak was forecast by the SPC five days before the event. Weather forecast offices (WFOs) in the area were also preparing for the threat of severe convective weather and tornadoes five days in advance. Some of the activities from the WFO to emergency managers included discussions and tools such as "Hazardous Weather Outlooks, Web images, prere‐ corded multimedia briefings, and webinars that discussed the potential impacts [12]."

All of the tornadoes occurred in tornado watch box and warning areas. Lead time from watches to warning averaged 2.4 hours while watch time to the first significant tornado ranged from 3 to 6 hours in each area. Warning lead time for tornadoes was 22.1 minutes. The fatalities were all in watch and warning boxes [12].

Even with the forecasts from the SPC and WFOs, there were a high number of fatalities and injuries for several reasons. The tornadoes hit urban, suburban, and rural areas and were longtrack violent tornadoes. The storms damaged warning sources, such as NOAA Weather Radio Transmitters. Human behavior was also a major factor as many individuals did not respond to warnings without additional confirmation or waited for visual confirmation before taking action. Furthermore, the storms moved 45–70 mph (or about 72–112 km/h) which gave less time for those who waited to seek shelter, and for some, adequate shelter was not readily available [12].

**Figure 5.** Tornado tracks from 27 April 2011 outbreak [12].

The majority of tornadoes in the United States are EF0 (25.6%) and EF1 (37.3%). EF4 and EF5 tornadoes are the rarest at 2.0% and 0.3% of all tornadoes, respectively (**Figure 4**) [4]. Wind speeds with EF0 range from 65 to 85 mph (105–137 km/h), EF1 86 to 110 mph (138–177 km/h), EF2 111 to 135 mph (178–217 km/h), EF3 136 to 165 mph (218–266 km/h), EF4 166 to 200 mph (267–321 km/h), and EF5 >200 mph (322 km/h) (**Table 2**). Damage associated with tornadoes is from property damage to structures from high wind speeds and heavy rain from the

A tornado outbreak consisting of 353 tornadoes in 21 states occurred between 25 April and 28 April 2011 [4]. This outbreak had more tornadoes than the 1974 Super Tornado Outbreak (148 tornadoes) and more fatalities than the outbreak that occurred on Palm Sunday in 1965. There were also 2400 injuries and over \$4.2 billion in damages associated with this outbreak. In the southeast United States, there were 122 tornadoes that resulted in 313 fatalities in the afternoon and evening of 27 April. Tornadoes that had hit before dawn on the 27th added three more deaths for a total of 316. States affected on the 27th include central and northern Mississippi, central and northern Alabama, eastern Tennessee, northern Georgia, and southwestern Virginia. There were 15 violent tornadoes (EF4 or EF5) and eight had paths longer than 50 miles (or about 80 km) (**Figure 5**). Two of these tornadoes in Alabama, one in the northern part of the state and the other that struck Birmingham and Tuscaloosa, each had more than 60 deaths

The outbreak was forecast by the SPC five days before the event. Weather forecast offices (WFOs) in the area were also preparing for the threat of severe convective weather and tornadoes five days in advance. Some of the activities from the WFO to emergency managers included discussions and tools such as "Hazardous Weather Outlooks, Web images, prere‐ corded multimedia briefings, and webinars that discussed the potential impacts [12]."

All of the tornadoes occurred in tornado watch box and warning areas. Lead time from watches to warning averaged 2.4 hours while watch time to the first significant tornado ranged from 3 to 6 hours in each area. Warning lead time for tornadoes was 22.1 minutes. The fatalities were

thunderstorms which can lead to flash flooding and hail [4].

48 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**Table 2.** Enhanced Fujita Scale [11].

[12].

**2.4. Tuscaloosa Alabama tornado**

all in watch and warning boxes [12].

**EF number 3-second gust (mph) 3-second gust (kph)** 65–85 104.58–136.77 86–110 138.38–176.99 111–135 178.60–217.22 136–165 218.83–265.49 166–200 267.101–321.81 Over 200 Over 321.81

**Figure 6.** Composite weather analysis 1200 UTC 27 April [12].

The active April severe weather event in the Southern Plains and southeastern United States peaked on 27 April. Conditions were ripe for atmospheric instability due to colder air (com‐ pared to previous systems) with an upper-level storm moving east out of the southern Rocky Mountains. The upper-level storm continued east while a strong low-pressure system formed in western Arkansas (**Figure 6**). "As this low formed in the morning, southerly winds increased dramatically in the lower portion of the atmosphere, from around 15 mph at the surface to 45 mph approximately 3000 ft aboveground level. The change of wind direction and speed with height, known as vertical wind shear, helped create highly organized storms that could develop strong rotation in the lower and mid-levels. The approaching upper-level storm brought strong westerly winds at high altitudes, helping ensure that long-lived thunderstorms would occur [12]."

**Figure 7.** Radar images of early morning storms 27 April 2011 from Jackson Mississippi (left) and Birmingham Alaba‐ ma (right) WFOs [12].

A line of severe thunderstorms hit central and northern Mississippi, central and northern Alabama (including Tuscaloosa County), and southern middle Tennessee before dawn and produced over 24 tornadoes and caused three fatalities and over 40 injuries (**Figure 7**). Besides causing widespread power outages, several NOAA Weather Radio All-Hazards Transmitters were out of service; this would be a factor later in the day regarding warnings in some of these areas [12].

The early morning storms left a strong low-level jet along with a lot of moisture to assist in more atmospheric instability. An outflow boundary in northern Mississippi and northern Alabama brought more severe storms later in the morning. Further south heating occurred from the sun resulting in more heated low-level air resulting in more destabilization. Vertical wind shear increased, from 20 mph (32 km/h) at the surface to 70 mph (112 km/h) at 3000 ft (0.9 km) to over 100 mph (160 km/h) near the tropopause (34,000 ft or 10 km). These conditions gave "an extraordinary high potential for strong low-level rotation in the storms." In addition, the upper-level wind speeds helped to produce long-lived storms, an environment suggesting severe, long-lived supercells capable of producing violent tornadoes (**Figure 8**) [12]. There were 62 confirmed tornadoes in Alabama and 29 in central Alabama 27 April [13].

**Figure 8.** Composite weather analysis 0000 UTC 28 April [12].

The active April severe weather event in the Southern Plains and southeastern United States peaked on 27 April. Conditions were ripe for atmospheric instability due to colder air (com‐ pared to previous systems) with an upper-level storm moving east out of the southern Rocky Mountains. The upper-level storm continued east while a strong low-pressure system formed in western Arkansas (**Figure 6**). "As this low formed in the morning, southerly winds increased dramatically in the lower portion of the atmosphere, from around 15 mph at the surface to 45 mph approximately 3000 ft aboveground level. The change of wind direction and speed with height, known as vertical wind shear, helped create highly organized storms that could develop strong rotation in the lower and mid-levels. The approaching upper-level storm brought strong westerly winds at high altitudes, helping ensure that long-lived thunderstorms

50 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**Figure 7.** Radar images of early morning storms 27 April 2011 from Jackson Mississippi (left) and Birmingham Alaba‐

A line of severe thunderstorms hit central and northern Mississippi, central and northern Alabama (including Tuscaloosa County), and southern middle Tennessee before dawn and produced over 24 tornadoes and caused three fatalities and over 40 injuries (**Figure 7**). Besides causing widespread power outages, several NOAA Weather Radio All-Hazards Transmitters were out of service; this would be a factor later in the day regarding warnings in some of these

The early morning storms left a strong low-level jet along with a lot of moisture to assist in more atmospheric instability. An outflow boundary in northern Mississippi and northern Alabama brought more severe storms later in the morning. Further south heating occurred from the sun resulting in more heated low-level air resulting in more destabilization. Vertical wind shear increased, from 20 mph (32 km/h) at the surface to 70 mph (112 km/h) at 3000 ft (0.9 km) to over 100 mph (160 km/h) near the tropopause (34,000 ft or 10 km). These conditions gave "an extraordinary high potential for strong low-level rotation in the storms." In addition,

would occur [12]."

ma (right) WFOs [12].

areas [12].

*Storm Data* [14] named the Tuscaloosa-Birmingham EF4 Tornado the "outstanding storm of the month." This tornado was not the strongest (there were EF5s) or the longest track of the outbreak, but it had significant impacts due to the population affected and social impacts. There are 194,656 people in Tuscaloosa County and 90,468 in the city of Tuscaloosa according to the US 2010 Census [15]. The 27 April 2011 outbreak was well forecast by the SPC and local WFOs with most severe weather parameters such as instability and shear on the higher ends of the scale.

The first afternoon tornado touched down at 3:05 pm CDT (2005Z), in Marion County, Alabama, was an EF5. Another supercell would produce the Tuscaloosa-Birmingham tornado started around 3:00 pm in Newton County, Mississippi (110 miles or 176.99 km southwest of Tuscaloosa), and took about an hour and 45 minutes to develop a tornado. This tornado dropped in Greene County, Alabama, and then to Tuscaloosa County into the populated city of Tuscaloosa. As the tornado continued to move east-northeast, it moved into Jefferson County, Alabama, and finally weakened around north of downtown Birmingham, Alabama, and lifted around 4 miles northeast when the parent supercell and another supercell merged. This supercell would produce other tornadoes and was tracked into western North Carolina. The National Weather Service (NWS) damage survey said the tornado caused 64 direct fatalities and over 1000 injuries along with impacting more than 36,000 people [14].

## **3. Methods and case studies**

These two atmospheric hazards had two separate case studies and different research questions and methods. Both of these studies examined preparedness along with communications on different levels and scales.

## **3.1. Tropical Storm Debby case study**

## *3.1.1. Research objectives*

The research objectives of this study examined various means of communicating weather information among the elderly. One of the major themes was comparing traditional media (i.e., television and newspapers) to newer forms of social media such as Facebook or Twitter. In addition, questions were asked in regard to preparedness actions for a tropical storm or hurricane.

## *3.1.2. Methods*

Self-administered questionnaires were given on voluntary basis to elderly citizens. Since American Association Retired Persons (AARP) considers membership from age 50 and older, this was the threshold for elderly. Surveys consisted of both close-ended and open-ended questions. Questions dealt with methods of weather communications prior and following Tropical Storm Debby. Additionally, questions dealt with other potential disaster situations, evacuations, and any changes in actions following this event. A few respondents were interested in elaborating in interviews but the majority only completed the surveys. Subjects resided in Pinellas and Pasco Counties, part of the Tampa Bay area in Central West Florida. There were 30 participants in the 40 question survey. All of the responses were kept confi‐ dential with no identifying characteristics of the respondents.

## *3.1.3. Results*

The first question asked on the primary method of receiving weather information. Some of the choices were local television, newspaper, radio, cable television news (i.e., CNN, MSNBC, and Fox), specialized cable news (Weather Channel), local cable news, the Internet, National Weather Service, cell phone, or personal communication with friend or relative. Local television was the overwhelming choice with 60.2% selecting this option. The second choice was local cable news at 13.3% (there is a 24-hour local news channel with weather updates every 10 minutes). Third was the National Weather Service at 10.0%. All of the other choices totaled 16.5%.

Question two asked if respondents utilized a second method of receiving weather information. Of those responding, 83.3% were affirmative and 16.7% did not go to a secondary source. For those who went to another source, 23.3% used the Internet, 13.3% the National Weather Service, 13.3% Weather Channel, local television and cable news each 10%, newspaper and local cable news each 6.7%, and cell phones and personal communication 3.3% each.

In regard to technology owned, 90% owned a computer, 6.7% did not own a computer, and 3.3% did not respond. Almost as many respondents owned a cell phone at 86.7% while 10% did not own a cell phone, and 3.3% did not respond to the question. For those who owned a cell phone, 23.3% indicated it was a smartphone while 66.7% did not have a smartphone and 10% did not respond.

**3. Methods and case studies**

52 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**3.1. Tropical Storm Debby case study**

different levels and scales.

*3.1.1. Research objectives*

hurricane.

*3.1.2. Methods*

*3.1.3. Results*

totaled 16.5%.

These two atmospheric hazards had two separate case studies and different research questions and methods. Both of these studies examined preparedness along with communications on

The research objectives of this study examined various means of communicating weather information among the elderly. One of the major themes was comparing traditional media (i.e., television and newspapers) to newer forms of social media such as Facebook or Twitter. In addition, questions were asked in regard to preparedness actions for a tropical storm or

Self-administered questionnaires were given on voluntary basis to elderly citizens. Since American Association Retired Persons (AARP) considers membership from age 50 and older, this was the threshold for elderly. Surveys consisted of both close-ended and open-ended questions. Questions dealt with methods of weather communications prior and following Tropical Storm Debby. Additionally, questions dealt with other potential disaster situations, evacuations, and any changes in actions following this event. A few respondents were interested in elaborating in interviews but the majority only completed the surveys. Subjects resided in Pinellas and Pasco Counties, part of the Tampa Bay area in Central West Florida. There were 30 participants in the 40 question survey. All of the responses were kept confi‐

The first question asked on the primary method of receiving weather information. Some of the choices were local television, newspaper, radio, cable television news (i.e., CNN, MSNBC, and Fox), specialized cable news (Weather Channel), local cable news, the Internet, National Weather Service, cell phone, or personal communication with friend or relative. Local television was the overwhelming choice with 60.2% selecting this option. The second choice was local cable news at 13.3% (there is a 24-hour local news channel with weather updates every 10 minutes). Third was the National Weather Service at 10.0%. All of the other choices

Question two asked if respondents utilized a second method of receiving weather information. Of those responding, 83.3% were affirmative and 16.7% did not go to a secondary source. For those who went to another source, 23.3% used the Internet, 13.3% the National Weather Service, 13.3% Weather Channel, local television and cable news each 10%, newspaper and

local cable news each 6.7%, and cell phones and personal communication 3.3% each.

dential with no identifying characteristics of the respondents.

Besides owning technology, the survey inquired how it was used by the respondents. A large majority of 90% use e-mail, while 6.7% do not and 3.3% did not respond to the question. The majority of respondents who use e-mail only on a computer were 66.7%, none used a phone only, 23.3% used both the computer and phone, and 10% did not respond. A minority utilized texting at 36.7% but 60% did not text, and 3.3% did not respond. Similar to texting, only 26.7% of the respondents engaged in social media with an overwhelming majority of 70% not using services such as Facebook or Twitter. The nonresponse rate was 3.3%.

Those who used social media primarily were for personal reasons, such as communicating with family. Only 4.6% used social media to get information. Some of the reasons stated for not using social media from the respondents include "don't like it, security issues, not interested in public exposure, privacy concerns, do not need it, don't feel comfortable with it, no time, and prefer to use the telephone."

Only five respondents indicated they had any damage from Tropical Storm Debby and all five noted it was minor damage. No insurance claims were filed or requests for disaster assistance.

The majority of respondents (36.7%) did not think it was applicable to prepare for Tropical Storm Debby. However, 33.3% did and 26.7% did nothing while 3.3% did not respond. Preparedness actions included close hurricane shutters if threatened, have an evacuation plan, follow instructions from local authorities (such as emergency management officials), have equipment and supplies on hand, secure property if threatened, have no special actions, minimize mulch around back of house to avoid a dam and cause flooding, alert the Citizens Emergency Response Team (CERT), and have more water (for drinking and other purposes).

Another question asked if respondents owned a NOAA All-Hazards or Weather Radio. A large majority of 63.3% owned a radio with 36.7% not owning a weather radio. A question was asked regarding the perception of FEMA with 6.7% answering excellent, 3.3% good, and 90% stating not applicable.

Besides questions on weather communications and preparedness, demographic information was requested from the respondents. The gender breakdown was almost equal with 46.7% female, 50% male, and 3.3% not responding. A question inquiring to primary employment status indicated the majority (76.7%) were retired, 13.3% employed full-time, 3.3% employed part time, and 3.3% not responding. There was a range of incomes with categories of \$20,000– 34,999 (16.7%), \$35,000–49,999 (13.3%), \$50,000–74,999 (16.7%), \$75,000 or above at 20%, and 33.3% preferred not to answer the question.

Ages were in categories with 50–54 (6.7%), 55–60 (13.3%), 60–64 (3.3%), 65–70 (16.7%), 71–74 (20%), 75–80 (16.7%), 81–85 (10%), and 86 or above (10%). Only 3.3% did not respond to the question.

## **3.2. Tuscaloosa, Alabama, tornado case study**

## *3.2.1. Research objectives*

Research objectives of this study examined fundamental emergency management issues on four different levels. These levels are Tuscaloosa County, the University of Alabama at Tuscaloosa, the Alabama Emergency Management Agency, and FEMA. Besides the emergency management issues, lessons learned were noted for each level.

## *3.2.2. Methods*

In-depth interviews were conducted with various emergency management officials at each level. An open-ended survey/interview was utilized with the respondents. Interviews were with personnel from the Tuscaloosa County Department of Emergency Management, Uni‐ versity of Alabama Department of Public Safety, Alabama Emergency Management Agency, and FEMA. Field observations were also completed by the author. Topics included existing and revised preparedness plans, major components of planning, lessons learned from plans, mutual aid, recovery progress, and lessons learned. Other issues included citizen and survivor experiences, mitigation, and interactions among the various agencies at the four levels.

## *3.2.3. Results: county level*

Tuscaloosa County utilized various preparedness actions. Preplanning for events was more common in the post-Katrina era (since 2005). Emergency management personnel had plans for events such as tornadoes, hurricanes, flooding, and winter storms. Besides having planning, exercises were practiced with various community stakeholders. One of the plans and exercises was for mass casualties. Procedures were established for damage assessment such as the type of tags to be issued based on damages and having architects and engineers ready to inspect structures to determine safety or be habitable. Price gouging laws were enacted to keep those from profiteering, especially for necessary items such as gasoline and hotel rooms. The city of Tuscaloosa and Tuscaloosa County had budgeted for financial reserves in case of disaster. Local media was also a partner with the city and county to help disseminate severe storm and tornado awareness to the citizens.

Response was challenging as the tornado made a direct hit and damaged the Emergency Operations Center (EOC) for Tuscaloosa County. The EOC was relocated to the University of Alabama campus. Prior to the tornado, emergency management personnel were aware of the severe weather and tornado potential and responded to the early morning tornadoes. There was a weather briefing with the Birmingham WFO at 2:00 pm and emergency management officials used tools such as Weather Messenger and Emergency Management Weather Information Network (EMWIN) Injects to stay apprised of current and future weather. Various responders (i.e., firefighters, law enforcement, and medical personnel) were also prepared and ready to respond if necessary. The tornado cut through the city and there was a lot of structure damage in addition to the high number of fatalities and injuries. A lock down and curfew was imposed in the city of Tuscaloosa. Some looting occurred but the offenders were not from Tuscaloosa County.

Recovery included tasks such as cleaning debris, power restoration, and working to bring the community back to normal conditions. Citizen needs such as food, water, clothing, housing, and recharging cell phones were attended to by various groups. Some were private or faithbased and others were governmental organizations such as FEMA. Four Disaster Resource Centers (DRCs) were opened by FEMA in Tuscaloosa to assist survivors with their needs. In addition, one Small Business Administration (SBA) Center was available to assist with lowinterest loans; SBA works with both individuals and businesses after disasters.

Tuscaloosa County worked with a few mitigation methods. One mitigation method was establishing an active group of Skywarn volunteers, a group of citizens that spot storms and report observations to the National Weather Service. Skywarn training is an annual event with many citizens either taking new or refresher training. Community shelters were established as many do not have storm shelters in their homes or apartments. Some of the structures to serve as shelters are schools and recreation centers. There have been discussions on stricter building codes and recommendations to tie down objects such as air conditioning compressors and water heaters so they are not projectiles in a tornado.

The loss of the EOC in Tuscaloosa County brought up some issues that serve as lessons learned from the incident. First, when the building was damaged, the fire suppression systems went off since the water was on. The backup generator was damaged and needed protection from the elements and water. When a generator is hooked up to the building, circuits on the generator need to be well marked to save time trying to find the live outlets. Spare COAX cable (used in communications) and antennae are necessary for setting up networks and commu‐ nications. The internet was backed up with a virtual private network (VPN) that went to the University of Alabama that allowed workers to be on the county computer network. A lot of handheld radios were used and issues were found getting a signal in substantial buildings due to the interference from construction materials; hence, multiple methods for communications are needed. Besides the issues at the EOC, there was a lot of interaction within the community that involved both the private and nonprofit sectors. Some events could have gone smoother such as the Governor's Summit (meeting with emergency management and political officials) that was held too soon after the event to cover response and recovery strategies. The FEMA housing event (for citizens impacted by the tornado on rebuilding options) was not well marketed to the community and had low attendance.

## *3.2.4. Results: university level*

**3.2. Tuscaloosa, Alabama, tornado case study**

management issues, lessons learned were noted for each level.

54 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Research objectives of this study examined fundamental emergency management issues on four different levels. These levels are Tuscaloosa County, the University of Alabama at Tuscaloosa, the Alabama Emergency Management Agency, and FEMA. Besides the emergency

In-depth interviews were conducted with various emergency management officials at each level. An open-ended survey/interview was utilized with the respondents. Interviews were with personnel from the Tuscaloosa County Department of Emergency Management, Uni‐ versity of Alabama Department of Public Safety, Alabama Emergency Management Agency, and FEMA. Field observations were also completed by the author. Topics included existing and revised preparedness plans, major components of planning, lessons learned from plans, mutual aid, recovery progress, and lessons learned. Other issues included citizen and survivor experiences, mitigation, and interactions among the various agencies at the four levels.

Tuscaloosa County utilized various preparedness actions. Preplanning for events was more common in the post-Katrina era (since 2005). Emergency management personnel had plans for events such as tornadoes, hurricanes, flooding, and winter storms. Besides having planning, exercises were practiced with various community stakeholders. One of the plans and exercises was for mass casualties. Procedures were established for damage assessment such as the type of tags to be issued based on damages and having architects and engineers ready to inspect structures to determine safety or be habitable. Price gouging laws were enacted to keep those from profiteering, especially for necessary items such as gasoline and hotel rooms. The city of Tuscaloosa and Tuscaloosa County had budgeted for financial reserves in case of disaster. Local media was also a partner with the city and county to help disseminate severe storm and

Response was challenging as the tornado made a direct hit and damaged the Emergency Operations Center (EOC) for Tuscaloosa County. The EOC was relocated to the University of Alabama campus. Prior to the tornado, emergency management personnel were aware of the severe weather and tornado potential and responded to the early morning tornadoes. There was a weather briefing with the Birmingham WFO at 2:00 pm and emergency management officials used tools such as Weather Messenger and Emergency Management Weather Information Network (EMWIN) Injects to stay apprised of current and future weather. Various responders (i.e., firefighters, law enforcement, and medical personnel) were also prepared and ready to respond if necessary. The tornado cut through the city and there was a lot of structure damage in addition to the high number of fatalities and injuries. A lock down and curfew was

*3.2.1. Research objectives*

*3.2.3. Results: county level*

tornado awareness to the citizens.

*3.2.2. Methods*

Preparedness at the University of Alabama was evident with their established Emergency Operations Plan (EOP) (Tuscaloosa County had one too), marked tornado shelters, an Emergency Call Center Plan (to get out information), preestablished mutual aid agreements, and following the Incident Command System (ICS). The university also had an Emergency Notification/Crisis Communication Plan. These systems utilized texts, e-mails, phones, signage, and public address system to get out warnings and information. Additionally, the university was designated by the National Weather Service as StormReady, meaning the university meets standard communication protocols for informing their community on present and future hazardous weather conditions. Other plans were established for using resources such as the campus buses along with shelter plans with dormitories and continuity plans dealing with dining, facilities, payroll, human resources, and public safety.

The tornado path came within 1200 ft (366 m) of the campus and impacted many students and faculty. There was a lack of power in Tuscaloosa and cell phone communications were severely hampered. This was due to tower damage, lack of power, and heavy usage. The university was able to offer mutual aid to the community in a number of ways for response. First, the EOC was used by Tuscaloosa County for two days and university law enforcement officers assisted the city for a month. Many university vehicles and equipment such as trucks, vans, gators (small all-terrain vehicles for transporting people and materials), and forklifts were used in the community. The campus was used as a staging area for several groups including Urban Search and Rescue (USAR) teams. Some university personnel were used as translators for the non-English-speaking population in shelters. Classes were canceled and dorms used for responders such as utility workers, National Guard, Red Cross, and law enforcement person‐ nel. Still, some students and employees also needed shelter along with meals for responders and key personnel. A "Seek and Find" website was established to track missing people. University resources were used for power and communications, especially with towers. With classes canceled, many students volunteered to help in the community and the university offered food assistance to citizens in need. Other university resources included medical personnel working at the hospital along with the Incident Command Center that was open for 17 days, 24/7. A Joint Information Center (JIC) was established to communicate with citizens and the media. Participants in the JIC included the City of Tuscaloosa, Tuscaloosa County, the University of Alabama, and the State of Alabama.

Since there was no damage to the campus, recovery was more financial than physical. The university was reimbursed by FEMA for expenses assisting the city and county. They also worked with the city recycling services with the large amount of debris. An "Acts of Kindness" program was established to assist students and employees who needed financial aid for recovery and helped manage donations. Another mitigation effort was a Hazard Mitigation Grant Request to FEMA for additional community shelters and generators on campus. The decision was not known at the time of research visit and it was not listed as a funded grant by FEMA. A Damage Assessment Response Plan was established to coordinate personnel to assess and restore buildings.

The University of Alabama learned several emergency management lessons through this event. For example, the university needed to evaluate the generator fuel supplier as the university was in direct competition with the City and, if conditions warranted, both could run low on fuel. Generators were found to be in short supply and increased power capacity was needed. The incident reinforced the need for redundant internet and network pathways; this could be accomplished with multiple ingress/egress core routers. Volunteers are usually forthcoming in disasters and the university needed to establish and coordinate their use in an effective manner. Once Incident Command was established, all requests for resources should go through the Incident Commander to avoid miscommunication issues along with utilizing volunteers. Another lesson learned was to expect traditional communications to be over‐ whelmed or unavailable. Good working relationships (networking, training, and exercises) and mutual aid agreements with other partners such as the city, county, and state are essential during a crisis. Finally, having large events (i.e., football games) with partners and utilizing Incident Command is important in a real disaster. Although a non-crisis situation, large numbers of citizens in a concentrated setting provide community experience in dealing with multiple agencies and stakeholders, a situation similar to a disaster.

## *3.2.5. Results: state level*

university meets standard communication protocols for informing their community on present and future hazardous weather conditions. Other plans were established for using resources such as the campus buses along with shelter plans with dormitories and continuity plans

The tornado path came within 1200 ft (366 m) of the campus and impacted many students and faculty. There was a lack of power in Tuscaloosa and cell phone communications were severely hampered. This was due to tower damage, lack of power, and heavy usage. The university was able to offer mutual aid to the community in a number of ways for response. First, the EOC was used by Tuscaloosa County for two days and university law enforcement officers assisted the city for a month. Many university vehicles and equipment such as trucks, vans, gators (small all-terrain vehicles for transporting people and materials), and forklifts were used in the community. The campus was used as a staging area for several groups including Urban Search and Rescue (USAR) teams. Some university personnel were used as translators for the non-English-speaking population in shelters. Classes were canceled and dorms used for responders such as utility workers, National Guard, Red Cross, and law enforcement person‐ nel. Still, some students and employees also needed shelter along with meals for responders and key personnel. A "Seek and Find" website was established to track missing people. University resources were used for power and communications, especially with towers. With classes canceled, many students volunteered to help in the community and the university offered food assistance to citizens in need. Other university resources included medical personnel working at the hospital along with the Incident Command Center that was open for 17 days, 24/7. A Joint Information Center (JIC) was established to communicate with citizens and the media. Participants in the JIC included the City of Tuscaloosa, Tuscaloosa County, the

Since there was no damage to the campus, recovery was more financial than physical. The university was reimbursed by FEMA for expenses assisting the city and county. They also worked with the city recycling services with the large amount of debris. An "Acts of Kindness" program was established to assist students and employees who needed financial aid for recovery and helped manage donations. Another mitigation effort was a Hazard Mitigation Grant Request to FEMA for additional community shelters and generators on campus. The decision was not known at the time of research visit and it was not listed as a funded grant by FEMA. A Damage Assessment Response Plan was established to coordinate personnel to

The University of Alabama learned several emergency management lessons through this event. For example, the university needed to evaluate the generator fuel supplier as the university was in direct competition with the City and, if conditions warranted, both could run low on fuel. Generators were found to be in short supply and increased power capacity was needed. The incident reinforced the need for redundant internet and network pathways; this could be accomplished with multiple ingress/egress core routers. Volunteers are usually forthcoming in disasters and the university needed to establish and coordinate their use in an effective manner. Once Incident Command was established, all requests for resources should go through the Incident Commander to avoid miscommunication issues along with utilizing

dealing with dining, facilities, payroll, human resources, and public safety.

56 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

University of Alabama, and the State of Alabama.

assess and restore buildings.

Preparedness for the Alabama Emergency Management Agency also involved planning, exercises, and training. The state agency also has a strong relationship with Voluntary Organizations Active in Disasters (VOAD) and the various organizations that are members. Faith-based groups along with civic organizations are among the members that are prepared to respond when needed at a disaster. The State Emergency Management Agency partners with the National Weather Service in educating the public on Severe Weather Awareness Day. Alabama Emergency Management Agency was very aware and communicating with the National Weather Service before and during the tornado outbreak.

Response from the state was to ensure citizens and communities received necessary aid and resources. Mutual aid compacts worked in the majority of counties with no problems in Tuscaloosa County (there were some issues in other counties). State subject matter experts coordinated and worked well with FEMA. Personnel from the state were working in the EOC at the University of Alabama within three hours of the tornado.

One of the major recovery objectives was to give support to the local level. The process for the citizens who needed assistance can be described as "bottom-up" (citizens starting the process and working up through the various agencies), whether to rebuild or obtaining information. The state made a big effort to prevent those "from falling through the cracks" or not following through with individual and community cases. This is due to the bureaucracy and regulations for receiving aid and resources – both to individuals and communities. Another major task was assisting with debris removal and reimbursements.

The State Emergency Management Agency was involved with a few items regarding mitiga‐ tion. There were grants available to individual homeowners for tornado shelters. They received hundreds of calls along with almost 4200 applications regarding these grants for a shelter or safe room. Hurricane straps and safe rooms were encouraged to be included in the rebuilding process; however, these items were not required items per the building code. Other mitigation issues were related to historic structures. The state advocated slowing down the process and be practical in regard to regulations dealing with historic preservation. Several individuals in Tuscaloosa are not fond of government regulations and wanted to proceed quickly with recovery and not spend too much time regarding older damaged structures that needed to be repaired or totally rebuilt.

A Tornado Recovery Action Council of Alabama (TRAC) was conferred by the Governor and aided in lessons learned from the outbreak. The major sections of the report were a summary of the tornadoes and prepare, warn, respond, recover, and forum reports [14]. Other variables included the importance of ICS training and that all resources and requests should go through the Incident Commander which helps with proper tracking. This is important for getting reimbursements. Like the other levels, the state found issues with communications, coordi‐ nation, and managing resources.

## *3.2.6. Results: federal level*

FEMA was following the whole community approach of collaboration in the Post Craig Fugate (Administrator since May 2009 through present time of 2016) era for the entire disaster cycle including preparedness. Personnel in the agency were aware of the severe weather potential and had strike teams ready to activate to work with partners at all levels of government along with the private and nonprofit sectors.

There was a quick response by FEMA with Community Affairs workers on the scene within 24 hours. A Joint Field Office was established in Tuscaloosa along with DRCs. In addition, a Federal Coordinating Officer was dispatched to Tuscaloosa to aid with response and recovery functions. At the peak, 3000 FEMA employees were assisting in the disaster. It was imperative for qualified personnel and a strong command and control structure to assist in getting a quick awareness of the situation and evaluate what resources would be needed to help the com‐ munity.

Some of the recovery tasks involved debris removal resources and wholesale rebuilding of structures. The Army Corps of Engineers also worked with homeowners on debris removal. A program called Tuscaloosa Forward was established consisting of town hall meetings to inform residents of assistance and procedures in this phase. One of the first priorities for survivors was temporary housing assistance while public assistance was infrastructure restoration. There were around 30,000 applications submitted to FEMA for assistance. Those with private insurance used their policies for recovery first and then FEMA assistance would come into aid if they were eligible.

Mitigation measures included education of current building codes and suggestions of strengthening structures (even if not enforced by building codes). This included tie downs and safe rooms. There was also a priority of establishing community shelters, especially with the number of apartments and manufactured housing in the community.

Lessons learned by FEMA went along with the other levels of government, such as the success of using the ICS and the cooperation/knowledge of community leaders with the process. All levels expressed a great working relationship with the local, state, and federal government personnel. The importance of a Public Information Officer (PIO) was useful with the different agencies along with communicating information to the media and public; however, there were communication and coordination glitches. Additionally there was a housing shortage due to the tornado hitting a populated area with housing needs for residents, volunteers, and responders. Due to the number of damaged vehicles, transportation shortages existed. Finally, there were challenges with commercial sites and insurance policies in regard to debris management.

## **4. Summary**

of the tornadoes and prepare, warn, respond, recover, and forum reports [14]. Other variables included the importance of ICS training and that all resources and requests should go through the Incident Commander which helps with proper tracking. This is important for getting reimbursements. Like the other levels, the state found issues with communications, coordi‐

58 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

FEMA was following the whole community approach of collaboration in the Post Craig Fugate (Administrator since May 2009 through present time of 2016) era for the entire disaster cycle including preparedness. Personnel in the agency were aware of the severe weather potential and had strike teams ready to activate to work with partners at all levels of government along

There was a quick response by FEMA with Community Affairs workers on the scene within 24 hours. A Joint Field Office was established in Tuscaloosa along with DRCs. In addition, a Federal Coordinating Officer was dispatched to Tuscaloosa to aid with response and recovery functions. At the peak, 3000 FEMA employees were assisting in the disaster. It was imperative for qualified personnel and a strong command and control structure to assist in getting a quick awareness of the situation and evaluate what resources would be needed to help the com‐

Some of the recovery tasks involved debris removal resources and wholesale rebuilding of structures. The Army Corps of Engineers also worked with homeowners on debris removal. A program called Tuscaloosa Forward was established consisting of town hall meetings to inform residents of assistance and procedures in this phase. One of the first priorities for survivors was temporary housing assistance while public assistance was infrastructure restoration. There were around 30,000 applications submitted to FEMA for assistance. Those with private insurance used their policies for recovery first and then FEMA assistance would

Mitigation measures included education of current building codes and suggestions of strengthening structures (even if not enforced by building codes). This included tie downs and safe rooms. There was also a priority of establishing community shelters, especially with the

Lessons learned by FEMA went along with the other levels of government, such as the success of using the ICS and the cooperation/knowledge of community leaders with the process. All levels expressed a great working relationship with the local, state, and federal government personnel. The importance of a Public Information Officer (PIO) was useful with the different agencies along with communicating information to the media and public; however, there were communication and coordination glitches. Additionally there was a housing shortage due to the tornado hitting a populated area with housing needs for residents, volunteers, and responders. Due to the number of damaged vehicles, transportation shortages existed. Finally, there were challenges with commercial sites and insurance policies in regard to debris

number of apartments and manufactured housing in the community.

nation, and managing resources.

with the private and nonprofit sectors.

come into aid if they were eligible.

*3.2.6. Results: federal level*

munity.

management.

## **4.1. Tropical Storm Debby summary**

Research indicated that the elderly use local television to get their primary weather informa‐ tion. As with most individuals, they also want to verify their information with a secondary source. The majority used the National Weather Service and the Weather Channel for their sources of secondary information. However, results indicated the elderly use technology in their lives. A large majority own computers and cell phones which are used for personal use such as e-mail and phone calls instead of receiving weather information. Another technology used by the respondents was NOAA Weather Radios. This could be due some of them being members of CERT teams along with previous tropical storm experiences. The majority of respondents do not text or use social media. These results indicate the importance of using both traditional means of communicating weather information and newer methods such as social media in order to reach all citizens. Only a third indicated preparedness actions with Tropical Storm Debby compared to around a quarter who took no action. More than a third said it was not applicable and did not feel the need with this tropical system.

## **4.2. Tuscaloosa Alabama tornado summary**

The research examined emergency management issues on different scales ranging from county to federal levels. All of the respondents indicated the importance of using Incident Command in the response of a disaster. Other commonalities emphasized the usefulness of planning and exercises along with networking. Mutual aid agreements for entities are also important for assistance and resources. Tornado-specific findings were having more tornado shelters, especially community shelters and tougher building codes. Common issues on all levels were with communications and obtaining and utilizing resources.

## **4.3. General themes**

These two case studies examined communication and emergency management issues along with disaster preparedness on different levels. Elderly citizens were surveyed to their prefer‐ ences for receiving weather information along with preparedness actions before Tropical Storm Debby. Seniors are often vulnerable populations in disasters. They use more traditional methods of receiving weather information and will listen to authorities such as local media and the National Weather Service. Elderly preparedness actions follow practices established and communicated by the media and local emergency management agencies. Tropical systems such as tropical storms or hurricanes are usually well forecast and receive a lot of media attention. However, most of the respondents did not have any impacts from Tropical Storm Debby but there were citizens who were flooded or had tornado damage.

Tornadoes do not usually have the lead time compared to a tropical storm or hurricane. However, the Tuscaloosa tornado had lead time but previous severe weather damaged some communication methods such as NOAA Weather Radio. This outbreak was forecast days in advance and the media communicated its progress live to listeners and viewers. Emergency management agencies were prepared and utilized existing networks for response and recovery actions. There was cooperation and mutual aid demonstrated on the various levels which ranged from the local community to the federal. While there were some issues and glitches, the objective of reducing social vulnerability was primarily successful. Most citizens were able to survive even with the large amount of property damage and high number of fatalities and injuries.

These case studies demonstrate the necessity of efficient communications, from a personal level of receiving weather information to all the levels of emergency management. In addition, preparedness actions and procedures need to be explicit and able to be understood by all, again from a personal level to working with agencies. Being prepared will not stop a tropical storm or hurricane but can help mitigate an event and if disaster strikes makes response, recovery, and resiliency better for those impacted and hopefully decreases fatalities, injuries, and property damage. Effective communications can help achieve this goal. No matter what the scale is, it is important to remember that all disasters are local.

## **Acknowledgements**

The author would like to thank Dr. Jason Senkbeil from the Department of Geography at the University of Alabama for his assistance with arrangements and fieldwork.

## **Author details**

Robert M. Schwartz

Address all correspondence to: rms73@uakron.edu

Department of Disaster Science and Emergency Services, The University of Akron, Akron, Ohio, United States

## **References**


[3] National Hurricane Center. Tropical Cyclone Climatology. [Internet]. 2016, Available from: http://www.nhc.noaa.gov/climo/ [Accessed: 2016-01-16].

management agencies were prepared and utilized existing networks for response and recovery actions. There was cooperation and mutual aid demonstrated on the various levels which ranged from the local community to the federal. While there were some issues and glitches, the objective of reducing social vulnerability was primarily successful. Most citizens were able to survive even with the large amount of property damage and high number of fatalities and

These case studies demonstrate the necessity of efficient communications, from a personal level of receiving weather information to all the levels of emergency management. In addition, preparedness actions and procedures need to be explicit and able to be understood by all, again from a personal level to working with agencies. Being prepared will not stop a tropical storm or hurricane but can help mitigate an event and if disaster strikes makes response, recovery, and resiliency better for those impacted and hopefully decreases fatalities, injuries, and property damage. Effective communications can help achieve this goal. No matter what the

The author would like to thank Dr. Jason Senkbeil from the Department of Geography at the

Department of Disaster Science and Emergency Services, The University of Akron, Akron,

[1] Federal Emergency Management Agency. Data Visualization: Summary of Disaster Declarations and Grants. [Internet]. 2015. 44325–4304 Available from: http:// www.fema.gov/data-visualization-summary-disaster-declarations-and-grants [Ac‐

[2] Federal Emergency Management Agency. Disaster Declarations by Year. [Internet]. 2016. Available from: http://www.fema.gov/disasters/grid/year [Accessed:

University of Alabama for his assistance with arrangements and fieldwork.

scale is, it is important to remember that all disasters are local.

60 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Address all correspondence to: rms73@uakron.edu

injuries.

**Acknowledgements**

**Author details**

Robert M. Schwartz

Ohio, United States

cessed: 2016-01-13].

2016-01-13].

**References**


**Part II: Atmospheric Dynamics and Modeling Aspects of Atmospheric Hazards**

## **A Case Study of Atmospheric Dynamics and Thermodynamics in Derechos and the Societal Impacts**

Kevin Law

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/63319

## **Abstract**

The word "derecho" is used to differentiate a storm having straight‐line winds as opposed to rotational, tornadic winds. Although the term "derecho" is relatively old, derechos were not readily recognized by the general public until recent outbreaks caused significant widespread damage and associated fatalities. Most notably, the 2012 Mid‐Atlantic Derecho in the USA brought these types of storms to the public's attention as a variety of societal impacts including infrastructural damage, power outages, and fatalities occurred over an extensive area from outside of Chicago to Washington, DC The associated damage can be more widespread than tornadoes, and the number of fatalities is comparable to those found in medium‐intensity tornadoes.

This study investigated the importance of the dynamics and thermodynamics in maintaining the intensity of derechos. Key meteorological parameters were measured over six stations where the 2012 Mid‐Atlantic Derecho passed. Low‐ to mid‐level wind shear, as well as the Convective Available Potential Energy (CAPE) and Most Unstable Convective Available Potential Energy (MUCAPE), was found to be significantly higher at the time of passage, which allowed the system to intensify and propagate down‐ stream.

**Keywords:** derechos, dynamics, thermodynamics, convection, societal impacts

## **1. Introduction**

North America is known for its many types of atmospheric hazards including tornadoes, thunderstorms, blizzards, and hurricanes. Physical geography plays a pivotal role as two mountain ranges, the Rockies in the west and the Appalachians in the east, are both oriented

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

north‐south. Since there is no physical barrier to prevent cold air masses to the north and warm air masses to the south from interacting with each other, severe and hazardous weather is commonly produced. The warm, moist air is easily lifted by the colder, drier air producing mid‐latitude storms like no other place on Earth. In fact, North America generates almost 75% of all known tornadoes [1].

However, there is another type of atmospheric hazard that does not rotate violently but nevertheless can cause considerable damage and a variety of societal impacts—the *derecho*. The term "derecho" is not as commonly known as "tornado" but has been around for many years. Derecho was first used by a University of Iowa physics professor, Dr. Gustavus Hinrichs, in 1888 to describe a thunderstorm that produced strong straight‐line winds as opposed to rotational winds [2]. Hinrichs believed another term was needed to differentiate violent straight‐line storms from tornadoes; therefore, he used "derecho," which is Spanish for "straight ahead." However, the word "derecho" was virtually absent from the meteorological lexicon for almost 100 years until the 1980s when Robert Johns and William Hirt [3] re‐ introduced the term. Although, not as widely known as tornadoes by the general public, derechos are gaining recognition for their destructive capability [4]. Even though derechos can form anywhere around the world, they predominantly form in the USA. In fact, very few studies have even been published about derechos occurring outside North America [5, 6]. Therefore, the focus of this study will be on the USA and a particularly strong derecho in 2012.

## **2. Background**

Derechos have been specifically defined as "families of downburst clusters" that originate from a mesoscale convective system while covering an area where the major axis is at least 400 km [3, 7, 8]. They are associated with causing straight‐line, non‐tornadic damage and occur most frequently during the summer months, especially east of the Rocky Mountains in the USA [3, 9]. In addition, areas of convection produce "bow echoes" [10] that propagate downstream and can create severe downbursts.

## **2.1. Types of derechos**

Derechos are often divided into four different types: progressive, serial, hybrid, and low‐dew point (dry). Regardless of the type, derechos often form from a mesoscale convective system where the individual thunderstorms often start to replicate as they propagate downstream.

*Progressive* derechos are often found along a stationary frontal boundary oriented west‐east with mid‐level winds flowing parallel (west‐east) to the frontal boundary [3, 11]. With sufficient convection, a mesoscale convective system will develop along the boundary evidenced by a bow echo typically having a length of up to 250 miles in the beginning phases. As the system travels downstream, a sharp downdraft of cold air intensifies near the center of the bow echo and is pushed further ahead by the west‐east mid‐level wind flow. The size of the derecho often increases to lengths greater than 250 miles. Progressives are more common during the summer as they require more convection at the site of the frontal boundary.

*Serial* derechos similarly form along a west‐east stationary frontal boundary; however, the mid‐ level winds are more from a southerly direction [3]. This results in multiple bow echoes that are smaller than the progressive counterparts, but are often embedded in a larger mid‐latitude low pressure system. Since serial derechos are often associated with warm southerly flow from the low‐level jet aiding the convection, these can even be observed during the spring and fall.

There are circumstances where the derecho will take on properties of both, making it a "*hybrid*." These are often found when there is a low‐pressure system present (as found in serial derechos) but there is also a west‐east mid‐level flow that is parallel to the stationary frontal boundary similar to those found in progressives. As a result, multiple derechos of both types can be found in the system [3].

*Dry* or *low‐dew point* derechos are found in environments of low moisture where the dew points are typically low (i.e., dew points lower than 60°F (16°C)). These types of environments are often found in the spring or fall in the Central Plains of the USA or in the Rocky Mountain states throughout the year. They take on characteristics similar to dry microbursts as cold, dry air rapidly accelerates toward the surface.

## **2.2. Climatology**

north‐south. Since there is no physical barrier to prevent cold air masses to the north and warm air masses to the south from interacting with each other, severe and hazardous weather is commonly produced. The warm, moist air is easily lifted by the colder, drier air producing mid‐latitude storms like no other place on Earth. In fact, North America generates almost 75%

66 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

However, there is another type of atmospheric hazard that does not rotate violently but nevertheless can cause considerable damage and a variety of societal impacts—the *derecho*. The term "derecho" is not as commonly known as "tornado" but has been around for many years. Derecho was first used by a University of Iowa physics professor, Dr. Gustavus Hinrichs, in 1888 to describe a thunderstorm that produced strong straight‐line winds as opposed to rotational winds [2]. Hinrichs believed another term was needed to differentiate violent straight‐line storms from tornadoes; therefore, he used "derecho," which is Spanish for "straight ahead." However, the word "derecho" was virtually absent from the meteorological lexicon for almost 100 years until the 1980s when Robert Johns and William Hirt [3] re‐ introduced the term. Although, not as widely known as tornadoes by the general public, derechos are gaining recognition for their destructive capability [4]. Even though derechos can form anywhere around the world, they predominantly form in the USA. In fact, very few studies have even been published about derechos occurring outside North America [5, 6]. Therefore, the focus of this study will be on the USA and a particularly strong derecho in 2012.

Derechos have been specifically defined as "families of downburst clusters" that originate from a mesoscale convective system while covering an area where the major axis is at least 400 km [3, 7, 8]. They are associated with causing straight‐line, non‐tornadic damage and occur most frequently during the summer months, especially east of the Rocky Mountains in the USA [3, 9]. In addition, areas of convection produce "bow echoes" [10] that propagate downstream

Derechos are often divided into four different types: progressive, serial, hybrid, and low‐dew point (dry). Regardless of the type, derechos often form from a mesoscale convective system where the individual thunderstorms often start to replicate as they propagate downstream. *Progressive* derechos are often found along a stationary frontal boundary oriented west‐east with mid‐level winds flowing parallel (west‐east) to the frontal boundary [3, 11]. With sufficient convection, a mesoscale convective system will develop along the boundary evidenced by a bow echo typically having a length of up to 250 miles in the beginning phases. As the system travels downstream, a sharp downdraft of cold air intensifies near the center of the bow echo and is pushed further ahead by the west‐east mid‐level wind flow. The size of the derecho often increases to lengths greater than 250 miles. Progressives are more common during the summer as they require more convection at the site of the frontal boundary.

of all known tornadoes [1].

**2. Background**

and can create severe downbursts.

**2.1. Types of derechos**

As North America is the most favorable location for tornadoes due to the interaction of warm, moist air with cold, dry air, the same is true for derechos. They are typically found east of the Rocky Mountains, with favored areas centered on the Upper Mississippi and Ohio River Valleys [3]. The climatological averages range from one derecho per year in the Mississippi and Ohio River Valleys to about one every four years near the Rocky Mountains and Atlantic Coast. The most likely time for formation is during the warm season primarily during the months of May, June, and July [3].

Bentley and Mote [9] found similar results for the timing of derechos, in that they primarily form during the warm season, however, the authors note they are more likely to form farther south in the southern Great Plains. In contrast to previous studies, they found the favored area is centered near Oklahoma and extended north‐eastward with a secondary maximum in the upper Ohio River Valley near Pennsylvania. They noted that the earlier study by Johns and Hirt used derecho events from a year that had an unusually high number in the Upper Midwest. When the inflated year was factored out, the resulting maximum shifted southward into the southern Great Plains.

Other studies, including Bentley and Sparks [12], have showed that the conditions favorable for derechos fluctuate from year to year, causing the frequency to shift along the Mississippi River Valley. Regardless, there appears to be a favorable axis oriented north‐south along the Mississippi River extending into the southern Great Plains. During the cold season, bow echoes were much more likely to form in the southeastern USA, especially when there is a strong, warm southwestern flow [13]. The resulting southwesterly flow through all levels is respon‐ sible for primarily producing serial derechos.

## **2.3. Societal impacts**

The societal impacts from derechos have been anticipated to grow as a result of population growth and increased urbanization [14]. Derecho impacts are thought to be as dangerous as tornadoes and hurricanes since they cover large areas and occur relatively frequently. Injuries and fatalities are the most obvious impacts and are expected where the frequency is high. However, a study by Ashley and Mote [4], found that an unusually high percentage of fatalities occurred in areas where the frequency of derecho events is not necessarily the highest. It was theorized that perhaps these regions had poorer warning systems in place as well as less awareness by the general public, thereby increasing the vulnerability. Some regions have been suggested, such as the upper Midwest and Great Lakes, to be susceptible to intense, warm season progressive derechos.

Most derecho fatalities occurred in vehicle accidents either by the vehicle overturning, a tree falling onto the vehicle, or the vehicle crashing into a fallen tree [4]. They also found fatalities were common on the open water in the form of drownings since boats can be easily overturned from the strong winds. As for injuries, most were again vehicle related. But also many injuries are the result of people being in poorly built structures such as mobile homes and being struck by flying debris.

When compared with other severe weather systems such as tornadoes and hurricanes, derecho fatalities are comparable [4]. In fact, derecho fatalities were found to be more than those caused by EF0 and EF1 tornadoes. Only when EF2 tornadoes were added did the totals surpass the derecho fatalities. Hurricane fatalities were found to be more than derechos, however, simply because the causes of death can come from a variety of factors including inland flooding, tornadoes, and storm surge.

In addition, insured losses from US derechos found that the estimated damage was often greater than \$100 million for each outbreak [4]. Brooks and Doswell [15] noted the amount of estimated damage is very comparable to most tornado outbreaks and smaller land‐falling hurricanes. Due to uninsured property and lack of derecho damage reports, the amount of damage, however, is apt to be even greater than described.

There have been several notable derecho events (averaging one per year) in the USA, specifi‐ cally during the 1998 warm season that brought three derechos and extremely hot tempera‐ tures to many areas of the country. More recently, during the summer of 2012, which also brought extremely hot summertime temperatures, the great Mid‐Atlantic Derecho occurred. The large 2012 derecho originated just outside of Chicago early in the morning and propagated swiftly toward Washington, DC, later that evening. Over five million customers in the region lost their power for more than a week and 22 fatalities were reported [16]. **Figure 1** shows an image before (top) and after (bottom) the storm. Notice the reduced number of lights in the bottom image around major metropolitans such as Columbus, OH, Pittsburgh, PA, and Washington, DC, depicting the loss of power. Not only did the storm adversely affect people throughout the region, but even people travelling around the region were impacted. The Mid‐ Atlantic contains large metropolitan areas; therefore, air and land travel is extremely abundant. Airplanes had to be diverted around the storm entering and leaving major airports, while

A Case Study of Atmospheric Dynamics and Thermodynamics in Derechos and the Societal Impacts http://dx.doi.org/10.5772/63319 69

**2.3. Societal impacts**

season progressive derechos.

tornadoes, and storm surge.

damage, however, is apt to be even greater than described.

by flying debris.

The societal impacts from derechos have been anticipated to grow as a result of population growth and increased urbanization [14]. Derecho impacts are thought to be as dangerous as tornadoes and hurricanes since they cover large areas and occur relatively frequently. Injuries and fatalities are the most obvious impacts and are expected where the frequency is high. However, a study by Ashley and Mote [4], found that an unusually high percentage of fatalities occurred in areas where the frequency of derecho events is not necessarily the highest. It was theorized that perhaps these regions had poorer warning systems in place as well as less awareness by the general public, thereby increasing the vulnerability. Some regions have been suggested, such as the upper Midwest and Great Lakes, to be susceptible to intense, warm

68 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Most derecho fatalities occurred in vehicle accidents either by the vehicle overturning, a tree falling onto the vehicle, or the vehicle crashing into a fallen tree [4]. They also found fatalities were common on the open water in the form of drownings since boats can be easily overturned from the strong winds. As for injuries, most were again vehicle related. But also many injuries are the result of people being in poorly built structures such as mobile homes and being struck

When compared with other severe weather systems such as tornadoes and hurricanes, derecho fatalities are comparable [4]. In fact, derecho fatalities were found to be more than those caused by EF0 and EF1 tornadoes. Only when EF2 tornadoes were added did the totals surpass the derecho fatalities. Hurricane fatalities were found to be more than derechos, however, simply because the causes of death can come from a variety of factors including inland flooding,

In addition, insured losses from US derechos found that the estimated damage was often greater than \$100 million for each outbreak [4]. Brooks and Doswell [15] noted the amount of estimated damage is very comparable to most tornado outbreaks and smaller land‐falling hurricanes. Due to uninsured property and lack of derecho damage reports, the amount of

There have been several notable derecho events (averaging one per year) in the USA, specifi‐ cally during the 1998 warm season that brought three derechos and extremely hot tempera‐ tures to many areas of the country. More recently, during the summer of 2012, which also brought extremely hot summertime temperatures, the great Mid‐Atlantic Derecho occurred. The large 2012 derecho originated just outside of Chicago early in the morning and propagated swiftly toward Washington, DC, later that evening. Over five million customers in the region lost their power for more than a week and 22 fatalities were reported [16]. **Figure 1** shows an image before (top) and after (bottom) the storm. Notice the reduced number of lights in the bottom image around major metropolitans such as Columbus, OH, Pittsburgh, PA, and Washington, DC, depicting the loss of power. Not only did the storm adversely affect people throughout the region, but even people travelling around the region were impacted. The Mid‐ Atlantic contains large metropolitan areas; therefore, air and land travel is extremely abundant. Airplanes had to be diverted around the storm entering and leaving major airports, while

**Figure 1.** Before (top) and after (bottom) of the lights in the Mid‐Atlantic depicting the power outages (Courtesy of CIMSS, University of Wisconsin).

gasoline was scarce in certain areas due to the power outages. Even local National Weather Service offices had difficulty sending reports due to the widespread power outages. **Figure 2** shows the official storm reports from the June 29, 2012, outbreak; however, there is a data "hole" in the middle of the plot of reports since electricity went out at a local National Weather Service office and regional cellular communication towers were down so reports could not be sent.

Coupled with the brutally hot temperatures (often near 100°F (38°C)), the negative societal impacts made the general public of the USA painfully aware of derechos. In fact, many people had never heard of the term before this powerful event. However, after showing the millions of dollars of damage, news media outlets made sure "derecho" was the new buzzword for the year of 2012 by broadcasting segments entitled, "What is a derecho?" [17, 18] and brought the term back into the meteorological lexicon.

**Figure 2.** Storm reports depicting the data "hole" in Central West Virginia (Courtesy of NWS/SPC).

## **3. Case study: 2012 Mid‐Atlantic Derecho**

The June 29, 2012, Mid‐Atlantic Derecho was one of the most destructive weather events of the year and will be remembered as one of the most intensive storms in the region. The event was responsible for 22 deaths, widespread infrastructural damage estimated at over \$1 billion, and approximately 5 million people losing power [16]. Two general types of derechos exist: serial, which is produced by multiple bow echoes embedded within a larger squall line; and progressive that originates as a small, single bow echo but develops into a large bow echo system hundreds of miles long. The dynamic and thermodynamic environmental conditions ultimately determine the type of derecho that develops. The 2012 Mid‐Atlantic Derecho exhibited the characteristics of a progressive derecho as it originated near a quasi‐stationary boundary in Iowa. The derecho quickly propagated into a small bow echo near Chicago (**Figure 3**) and then raced east expanding in size as it reached the Mid‐Atlantic coast.

**Figure 3.** Radar Base Reflectivity overlay of the derecho progression June 29, 2012. Overlay courtesy of G. Carbin NWS/Storm Prediction Center.

Since the formation of derechos is dependent on the thunderstorms replicating in a downwind manner, atmospheric conditions must be examined to determine if the individual thunder‐ storms will develop sequentially. However, progressive derechos remain difficult to forecast largely because of the sub‐grid scale interactions between the individual thunderstorms with the environment [19]. It has been shown that progressive derechos often form during the warm season and develop in a variety of shear and instability conditions [20, 21]. Derechos typically develop when the mid‐level shear is weak while the low‐level shear is strong. In addition, derechos form in very unstable environments indicated by high values of Convective Available Potential Energy (CAPE) greater than 1500 J Kg-1. On the other hand, they have been shown to form in low CAPE environments (less than 500 J kg-1) if the synoptic forcing is high [21, 22]. The objective of this study is to investigate the dynamic and thermodynamic factors respon‐ sible for the 2012 outbreak and determine if those parameters during the event were signifi‐ cantly different from the mean values. As a result, the parameters can be examined to see if they are outside the generally accepted thresholds for forecasting derechos.

## **3.1. Methodology**

**3. Case study: 2012 Mid‐Atlantic Derecho**

70 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

NWS/Storm Prediction Center.

The June 29, 2012, Mid‐Atlantic Derecho was one of the most destructive weather events of the year and will be remembered as one of the most intensive storms in the region. The event was responsible for 22 deaths, widespread infrastructural damage estimated at over \$1 billion, and approximately 5 million people losing power [16]. Two general types of derechos exist: serial, which is produced by multiple bow echoes embedded within a larger squall line; and progressive that originates as a small, single bow echo but develops into a large bow echo system hundreds of miles long. The dynamic and thermodynamic environmental conditions ultimately determine the type of derecho that develops. The 2012 Mid‐Atlantic Derecho exhibited the characteristics of a progressive derecho as it originated near a quasi‐stationary boundary in Iowa. The derecho quickly propagated into a small bow echo near Chicago

(**Figure 3**) and then raced east expanding in size as it reached the Mid‐Atlantic coast.

**Figure 3.** Radar Base Reflectivity overlay of the derecho progression June 29, 2012. Overlay courtesy of G. Carbin

Since the formation of derechos is dependent on the thunderstorms replicating in a downwind manner, atmospheric conditions must be examined to determine if the individual thunder‐ storms will develop sequentially. However, progressive derechos remain difficult to forecast largely because of the sub‐grid scale interactions between the individual thunderstorms with the environment [19]. It has been shown that progressive derechos often form during the warm season and develop in a variety of shear and instability conditions [20, 21]. Derechos typically develop when the mid‐level shear is weak while the low‐level shear is strong. In addition, derechos form in very unstable environments indicated by high values of Convective Available Potential Energy (CAPE) greater than 1500 J Kg-1. On the other hand, they have been shown to form in low CAPE environments (less than 500 J kg-1) if the synoptic forcing is high [21, 22]. The objective of this study is to investigate the dynamic and thermodynamic factors respon‐ sible for the 2012 outbreak and determine if those parameters during the event were signifi‐ Data were collected from the North American Mesoscale (NAM) model for the 1200 UTC, 1800 UTC (June 29, 2012), and 0000 UTC (June 30, 2012) runs. Six stations were selected for closer investigation (Davenport, Iowa; Chicago, Illinois; Ft. Wayne, Indiana; Wilmington, Ohio; Charleston, West Virginia; and Washington, DC) due to their proximity to the passage of the storm system. The relative station locations are shown in **Figure 4** as well as the time of the derecho passage is stated in **Table 1**. Radar images at the time of the storm passage are shown for each station in **Figure 5**.

**Figure 4.** USA Map with the six stations under investigation: (a) Davenport, IA; (b) Chicago, IL; (c) Ft. Wayne, IN; (d) Wilmington, OH; (e) Charleston, WV; (f) Washington, DC.


**Table 1.** Six stations used in the study and the time of derecho passage.

**Figure 5.** Base Reflectivity Radar Imagery of the derecho as it was passing through the six stations investigated in the study (indicated by star): (a) Davenport, IA; (b) Chicago, IL; (c) Ft. Wayne, IN; (d) Wilmington, OH; (e) Charleston, WV; (f) Washington, DC, Dulles International Airport.


**Table 2.** List of wind shear and instability parameters.

Twelve parameters were collected in hourly intervals for all six stations beginning at 1200 UTC on June 29, 2012, until the time of the storm passage. The list of instability and wind shear parameters investigated are shown in **Table 2**. The instability parameters included the following: Convective Available Potential Energy (CAPE); the Most Unstable CAPE (MU‐ CAPE), the potential energy within the lowest 300 hPa; the Downdraft CAPE (DCAPE), measuring the strength of the rain‐cooled downdraft; the 950–850 hPa lapse rate, describing lower‐level instability; and convective temperature (CT), the surface temperature that must be attained for convection to occur. Precipitable water (PW) is the vertically integrated amount of water through the column of air. Wind shear parameters included the Bulk Richardson Number (BRN), which is a dimensionless ratio of turbulence versus wind shear. Helicity is a measure of the helical or "corkscrew" flow of the air. Additionally the 1 km storm relative inflow and three different layers of wind shear were included.

The model data were analyzed in BUFKIT [23], a visualization software designed for weather forecasting and analysis. The data were standardized by hour (…t‐2, t‐1, t) until the time of storm passage. Downshear Convective Available Potential Energy values were calculated by estimating the temperature of the downdraft parcel (*T*pd), which is between the mid‐level wet bulb potential temperature and the updraft wet bulb potential temperature [24] and is shown by the following equation:

$$\frac{1}{2} \ast \mathbf{g} \ast \left(\frac{Te - Tpd}{Te}\right) \ast \Delta z \tag{1}$$

where *T*<sup>e</sup> is the environmental surface temperature (K), *T*pd is the expected parcel downdraft surface temperature (K), and Δ*z* is the depth of the negatively buoyant air (m).

One‐sample one‐tailed *t*‐tests were then conducted (for each station and each variable) at hourly intervals relative to the storm passage in order to determine if parameter values were significantly larger than the mean daily values and to potentially develop critical forecasting thresholds. The number of samples varied at each station since they were measured hourly starting at the beginning of the day until relative to the storm passage; therefore, more data were available progressing eastward. Level 2 Radar, which has higher resolution than conventional Level 3 Radar and offers dual polarmetric data [25] was also analyzed in addition to surface and upper air plots.

## **4. Results**

**Figure 5.** Base Reflectivity Radar Imagery of the derecho as it was passing through the six stations investigated in the study (indicated by star): (a) Davenport, IA; (b) Chicago, IL; (c) Ft. Wayne, IN; (d) Wilmington, OH; (e) Charleston,

CAPE J kg-1

MUCAPE J kg-1

DCAPE J kg-1

Shr 0–6 m s-1

Shr 0–3 m s-1

Shr 3–6 m s-1

s-2

**Parameter Abbreviation Units**

72 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Precipitable water PW cm Convective temperature CT °C 950 hPa–850 hPa lapse rate Lapse rate °C km-1 Helicity Helicity m2

Bulk Richardson number BRN Unitless 1 km storm relative inflow SR inflow m s-1

WV; (f) Washington, DC, Dulles International Airport.

**Table 2.** List of wind shear and instability parameters.

Convective available Potential energy

Most unstable convective Available potential energy

Downdraft convective Available potential energy

Wind shear (0–6 km)

Wind shear (0–3 km)

Wind shear (3–6 km)

A quasi‐stationary (QS) boundary was oriented west‐east from Iowa to the Mid‐Atlantic region during the morning hours on June 29, 2012 (**Figure 6**). Close inspection of the soundings and surface plots revealed warm temperatures and high dew points in the region at 1500 UTC (**Figures 7** and **8**). On the northern side of the boundary, temperatures were in the mid‐70s to the low‐80s (Fahrenheit) (21–28°C), while dew points were around 60°F (16°C). However, on the southern side of the boundary, temperatures were in the mid‐80s (30°C) to around 90°F (32°C), while the dew points were close to 70°F (21°C). The reason the air was unusually warm and humid in the morning hours was because it was the result of a decaying mesoscale convective system the previous day. A small region of convection then formed over northern Iowa as a strong southerly flow from the nocturnal low‐level jet provided ample moisture to the system indicated by the dew point equaling the temperature around 900 hPa. The sounding at Davenport (**Figure 9**) at 1200 UTC exhibited a classic "inverted‐v" shape confining the potential energy close to the surface and preventing thunderstorms from developing too quickly.

**Figure 6.** Surface map at 1500 UTC for June 29, 2012.

A Case Study of Atmospheric Dynamics and Thermodynamics in Derechos and the Societal Impacts http://dx.doi.org/10.5772/63319 75

**Figure 7.** Surface plot of temperatures (a) (previous page °F); (b) (above °C) at 1500 UTC June 29, 2012.

the southern side of the boundary, temperatures were in the mid‐80s (30°C) to around 90°F (32°C), while the dew points were close to 70°F (21°C). The reason the air was unusually warm and humid in the morning hours was because it was the result of a decaying mesoscale convective system the previous day. A small region of convection then formed over northern Iowa as a strong southerly flow from the nocturnal low‐level jet provided ample moisture to the system indicated by the dew point equaling the temperature around 900 hPa. The sounding at Davenport (**Figure 9**) at 1200 UTC exhibited a classic "inverted‐v" shape confining the potential energy close to the surface and preventing thunderstorms from developing too

74 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

quickly.

**Figure 6.** Surface map at 1500 UTC for June 29, 2012.

**Figure 8.** Surface plot of dew points (a) (top °F); (b) (bottom °C) at 1500 UTC for June 29, 2012.

**Figure 9.** NAM 1200 UTC run for sounding at Davenport, IA for 1500 UTC, June 29, 2012.

A small of line of thunderstorms formed along the QS boundary and passed through Daven‐ port, IA at approximately 1300 UTC. The 500 hPa chart did not show any short‐wave troughs but exhibited strong westerly flow over northern Iowa (**Figure 10**). However, the 250 hPa chart showed a jet streak with the right entrance region over northern Iowa to provide synoptic scale forcing (**Figure 11**). Despite not having large CAPE values (less than 500 J kg-1) in the early morning, the moderate amount of synoptic scale forcing was enough to develop the storms. By approximately 1500 UTC, the QS boundary was between Davenport and Chicago (**Figure 12**). The rear cold outflow combined with the southerly surface inflow appeared to "split" the system. In addition, the faster westerly mid–upper‐level winds to the north of the boundary caused the eastern part of the QS boundary to turn clockwise into a north‐south orientation (**Figure 13**).

**Figure 10.** 500 hPa map at 1200 UTC, June 29, 2012.

A Case Study of Atmospheric Dynamics and Thermodynamics in Derechos and the Societal Impacts http://dx.doi.org/10.5772/63319 77

**Figure 11.** 250 hPa map at 1200 UTC, June 29, 2012.

**Figure 9.** NAM 1200 UTC run for sounding at Davenport, IA for 1500 UTC, June 29, 2012.

76 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

orientation (**Figure 13**).

**Figure 10.** 500 hPa map at 1200 UTC, June 29, 2012.

A small of line of thunderstorms formed along the QS boundary and passed through Daven‐ port, IA at approximately 1300 UTC. The 500 hPa chart did not show any short‐wave troughs but exhibited strong westerly flow over northern Iowa (**Figure 10**). However, the 250 hPa chart showed a jet streak with the right entrance region over northern Iowa to provide synoptic scale forcing (**Figure 11**). Despite not having large CAPE values (less than 500 J kg-1) in the early morning, the moderate amount of synoptic scale forcing was enough to develop the storms. By approximately 1500 UTC, the QS boundary was between Davenport and Chicago (**Figure 12**). The rear cold outflow combined with the southerly surface inflow appeared to "split" the system. In addition, the faster westerly mid–upper‐level winds to the north of the boundary caused the eastern part of the QS boundary to turn clockwise into a north‐south

**Figure 12.** Base reflectivity radar at Davenport, IA at 1500 UTC, June 29, 2012.

**Figure 13.** Base reflectivity radar at Davenport, IA at 1530 UTC, June 29, 2012.

Once the storm aligned north‐south, the upper‐level winds steered the system to the east. As shown in the soundings and upper‐level charts, the winds were approximately 30–50 kts out of the west and exhibited very little shear. Therefore, as daytime heating occurred the con‐ vection became more of an important forcing mechanism. The daytime high temperatures approached 100°F (approximately 38°C) over much of the region with dew points near 70°F (21°C) (**Figure 14**).

Links have been made between elevated mixed layers (EMLs) and derecho formation [22]. Since EMLs exhibit very steep lapse rates in the mid‐levels, they help increase the instability. **Figure 14** illustrates the steep lapse rates that existed at Charleston, WV. Combined with the extremely high surface temperatures, the air easily became unstable as the air attained the convective temperature of 36°C.

**Figure 14.** Surface map of temperature (a) (top) and dew point (b) (bottom) in °F for 2200 UTC, June 29, 2012.

A Case Study of Atmospheric Dynamics and Thermodynamics in Derechos and the Societal Impacts http://dx.doi.org/10.5772/63319 79

**Figure 15.** Sounding for Charleston, WV at 2300 UTC, June 29, 2012.

Once the storm aligned north‐south, the upper‐level winds steered the system to the east. As shown in the soundings and upper‐level charts, the winds were approximately 30–50 kts out of the west and exhibited very little shear. Therefore, as daytime heating occurred the con‐ vection became more of an important forcing mechanism. The daytime high temperatures approached 100°F (approximately 38°C) over much of the region with dew points near 70°F

78 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Links have been made between elevated mixed layers (EMLs) and derecho formation [22]. Since EMLs exhibit very steep lapse rates in the mid‐levels, they help increase the instability. **Figure 14** illustrates the steep lapse rates that existed at Charleston, WV. Combined with the extremely high surface temperatures, the air easily became unstable as the air attained the

**Figure 14.** Surface map of temperature (a) (top) and dew point (b) (bottom) in °F for 2200 UTC, June 29, 2012.

(21°C) (**Figure 14**).

convective temperature of 36°C.

The repetitive downwind propagation continued eastward as the rapid convection ahead of the gust front created large updrafts. The rear inflow became more pronounced near Charles‐ ton, WV as the "bow" pushed farther to the east (**Figures 15** and **16a**). Behind the gust front, the cold pool became more elongated which can be seen in the base velocity image (**Fig‐ ure 16b**). Volume scans show the slope of the updraft and the rear inflow jet and the formation of the cold pool (**Figure 16c**, **d**).

**Figure 16.** Charleston, WV radar at 2300 UTC, June 29, 2012; (a) base reflectivity; (b) base velocity; (c) volume scan base reflectivity; (d) volume scan base velocity.

The mean convective temperature as the derecho passed each of the six stations was 38.5°C while the mean 950 hPa to 850 hPa lapse rate was 8.61°C km-1 (**Table 3**). Although the mean convective temperature and lapse rates were relatively high, the standard deviation was large as the values were relatively low during the early stages but progressively increased as the derecho matured. The mean CAPE at *t (*time of derecho passage*)* was 2796 J kg-1 and DCAPE was 789 J kg-1. Similarly the mean instability variables were relatively high but were low *during* the early stage of development. On the other hand, shear variables exhibited high values throughout the movement toward the east. Mean values for 0–3 km shear, 0–6 km shear, and 1 km storm relative inflow were 21.66 m s-1, 31.5 m s-1, and 9 m s-1, respectively.


\*Significant parameter (*p* < .05).

**Table 3.** One sample *t*‐test for the six stations at time of derecho passage (df = 5, *p* < .05).

As a result, the one‐sample t‐test did not reveal that the mean (*<sup>X</sup>*¯)

was larger than the test value for the convection and instability variables. However, for three shear variables (shr 0–3, shr 0–6, and Storm Relative (SR) inflow) the mean was significantly larger than the test values at the time of passage (denoted by the \* next to the value of *t*). In addition, the mean precipitable water was significantly larger than the test value of 2.54 cm.

At Davenport, IA, only three variables (shr 0–3, shr 0–6, and PW) exhibited means that were significantly larger than the critical test value. **Tables 4**–**6** show the one‐sample t‐test results for Chicago, Wilmington, and Charleston in addition to the mean daily values prior to the passage of the derecho. In Chicago, the mean convective temperature was significantly larger (41.82°C) than the critical test value (37°C) as well as the shear variables and PW found in Davenport (**Tables 4**). However, farther downwind in Fort Wayne, Wilmington, Charleston, and Washington‐Dulles, the mean CAPE and MUCAPE values were significantly larger than the respective means of 2000 and 2200 J kg-1. These significant results are largely due to the extremely high CAPE/MUCAPE model values on the order of 4500–5000 J kg-1 found in the region.


\*Significant parameter (CT, shr 0–3, shr 0–6, SR Inflow, PW).

The mean convective temperature as the derecho passed each of the six stations was 38.5°C while the mean 950 hPa to 850 hPa lapse rate was 8.61°C km-1 (**Table 3**). Although the mean convective temperature and lapse rates were relatively high, the standard deviation was large as the values were relatively low during the early stages but progressively increased as the derecho matured. The mean CAPE at *t (*time of derecho passage*)* was 2796 J kg-1 and DCAPE was 789 J kg-1. Similarly the mean instability variables were relatively high but were low *during* the early stage of development. On the other hand, shear variables exhibited high values throughout the movement toward the east. Mean values for 0–3 km shear, 0–6 km shear, and

*t* **Mean SD Test value**

1 km storm relative inflow were 21.66 m s-1, 31.5 m s-1, and 9 m s-1, respectively.

) -1.69 92.5 83.39 150

was larger than the test value for the convection and instability variables. However, for three shear variables (shr 0–3, shr 0–6, and Storm Relative (SR) inflow) the mean was significantly larger than the test values at the time of passage (denoted by the \* next to the value of *t*). In addition, the mean precipitable water was significantly larger than the test value of 2.54 cm. At Davenport, IA, only three variables (shr 0–3, shr 0–6, and PW) exhibited means that were significantly larger than the critical test value. **Tables 4**–**6** show the one‐sample t‐test results for Chicago, Wilmington, and Charleston in addition to the mean daily values prior to the passage of the derecho. In Chicago, the mean convective temperature was significantly larger (41.82°C) than the critical test value (37°C) as well as the shear variables and PW found in Davenport (**Tables 4**). However, farther downwind in Fort Wayne, Wilmington, Charleston, and Washington‐Dulles, the mean CAPE and MUCAPE values were significantly larger than

PW (cm) \*11.40 4.57 0.43 2.54 BRN (unitless) 1.13 97.66 103.1 50

**Table 3.** One sample *t*‐test for the six stations at time of derecho passage (df = 5, *p* < .05).

As a result, the one‐sample t‐test did not reveal that the mean (*<sup>X</sup>*¯)

CT (°C) 1.60 38.5 2.29 37 Lapse rate (°C km-1) 1.31 8.61 2.17 8.5 CAPE (J kg-1) 0.53 2378 1740 2000 MUCAPE (J kg-1) 0.87 2796 1679 2200 DCAPE (J kg-1) -1.42 789 363 1000 Shr (0–3) (m s-1) \*2.72 21.7 9.15 11.5 Shr (0–6) (m s-1) \*5.56 31.5 7.0 15.6 Shr (3–6) (m s-1) -2.67 9.66 4.88 15 SR Inflow (m s-1) \*6.71 9.0 1.09 6

80 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Helicity (m2

s2

\*Significant parameter (*p* < .05).

**Table 4.** One‐sample *t*‐test for the mean daily values prior to derecho passage at Chicago, IL (df = 4, *p* < .05).


\*Significant parameter (CAPE, MUCAPE, shr 0–3, shr 0–6, SR Inflow, PW, BRN).

**Table 5.** One‐sample *t*‐test for the mean daily values prior to derecho passage at Wilmington, OH (df = 9, *p* < .05).


**Table 6.** One‐sample *t*‐test for the mean daily values prior to derecho passage at Charleston, WV (df = 11, *p* < .05).

In addition to the large amount of instability, the storm relative inflow was notably high once the derecho passed Fort Wayne, IN. In Wilmington, Charleston, and Washington DC, the mean storm relative inflow was significantly larger than the critical value of 6 m s-1 (**Tables 5** and **6**). The mean was at least 8 m s-1 in both Wilmington and Charleston and often approached 12 m s-1 in the late afternoon from the south. Overall, the mean Downdraft CAPE was not significantly larger than the critical value because the values were low in the morning which offset the higher values in the late afternoon. However, the DCAPE values were extremely high in the late afternoon (approaching 1300 J kg-1) aiding the formation of the cold pool and strengthening the derecho.

## **5. Conclusions**

Derechos have been known to cause similar types of damage to tornadoes in terms of monetary damage and fatalities. Even though they have straight‐line winds rather than rotational, the environmental conditions prior and during derecho events are comparable as well. By investigating the intense 2012 Mid‐Atlantic Derecho, the importance of the key thermody‐ namics and dynamics could be seen.

Anomalously high wind shear in the low‐ and mid‐levels was shown to be vital in the propagation of the storms. Low‐level mean wind shear was significantly larger at the 0–3 km layer (21.7 m s-1) than the critical threshold of 11.5 m s-1 at the time of passage at all six stations in the path. Likewise the low‐mid‐level mean shear in the 0–6 km level was significantly larger (31.5 m s-1) than the critical threshold of 15.6 m s-1 at the time of passage at all six stations.

Equally important, anomalously high heat values combined with great atmospheric instability (measured by CAPE/MUCAPE) were present at certain times during the outbreak. During the early development, CAPE/MUCAPE values were low suggesting they are not necessary as long as other synoptic forcing agents (i.e., jet streaks) are available. CAPE/MUCAPE values were not high in Illinois where the derecho formed; however, there was plenty of shear and synoptic forcing from the jet in northern Illinois to initialize the development.

Even though the synoptic uplift was absent from the lack of a jet streak in the Mid‐Atlantic, surface temperatures reached the convective temperatures, triggering uplift into the unstable atmosphere. Afternoon temperatures in the area easily reached approximately 100°F (38°C). CAPE/MUCAPE model values were approximately 4500–5000 J kg-1 and model convective temperatures ranged between 38 and 40°C. On the other hand, the strength of the downdraft dynamics (DCAPE) was large but not found to be significantly larger.

In summary, CAPE/MUCAPE and the mean wind shear were significantly larger than their respective critical thresholds in the Mid‐Atlantic. The 2012 Mid‐Atlantic Derecho showed how the right combination of mean low‐mid‐level shear with jet stream dynamics caused it to initially develop while the intense heat led to the convective instability necessary to generate intensity. The event demonstrated that winds do not have to rotate to cause widespread damage and societal impacts and, as such, the term "derecho" was brought back to the forefront in the meteorological vocabulary.

## **Author details**

Kevin Law

*t* **Mean SD Test value**

CT (°C) \*6.30 41.05 2.22 37 Lapse Rate (°C km-1) 0.66 8.85 1.89 8.5 CAPE (J kg-1) \*2.36 2852.9 1253.9 2000 MUCAPE (J kg-1) \*6.07 3383.3 674.6 2200 DCAPE (J kg-1) -0.19 985.2 269.7 1000 Shr (0–3) (m s-1) \*4.57 23.3 8.97 11.5 Shr (0–6) (m s-1) \*8.77 38.1 8.87 15.6 Shr (3–6) (m s-1) -0.86 14.6 1.67 15 SR Inflow (m s-1) \*3.39 8.0 2.04 6

82 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

) -10.16 -3.41 52.29 150

**Table 6.** One‐sample *t*‐test for the mean daily values prior to derecho passage at Charleston, WV (df = 11, *p* < .05).

In addition to the large amount of instability, the storm relative inflow was notably high once the derecho passed Fort Wayne, IN. In Wilmington, Charleston, and Washington DC, the mean storm relative inflow was significantly larger than the critical value of 6 m s-1 (**Tables 5** and **6**). The mean was at least 8 m s-1 in both Wilmington and Charleston and often approached 12 m s-1 in the late afternoon from the south. Overall, the mean Downdraft CAPE was not significantly larger than the critical value because the values were low in the morning which offset the higher values in the late afternoon. However, the DCAPE values were extremely high in the late afternoon (approaching 1300 J kg-1) aiding the formation of the cold pool and

Derechos have been known to cause similar types of damage to tornadoes in terms of monetary damage and fatalities. Even though they have straight‐line winds rather than rotational, the environmental conditions prior and during derecho events are comparable as well. By investigating the intense 2012 Mid‐Atlantic Derecho, the importance of the key thermody‐

Anomalously high wind shear in the low‐ and mid‐levels was shown to be vital in the propagation of the storms. Low‐level mean wind shear was significantly larger at the 0–3 km layer (21.7 m s-1) than the critical threshold of 11.5 m s-1 at the time of passage at all six stations in the path. Likewise the low‐mid‐level mean shear in the 0–6 km level was significantly larger (31.5 m s-1) than the critical threshold of 15.6 m s-1 at the time of passage at all six stations.

PW (cm) \*12.20 4.06 0.43 2.54 BRN (unitless) 1.34 79.6 76.7 50

\*Significant parameter (CT, CAPE, MUCAPE, shr 0–3, shr 0–6, SR Inflow, PW).

Helicity (m2

s2

strengthening the derecho.

namics and dynamics could be seen.

**5. Conclusions**

Address all correspondence to: law14@marshall.edu

Marshall University Huntington, West Virginia, USA

## **References**


[19] Coniglio, M. C., S. F. Corfidi, and J. S. Kain, 2011: Environment and early evolution of the 8 May 2009 derecho producing convective system. *Mon. Wea. Rev.,* 139, No. 4, 1083– 1102.

[5] Gatzen, C., 2004: A derecho in Europe: Berlin, 10 July 2002. *Wea. Forecasting*, 19, 639–

[6] López, J. M., and J. Arús, 2004: A Mediterranean derecho: Catalonia (Spain), 17th August 2003. Preprints, *Third European Conf. on Severe Storms*, León, Spain, CD‐ROM,

[7] Fujita, T. T., and R. M. Wakimoto, 1981: Five scales of airflow associated with a series

[8] Zipser, K. A., 1982: Use of a conceptual model of the life cycle of mesoscale convective systems to improve very short range forecasts. *Nowcasting*, K. Browning, Ed., Academic

[9] Bentley, M. L., and T. L. Mote, 1998: A climatology of derecho producing mesoscale convective systems in the central and eastern United States, 1986–1995. Part I: Temporal

[10] Fujita, T. T., 1978: Manual of downburst identification for project NIMROD, SMRP Res. Paper 156, University of Chicago, NTIS Accession No. N78‐30771/7GI, 104 pp.

[11] Coniglio, M. C., and D. J. Stensrud, 2001: Simulation of a progressive derecho using

[12] Bentley, M. L., and J. A. Sparks, 2003: A 15‐yr climatology of derecho‐producing mesoscale convective systems over the central and eastern United States. *Climate Res.*,

[13] Burke P. C., and D. M. Shultz, 2004: A 4‐yr climatology of cold season bow echoes over

[14] Changnon S. A., 2001: Damaging thunderstorm activity in the United States. *Bull. Amer.*

[15] Brooks, H. E., and C. A. Doswell III, 2001: Normalized damage from major tornadoes

[16] National Weather Service, 2013: The historic derecho of June 29, 2012. Service Assess‐ ment, U.S. Department of Commerce, National Oceanic and Atmospheric Adminis‐

[17] The Weather Channel, Derecho: The science behind widespread damaging winds. Retrieved from https://weather.com/news/news/derecho‐explainer‐20120612.

[18] Accuweather, intense storms called derechos slam 700 miles of the US. Retrieved from http://www.accuweather.com/en/weather‐news/deadly‐super‐derecho‐strikes‐m/

of downbursts on 16 July 1980. *Mon. Wea. Rev.*, 109, 1438–1456.

84 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

and spatial distribution. *Bull. Amer. Meteor. Soc.,* 79, 2527–2540.

composite initial conditions. *Mon. Wea. Rev*., 129, 1593–1616.

the continental United States. *Wea. Forecasting*, 19, 1061–1074.

in the United States: 1980–1999. *Wea. Forecasting*, 16, 168–176.

645.

P53.

Press, London, 191–204.

24, 129–139.

67383.

*Meteor. Soc.*, 82, 597–608.

tration, Silver Spring, Maryland, 61 pp.


## **Extreme Precipitation Events over East Asia: Evaluating the CMIP5 Model**

Nicolas Freychet, Huang-Hsiung Hsu and Chi-Hua Wu

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/62996

### **Abstract**

Extreme hydrological events are a direct threat to society and the environment, and their study within the framework of global climate change remains crucial. However, forecasts present numerous uncertainties.

This study investigates the modification of precipitation characteristics over East Asia, a region densely populated and vulnerable to extreme rainfall. The performance of the models and the confidence in their projections are analyzed using data derived from an ensemble of models from Phase 5 of the Coupled Model Intercomparison Project (CMIP5). Different factors that can affect the confidence of the projection are considered, such as the resolution, the response to radiative forcing, and the modification of large-scale atmospheric circulation.

The resolution and response in radiative forcing do not exhibit a clear correlation with the change in precipitation. The moisture flux convergence (MFC), by contrast, has a clear impact on extreme events. Specifically, the change in the dynamical term of the vertical MFC exhibits a major disagreement among the models and could strongly affect the confidence of the ensemble projection. Extreme precipitation is likely to increase over East Asia and India.

**Keywords:** Extreme precipitation, CMIP5, East Asia monsoon, RCP8.5, variability

## **1. Introduction**

The Asian subcontinent is one of the most densely populated areas on the Earth and is also one of the most vulnerable regions to environmental conditions. The region is subject to

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

strong seasonal variations in precipitation, with wet summers and drier winters. The dynamics of the Asian summer monsoon have been reviewed in many papers [e.g., 1–5]. The monsoon provides necessary water for societal needs, but can also be associated with extreme precipitation. This type of rainfall represents a dual threat to society. First, the rain falls in excess amounts over a short period, resulting in a high runoff rather than recharg‐ ing the groundwater supply. Second, extreme precipitation often leads to floods and, eventually, to landslides, building collapses, and causalities. Extreme precipitation in East Asia is often associated with typhoons, but it can also be triggered by other atmospheric conditions, such as a recent extreme rainfall event that occurred over China in the late spring and early summer of 2015. Heavy rainfall (above 60 mm day−1) first affected the southern provinces before moving toward the central and northern parts of the country. Overall, several hundreds of thousands of people were affected, tens of thousands of houses were destroyed, several people died, and crops and roads were severely damaged.1 These effects showcase the need for anticipating and planning hydrological extreme events, including answering the following question: How will global climate change affect the characteris‐ tics of extreme rainfall?

Various studies have investigated the projections of precipitation in a warming climate [6– 19] and the modification of monsoon systems under such conditions [20–27]. The global and regional patterns of extreme rainfall changes have also been subject to research [28–36]. The Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCCAR4) presents a summary of these studies; for example, Chapters 10.3.6 [37] and 11.4 [38] detail projec‐ tions for Asia. However, one critical problem remains: To what extent can we trust the models? Phase 5 of the Coupled Model Intercomparison Project (CMIP5) provides a wide range of models, each with its own characteristics (e.g., parametrization and resolution). The study of an ensemble extracted from the CMIP5 is a good approach to investigating the uncertainties associated with the projection and to quantifying its confidence. In this study, 23 models are used, all in accordance with the representative concentration pathway 8.5 (RCP8.5), to answer the following questions: How will precipitation characteristics change over East Asia by the end of the twenty-first century? How reliable are these projections? What can explain the scattering of the ensemble? Moreover, what will be the social and economic impacts?

Section 2 details the data and methodology used. Section 3 introduces a review of the per‐ formance of the CMIP5 models as well as a comparison against observations and then an analysis on the estimated ensemble projection and its reliability. Section 4 presents a more detailed discussion on the change in atmospheric characteristics associated with the change in extreme rainfall. Section 5 provides a brief discussion on possible societal impacts, and Section 6 offers a conclusion.

<sup>1</sup> Statistics from: floodlist.com/tag/china

## **2. Data and methodology**

## **2.1. Data**

These effects

strong seasonal variations in precipitation, with wet summers and drier winters. The dynamics of the Asian summer monsoon have been reviewed in many papers [e.g., 1–5]. The monsoon provides necessary water for societal needs, but can also be associated with extreme precipitation. This type of rainfall represents a dual threat to society. First, the rain falls in excess amounts over a short period, resulting in a high runoff rather than recharg‐ ing the groundwater supply. Second, extreme precipitation often leads to floods and, eventually, to landslides, building collapses, and causalities. Extreme precipitation in East Asia is often associated with typhoons, but it can also be triggered by other atmospheric conditions, such as a recent extreme rainfall event that occurred over China in the late spring and early summer of 2015. Heavy rainfall (above 60 mm day−1) first affected the southern provinces before moving toward the central and northern parts of the country. Overall, several hundreds of thousands of people were affected, tens of thousands of houses were

88 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

destroyed, several people died, and crops and roads were severely damaged.1

tics of extreme rainfall?

economic impacts?

Section 6 offers a conclusion.

1 Statistics from: floodlist.com/tag/china

showcase the need for anticipating and planning hydrological extreme events, including answering the following question: How will global climate change affect the characteris‐

Various studies have investigated the projections of precipitation in a warming climate [6– 19] and the modification of monsoon systems under such conditions [20–27]. The global and regional patterns of extreme rainfall changes have also been subject to research [28–36]. The Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCCAR4) presents a summary of these studies; for example, Chapters 10.3.6 [37] and 11.4 [38] detail projec‐ tions for Asia. However, one critical problem remains: To what extent can we trust the models? Phase 5 of the Coupled Model Intercomparison Project (CMIP5) provides a wide range of models, each with its own characteristics (e.g., parametrization and resolution). The study of an ensemble extracted from the CMIP5 is a good approach to investigating the uncertainties associated with the projection and to quantifying its confidence. In this study, 23 models are used, all in accordance with the representative concentration pathway 8.5 (RCP8.5), to answer the following questions: How will precipitation characteristics change over East Asia by the end of the twenty-first century? How reliable are these projections? What can explain the scattering of the ensemble? Moreover, what will be the social and

Section 2 details the data and methodology used. Section 3 introduces a review of the per‐ formance of the CMIP5 models as well as a comparison against observations and then an analysis on the estimated ensemble projection and its reliability. Section 4 presents a more detailed discussion on the change in atmospheric characteristics associated with the change in extreme rainfall. Section 5 provides a brief discussion on possible societal impacts, and To examine the projection and reliability of the CMIP5 ensemble, 23 models (**Table 1**) are used (with a single member for each). The models were first forced by historical conditions (the observation of aerosols, greenhouse gases, and solar irradiance) until 2005 and then followed the RCP8.5 pathway [39], leading to an increase in radiative forcing of 8.5 W m−2 by the end of 2100. The projection of the ensemble is analyzed by conducting a comparison between the last 30 years of the century (i.e., from 2071 to 2100, called "RCP" hereafter) and 30 years of the historical period (i.e., from 1976 to 2005, called "HIST" hereafter). Extreme events are com‐ puted from the daily outputs, and then the results are averaged for the 30 years of each period.



**Table 1.** CMIP5 models used for this study. The resolution is indicated in grid points (latitude × longitude).

Asia covers a large area with different regional climatologies (**Figure 1**). The mean precipitation during the extended summer (May–August) season is computed from the daily (1997–2007) Global Precipitation Climatology Project (GPCP) observations [40]. The total domain is subdivided into three regions based on the different phases of the Asian summer monsoon: East Asian region (EAR: 22°–45°N, 105°–145°E), India region (IR: 5°–28°N, 70°–105°E), and the western North-Pacific region (WNPR: 5°–22°N, 105°–160°E). The results are averaged over each of the three subregions to compare the regional characteristics of the rainfall.

**Figure 1.** Mean May–August precipitation (mm day−1) from GPCP [40] daily data (1997–2007), and an illustration of the three main domains used for this study: EAR(22°–45°N, 105°–145°E), IR(5°–28°N, 70°–105°E), and WNPR (5°–22°N, 105°–160°E).

## **2.2. Methodology**

**Model Name Institute and Country Resolution IPSL-CM5B-LR** IPSL, France 96 × 96

128 × 256

**MIROC5** Atmosphere and Ocean Research Institute (AORI; The University of

and Technology (JAMSTEC), Japan

90 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Tokyo), National Institute for Environmental Studies (NIES), and Japan Agency for Marine-Earth Science

**Table 1.** CMIP5 models used for this study. The resolution is indicated in grid points (latitude × longitude).

each of the three subregions to compare the regional characteristics of the rainfall.

Asia covers a large area with different regional climatologies (**Figure 1**). The mean precipitation during the extended summer (May–August) season is computed from the daily (1997–2007) Global Precipitation Climatology Project (GPCP) observations [40]. The total domain is subdivided into three regions based on the different phases of the Asian summer monsoon: East Asian region (EAR: 22°–45°N, 105°–145°E), India region (IR: 5°–28°N, 70°–105°E), and the western North-Pacific region (WNPR: 5°–22°N, 105°–160°E). The results are averaged over

**Figure 1.** Mean May–August precipitation (mm day−1) from GPCP [40] daily data (1997–2007), and an illustration of the three main domains used for this study: EAR(22°–45°N, 105°–145°E), IR(5°–28°N, 70°–105°E), and WNPR (5°–22°N,

105°–160°E).

**MIROC5-ESM-CHEM**  JAMSTEC, AORI, and NIES, Japan 64 × 128 **MPI-ESM-LR** Max Planck Institute for Meteorology (MPI-M), Germany 96 × 192 **MRI-CGCM3** Meteorological Research Institute, Japan 160 × 320 **NorESM1-M** Norwegian Climate Centre, Norway 96 × 144

The overall methodology of this study follows that used in [41]. To study the characteristics of precipitation, the probability density function (pdf) is computed to separate and highlight the types of rainfall. Both intensity and frequency are considered and are defined as follows.

## *2.2.1. Intensity*

Intensity is computed by separating the precipitation into percentiles at each grid point. Values are computed by steps of 10 between 1 and 90 and by steps of 1 between 90 and 99. Percentiles are computed either for the entire year (using all 30 years) or for each month (using only the precipitation from identical months over 30 years). The intensity of a percentile *X* is denoted as *pctX* and is expressed in mm day−1.

## *2.2.2. Frequency*

Frequency indicates the occurrence of a certain range of precipitation. It is computed for each grid point by using a threshold based on the HIST intensity value of a yearly percentile (this threshold is therefore independent from the month) and averaged over a 10° × 10° box around this point. At each grid point, each day with rainfall higher than this threshold is counted as an event (for the given percentile). The HIST and RCP use the same threshold value for a percentile, and thus the same definition for light and heavy rainfall is used for both periods, which facilitates comparing the change in each percentile. The frequency is determined by averaging the count of events from the same months over 30 years. The frequency of a percentile *X* is denoted as *fqpctX* and is expressed as a number of days.

The projection is expressed either as a difference between the two periods (RCP – HIST) or as a relative difference ([(RCP – HIST)/HIST] × 100). The former has a unit depending on the variable, whereas the relative difference is always expressed as a percentage. Each value was computed independently for every model on the respective grids. To plot the spatial distri‐ butions, the results were interpolated on a common grid (1.5° × 1.5°) to compute the ensemble mean and the associated ensemble standard deviation.

A change in intensity translates how each type of rainfall (light, medium, and heavy) could become more or less intense, while the change in frequency indicates how each type of rainfall could become more or less frequent.

## **3. Confidence in the distribution of precipitation and its projection**

## **3.1. A brief review of the historical CMIP5 ensemble performances**

First, the capability of the ensemble to simulate the precipitation and monsoon circulation over Asia is reviewed briefly. **Figure 2** shows the annual signal of precipitation in Asia for the 1976– 2005 period (HIST) for the ensemble mean and the GPCP daily data (averaged over 1997–2007). The 99th percentile is also computed for each grid point over the entire period and then

**Figure 2.** Annual signal of precipitation (mm day−1) in the Asia region (5°–45°N, 65°–160°E), EAR (22°–45°N, 105°– 145°E), IR (5°–28°N, 70°–105°E), and WNPR (5°–22°N, 105°–160°E). The solid black line represents the GPCP observa‐ tions, and the dashed red line indicates the CMIP5 ensemble mean. For the Asia region (left), on the top panel, the solid black and dashed red lines indicate the 99th percentile for the GPCP and CMIP5, respectively, and the orange triangles (circles) correspond to the mean value of the 10 models with the highest (lowest) resolution. The PERSIANN observational dataset is also plotted for the 99th percentile (solid green line). The signal is averaged over 1976–2005 for the CMIP5 ensemble and 1997–2007 for the observations.

averaged for the Asia domain (5°–45°N, 65°–160°E). Another observational dataset is used for the 99th percentile (PERSIANN, [42]), which has a 0.25° resolution.

The mean signal of the ensemble is relatively consistent with observations, and it can detect the seasonal signal for each region, although it tends to overestimate (underestimate) the precipitation over WNPR (EAR) during the summer. A significant delay in the onset of the monsoon over IR is also visible. The signal of extreme precipitation (99th percentile) is also adequately represented in a comparison between the CMIP5 mean and the GPCP. However, the signal intensity is stronger with a higher-resolution dataset (PERSIANN). This impact of the resolution could also affect the individual model results. Thus, for the 99th percentile, we separated the 10 models with the highest resolution (triangles) and the 10 models with the lowest resolution (circles). The effect of the resolution is clearly apparent, establishing an increased estimation of the 99th intensity according to the highest-resolution models during the Asian summer monsoon (the triangles are closer to the observation line.).

Regarding the spatial distribution (**Figure 3**) for the mean precipitation and wind circulation, biases are more readily apparent. In this figure, two periods are separated to highlight the monsoon evolution: May–June (MJ) and July–August (JA). Overall, the mean circulation is adequately represented by the CMIP5 ensemble, as are the rainfall patterns, although the intensity of precipitation exhibits a low bias more clearly over the Mei-Yu front in the EAR (during MJ) and a negative (positive) bias over the northern (southern) part of the IR, partic‐ Extreme Precipitation Events over East Asia: Evaluating the CMIP5 Model http://dx.doi.org/10.5772/62996 93

**Figure 3.** Climatology of the precipitation (shading) from the GPCP observations and the wind from the NCEP reanal‐ ysis (Obs); the CMIP5 ensemble mean (CMIP5) and the difference between the CMIP5 and the Obs. Two periods are separated: May–June (MJ) and July–August (JA). Intensity is in mm day−1 and contours are plotted every 4 mm day−1 for the Obs and CMIP5 and every 2 mm day−1 for the CMIP5-Obs.

ularly during JA. The underestimated precipitation resulted from unrealistic topography in the models, as indicated in [43]. A positive bias is observed over the WNPR and convective area. Thus, even if certain biases exist, the CMIP5 ensemble can reproduce the mean monsoon signal.

averaged for the Asia domain (5°–45°N, 65°–160°E). Another observational dataset is used for

**Figure 2.** Annual signal of precipitation (mm day−1) in the Asia region (5°–45°N, 65°–160°E), EAR (22°–45°N, 105°– 145°E), IR (5°–28°N, 70°–105°E), and WNPR (5°–22°N, 105°–160°E). The solid black line represents the GPCP observa‐ tions, and the dashed red line indicates the CMIP5 ensemble mean. For the Asia region (left), on the top panel, the solid black and dashed red lines indicate the 99th percentile for the GPCP and CMIP5, respectively, and the orange triangles (circles) correspond to the mean value of the 10 models with the highest (lowest) resolution. The PERSIANN observational dataset is also plotted for the 99th percentile (solid green line). The signal is averaged over 1976–2005 for

The mean signal of the ensemble is relatively consistent with observations, and it can detect the seasonal signal for each region, although it tends to overestimate (underestimate) the precipitation over WNPR (EAR) during the summer. A significant delay in the onset of the monsoon over IR is also visible. The signal of extreme precipitation (99th percentile) is also adequately represented in a comparison between the CMIP5 mean and the GPCP. However, the signal intensity is stronger with a higher-resolution dataset (PERSIANN). This impact of the resolution could also affect the individual model results. Thus, for the 99th percentile, we separated the 10 models with the highest resolution (triangles) and the 10 models with the lowest resolution (circles). The effect of the resolution is clearly apparent, establishing an increased estimation of the 99th intensity according to the highest-resolution models during

Regarding the spatial distribution (**Figure 3**) for the mean precipitation and wind circulation, biases are more readily apparent. In this figure, two periods are separated to highlight the monsoon evolution: May–June (MJ) and July–August (JA). Overall, the mean circulation is adequately represented by the CMIP5 ensemble, as are the rainfall patterns, although the intensity of precipitation exhibits a low bias more clearly over the Mei-Yu front in the EAR (during MJ) and a negative (positive) bias over the northern (southern) part of the IR, partic‐

the 99th percentile (PERSIANN, [42]), which has a 0.25° resolution.

92 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

the CMIP5 ensemble and 1997–2007 for the observations.

the Asian summer monsoon (the triangles are closer to the observation line.).

Finally, the distribution of extreme precipitation over Asia is shown in **Figure 4**. As shown in this figure, the occurrence of extreme events during the MJJA (top panel) is computed using the same threshold (pct99) based on the GPCP observations. This means that the models and observations were subject to the same criteria for identifying what is considered an extreme occurrence. The number of extreme events was then computed for each model (on its own grid), and the results were averaged for two groups: high- (CMIP HR, projected on a 1.5° × 1.5° grid) and low-resolution models (CMIP LR, projected on a 3° × 3° grid).

The location of extreme events is captured more accurately by the CMIP HR group, particularly where the topography is important (Himalayan plateau and Southeast Asia), but a stronger positive bias also occurs over the equatorial Pacific for this group. The CMIP LR can also represent the position of the main signals (west of India and the Bay of Bengal), but with less precision. This figure shows that the impact of resolution is critical for an accurate represen‐ tation of extreme rainfall.

The tail end of the precipitation distribution is shown on the bottom panel for the individual models and the GPCP. The results are displayed for the three subregions. The resolution clearly

**Figure 4.** Climatology of extreme precipitation occurrence during the MJJA (top panel, in day month−1) for the GPCP (Obs); 10 models each with the highest and lowest resolution (CMIP HR and CMIP LR, respectively). The bottom panel shows the tail end of the precipitation distribution computed from the GPCP (black) and from each model over each of the three subregions. Colors correspond to different resolutions, as indicated on the right-hand side of the panel.

affects the capacity of the model to produce more intense precipitation. Certain lowerresolution models (in blue) can also produce heavy rainfall; thus, the resolution is not always a limitation. Moreover, the ensemble is visibly more scattered over the WNPR and more consistent over the EAR. The confidence of the model ensemble can thus vary by region, which is a critical consideration when assessing the potential societal impacts in future climate projections.

These biases of the CMIP5 ensemble were examined in [44] and could significantly affect the reliability of the projection. Therefore, because of the systematic bias of the ensemble, the results of this study must be considered with a margin of error. If the bias is a supposed constant between the RCP and the HIST, a study of the change between the two periods should not be strongly affected; the difference between the periods should cancel out the bias.

## **3.2. Projected change in the distribution of the precipitation and its confidence**

The ensemble projection of the precipitation characteristics for the end of the twenty-first century is investigated for each of the three subregions. The pdf of the precipitation is computed using a percentile method, and its projected change is plotted in **Figure 5**. The percentiles range from 1 (light precipitation) to 99 (extreme precipitation) and are grouped by tens between 1 and 90 and plotted sequentially between 90 and 99 so that the heaviest rainfall is highlighted in the figure. Both intensity and frequency are considered and plotted on separated panels, and each value is expressed as a relative change between the RCP and the

**Figure 5.** Probability density function (pdf) of the relative change (in percentages) in precipitation (left) intensity and (right) frequency. Precipitation is divided into 10 bins (from 1 to 100), and the last 10% is also divided into 10 bins of 1% each. Results are categorized between the wet season (blue charts) and dry season (orange charts) of each region. The colored boxes show the 25th–75th ensemble quantile, and the bars indicate the 10th–90th ensemble quantile. The horizontal black bars inside the boxes show the 50th ensemble percentile. In the frequency plots, the first bin (before 0) represents the dry days. The color scale indicates the values that exceeded a certain confidence level (90 or 95%) as computed from a Student's *t* test.

HIST (in percentages). To investigate the possible seasonal differences, the winter (December– March) and summer (May–August) periods are separated by color (blue for winter, and orange or red for summer). Finally, the signal of the dry days (defined as days with rainfall lower than the first percentile) is shown on the frequency plots before the 0 value.

affects the capacity of the model to produce more intense precipitation. Certain lowerresolution models (in blue) can also produce heavy rainfall; thus, the resolution is not always a limitation. Moreover, the ensemble is visibly more scattered over the WNPR and more consistent over the EAR. The confidence of the model ensemble can thus vary by region, which is a critical consideration when assessing the potential societal impacts in future climate

**Figure 4.** Climatology of extreme precipitation occurrence during the MJJA (top panel, in day month−1) for the GPCP (Obs); 10 models each with the highest and lowest resolution (CMIP HR and CMIP LR, respectively). The bottom panel shows the tail end of the precipitation distribution computed from the GPCP (black) and from each model over each of the three subregions. Colors correspond to different resolutions, as indicated on the right-hand side of the panel.

94 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

These biases of the CMIP5 ensemble were examined in [44] and could significantly affect the reliability of the projection. Therefore, because of the systematic bias of the ensemble, the results of this study must be considered with a margin of error. If the bias is a supposed constant between the RCP and the HIST, a study of the change between the two periods should not be

The ensemble projection of the precipitation characteristics for the end of the twenty-first century is investigated for each of the three subregions. The pdf of the precipitation is computed using a percentile method, and its projected change is plotted in **Figure 5**. The percentiles range from 1 (light precipitation) to 99 (extreme precipitation) and are grouped by tens between 1 and 90 and plotted sequentially between 90 and 99 so that the heaviest rainfall is highlighted in the figure. Both intensity and frequency are considered and plotted on separated panels, and each value is expressed as a relative change between the RCP and the

strongly affected; the difference between the periods should cancel out the bias.

**3.2. Projected change in the distribution of the precipitation and its confidence**

projections.

One focus of this study is the reliability of the ensemble projection. Thus, for each bin, the 25th to 75th ensemble quartile intervals are indicated by colored boxes, and the 10th to 90th ensemble quartile intervals are indicated by black bars. The black horizontal bar inside the color chart represents the 50th ensemble percentile. To highlight the most significant results, the values that are significant at the 90% or 95% confidence level based on a Student's *t-*test are plotted with darker colors.

The results for the winter season, which is also the driest, are overall below the 90% confidence level. In the EAR, a dry tendency is clearly visible, with a confident decrease in frequency and intensity for the medium precipitation. This behavior is also visible for the intensity in the IR, but with a lower confidence level.

During the wet season (summer), when extreme rainfall occurs, all ranges of precipitation increase in all subregions, particularly for the heaviest rainfall (above the 90th percentile). For this type of precipitation, the confidence is higher (compared with medium to light precipita‐ tion), particularly for the EAR results. When examining the most extreme precipitation (the 99th percentile), almost all models are in agreement regarding the tendency to increase (the 10th ensemble percentile being above or near 0), although the magnitude of augmentation varies (from slightly more than 0 to 100%).

These results clearly indicate the threat that global warming could pose to the East Asian region. The most confident changes indicate an increase in extreme precipitation everywhere during the summer and a possible decrease in medium precipitation during the winter. The EAR region has the highest confidence levels (for both summer and winter), and thus the region will undeniably, at a minimum, be challenged by severe hydrological threats. At present, it is also clear that major uncertainties remain. The ensemble projections are scattered, particularly over the WNPR and IR regions, and uncertainties associated with ensemble projections over these regions were previously examined [45]. Many factors could affect ensemble scattering, such as parametrization schemes, the resolution, the response in radiative forcing, or the response in terms of atmospheric circulation. Most of these possibilities are explored in subsequent sections.

## **4. Intensification of the extremes during the summer monsoon: a change in atmosphere dynamics?**

In this section, potential factors that could explain the scattering of the ensemble projection are investigated.

## **4.1. Impact of the resolution and radiative forcing**

A part of the uncertainty could be due to the response of the surface temperature to the increase in radiative forcing in the models. Certain models revealed only a weak increase in tempera‐ ture, which could explain the lower increase in (extreme) precipitation. To investigate this hypothesis, the changes in extreme precipitation characteristics versus the changes in surface temperature (averaged over the globe) in each model during the wet season is plotted in **Figure 6** (right-hand side).As mentioned in [41], the results do not exhibit a significant correlation between the response in surface temperature in the models and the change in extreme precipitation (intensity or frequency), although the strongest frequency increase is reached mostly by the models with a stronger increase in temperature.

Another factor that may affect the projection is the model resolution. Section 3.1 presented a significant relationship between the horizontal resolution and the capacity of the models to simulate heavy rainfall. When considering the change (left-hand side of **Figure 6**), the rela‐ tionship between the resolution and the change of extreme events are nonsignificant. This result is also supported when examining the spatial distribution of the change in extreme occurrences for each model (**Figure 7**). The model of the same category (high or low resolution) can clearly produce conflicting signals. However, high-resolution models tend to have a stronger response, particularly over the equatorial oceanic region. A higher sensitivity in highresolution models (particularly in convection) to changes in the environment at a small spatial scale may explain this stronger signal.

Extreme Precipitation Events over East Asia: Evaluating the CMIP5 Model http://dx.doi.org/10.5772/62996 97

10th ensemble percentile being above or near 0), although the magnitude of augmentation

These results clearly indicate the threat that global warming could pose to the East Asian region. The most confident changes indicate an increase in extreme precipitation everywhere during the summer and a possible decrease in medium precipitation during the winter. The EAR region has the highest confidence levels (for both summer and winter), and thus the region will undeniably, at a minimum, be challenged by severe hydrological threats. At present, it is also clear that major uncertainties remain. The ensemble projections are scattered, particularly over the WNPR and IR regions, and uncertainties associated with ensemble projections over these regions were previously examined [45]. Many factors could affect ensemble scattering, such as parametrization schemes, the resolution, the response in radiative forcing, or the response in terms of atmospheric circulation. Most of these possibilities are explored in

**4. Intensification of the extremes during the summer monsoon: a change in**

In this section, potential factors that could explain the scattering of the ensemble projection are

A part of the uncertainty could be due to the response of the surface temperature to the increase in radiative forcing in the models. Certain models revealed only a weak increase in tempera‐ ture, which could explain the lower increase in (extreme) precipitation. To investigate this hypothesis, the changes in extreme precipitation characteristics versus the changes in surface temperature (averaged over the globe) in each model during the wet season is plotted in **Figure 6** (right-hand side).As mentioned in [41], the results do not exhibit a significant correlation between the response in surface temperature in the models and the change in extreme precipitation (intensity or frequency), although the strongest frequency increase is reached

Another factor that may affect the projection is the model resolution. Section 3.1 presented a significant relationship between the horizontal resolution and the capacity of the models to simulate heavy rainfall. When considering the change (left-hand side of **Figure 6**), the rela‐ tionship between the resolution and the change of extreme events are nonsignificant. This result is also supported when examining the spatial distribution of the change in extreme occurrences for each model (**Figure 7**). The model of the same category (high or low resolution) can clearly produce conflicting signals. However, high-resolution models tend to have a stronger response, particularly over the equatorial oceanic region. A higher sensitivity in highresolution models (particularly in convection) to changes in the environment at a small spatial

varies (from slightly more than 0 to 100%).

96 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

subsequent sections.

investigated.

**atmosphere dynamics?**

**4.1. Impact of the resolution and radiative forcing**

mostly by the models with a stronger increase in temperature.

scale may explain this stronger signal.

**Figure 6.** Scatterplots of the relative change during the wet season (for each model) in extreme precipitation (y axis, in percent) intensity (top) and frequency (bottom) versus the change in surface temperature (right, °C) and the mean res‐ olution of each model (left). Black crosses, green squares, and red triangles, respectively, indicate the EAR, IR, and WNPR.

**Figure 7.** Change in the occurrence of extreme rainfall (day month−1) for each model during the MJJA.

Based on these results, the resolution and the response to the radiative forcing are concluded not to be the main factors affecting the scattering of the ensemble projection and thus the reliability. The following subsection emphasizes the change in the circulation characteristics and its impact on rainfall projection.

## **4.2. Change in mean atmospheric circulation**

## *4.2.1. Monthly mean circulation*

The Asian summer monsoon is characterized by many active phases and short breaks, which can affect the precipitation characteristics. The results are summarized for two averaged periods: May–June (late spring) and July–August (summer). The late spring period corre‐ sponds with the onset of the global monsoon system and is also the main active phase for the Mei-Yu front system that develops over the EAR region. By contrast, the summer season corresponds with the main active phase of the monsoon over the IR and WNPR regions before decreasing in late summer.

The change in mean circulation in the low-level atmosphere (850 hPa) is displayed in **Figure 8** for the early spring and summer periods. The change in wind is indicated by the vectors, and the change in moisture flux (MF) is shown by the color shading. For the MF, only the values that exceeded a 90% confidence level with a Student's *t* test are shaded. During both phases, the MF and wind strengthen north of the IR region and along the east coast of Asia. The wind intensity is also decreased south of the IR region, but the change in MF is nonsignificant in this area. The MF results are not statistically significant over oceanic areas, particularly for the WNPR region. Despite good agreement between the models regarding an increase in atmos‐ pheric moisture content (data not shown), the change in MF for these areas is unclear (less than the 90% confidence level), meaning that the main uncertainties for the MF is caused by wind variations.

**Figure 8.** Change in 850 hPa moisture flux (shading, g kg−1 m s−1) and winds (vectors, m s−1), averaged over MJ and JA. For the winds, the vectors are plotted only for a change larger than 0.5 m s−1, and red (blue) indicates positive (nega‐ tive) changes. For the moisture flux, only values that exceeded a 90% confidence level with a two-tailed Student's *t* test are shaded.

The changes observed for low-level circulation appear to favor the triggering of heavy rainfall by increasing the moisture transport along South and East Asia. However, extreme events are typically associated with deep convection, affecting the entire troposphere. Thus, to gain an improved view of the change in atmospheric circulation, the change in MF convergence is computed for each layer of the atmosphere. Its correlation with the change in extreme rainfall is examined in the next section.

## *4.2.2. Daily horizontal moisture flux convergence*

Based on these results, the resolution and the response to the radiative forcing are concluded not to be the main factors affecting the scattering of the ensemble projection and thus the reliability. The following subsection emphasizes the change in the circulation characteristics

The Asian summer monsoon is characterized by many active phases and short breaks, which can affect the precipitation characteristics. The results are summarized for two averaged periods: May–June (late spring) and July–August (summer). The late spring period corre‐ sponds with the onset of the global monsoon system and is also the main active phase for the Mei-Yu front system that develops over the EAR region. By contrast, the summer season corresponds with the main active phase of the monsoon over the IR and WNPR regions before

The change in mean circulation in the low-level atmosphere (850 hPa) is displayed in **Figure 8** for the early spring and summer periods. The change in wind is indicated by the vectors, and the change in moisture flux (MF) is shown by the color shading. For the MF, only the values that exceeded a 90% confidence level with a Student's *t* test are shaded. During both phases, the MF and wind strengthen north of the IR region and along the east coast of Asia. The wind intensity is also decreased south of the IR region, but the change in MF is nonsignificant in this area. The MF results are not statistically significant over oceanic areas, particularly for the WNPR region. Despite good agreement between the models regarding an increase in atmos‐ pheric moisture content (data not shown), the change in MF for these areas is unclear (less than the 90% confidence level), meaning that the main uncertainties for the MF is caused by wind

**Figure 8.** Change in 850 hPa moisture flux (shading, g kg−1 m s−1) and winds (vectors, m s−1), averaged over MJ and JA. For the winds, the vectors are plotted only for a change larger than 0.5 m s−1, and red (blue) indicates positive (nega‐ tive) changes. For the moisture flux, only values that exceeded a 90% confidence level with a two-tailed Student's *t* test

and its impact on rainfall projection.

*4.2.1. Monthly mean circulation*

decreasing in late summer.

variations.

are shaded.

**4.2. Change in mean atmospheric circulation**

98 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

The horizontal MF convergence (HMFC) is expressed here as the sum of the contributions of a convergence term, the HMFCt (Eq. 1), and an advection term, the HMFCd (Eq. 2). The total MF convergence is obtained by summing the first two equations (Eq. 3). For Equations 1–3, *q* is the relative humidity, and *u* and *v* represent the zonal and meridional components of the wind, respectively. The specific contribution of the surface flux is not examined here, but it is implicitly included in the atmospheric humidity content. The changes in the HMFC profile and its two terms during the summer are separated, as shown in **Figure 9** (averaged over Asia). The correlation between the change in each term and that in the extreme precipitation characteristics is also computed over each subregion (**Figure 10** and **Table 2**). The correlations are based on individual model results, and the changes in the HMFC and its two components are averaged for the 200–850 hPa atmospheric levels. Results with values exceeding the 99% confidence level are indicated in bold.

$$\mathbf{HMFCt} = -\mathbf{q} \left[ \left( \frac{\partial \mathbf{u}}{\partial \mathbf{x}} \right) + \left( \frac{\partial \mathbf{v}}{\partial \mathbf{y}} \right) \right] \tag{1}$$

$$\mathbf{H} \mathbf{M} \mathbf{F} \mathbf{C} \mathbf{d} = -\mathbf{u} \left[ \left( \frac{\partial \mathbf{q}}{\partial \mathbf{x}} \right) \right] - \mathbf{v} \left[ \left( \frac{\partial \mathbf{q}}{\partial \mathbf{y}} \right) \right] \tag{2}$$

$$\text{HMFC} = \text{HMFCt} + \text{HMFCd} \tag{3}$$

The change in the HMFC is not statistically significant (**Figure 9**). The model ensemble scattering is large in each region and for each HMFC component. Determining a clear tendency is thus difficult, even when averaging the ensemble (red lines). However, **Figure 10** and the correlation coefficients in **Table 2** show that a clear relationship does not exist between the change in the HMFC and in either the intensity or frequency of extreme events. A significant correlation is found only for the change in the frequency of extremes over the WNPR (0.58 for HMFC and HMFCd). Thus, because the scattering of the change in MF convergence over this region is large, it may also significantly affect the scattering of the change in the extreme precipitation formation and thus the confidence of the projection. This does not apply to the other subregions.

**Figure 9.** Relative change (in percentages) of (top) the HMFC profile and its (middle) convergence and (bottom) advec‐ tion parts. Each change is normalized according to its mean historical value of the total atmospheric column (850–100 hPa). Black lines indicate separate models, and the red line represents the ensemble mean. The vertical axis indicates the pressure levels (in hPa). The scale is similar for each caption.

**Figure 10.** Similar to Figure 6, except for the change in the HMFC, HMFCt, and HMFCd versus that in the intensity and frequency of extreme precipitation.


Based on these results, the changes in horizontal circulation and convergence poorly affect the results of the extreme projections, except over the WNPR. The next section focuses on the change in vertical convergence.

Bold text indicates values exceeding the 99% confidence level.

**Table 2.** Linear correlation coefficients between the relative change in extreme precipitation (intensity and frequency) and in the HMFC (200–850 hPa) during the wet season (MJJA), separated into its convergence (HMFCt) and advection (HMFCd) parts.

## **4.3. Change in vertical circulation**

**Figure 9.** Relative change (in percentages) of (top) the HMFC profile and its (middle) convergence and (bottom) advec‐ tion parts. Each change is normalized according to its mean historical value of the total atmospheric column (850–100 hPa). Black lines indicate separate models, and the red line represents the ensemble mean. The vertical axis indicates

**Figure 10.** Similar to Figure 6, except for the change in the HMFC, HMFCt, and HMFCd versus that in the intensity

the pressure levels (in hPa). The scale is similar for each caption.

100 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

and frequency of extreme precipitation.

This section has the same structure as Section 4.2.2, but discusses the vertical convergence. The vertical moisture convergence (VMFC) is defined as Eq. 4, with ω being representing the vertical velocity, dP⬚q representing the vertical gradient of a specific humidity, and the brackets indicating vertical integration. This is the most dominant term of the column-integrated moisture budget, particularly for heavy precipitation. The change in the VMFC is separated between the contributions of the thermodynamic term (*q*) and the dynamic term (ω).

$$\text{VMFFC} = \{\boldsymbol{\omega}.\mathbf{d}\_{\mathbb{P}^\bullet}\mathbf{q}\} \tag{4}$$

The change in the VMFC profile and its two terms during the summer are shown in **Figure 11** (averaged over Asia). The correlation between the change in the VMFC and that in extreme events is displayed in **Figure 12**; the coefficients are listed in **Table 3**.

In contrast with the HMFC, the change in the VMFC is more apparent (**Figure 11**), although the scattering of the ensemble remains large, particularly for the IR and WNPR. The mean VMFC increases in each region (20–40%), particularly in the mid-troposphere. The VMFCt exhibits the same behavior, with a high confidence level (because of the expected increase in moisture in a warmer climate) and a higher magnitude in the low levels (40%). By contrast, the VMFCd is less certain, with a larger scattering and a mean closer to 0%. The change in the VMFCd tends to be negative and to oppose the positive change in the VMFCt. In summary, vertical convergence is expected to increase over each region, mostly because of the change in atmospheric moisture content, whereas the change in dynamics tends to increase the scattering of the ensemble (i.e., the uncertainties).

**Figure 11.** Similar to Figure 9 but for the VMFC, VMFCt, and VMFCd.

**Figure 12.** Similar to Figure 10 but for the VMFC, VMFCt, and VMFCd.

The correlations are also more apparent in **Figure 12** and **Table 3**, particularly for the total change in the VMFC. The changes in extreme precipitation characteristics are strongly correlated to the change in the VMFC (0.50–0.89). When considered separately, the correlation between the individual terms (VMFCt and VMFCd) with the changes in extreme precipitation become less clear. This is particularly applicable to the EAR, where significant correlations are found only for the total VMFC. The VMFCt significantly affects the WNPR in both frequency (0.75) and intensity (0.61), and the VMFCd is correlated with the frequency of extreme precipitation over the IR (0.63). These results clearly revealed that the change in extreme precipitation, both in frequency and intensity, is strongly affected by the change in the VMFC, and mostly under the simultaneous consideration of thermodynamic and dynamic terms.


Bold text indicates values exceeding the 99% confidence level.

**Table 3.** Identical to **Table 2**, except for vertical moisture flux convergence (VMFC) and its thermodynamical (VMFCt) and dynamical (VMFCd) components.

As mentioned in [41], these results may at first be contrary to expectations, which might anticipate an increase in atmospheric humidity content to be the dominant factor for explaining the change in extreme precipitation. The present research shows that the dynamical part of the VMFC can also significantly influence the intensity and frequency of these events. Further‐ more, because its change is less apparent (**Figure 11**), it negatively influences confidence levels in projecting extremes (by increasing the scattering of the ensemble). By contrast, the change in the VMFCt is clearer and tends to increase confidence levels in projecting extremes.

## **4.4. Summary**

**Figure 11.** Similar to Figure 9 but for the VMFC, VMFCt, and VMFCd.

102 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**Figure 12.** Similar to Figure 10 but for the VMFC, VMFCt, and VMFCd.

In a warming climate, the specific humidity of the atmosphere is typically expected to increase. The variation in monsoon areas and precipitation has been linked to a rise in moisture convergence, offset by the change in circulation [15]. The earlier sections showed that the VMFC exerts a potentially high level of influence on the trends of extreme events, whereas the correlation between the change in the mean circulation and the HMFC is less clear. However, the influence of the VMFC varies significantly between regions, which is consistent with findings in [46], showing that the dynamic contribution induces spatial variations in the changes. The confidence of ensemble-projected change in extreme precipitation may clearly be influenced by the low level of agreement in the model dynamics, particularly in the IR and WNPR. A direct correlation was not found between the resolution of the models or their response to radiative forcing and the change in extreme events, at least with the ensembles used for this study.

## **5. Extreme precipitation projections and societal impacts**

These findings highlight the importance of considering regional studies when examining longterm projections of precipitation characteristics. Because of the different responses of the models to radiative forcing (in atmospheric circulation), the change in precipitation can be uncertain. However, the clearest and strongest tendencies are observed for heavy and extreme precipitation, which is also the most critical in terms of constituting a threat to society (but not for water resources, which is another subject of concern). Moreover, the regions for which the changes are most significant for these events are also those that are the most densely populated: East Asia and South Asia. Over these two regions, heavy rainfall is expected to increase in frequency and intensity with good confidence.

Considering past events (such as the floods of 2011, 2013, or 2015 that affected South Asia and China) and their cost, extreme hydrological events represent a severe threat to the populations living in these regions [47, 48]. It is most likely that the loss associated with these events (in terms of economy, infrastructure, land, and casualties) will increase by the end of the century. However, even if an increase in heavy rainfall cannot be avoided, its impact can be limited [49]. For instance, because deforestation is a major cause of landslides under heavy rainfall [50], improved management over forests could locally limit the impact of heavy rainfall [51]. In urban areas, drainage systems and infiltration surfaces can be improved to convey more surface water, as indicated in [52], which compared measurements conducted in various cities. Thus, if measures are considered and implemented sufficiently early, it is possible to limit the impact of heavy rainfall and its associated costs.

## **6. Discussion and concluding remarks**

The change in extreme precipitation under global warming conditions will remain a major subject of research for the next few decades. Extreme precipitation events can drastically affect society, entailing severe economic, infrastructure, and human loss. The use of a model ensemble is a beneficial approach to highlighting regions where a change in these events is clear or, alternatively, regions where the level of agreement is lower. The frequency of extreme precipitation and the confidence level of the projected changes are critical considerations for decision makers in developing the prevention and risk management measures.

This study used 23 models from the CMIP5. Certain projections of precipitation characteristics can be clearly identified as confident, whereas other results remain uncertain (**Figure 5**). The selected models revealed a high confidence level in indicating that the East Asian region will be highly vulnerable to global warming, exhibiting more intense and more frequent extreme precipitation. The India and South Asian regions exhibit the same tendency but with less confidence. The results in the western North Pacific region are less certain, particularly regarding the intensity of extremes. These findings are in agreement with those presented in [41], even if the ensemble used was different. Changes in the precipitation characteristics are likely to affect all Asian regions, but not with the same magnitude.

Different factors that can affect the scattering of the ensemble projection were studied through correlation analysis. The resolution of the models does not display a clear impact, although we found that it can significantly affect the capacity of the models to produce heavy rainfall (**Figure 4**). The response of the models to radiative forcing did not exhibit a clear relationship with the change of extremes (**Figure 6**). The most significant findings were obtained by considering daily moisture convergence over the entire atmosphere (Figures 9–12 and Tables 2 and 3). The vertically integrated moisture gradient can strongly influence the projection of extreme events. Specifically, the model results were not in agreement with the change of its dynamical term (VMFCd), and thus this may explain the uncertainties associated with the projection. Thus, even if the increase in atmospheric moisture content is expected to strengthen the precipitation, local variations in the circulation will strongly affect changes in the charac‐ teristics of extreme events.

The systematic bias of the CMIP5 models mentioned in Section 3, their low-level ability to simulate tropical cyclones correctly (with a resolution coarser than 1°), and the definition of the MF adopted herein could also lead to a biased estimation of the correlations computed for this study. Thus, the results should be considered indicators of what can affect the projection of extreme precipitation and ensemble uncertainties. However, other factors could also influence the response of the models and should be investigated in future studies. Neverthe‐ less, most of the results indicate a confident increase in extreme precipitation over East Asia. The societal and economic impacts are expected to be massive, and anticipation will be critical to limiting the damage of such changes in hydrological events.

## **Acknowledgements**

response to radiative forcing and the change in extreme events, at least with the ensembles

These findings highlight the importance of considering regional studies when examining longterm projections of precipitation characteristics. Because of the different responses of the models to radiative forcing (in atmospheric circulation), the change in precipitation can be uncertain. However, the clearest and strongest tendencies are observed for heavy and extreme precipitation, which is also the most critical in terms of constituting a threat to society (but not for water resources, which is another subject of concern). Moreover, the regions for which the changes are most significant for these events are also those that are the most densely populated: East Asia and South Asia. Over these two regions, heavy rainfall is expected to increase in

Considering past events (such as the floods of 2011, 2013, or 2015 that affected South Asia and China) and their cost, extreme hydrological events represent a severe threat to the populations living in these regions [47, 48]. It is most likely that the loss associated with these events (in terms of economy, infrastructure, land, and casualties) will increase by the end of the century. However, even if an increase in heavy rainfall cannot be avoided, its impact can be limited [49]. For instance, because deforestation is a major cause of landslides under heavy rainfall [50], improved management over forests could locally limit the impact of heavy rainfall [51]. In urban areas, drainage systems and infiltration surfaces can be improved to convey more surface water, as indicated in [52], which compared measurements conducted in various cities. Thus, if measures are considered and implemented sufficiently early, it is possible to limit the

The change in extreme precipitation under global warming conditions will remain a major subject of research for the next few decades. Extreme precipitation events can drastically affect society, entailing severe economic, infrastructure, and human loss. The use of a model ensemble is a beneficial approach to highlighting regions where a change in these events is clear or, alternatively, regions where the level of agreement is lower. The frequency of extreme precipitation and the confidence level of the projected changes are critical considerations for

This study used 23 models from the CMIP5. Certain projections of precipitation characteristics can be clearly identified as confident, whereas other results remain uncertain (**Figure 5**). The selected models revealed a high confidence level in indicating that the East Asian region will be highly vulnerable to global warming, exhibiting more intense and more frequent extreme precipitation. The India and South Asian regions exhibit the same tendency but with less

decision makers in developing the prevention and risk management measures.

**5. Extreme precipitation projections and societal impacts**

104 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

frequency and intensity with good confidence.

impact of heavy rainfall and its associated costs.

**6. Discussion and concluding remarks**

used for this study.

This work was supported by the Consortium for Climate Change Study (CCliCS) - Ministry of Science and Technology (MOST), Taiwan, under Grant MOST 100-2119-M-001-029-MY5. We thank the reviewers and editor for comments that greatly improved the manuscript.

## **Author details**

Nicolas Freychet\* , Huang-Hsiung Hsu and Chi-Hua Wu

\*Address all correspondence to: nfreychet@gate.sinica.edu.tw

Research Center for Environmental Changes, Taipei, Taiwan

## **References**


[14] Kusunoki S, Arakawa O. Change in the precipitation intensity of the East Asian summer monsoon projected by CMIP3 models. Climate Dyn. 2012;38:2055-2072. DOI: 10.1007/ s00382-011-1234-7

**References**

21

84.1

demic Press; 1971, San Diego. 296 p.

21-54. DOI: 10.2151/jmsj.85B.21

10.1175/2008JCLI2144.1

D-11-00239.1

DOI: 10.1175/2008JCLI2471.1

2010;23:4651-4668. DOI: 10.1175/2010JCLI3655.1

WMO/TD 1266, TMPRRep. 70 ed. 2005. 542 p.

106 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

10.1175/1520-0442(2004)017,2688:MOGWIO.2.0.CO;2

[1] Ramage CS. Monsoon Meteorology, International Geophysical Series, Vol. 15. Aca‐

[2] Ding Y. The variability of the Asian summer monsoon. In: Kluwer Academic, editor. Monsoon over China. J. Meteor. Soc. Japan; 1994, pp. 85B, 21-54. DOI: 10.2151/jmsj.85B.

[3] Chang C-P, Wang B, Lau N-C G. The global monsoon system: research and forecast.

[5] Ding Y. The variability of the Asian summer monsoon. J. Meteor. Soc. Japan. 2007;85B:

[6] Chou C, Neelin J D. Mechanism of global warming impacts on regional tropical precipitation. J. Climate. 2004;17:2688–2701. DOI:

[7] Min S-K, Legutke S, Hense A, Cubasch U, Kwon W-T, Oh J-H, Schlese U.. East Asian climate change in the 21st century as simulated by the coupled climate model ECHO-G under IPCC SRES scenarios. J. Meteor. Soc. Japan. 2006;84:1-26. DOI: 10.2151/jmsj.

[8] Kripalani RH, Oh J-H, Chaudhari HS. Response of the East Asian summer monsoon to doubled atmospheric CO2: coupled climate model simulations and projections under IPCC AR4. Theor. Appl. Climatol. 2007;87:1-28. DOI: 10.1007/s00704-006-0238-4

[9] Stephens GL, Ellis TD. Controls of global-mean precipitation increases in global warming GCM experiments. J. Climate. 2008;21:6141-6155. DOI:

[10] Chou C, Neelin J D, Chen C-A, Tu J-Y. Evaluating the "rich-get-richer" mechanism in tropical precipitation change under global warming. J. Climate. 2009;22:1982–2005.

[11] Wu C-H, Kau W-S, Chou M-D. Summer monsoon onset in the subtropical western North Pacific. Geophys. Res. Lett. 2009;36:L18810. DOI: 10.1029/2009GL040168

[12] Seager R, Naik N, Vecchi GA. Thermodynamic and dynamic mechanisms for largescale changes in the hydrological cycle in response to global warming. J. Climate.

[13] Chou C, Chen C-A, Tan P-H, Chen K-T. Mechanisms for global warming impacts on precipitation frequency and intensity. J. Climate. 2012;25:3291-3306. DOI: 10.1175/JCLI-

[4] Wang B. The Asian Monsoon. Praxis Publishing; 2006, Chichester, UK. 787 p.


[41] Freychet N, Hsu H-H, Chou C, Wu C-H. Asian summer monsoon in the CMIP5 projections: A link between the change in extreme precipitation and monsoon dynam‐ ics. J Climate. 2015;28:1477-1493.

[28] Trenberth KE, Dai A, Rasmussen R, Parsons D. The changing character of precipitation.

[29] Kharin VV, Zwiers FW. Estimating extremes in transient climate change simulations.

[30] Meehl GA, Arblaster JM, Tebaldi C. Understanding future patterns of increased precipitation intensity in climate model simulations. Geophys Res Lett. 2005;32:L18719.

[31] Räisänen J. Impact of increasing CO2 on monthly-to-annual precipitation extremes: analysis of the CMIP2 experiments. Climate Dyn. 2005;24:309-323. DOI: 10.1007/

[32] Barnett DN, Brown SJ, Murphy JM, Sexton DMH, Webb MJ. Quantifying uncertainty in changes in extreme event frequency in response to doubled CO2 using a large ensemble of GCM simulations. Climate Dyn. 2006;26:489-511. DOI: 10.1007/

[33] Tebaldi C, Hayhoe K, Arblaster M, Meehl GA. Going to the extremes. Climatic Change.

[34] Giorgi F, Im E-S, Coppola E, Diffenbaugh NS, Gao XJ, Mariotti L, Shi Y. Higher hydroclimatic intensity with global warming. J Climate. 2011;24:5309-5324. DOI:

[35] Shiu C-J, Liu S C, Fu C, Dai A, Sun Y. How much do precipitation extremes change in a warming climate?. Geophys Res Lett. 2012;39:L17707. DOI: 10.1029/2012GL052762

[36] Scoccimarro E, Gualdi S, Bellucci A, Zampiery M, A. Navara A. Heavy precipitation events in a warmer climate: Results from CMIP5 models. J Climate. 2013;26:7902-7911.

[37] Meehl GA, et al. Global climate projections. In: Climate Change 2007, The Physical Science Basis. S. Solomon, et al. Eds., Cambridge, UK, Cambridge University Press;

[38] Christensen JH, et al. Regional climate projections. In: Climate Change 2007, The Physical Science Basis. S. Solomon, et al. Eds., Cambridge, UK, Cambridge University

[39] Riahi K, Gruebler A, Nakicenovic N. Scenarios of long-term socio-economic and environmental development under climate stabilization. Technol. Forecast Soc Change.

[40] Huffman G, Adler RF, Morrissey MM, Bolvin DT, Curtis S, Joyce R, McGavock B, Susskind J. Global precipitation at one-degree daily resolution from multisatellite observations. J Hydrometeor. 2001;2:36-50. DOI:

Bull Am Meteor Soc. 2003;84:1205-1217. DOI: 10.1175/BAMS-84-9-1205

J Climate. 2005;18:1156-1173. DOI: 10.1175/JCLI3320.1

108 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

2006;79:185-211. DOI: 10.1007/s10584-006-9051-4

2007;74:887-935. DOI: 10.1016/j.techfore.2006.05.026

10.1175/1525-7541(2001)002,0036:GPAODD.2.0.CO;2

DOI: 10.1029/2005GL023680

s00382-004-0510-1

s00382-005-0097-1

10.1175/2011JCLI3979.1

2007. pp. 747–845.

Press; 2007. pp. 847–940.

DOI: 10.1175/JCLI-D-12-00850.1


## **Computational Tools for the Simulation of Atmospheric Pollution Transport During a Severe Wind Event in Argentina**

César Augusto Aguirre and Armando Benito Brizuela

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/63552

## **Abstract**

This chapter details the theoretical aspects of numerical methods for the simulation of atmospheric phenomena, such as severe thunderstorms and turbulent transport of the dangerous gases and solid particles into the atmospheric boundary layer. Numerical methods are included in computational algorithms to solve large turbulent scales using large eddy simulation (LES) techniques to obtain acceptable results of turbulent flows. However, microphysics processes involving evaporation, condensation and precipita‐ tion water using LES techniques are parameterized. These atmospheric processes are simulated using the advanced regional prediction systems (ARPS) code. On the contrary, atmospheric transport of pollutants is simulated using ARPS code coupled with a Lagrangian stochastic one-particle method. The theoretical details of this coupling are presented. Later, we show some laboratory experiments of plume dispersion emitted from gaseous sources, and the results of the computational simulation tool are compared after obtaining good agreement of the gas concentra‐ tions on the stream-wise vertical plane and over the ground. Finally, we present a simulation of a pollution event of copper solid particles at San Miguel de Tucumán city, Argentina. The geographical distributions of copper particle concentrations are in good agreement with the measurements carried out experimentally.

**Keywords:** large eddy simulation, solid particle dispersion, smelter, severe wind events, Argentina

## **1. Introduction**

In the last few years, diverse governmental and civic organizations have expressed the need to evaluate, reduce and legislate atmospheric emanations from factories, industries and motor

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

vehicles so as to mitigate respiratory illnesses in residents living in large cities close to indus‐ trial zones. One of the tools in use to evaluate the adverse effects of industrial installations at the local level is the simulation of air pollution episodes of low, average and extreme operat‐ ing conditions. Advances in hardware and software computing have led to implement codes of pollution simulation to aim at solving the balance equations of fluid mechanics in conjunc‐ tion with chemical reaction models. These computation codes aim to estimate the concentra‐ tion values and spatial distribution of chemical species dispersed in the atmosphere under different weather conditions. However, there are still limitations of computer power due to the large number of calculations needed to solve turbulent flows of a high Reynolds number (ℜe) asthosenormallyarepresentintheatmosphericboundarylayer.Byusingalargeeddysimulation (LES),theinstantaneousevolutionoflargeturbulentstructurescanbecomputedwiththebalance equations of fluid mechanics. This technique of discretization in space and time of turbulent transport phenomena has been developed from the early work of Deardorff [1, 2] and Schu‐ mann [3].

The large-scale resolution by LES allows a three-dimensional description of the wind field with a spatial resolution limited by the size of the cell that has subdivided the computational domain. While no major complications arise due to this limitation, certain meso- and macroscale applications require a small cell grid size for computing molecular diffusion and chemical reactions with a high ℜe number, such as near pollution emission sources. One option to avoid too much reduce the size of the cell grid is considered fluid particles leading concentrations of chemical species (e.g. concentration of CO2, CO, N2O, NO, O3). This approach requires designing a model to get the trajectories of these particles by computing at each time step of the simulation, the position and velocity of the particles. These models are known as *Lagrangian models* (LM). When fluid dynamics are described by LES, the particles are driven following the movement of large scales of turbulence. Some authors added to the movement induced by LES one-random motion component that simulates the behaviour of smaller scales according to *Brownian* motion [4–12]. Models based on this technique are called *Lagrangian stochastic models* (STOs).

This chapter describes the assumptions and theoretical considerations of the coupling between these STO models with LES. Next, simulation results are compared with CO2 measurements emitted from an upwind source made by Gong [13]. Finally, model results are compared with measurements of copper concentration made by Fernández-Turiel [14] in Lastenia town, province of Tucuman, Argentina, caused by pollution from a smelter during a severe wind event.

## **2. Numerical methods**

## **2.1. The large eddy simulation (LES)**

In a fluid turbulent flow, many eddies of different sizes are induced. A technique for obtaining the time-space evolution of the most important eddies based on the energy they carry is known as large eddy simulation of turbulence (LES). The LES technique is an important tool for the simulation of wind turbulence into the atmosphere because the technique allows one to obtain a three-dimensional description of the wind field and its temporal evolution. The code used for LES is the ARPS (version 5.2.12), a mesoscale model of the non-hydrostatic and fully compressible type, developed by Center of Analysis and Prediction of Storms (CAPS) at the University of Oklahoma (USA). The model numerically integrates the time-dependent equations of mass balance, quantity of movement and energy of the largest turbulent scales. This model not only simulates the wind field but also has sub-models of heat and vapour flow, cloud formation and rainfall. For this, the orography and land cover are considered as well as the initial conditions of the ground and the atmospheric boundary layer.

vehicles so as to mitigate respiratory illnesses in residents living in large cities close to indus‐ trial zones. One of the tools in use to evaluate the adverse effects of industrial installations at the local level is the simulation of air pollution episodes of low, average and extreme operat‐ ing conditions. Advances in hardware and software computing have led to implement codes of pollution simulation to aim at solving the balance equations of fluid mechanics in conjunc‐ tion with chemical reaction models. These computation codes aim to estimate the concentra‐ tion values and spatial distribution of chemical species dispersed in the atmosphere under different weather conditions. However, there are still limitations of computer power due to the large number of calculations needed to solve turbulent flows of a high Reynolds number (ℜe) asthosenormallyarepresentintheatmosphericboundarylayer.Byusingalargeeddysimulation (LES),theinstantaneousevolutionoflargeturbulentstructurescanbecomputedwiththebalance equations of fluid mechanics. This technique of discretization in space and time of turbulent transport phenomena has been developed from the early work of Deardorff [1, 2] and Schu‐

112 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

The large-scale resolution by LES allows a three-dimensional description of the wind field with a spatial resolution limited by the size of the cell that has subdivided the computational domain. While no major complications arise due to this limitation, certain meso- and macroscale applications require a small cell grid size for computing molecular diffusion and chemical reactions with a high ℜe number, such as near pollution emission sources. One option to avoid too much reduce the size of the cell grid is considered fluid particles leading concentrations of chemical species (e.g. concentration of CO2, CO, N2O, NO, O3). This approach requires designing a model to get the trajectories of these particles by computing at each time step of the simulation, the position and velocity of the particles. These models are known as *Lagrangian models* (LM). When fluid dynamics are described by LES, the particles are driven following the movement of large scales of turbulence. Some authors added to the movement induced by LES one-random motion component that simulates the behaviour of smaller scales according to *Brownian* motion [4–12]. Models based on this technique are called *Lagrangian stochastic*

This chapter describes the assumptions and theoretical considerations of the coupling between these STO models with LES. Next, simulation results are compared with CO2 measurements emitted from an upwind source made by Gong [13]. Finally, model results are compared with measurements of copper concentration made by Fernández-Turiel [14] in Lastenia town, province of Tucuman, Argentina, caused by pollution from a smelter during a severe wind

In a fluid turbulent flow, many eddies of different sizes are induced. A technique for obtaining the time-space evolution of the most important eddies based on the energy they carry is known as large eddy simulation of turbulence (LES). The LES technique is an important tool for the

mann [3].

*models* (STOs).

**2. Numerical methods**

**2.1. The large eddy simulation (LES)**

event.

The continuity, momentum equations and energy are resolved using the scheme of finite differences centred on an *Arakawa C-grid* cell type. A fully three-dimensional curvilinear coordinate system is used where the size of horizontal grids is constant, but the vertical coordinate follows the terrain elevation, and a stretching in size is applied to obtain more accuracy near the ground. The atmospheric model takes into account the compressibility of the flow. The numerical scheme used to obtain the solution of the differential equations is of 4th order centred of the explicit type, while that used to integrate the equations of pressure and vertical component of the air speed is implicit of *Crank-Nicholson* type. The prognostic variables of the model are Cartesian (0xyz) wind components *u, v, w* and scalars potential temperature *θ*, pressure *p*, air density *ρ*, mixing ratio of water vapour *qv*, cloud water *qc*, ice *qi*, rainwater *qr*, snow *qs* and hail *qh*. Initially, the states of these variables are included according to Reynolds decomposition (1) in a base-state *ā* and a perturbation *a'*.

$$\begin{cases} \boldsymbol{u}(\mathbf{x},\boldsymbol{y},z,t) = \overline{\boldsymbol{u}}(z) + \boldsymbol{u}'(\mathbf{x},\boldsymbol{y},z,t) \\ \nu(\mathbf{x},\boldsymbol{y},z,t) = \overline{\nu}(z) + \boldsymbol{u}'(\mathbf{x},\boldsymbol{y},z,t) \\ \nu(\mathbf{x},\boldsymbol{y},z,t) = \boldsymbol{w}'(\mathbf{x},\boldsymbol{y},z,t) \\ \theta(\mathbf{x},\boldsymbol{y},z,t) = \overline{\theta}(z) + \theta'(\mathbf{x},\boldsymbol{y},z,t) \\ p(\mathbf{x},\boldsymbol{y},z,t) = \overline{p}(z) + p'(\mathbf{x},\boldsymbol{y},z,t) \\ \rho(\mathbf{x},\boldsymbol{y},z,t) = \overline{\rho}(z) + \rho'(\mathbf{x},\boldsymbol{y},z,t) \\ q(\mathbf{x},\boldsymbol{y},z,t) = \overline{q}(z) + q'(\mathbf{x},\boldsymbol{y},z,t) \end{cases}$$

(1)

In this decomposition, the base state is assumed to be horizontally homogeneous. For this reason, the vertical component of the base state of the wind velocity is zero. Further, the base state is time invariant and hydrostatically balanced, so that the perturbation is integrated numerically in every time step for the filtered continuity, filtered momentum equation of wind velocity and filtered momentum equation of the scalars *ψ*. The filtered operation is carried out to obtain large scales of turbulent flow. This involves the application of a convolution spatial filter *G*(Δ*xi* ) where ∫*G*(Δ*xi* )*dxi* = 1, a low pass filter, and Δ*xi* is the size of grid elements of computational spatial domain. This filtered operation is applied to velocity champs of wind given the large scale for this variable:

$$\mu\_i^{\oplus} = \int\_D \mu\_i G(\Delta \mathbf{x}\_i) d\mathbf{x}\_i \tag{2}$$

where *D* is the entire spatial domain. The non-resolved scales are designed as *ui* <sup>−</sup> <sup>=</sup>*ui* <sup>−</sup>*ui* ⊕.

Filtered continuity (3), filtered momentum of fluid velocity (4) and filtered momentum of scalars (5) are described as

$$\frac{\partial \tilde{u}\_i^{\oplus}}{\partial \mathbf{x}\_i} = \mathbf{0} \tag{3}$$

$$\frac{\partial \tilde{u}\_i^{\oplus}}{\partial t} + \frac{\partial \left(\tilde{u}\_i^{\oplus} u\_j^{\oplus}\right)}{\partial \mathbf{x}\_j} = \overline{\rho} \mathbf{g}\_i B^{\oplus} - \frac{\partial p^{\oplus}}{\partial \mathbf{x}\_i} - \frac{\partial \tilde{\tau}\_y}{\partial \mathbf{x}\_j} + 2\nu \frac{\partial \tilde{\mathbf{S}}\_y^{a \oplus}}{\partial \mathbf{x}\_j} \tag{4}$$

$$\frac{\partial \tilde{\boldsymbol{\varphi}}^{\oplus}}{\partial t} + \frac{\partial \left(\tilde{\boldsymbol{u}}\_{/}^{\oplus} \boldsymbol{\nu}^{\oplus}\right)}{\partial \boldsymbol{\alpha}\_{/}} = \boldsymbol{\Phi}\_{\rm v} - \frac{\partial \tilde{\boldsymbol{\tau}}\_{\rm v}}{\partial \boldsymbol{\alpha}\_{/}}\tag{5}$$

The *Leonard identity* [15] is applied to Eqs. 3–5 as follows:

$$\begin{aligned} \left(\tilde{u}\_{i}^{\Rightarrow}u\_{j}\right)^{\oplus} &= \left(\tilde{u}\_{i}^{\Rightarrow}u\_{j}^{\oplus}\right)^{\oplus} + \left(\tilde{u}\_{i}^{\Rightarrow}u\_{j}^{\leftarrow} + \tilde{u}\_{i}^{\leftarrow}u\_{j}^{\oplus}\right)^{\oplus} + \left(\tilde{u}\_{i}^{\leftarrow}u\_{j}^{\rightarrow}\right)^{\oplus} \\ &= \left(\tilde{u}\_{i}^{\oplus}u\_{j}^{\oplus}\right) + \tilde{L}\_{ij} + \tilde{C}\_{ij} + \tilde{R}\_{ij} \\ &= \left(\tilde{u}\_{i}^{\oplus}u\_{j}^{\oplus}\right) + \tilde{\tau}\_{y} \end{aligned} \tag{6}$$

$$\begin{aligned} \left(\tilde{u}\_i^\circ \boldsymbol{\nu}^\circ \right)^\oplus &= \left(\tilde{u}\_i^\oplus \boldsymbol{\nu}^\circ \right)^\oplus + \left(\tilde{u}\_i^\oplus \boldsymbol{\nu}^- + \tilde{u}\_i^- \boldsymbol{\nu}^\circ \right)^\oplus + \left(\tilde{u}\_i^- \boldsymbol{\nu}^- \right)^\oplus \\ &= \left(\tilde{u}\_i^\oplus \boldsymbol{\nu}^\circ \right) + \tilde{\tau}\_{\boldsymbol{\nu}} \end{aligned} \tag{7}$$

In Eq. (4), *B* is a buoyancy force, *Sij a* is the anisotropic deformation tensor and *ν* is the molecular viscosity. In Eq. (5), Φψ is the sink and source of the scalar variable *ψ*. The variables with tilde indicate that they have been weighted by the density state base *u*˜*<sup>i</sup>* <sup>⊕</sup> <sup>=</sup>*ρ*¯(*z*)*ui* <sup>⊕</sup>. The pressure equation is obtained by taking the material derivative of the state equation for moist air and replacing the time derivative of density by velocity divergence using the continuity equation. The correlation terms containing unsolved scales *τ*˜*ij* and *τ*˜*<sup>i</sup>*<sup>ψ</sup> are modelled using *Smagorinsky formulation* [16, 17] with a dynamic scheme [18].

## **2.2. The dynamic Smagorinsky model**

*i i ii* ( ) *D*

Filtered continuity (3), filtered momentum of fluid velocity (4) and filtered momentum of

0 *<sup>i</sup> i u x* Å ¶ = ¶

( ) ' <sup>2</sup>

*tx x*

( ) ( ) ( ) ( )

( ) ( ) ( ) ( )

= ++ +

 yy

<sup>Å</sup> <sup>Å</sup> Å Å ÅÅ Å- -Å --

viscosity. In Eq. (5), Φψ is the sink and source of the scalar variable *ψ*. The variables with tilde

equation is obtained by taking the material derivative of the state equation for moist air and replacing the time derivative of density by velocity divergence using the continuity equation.

*i i ii i*

*u u uu u*

% % %% %

 t

*uu u u u u u u u u uu L C R*

% % %% % % % % %

*ij i j i j i j i j i j ij ij ij*

= ++ +

<sup>Å</sup> <sup>Å</sup> Å Å ÅÅ Å- -Å --

 <sup>Å</sup> Å Å ¶ ¶ ¶ + =F - ¶¶ ¶

Å Å Å Å Å <sup>Å</sup> ¶ ¶ ¶ ¶ ¶ + = - -+ ¶ ¶ ¶¶ ¶

*j ij j*

( ) <sup>ψ</sup> ψ *j i j j*

t

t

*i j ij ij i i*

*u p u u S g B t x xx x*

r

*u*

y

y

( ) ( )

Å Å Å Å

*u u*

( ) <sup>ψ</sup>

*u*

 y

y

= +

Å Å

*a*

indicate that they have been weighted by the density state base *u*˜*<sup>i</sup>*

= +

*i j ij*

% %

*i i*

t

= +++

The *Leonard identity* [15] is applied to Eqs. 3–5 as follows:

 

y

The correlation terms containing unsolved scales *τ*˜*ij*

*formulation* [16, 17] with a dynamic scheme [18].

In Eq. (4), *B* is a buoyancy force, *Sij*

where *D* is the entire spatial domain. The non-resolved scales are designed as *ui*

114 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

scalars (5) are described as

*u u G x dx* <sup>Å</sup> = D ò (2)

% (3)

*a*

% % % (5)

 y

% % (7)

is the anisotropic deformation tensor and *ν* is the molecular

<sup>⊕</sup> <sup>=</sup>*ρ*¯(*z*)*ui*

and *τ*˜*<sup>i</sup>*<sup>ψ</sup> are modelled using *Smagorinsky*

<sup>⊕</sup>. The pressure

 n

% % % % (4)

<sup>−</sup> <sup>=</sup>*ui* <sup>−</sup>*ui* ⊕.

(6)

The unsolved scales can be viewed as the sub-grid scale viscosity due to the effect of small vortices whose size is less than Δ. The effect of small scales is to transfer the kinetic energy from large scales for dissipation following the theatrical energy cascade of *Kolmogorov*; then this behaviour can be modelled as a sub-grid stress (SGS) tensor *τ*˜*ij* , *τ*˜*<sup>i</sup>*ψ. Furthermore, while large-scale motions are strongly dependent on the external flow conditions, small-scale motions are expected to behave more universally. Hence, the intention is that numerical modelling can be feasible and/or require few adjustments when applied to various flows [19].

Eddy viscosity models parameterized the SGS stress tensor in a way to relate it with the resolved scales through the deformation tensor *Sij* ⊕:

$$
\pi^a\_\psi = -\mathcal{D}\nu\_r S^\oplus\_\psi \tag{8}
$$

where *τij <sup>a</sup>* <sup>=</sup>*τij* <sup>−</sup><sup>1</sup> 3 *δij τkk* is the anisotropic part of SGS stress tensor, and *ν<sup>T</sup>* denotes the SGS viscosity due to small scales.

The *Smagorinsky model* [16] proposes the equilibrium state where the small scales dissipate the kinetic energy entirely and instantaneously they receive from the large scales [20]. Following this assumption

$$\nu\_T = C\Lambda^2 \left| S^{\oplus} \right| \tag{9}$$

with |*<sup>S</sup>* <sup>⊕</sup> <sup>|</sup> =(2*Sij* <sup>⊕</sup>*Sij* ⊕) 1 2 and *C*s= *C* known as the *Smagorinsky coefficient*. The *C*s coefficient takes values between 0.18 and 0.23 depending on the kinetic energy, Reynolds number, solid boundary proximity and time-space decay energy.

More sophisticated models allow to progressively estimate the *Smagorinsky coefficient* in time space. The *dynamic Smagorinsky model* introduces a test filter operation where the filter size is taken as Δ*<sup>T</sup>* = *α*Δ. Typically *α* = 2 [17, 18]. The model coefficient is calculated to apply the *Germano identity*:

$$L\_y^r = T\_y - \boldsymbol{\tau}\_y^r = \left(\boldsymbol{u}\_i^{\oplus} \boldsymbol{u}\_j^{\oplus}\right)^r - \boldsymbol{u}\_i^{\oplus r} \boldsymbol{u}\_j^{\oplus r} \tag{10}$$

where *L ij <sup>T</sup>* denotes the Leonard test filter stress, and *Tij* denotes the SGS stress to the test filter scale. The last term can be modelled following the eddy viscosity model (Eq. (8)):

$$T\_y - \frac{1}{3} \mathcal{S}\_y T\_{\&\ k} = -2C\_r \left(\Delta^r \right)^2 \left| \mathbf{S}^{\oplus r} \right| S\_y^{\oplus r} \tag{11}$$

Based on Eqs. (8) and (11), the relation in Eq. (10) can be estimated as

$$\begin{split} L\_y^T &= -2C\_T \Delta^2 \underbrace{\left\{ \alpha^2 \left\| S^{\oplus} \right\|^{\bar{I}} S\_y^{\oplus \bar{I}} - \left( \left\| S^{\oplus} \right\| S\_y^{\oplus} \right)^{\bar{I}} \right\}}\_{-M\_y} \\ &= 2C\_T \Delta^2 M\_y \end{split} \tag{12}$$

where the scale invariance of the coefficients (*CT* =*C*) has been assumed, and *CT* is found by optimizing through least squares minimization of error function εrr=(*L ij <sup>T</sup>* <sup>−</sup>2*CT <sup>Δ</sup>* <sup>2</sup> *Mij* )2 when ∂ (εrr) <sup>∂</sup>*CT* =0 [17]:

$$C\_T = \frac{1}{2\Delta^2} \frac{\left\langle L\_y^T M\_y \right\rangle\_H}{\left\langle M\_y M\_y \right\rangle\_H} \tag{13}$$

The brackets . *<sup>H</sup>* in Eq. (13) denote the average over homogeneous directions and need to be introduced in order to guarantee numerical stability of the procedure because when *CT* <0, the numerical stability is lost. This operation creates an important limitation, restricting the simulations to flat terrains.

In order to avoid this constraint, a *modified dynamic Smagorinsky model* is proposed using another test filter size *Δ <sup>T</sup>* =3*Δ* (*α* = 3) and computing the test filter coefficient [8] as

$$\begin{cases} \frac{1}{2\Delta^2} \frac{L\_{\circ}^T M\_{\circ}}{M\_{\circ} M\_{\circ}} > 0 & \text{then} \quad C\_r = \frac{1}{2\Delta^2} \frac{L\_{\circ}^T M\_{\circ}}{\left(M\_{\circ} M\_{\circ}\right)} \\\\ \frac{1}{2\Delta^2} \frac{L\_{\circ}^T M\_{\circ}}{M\_{\circ} M\_{\circ}} \le 0 & \text{then} \quad C\_r = 0 \end{cases} \tag{14}$$

Therefore, another problem of this model is the inaccurate results near the solid wall (e.g. near of terrain) because in this region the viscous layer is present, and the sub-grid stress is a significant fraction of the total stress. These models are incapable of computing the small scales present in this region [21] and lead to the underestimation of coefficient *CT*, resulting in an overestimation of the velocity flow near the solid wall [22]. This has been accomplished by various types of *ad hoc* corrections such as *Van Driest damping* [23] and intermittence functions [24]. The *Van Driest damping* functions applied to the first row of equation (14) yield

Computational Tools for the Simulation of Atmospheric Pollution Transport During a Severe Wind Event in Argentina http://dx.doi.org/10.5772/63552 117

$$C\_T = \frac{1}{2\Delta^2} \frac{L\_y^I M\_y}{M\_y M\_y} \underbrace{\left[1 - e^{-z^\*/25}\right]^2}\_{\text{Vin Distâmdip}} \quad \text{when} \quad C\_T > 0 \tag{15}$$

where *<sup>z</sup>* <sup>+</sup> <sup>=</sup> *zu* \* *<sup>ν</sup>* denotes the vertical coordinate in wall units and *u\** the friction velocity of flow.

The SGS stress tensor to scalar variables is computed in some way:

( ) <sup>1</sup> <sup>2</sup> 2

*T TT T T C SS ij ij kk T ij*

*<sup>T</sup> <sup>T</sup> T T ij T ij ij*

é ù =- D - ê ú ë û

*L C S S SS*

a

( ) 2 2

Å Å ÅÅ

*ij*

*M*


where the scale invariance of the coefficients (*CT* =*C*) has been assumed, and *CT* is found by

*T ij ij <sup>H</sup>*

*L M*

The brackets . *<sup>H</sup>* in Eq. (13) denote the average over homogeneous directions and need to be introduced in order to guarantee numerical stability of the procedure because when *CT* <0, the numerical stability is lost. This operation creates an important limitation, restricting the

In order to avoid this constraint, a *modified dynamic Smagorinsky model* is proposed using

( ) 2 2

*T*

*C*

Therefore, another problem of this model is the inaccurate results near the solid wall (e.g. near of terrain) because in this region the viscous layer is present, and the sub-grid stress is a significant fraction of the total stress. These models are incapable of computing the small scales present in this region [21] and lead to the underestimation of coefficient *CT*, resulting in an overestimation of the velocity flow near the solid wall [22]. This has been accomplished by various types of *ad hoc* corrections such as *Van Driest damping* [23] and intermittence functions

[24]. The *Van Driest damping* functions applied to the first row of equation (14) yield

*T T ij ij ij ij T ij ij ij ij*

*L M L M C M M M M*

another test filter size *Δ <sup>T</sup>* =3*Δ* (*α* = 3) and computing the test filter coefficient [8] as

1 1 0 then 2 2

ï > = ï D D

<sup>ï</sup> £ = <sup>ï</sup> <sup>D</sup> <sup>î</sup>

<sup>1</sup> 0 then 0

*ij ij <sup>H</sup>*

Å Å - =- D (11)

<sup>1444442444443</sup> (12)

*M M* <sup>=</sup> <sup>D</sup> (13)

*<sup>T</sup>* <sup>−</sup>2*CT <sup>Δ</sup>* <sup>2</sup>

*Mij* )2 when

(14)

3

116 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

2

= D

2

2

*T ij ij*

*L M*

*M M*

*ij ij*

2

ì

í

∂ (εrr) <sup>∂</sup>*CT* =0 [17]:

simulations to flat terrains.

d

Based on Eqs. (8) and (11), the relation in Eq. (10) can be estimated as

2

*C M*

*T ij*

optimizing through least squares minimization of error function εrr=(*L ij*

*T*

*C*

2 1 2

$$\tau\_{\nu\rho} = -\frac{\nu\_r}{\text{Pr}} \frac{\hat{\mathcal{O}} \mathcal{W}^{\oplus}}{\hat{\mathcal{O}} \mathbf{x}\_l} = -\frac{C\Lambda^2}{\text{Pr}} \left| S^{\oplus} \right| \frac{\hat{\mathcal{O}} \mathcal{W}^{\oplus}}{\hat{\mathcal{O}} \mathbf{x}\_l} \tag{16}$$

where Pr is the *Prandtl number*, a variable computed dynamically. The *Germano identity* applied to scalar *ψ*:

$$\underline{Q}^{\boldsymbol{T}} = \mathfrak{T}\_{\boldsymbol{\nu}\boldsymbol{\eta}} - \boldsymbol{\tau}\_{\boldsymbol{\nu}\boldsymbol{\eta}}^{\boldsymbol{T}} = \left(\boldsymbol{u}\_{i}^{\oplus}\boldsymbol{\psi}^{\oplus}\right)^{\boldsymbol{T}} - \boldsymbol{u}\_{i}^{\oplus \boldsymbol{T}}\boldsymbol{\psi}^{\oplus \boldsymbol{T}} \tag{17}$$

where I*i*ψdenotes the SGS stress tensor according to the test filter. This tensor can be modelled as

$$\mathfrak{S}\_{\nu\mu} = -\frac{\nu\_r^r}{\text{Pr}} \frac{\partial \,\nu^{\oplus r}}{\partial \mathbf{x}\_i} = -\frac{C\_r \left(a\Delta\right)^2}{\text{Pr}} \left| S^{\oplus} \right|^r \frac{\partial \,\nu^{\oplus r}}{\partial \mathbf{x}\_i} \tag{18}$$

The relationship in Eq. (17) can be estimated by Eqs. (16) and (18):

$$\begin{split} \boldsymbol{Q}\_{i}^{\boldsymbol{r}} &= -\frac{\boldsymbol{C}\_{\boldsymbol{r}} \boldsymbol{\Delta}^{2}}{\text{Pr}} \underbrace{\left\{ \boldsymbol{\alpha}^{2} \left| \boldsymbol{S}^{\oplus} \right|^{\boldsymbol{r}} \frac{\partial \boldsymbol{\mathcal{V}}^{\oplus}}{\partial \boldsymbol{\alpha}\_{i}} - \left( \left| \boldsymbol{S}^{\oplus} \right| \frac{\partial \boldsymbol{\mathcal{V}}^{\oplus}}{\partial \boldsymbol{\alpha}\_{i}} \right)^{\boldsymbol{r}} \right\}}\_{-\boldsymbol{N}\_{i}} \\ &= \frac{\boldsymbol{C}\_{\boldsymbol{r}} \boldsymbol{\Delta}^{2}}{\text{Pr}} \boldsymbol{N}\_{i} \end{split} \tag{19}$$

The least squares minimization of error function εrr=(*Qi <sup>T</sup>* <sup>−</sup> *CT <sup>Δ</sup>* <sup>2</sup> Pr *Ni* ) 2 yield

$$\frac{1}{\text{Pr}} = \frac{1}{C\_r \Delta^2} \frac{Q\_i^r N\_i}{N\_i N\_i} \tag{20}$$

The computation of the SGS stress tensor *νT* in the eddy viscosity models (Eq. (8)) and the *Prandtl number* Pr in the turbulent diffusivity model (Eq. (16)) has been introduced into the ARPS code by Aguirre [8].

### **2.3. The parametrization of microphysical processes**

In addition, the ARPS code has been designed specifically to describe thunderstorms. So, submodels of heat flow and water steam, cloud formation and precipitation are included. Ground relief, vegetation types, soil types and initial conditions of the atmospheric state are taken into account for the simulation of these phenomena. The parameterization model is called warmrain microphysics and is based on the descriptions of Klemp [25] and Soong [26]. This model considers three water categories such as water vapour (*qv*), cloud water (*qc*) and rainwater (*qr*), where the latter two are characterized by its size. At the beginning of the process, cloud water droplets are formed when the air becomes saturated and condensation occurs. Then, if the water mixing ratio exceeds a critical threshold in the cloud interior, raindrops form and the collision-coalescence process begins. If after crossing the base of the cloud they meet air below the saturation point, then the process of evaporation occurs, gradually diminishing the size of the raindrops. The time rate of evaporation and condensation will depend on certain param‐ eters and on the water mixing ratio in the air. As all these processes involve transference of energy in heat and latent vapour, it is necessary to fit the potential temperature of the air. The theoretical formulation will not be described here because it is not the aim of this chapter. For more details on the microphysical processes in the ARPS code, see Xue [27].

### **2.4. Stochastic Lagrangian one-fluid particle model (STO)**

The SGS models simulate the effect of unsolved scales as an energy sink of the solved LES scales. However, atmospheric dispersion phenomena are well simulated when small unsolved scales trajectory are computed. To obtain a more realistic description of the trajectories of fluid particles that transport the chemical species in a turbulent regime, the small-scale simulation unsolved by LES must be carried out. A coupled model between the LES and *stochastic Lagrangian models*(STO) for the small scales is proposed. The fluid particle *Lagrangian* velocity *Ui* is computed by solving the LES-STO coupled model by using a *Langevin* equation:

$$\frac{dU\_i}{dt} = h\_y(U\_j, t) + q\_y(U\_j, t)\eta\_j(t) \tag{21}$$

where *hij* is the dynamic deterministic coefficient, and *q*ij is the dynamic random coefficient (in analogy with the Brownian movement), and it is linked to the statistical properties of the turbulence. *η<sup>j</sup>* denotes a random variable whose mean value is null and covariance:

$$
\langle \eta\_i(t')\eta\_j(t'')\rangle = \delta\_{ij}\delta(t'-t'')\tag{22}
$$

This property suggests that *η<sup>j</sup>* is correlated neither in space nor in time.

The computation of the SGS stress tensor *νT* in the eddy viscosity models (Eq. (8)) and the *Prandtl number* Pr in the turbulent diffusivity model (Eq. (16)) has been introduced into the

In addition, the ARPS code has been designed specifically to describe thunderstorms. So, submodels of heat flow and water steam, cloud formation and precipitation are included. Ground relief, vegetation types, soil types and initial conditions of the atmospheric state are taken into account for the simulation of these phenomena. The parameterization model is called warmrain microphysics and is based on the descriptions of Klemp [25] and Soong [26]. This model considers three water categories such as water vapour (*qv*), cloud water (*qc*) and rainwater (*qr*), where the latter two are characterized by its size. At the beginning of the process, cloud water droplets are formed when the air becomes saturated and condensation occurs. Then, if the water mixing ratio exceeds a critical threshold in the cloud interior, raindrops form and the collision-coalescence process begins. If after crossing the base of the cloud they meet air below the saturation point, then the process of evaporation occurs, gradually diminishing the size of the raindrops. The time rate of evaporation and condensation will depend on certain param‐ eters and on the water mixing ratio in the air. As all these processes involve transference of energy in heat and latent vapour, it is necessary to fit the potential temperature of the air. The theoretical formulation will not be described here because it is not the aim of this chapter. For

more details on the microphysical processes in the ARPS code, see Xue [27].

The SGS models simulate the effect of unsolved scales as an energy sink of the solved LES scales. However, atmospheric dispersion phenomena are well simulated when small unsolved scales trajectory are computed. To obtain a more realistic description of the trajectories of fluid particles that transport the chemical species in a turbulent regime, the small-scale simulation unsolved by LES must be carried out. A coupled model between the LES and *stochastic Lagrangian models*(STO) for the small scales is proposed. The fluid particle *Lagrangian* velocity

is computed by solving the LES-STO coupled model by using a *Langevin* equation:

( ,) ( ,) () *<sup>i</sup> ij j ij j j dU hU t qU t t dt* = +

( ') ( ") ( ' ") *i j ij* á ñ= -

 dd

h h

where *hij* is the dynamic deterministic coefficient, and *q*ij is the dynamic random coefficient (in analogy with the Brownian movement), and it is linked to the statistical properties of the

denotes a random variable whose mean value is null and covariance:

h

(21)

*t t tt* (22)

**2.4. Stochastic Lagrangian one-fluid particle model (STO)**

ARPS code by Aguirre [8].

*Ui*

turbulence. *η<sup>j</sup>*

**2.3. The parametrization of microphysical processes**

118 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

To obtain the *Langevin* equation terms, the following hypotheses are proposed:


The first is not completely sustainable in closeness to solid walls or to the ground. For this reason, an anisotropic model will be used. The last hypothesis suggests that a relationship exists between the results of the LES and the coefficients of the *Langevin* model of the equation (21). In this way, these terms are calculated dynamically in each cell and for each time step from the results of the LES. Consequently, the fluid particle moves inside the cell of the calculation grid by following the evolution of the large scales resolved by LES to which is added a fluctuation that simulates the behaviour of the small scales of movement produced in its interior. The fluid particle velocity decomposition can be expressed as

$$U\_i(\mathbf{x}\_i, t) = \boldsymbol{\mu}\_i^{\oplus}(\mathbf{x}\_i, t) + \boldsymbol{\mu}\_i^{-}(\mathbf{x}\_i, t) \tag{23}$$

where *ui* − is a *Lagrangian* fluctuation velocity of fluid particle due to sub-grid scale (SGS) turbulence. To obtain the coefficients of Eq. (21), the transport equation of probability density function (PDF) applied to velocities field, called the *Fokker-Plank* equation, can be used [4]:

$$\frac{\partial P\_L}{\partial t} = -\frac{\partial}{\partial \mathbf{v}\_l} \left[ h\_{\bar{y}}(U\_j, t) P\_L \right] + \frac{1}{2} \frac{\partial^2}{\partial \mathbf{v}\_l \partial \mathbf{v}\_j} \left[ q\_{ik}(U\_i, t) q\_{jk}(U\_j, t) P\_L \right] \tag{24}$$

The decomposition Eq. (23) is used into the PDF transport Eq. (24):

$$\begin{split} \frac{\partial P\_{L}}{\partial t} &= -\frac{\partial}{\partial \boldsymbol{\upsilon}\_{i}} \Big[ \boldsymbol{h}\_{\boldsymbol{\upsilon}}^{\oplus}(\boldsymbol{u}\_{\boldsymbol{\prime}}^{\oplus},t) \boldsymbol{P}\_{L} \Big] - \frac{\partial}{\partial \boldsymbol{\upsilon}\_{i}} \Big[ \boldsymbol{h}\_{\boldsymbol{\upsilon}}^{-}(\boldsymbol{u}\_{\boldsymbol{\prime}}^{-},t) \boldsymbol{P}\_{L} \Big] \\ &+ \frac{1}{2} \frac{\partial^{2}}{\partial \boldsymbol{\upsilon}\_{i} \partial \boldsymbol{\upsilon}\_{j}} \Big[ \boldsymbol{q}\_{\boldsymbol{\mu}}^{\oplus}(\boldsymbol{u}\_{i}^{\oplus},t) \boldsymbol{q}\_{\boldsymbol{\mu}}^{\oplus}(\boldsymbol{u}\_{\boldsymbol{\prime}}^{\oplus},t) \boldsymbol{P}\_{L} \Big] + \frac{1}{2} \frac{\partial^{2}}{\partial \boldsymbol{\upsilon}\_{i} \partial \boldsymbol{\upsilon}\_{j}} \Big[ \boldsymbol{q}\_{\boldsymbol{\mu}}^{-}(\boldsymbol{u}\_{i}^{-},t) \boldsymbol{q}\_{\boldsymbol{\mu}}^{-}(\boldsymbol{u}\_{\boldsymbol{\prime}}^{-},t) \boldsymbol{P}\_{L} \Big] \end{split} \tag{25}$$

Also the same decomposition Eq. (23) is used in the *Langevin* Eq. (21):

$$\frac{d\boldsymbol{u}\_{i}^{\oplus}}{dt} + \frac{d\boldsymbol{u}\_{i}^{-}}{dt} = \boldsymbol{h}\_{\boldsymbol{\upbigvee}}^{\oplus}(\boldsymbol{u}\_{\boldsymbol{\upbeta}}^{\oplus},t) + \boldsymbol{h}\_{\boldsymbol{\upbeta}}^{-}(\boldsymbol{u}\_{\boldsymbol{\upbeta}}^{-},t) + \left[\boldsymbol{q}\_{\boldsymbol{\upbeta}}^{\oplus}(\boldsymbol{u}\_{\boldsymbol{\upbeta}}^{\oplus},t) + \boldsymbol{q}\_{\boldsymbol{\upbeta}}^{-}(\boldsymbol{u}\_{\boldsymbol{\upbeta}}^{-},t)\right] \boldsymbol{\upeta}\_{\boldsymbol{\upbeta}}(t) \tag{26}$$

In this analysis, some simplifications in Eq. (26) with reference to the hypothesis mentioned above can be performed. In particular, the hypothesis (b) proposes that small scales are far from the inertial range energy transfer, then it can be assumed that the random term *qij* is completely defined by *uj* − small scales (sub-grid scales); in other words *qij* <sup>⊕</sup>(*uj* <sup>⊕</sup>, *t*)=0. Therefore, separating large scales and sub-grid scales of Eq. (26) can be assigned the following equiva‐ lences:

$$\begin{cases} \frac{d\boldsymbol{u}\_{i}^{\oplus}}{dt} = \boldsymbol{h}\_{\boldsymbol{\mathcal{y}}}^{\oplus}(\boldsymbol{u}\_{\boldsymbol{\mathcal{y}}}^{\oplus}, t) \\\\ \frac{d\boldsymbol{u}\_{i}^{-}}{dt} = \boldsymbol{h}\_{\boldsymbol{\mathcal{y}}}^{-}(\boldsymbol{u}\_{\boldsymbol{\mathcal{y}}}^{-}, t) + \boldsymbol{q}\_{\boldsymbol{\mathcal{y}}}^{-}(\boldsymbol{u}\_{\boldsymbol{\mathcal{y}}}^{-}, t)\boldsymbol{\eta}\_{\boldsymbol{\mathcal{y}}}(t) \end{cases} \tag{27}$$

The first row of Eq. (27) is the material derivative of large scales. It is computed according to Eq. (4):

$$\frac{d\boldsymbol{u}\_{i}^{\oplus}}{dt} = \mathbf{g}\_{i}\boldsymbol{B}^{\oplus} - \frac{1}{\rho}\frac{\partial\boldsymbol{p}^{\prime \oplus}}{\partial\boldsymbol{\alpha}\_{i}} - \frac{\partial\boldsymbol{\tau}\_{\boldsymbol{y}}}{\partial\boldsymbol{\alpha}\_{j}} + 2\nu\frac{\partial\mathbf{S}\_{\boldsymbol{y}}^{\boldsymbol{a}\oplus}}{\partial\boldsymbol{\alpha}\_{j}}\tag{28}$$

Therefore, the first term of the second member of Eq. (25) is obtained from LES:

$$-\frac{\partial}{\partial \mathbf{v}\_{i}} \Big[ h\_{\mathbf{y}}^{\oplus} (\boldsymbol{u}\_{\neq}^{\oplus}, t) P\_{L} \Big] = -\mathbf{g}\_{i} \boldsymbol{B}^{\oplus} \frac{\partial P\_{L}}{\partial \mathbf{v}\_{i}} + \frac{1}{\rho} \frac{\partial p^{\prime \oplus}}{\partial \mathbf{x}\_{i}} \frac{\partial P\_{L}}{\partial \mathbf{v}\_{i}} + \frac{\partial \mathsf{r}\_{\neq}}{\partial \mathbf{x}\_{j}} \frac{\partial P\_{L}}{\partial \mathbf{v}\_{i}} - 2\nu \frac{\partial \mathsf{S}\_{\neq}^{\omega \oplus}}{\partial \mathbf{x}\_{j}} \frac{\partial P\_{L}}{\partial \mathbf{v}\_{i}} \tag{29}$$

For the other terms of Eq. (25), a turbulence model is proposed because these are terms dependent on the unsolved scales. For deterministic term, which takes account of unsolved scales, Gicquel [7] suggested that this value is proportional at velocity of small scales and a tensor that note the turbulent kinetic energy of sub-grid scales. The idea of this proposal is that the greater the turbulent kinetic energy of sub-grid scales, the greater is the number of small vortices and therefore the greater importance of this term. The other terms express the random component sub-grid. They can be modelled taking into account the assumption (a); in other words, the small scales have an isotropic behaviour. Pope [6] proposed an expression for these terms which takes into account also the rate of dissipation in homogeneous isotropic sub-grid turbulence *ε*.

With these considerations, the transport equation for the probability density function of velocity field in Eq. (25) can be summarized as

$$\frac{\partial P\_L}{\partial t} = -\frac{\partial}{\partial \mathbf{v}\_i} \left[ \frac{d\boldsymbol{u}\_j^{\oplus}}{dt} P\_L \right] - \frac{\partial}{\partial \mathbf{v}\_i} \left[ \alpha\_\eta \boldsymbol{u}\_j^{-} P\_L \right] + \frac{1}{2} C\_0 \varepsilon \frac{\partial^2 P\_L}{\partial \mathbf{v}\_i \partial \mathbf{v}\_j} \tag{30}$$

## where *C*<sup>0</sup> =2.1 denotes the *Kolmogorov coefficient*.

In this analysis, some simplifications in Eq. (26) with reference to the hypothesis mentioned above can be performed. In particular, the hypothesis (b) proposes that small scales are far from the inertial range energy transfer, then it can be assumed that the random term *qij*

small scales (sub-grid scales); in other words *qij*

separating large scales and sub-grid scales of Eq. (26) can be assigned the following equiva‐

( ,) ( ,) ()

h

*a*

¶¶ ¶ (28)

*a*

n

2

0 1 2

 e

*ij j ij j j*

The first row of Eq. (27) is the material derivative of large scales. It is computed according to

1 ' <sup>2</sup>

t

*ij j*

n

*L LL L ij ij*

Å Å

*P pP P P S*

ë û ¶ ¶ ¶ ¶ ¶¶ ¶ ¶ (29)

t

*ij ij i*

Å Å Å <sup>Å</sup> ¶ ¶ ¶

*du p S g B dt xx x*

= - -+

r

Therefore, the first term of the second member of Eq. (25) is obtained from LES:


1 ' ( ,) <sup>2</sup>

*i i i i ji j i*

For the other terms of Eq. (25), a turbulence model is proposed because these are terms dependent on the unsolved scales. For deterministic term, which takes account of unsolved scales, Gicquel [7] suggested that this value is proportional at velocity of small scales and a tensor that note the turbulent kinetic energy of sub-grid scales. The idea of this proposal is that the greater the turbulent kinetic energy of sub-grid scales, the greater is the number of small vortices and therefore the greater importance of this term. The other terms express the random component sub-grid. They can be modelled taking into account the assumption (a); in other words, the small scales have an isotropic behaviour. Pope [6] proposed an expression for these terms which takes into account also the rate of dissipation in homogeneous isotropic sub-grid

With these considerations, the transport equation for the probability density function of

*L L j L ij j L i i i j*

*P P du P uP C t v dt v v v* a


Å

*h u t P gB <sup>v</sup> v x v xv x v*

r

Å Å <sup>Å</sup> ¶ ¶ ¶¶ ¶ ¶ ¶ ¶

*du hut qut t dt*


( ,)

*ij j*

Å Å

*du hut*

<sup>ï</sup> = + ïî

*i*

Å

*dt*

120 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

ì <sup>ï</sup> <sup>=</sup> <sup>ï</sup> í

*i*


*i*

*ij j L i*

velocity field in Eq. (25) can be summarized as

completely defined by *uj*

lences:

Eq. (4):

turbulence *ε*.

−

is

(27)

(30)

<sup>⊕</sup>, *t*)=0. Therefore,

<sup>⊕</sup>(*uj*

With the hypotheses and using the PDF transport Eq. (30), the dynamic deterministic coeffi‐ cient and dynamic random coefficient of Eq. (21) can be expressed as

$$\begin{cases} h\_y(U\_j, t) = \frac{d\boldsymbol{u}\_j^{\oplus}}{dt} + \boldsymbol{\alpha}\_y \boldsymbol{u}\_y^{-} \\ q\_y(U\_j, t) = \sqrt{\boldsymbol{C}\_0 \boldsymbol{\varepsilon}} \quad \boldsymbol{\delta}\_y \end{cases} \tag{31}$$

Since the material derivative of large scales is obtained using LES in Eq. (28) and the rate of dissipation of turbulent kinetic energy *ε* is computed with a gradient model [28] by the ARPS code, the remainder proposes an expression for the deterministic tensor *αij* . These tensors are related to the statistical properties of sub-grid turbulence [29]. For the more complex case, these features are not steady, inhomogeneous and anisotropic flow, as developed in the atmospheric boundary layer over heterogeneous rough terrain near the soil surface:

$$\alpha\_{\circ} = \frac{1}{2K^{-}} \frac{dK^{-}}{dt} \delta\_{\circ} - \left(\frac{3}{4} C\_{0}\right) \frac{\mathcal{E}}{K^{-}} \delta\_{\circ} + \left(\frac{R\_{\circ}}{2K^{-}} - \frac{\delta\_{\circ}}{3}\right) \frac{\mathcal{E}}{K^{-}} \tag{32}$$

where *K* <sup>−</sup> =<sup>1</sup> 2 (*ui* − *uj* − )⊕*δij* denotes the sub-grid turbulent kinetic energy; it is obtained with transport Eq. (28). *Rij* =(*ui* − *uj* − )<sup>⊕</sup>denotes the Reynolds tensor of sub-grid scales; it is obtained using the turbulent diffusivity models (Eq. 8). More details of this resolution can be found in [9, 10] and [29].

## **2.5. Solid particle simulation model**

The solid particle simulation addressed in this work is on the order of tens of micron size. Therefore, the following assumptions may be raised:


The motion equation of a solid particle immersed in fluid flow can be written as follows [30]:

$$\frac{dV\_i}{dt} = F\_i \tag{33}$$

where *Vi* is the solid particle velocity, and *Fi* denotes the forces per unit mass.

The assumption (a) has been proven when the density of the solid particles is more than 1000 times of the air density which transports. Then, the second member (Eq. (33)) can be summar‐ ized as

$$F\_i = \frac{U\_i - V\_i}{\tau\_s} - \mathbf{g}\_i \delta\_{i3} \tag{34}$$

in which the first term on the right-hand side denotes the drag force, while the second term is the force of gravity acting in the vertical direction indicated with the index 3, both per unit mass of solid particle. *τ<sup>s</sup>* denotes a time scale of particle acceleration. It depends on the rate of densities between the solid particle and the air, solid particle diameter *ds* and a drag coefficient *CD*.

$$\tau\_s = \frac{4}{3} \frac{ds}{C\_D} \frac{\rho\_s}{\rho} \frac{1}{\left| U\_i - V\_i \right|} \tag{35}$$

The hypothesis (b) suggests that the drag coefficient *CD* is proportional to the diameter of the solid particle *ds* and also depends on the fluid viscosity. This condition is usually assessed by calculating the *Reynolds number of the solid particle* ℜ*e*s.

$$\mathfrak{Re}\_s = d\_s \frac{|U\_i - V\_i|}{\nu} \tag{36}$$

where *ν* denotes the dynamic viscosity of fluid.

In laminar flow (or *Stokes flow*), ℜ*e*s ≤ 1 and the drag coefficient can be calculated as

$$C\_D = \frac{24}{\Re \mathcal{e}\_s} \tag{37}$$

In turbulent flows, the authors propose many different expressions. Sommerfeld [31] uses the following:

$$C\_D = \begin{cases} \frac{24}{\mathfrak{Re}\_s} \left( 1 + 0.15 \mathfrak{Re}\_s^{0.687} \right) & \text{if} \\ 0.44 & \text{if} \end{cases} \qquad \text{l} < \mathfrak{Re}\_s < 1000 \text{l} \tag{38}$$

In Eqs. (34), (35) and (36), *Ui* is the velocity of fluid particle on the position of solid one. Aguirre [32, 33] denotes it as `the fluid particle velocity seen by the solid particle´, and it is estimated with stochastic equation similar to Eq. (21) but using a *Reynolds decomposition* (RANS). These authors corrected the trajectories of fluid particles seen by the solid particle using a scale of *weighted characteristic time* taking into account the density and diameter of the solid particles.

In the decomposition of large and small scales, *Ui* is computed as Eq. (23), and *ui* − can be obtained with the simulation of fluid particles emitted from the same sources as the solid particles and dispersed in the atmospheric boundary layer following the equations of motion described in the previous section. However, it is very difficult that after a few time steps, the positions of the pairs of particles (fluid-solid) launched from the same source, are still in matching positions. An alternative to this problem is to `detect´ the fluid particle closer to the solid particle after each time step, to assign the speed *Ui* . Firstly, this requires that both sets of particles are computed in the simulation, and secondly, the use of a greater amount of fluid particles that of solid particles because of their different trajectories. Each new position in a solid particle must find a nearby fluid. In this case, it is desirable that the source of fluid particles is much larger than that of the solid. Vinkovic [11] followed this method. An alter‐ native, which uses less computer resources, is to calculate *Ui* in the exact position of the solid particle starting from the sub-grid turbulent kinetic energy *K* <sup>−</sup> , then using a stochastic Eq. (21). Thus, it is not necessary to compute the trajectories of fluid particles or use search algorithms proximity to the solid particle. This method is used in this chapter and detailed below.

## **2.6. Numerical method to compute the solid particle trajectory**

where *Vi*

ized as

*CD*.

following:

is the solid particle velocity, and *Fi*

122 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

denotes the forces per unit mass.


<sup>=</sup> - (35)


<sup>24</sup> *CD <sup>e</sup>* <sup>=</sup> <sup>Â</sup> (37)

(38)

The assumption (a) has been proven when the density of the solid particles is more than 1000 times of the air density which transports. Then, the second member (Eq. (33)) can be summar‐

in which the first term on the right-hand side denotes the drag force, while the second term is the force of gravity acting in the vertical direction indicated with the index 3, both per unit mass of solid particle. *τ<sup>s</sup>* denotes a time scale of particle acceleration. It depends on the rate of densities between the solid particle and the air, solid particle diameter *ds* and a drag coefficient

s

*C UV* r

The hypothesis (b) suggests that the drag coefficient *CD* is proportional to the diameter of the solid particle *ds* and also depends on the fluid viscosity. This condition is usually assessed by

4 1 3 *D ii*

r

*ds*

s s *U V i i e d* n

In laminar flow (or *Stokes flow*), ℜ*e*s ≤ 1 and the drag coefficient can be calculated as

( ) 0.687

*e e <sup>C</sup> <sup>e</sup>*

<sup>ï</sup> + Â <Â < <sup>=</sup> íÂ

î ³

s

*D*

ì

ï

s

In turbulent flows, the authors propose many different expressions. Sommerfeld [31] uses the

<sup>24</sup> 1 0.15 if 1 1000

0.44 if 1000

s s

s

*e*

s

t

calculating the *Reynolds number of the solid particle* ℜ*e*s.

where *ν* denotes the dynamic viscosity of fluid.

*i i i i i s U V F g*

t

3

d

The discretized equations for the position and velocity of the solid particles can be written as

$$\begin{cases} V\_{\ i \ (n+1)} = V\_{\ i \ (n)} + \frac{\Delta t}{\mathsf{T}\_s} \left( U\_{\ i \ (n)} - V\_{\ i \ (n)} \right) - \mathsf{g}\_i \Delta t \delta\_{i3} \\\\ X\_{\ i \ (n+1)} = \frac{V\_{\ i \ (n+1)} - V\_{\ i \ (n)}}{\mathsf{Z}} \Delta t \end{cases} \tag{39}$$

where the subscript in parentheses denotes the number of time instant simulation, and *Δt* =*t*(*n*+1)−*t*(*n*) is the time step.

The velocity of fluid particle seen by the solid particle is computed following Eq. (23):

$$U\_{\boldsymbol{i}\_{\boldsymbol{i}}(n)} = \boldsymbol{u}\_{\boldsymbol{i}\_{\boldsymbol{i}}(n)}^{\oplus} + \boldsymbol{u}\_{\boldsymbol{i}\_{\boldsymbol{i}}(n)}^{\cdot} \tag{40}$$

The speed of the large scales to the position of the solid particle in the four nodes of the grid closest to it is weighed. Then, the component representing the speed of sub-grid fluid particle at solid particle position is computed by using the second row of Eq. (27) in discretized form as:

$$
\mu\_{i\_{\lfloor (n) \rfloor}}^{-} = \mu\_{i\_{\lfloor (n-1) \rfloor}}^{-} + \alpha\_{\underline{\gamma}\_{\lfloor (n) \rfloor}} \mu\_{i\_{\lfloor (n-1) \rfloor}}^{-} \Delta t + \sqrt{C\_{0} \varepsilon \Delta t} \ \tag{41}
$$

where *χ*(*n*) denotes an independent random variable with zero mean and unit variance at time *t*(*n*), and *αij* (*n*) denotes the tensor computed with Eq. (32) at solid particle position and to estimate the sub-grid scale of fluid particle velocity in the previous time step at solid particle position, is proposed isotropic turbulence (Pope, 1994):

( 1) () () 2 <sup>3</sup> *i n n n u K* c - - - = (42)

## **3. Comparison with experimental laboratory measurements**

#### **3.1. Description of Gong's experiment**

Gong [34] carried out measurements of mean velocity and air flow fluctuation in the neutral turbulent layer produced in the wind tunnel of the Department of Agriculture of the University of Reading (UK) using the methodology of generation of turbulent flow of Counihan [35]. The authors installed a rubber sheet on the floor of tunnel to simulate a rough floor and a hill with a slight slope. Details of the geometry of the tunnel and the simulated hill can be found in Gong [34]. Gong [13] measured concentrations of a passive gas (carbon dioxide) incorporating a nearer point source upwind of a bi-dimensional symmetrical hill placed transversally to the air flow direction as shown in **Figure 1**. The characteristic data of the boundary layer generated in the laboratory, the diameter and height of the gas emission source as well as its position are as follows:


The measurements of gas concentration were made on the plane of axial symmetry containing the emission source, obtaining mean value profiles in five positions:

Computational Tools for the Simulation of Atmospheric Pollution Transport During a Severe Wind Event in Argentina http://dx.doi.org/10.5772/63552 125

**Figure 1.** Scheme of Gong's experiments [13, 34].

*i n i n ij n i n* ( ) ( 1) ( ) ( 1) 0 ( ) *<sup>n</sup> u u u t Ct* a

124 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

*t*(*n*),

and *αij* (*n*)

as follows:

is proposed isotropic turbulence (Pope, 1994):

**3.1. Description of Gong's experiment**







the emission source, obtaining mean value profiles in five positions:


The measurements of gas concentration were made on the plane of axial symmetry containing


where *χ*(*n*) denotes an independent random variable with zero mean and unit variance at time

the sub-grid scale of fluid particle velocity in the previous time step at solid particle position,

c

( 1) () () 2 <sup>3</sup> *i n n n u K*

Gong [34] carried out measurements of mean velocity and air flow fluctuation in the neutral turbulent layer produced in the wind tunnel of the Department of Agriculture of the University of Reading (UK) using the methodology of generation of turbulent flow of Counihan [35]. The authors installed a rubber sheet on the floor of tunnel to simulate a rough floor and a hill with a slight slope. Details of the geometry of the tunnel and the simulated hill can be found in Gong [34]. Gong [13] measured concentrations of a passive gas (carbon dioxide) incorporating a nearer point source upwind of a bi-dimensional symmetrical hill placed transversally to the air flow direction as shown in **Figure 1**. The characteristic data of the boundary layer generated in the laboratory, the diameter and height of the gas emission source as well as its position are


**3. Comparison with experimental laboratory measurements**

e c-- - = + D+ D - - (41)


denotes the tensor computed with Eq. (32) at solid particle position and to estimate

$$\mathbf{x} - \mathbf{x}\_s = \begin{cases} \text{ } \textbf{150.00 mm: foot, upwind} \\ \text{ } \textbf{250.00 mm: halfway uphill, upwind} \\ \text{ } \textbf{350.00 mm: crest} \\ \text{ } \textbf{450.00 mm: halfway uphill, downwind} \\ \text{ } \textbf{550.00 mm: foot, downwind} \end{cases}$$

Below is the comparison between the experimental measurements and the results of the simulation using the coupled model LES-STO considering the hypothesis of statistically inhomogeneous and anisotropic turbulence for the *αij* tensor (32). Besides, other numeric simulations were carried out without using the stochastic model for sub-grid turbulence; that is, in this case, the particles only followed the trajectories imposed by the LES.

#### **3.2. Numerical simulation of Gong's experiment**

For statistical calculations, results were seen after 4 seconds of the start simulation to get a permanent state of the particles within the computational domain. The total physical time simulated was 100 seconds. The rate of injection of particles was fixed at 50,000 particles per second. The time step of the simulation has been proposed to ensure the numerical stability of the computation at 0.02 seconds. The temporal evolution of the quantity of fluid particles in the domain of the calculations shows that close to 130,000 are present in a permanent state. The total quantity of particles injected for the entire simulation time was 10,125,151. The author of the experiment presented the results of gas concentration normalized:

126 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

$$C\_n = \frac{CD \ U\_e^2}{z\_s \mu\_s^2 C\_0} \tag{43}$$

**Figure 2.** Profiles of normalized concentrations upwind hill. Left-hand side: *x – xs* = 150 mm (foot of the hill). Righthand side: *x – xs* = 250 mm (halfway on the hill).

**Figure 3.** Profiles of mean concentration at the crest of the hill at *x – xs* = 350 mm.

where *C* is the gas concentration, and *u*<sup>s</sup> is the air velocity to the height of the emission source. **Figures 2**–**4** show the profiles of the mean concentration using LES and LES-STO. **Figure 5** shows concentration at ground level (left) and the standard deviation of the height of the centre of the gas plume on the axial plane that contains the emission source (right). **Figure 6** shows the mean concentration levels of the gas plume on the axial plane containing the source (left) and at the same at ground level (right), both of which correspond to the case of non-stationary, inhomogeneous and anisotropic sub-grid turbulence (LES-STO) that presents better results in comparison with the experimental measurements.

*zuC* <sup>=</sup> (43)

*s s CD U <sup>C</sup>*

**Figure 2.** Profiles of normalized concentrations upwind hill. Left-hand side: *x – xs* = 150 mm (foot of the hill). Right-

where *C* is the gas concentration, and *u*<sup>s</sup> is the air velocity to the height of the emission source. **Figures 2**–**4** show the profiles of the mean concentration using LES and LES-STO. **Figure 5** shows concentration at ground level (left) and the standard deviation of the height of the centre of the gas plume on the axial plane that contains the emission source (right). **Figure 6** shows the mean concentration levels of the gas plume on the axial plane containing the source (left) and at the same at ground level (right), both of which correspond to the case of non-stationary,

hand side: *x – xs* = 250 mm (halfway on the hill).

**Figure 3.** Profiles of mean concentration at the crest of the hill at *x – xs* = 350 mm.

*n*

126 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

**Figure 4.** Profiles of normalized concentration downwind hill. Left-hand side: *x – xs* = 450 mm (halfway on the hill). Right-hand side: *x – xs* = 550 mm (foot of the hill).

**Figure 5.** Normalized mean concentration at ground level (left) and standard deviation from the height of the gas plume centre (right).

**Figure 6.** Mean gas concentration values simulated with LES-STO. Left-hand side: on the axial plane containing the source. Right-hand side: at ground level.

## **4. Atmospheric dispersion of solid particles during severe wind**

## **4.1. Description of study case**

Lastenia town, northwest of San Miguel de Tucuman city, has operated a foundry of metals for 24 years. It is located on a flat residential area (430 m) and was operated until its closure in the mid-1990s. This plant supplied machinery to large sugar refineries since this is the most important crop in the area. The smelter has two chimneys of a height of 45 and 3 m in diameter. The outlet temperature of the gases was *T*g = 220°C. In addition to these combustion gases from the foundry, the smelter emitted particles of different metals that are highly toxic when inhaled directly or ingested indirectly through local crops (e.g. citrus, leafy vegetables). According to Fernández-Turiel [14], the Lastenia region had dangerous concentrations of metals as such as silver (Ag), cadmium (Cd), cooper (Cu), nickel (Ni), plumb (Pb), tin (Sn), zinc (Zn), among others. These measurements were performed in the laboratory with very specific equipment samples taken from in situ in square areas of 10,000 m2 in both soil and plants, for a sector near the smelter. Lastenia town location and area of study are shown in **Figure 7**.

**Figure 7.** Location of study area and sample measurements.

The study area has a rectangular shape of 3.5 km in north-south and 2.4 km east-west. The prevailing winds in the area are in north and south-west direction. However, the most damaging wind direction to the residential area is the eastern sector.

### **4.2. Simulation details**

Calculation grid used for simulation consists of regular prismatic cells of varying height. They have a horizontal dimension of 100 × 100 m (of dimensions equal to those used for in situ sampling by Fernández-Turiel [14]) and height ranging from 3 m to adjacent cells to the ground, up to 42 m for which are the top of the computational domain. The law of height size of the cells follows a hyperbolic tangent function. Thus, the grid consists of 37 cells in the eastwest direction, 54 cells in north-south direction and 52 cells in the vertical direction. **Figure 8** shows the vertical arrangement (a) and horizontal (b) a sector of the grid to LES computation. The location of the chimneys of the smelter is appreciated too.

**4. Atmospheric dispersion of solid particles during severe wind**

128 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

the smelter. Lastenia town location and area of study are shown in **Figure 7**.

Lastenia town, northwest of San Miguel de Tucuman city, has operated a foundry of metals for 24 years. It is located on a flat residential area (430 m) and was operated until its closure in the mid-1990s. This plant supplied machinery to large sugar refineries since this is the most important crop in the area. The smelter has two chimneys of a height of 45 and 3 m in diameter. The outlet temperature of the gases was *T*g = 220°C. In addition to these combustion gases from the foundry, the smelter emitted particles of different metals that are highly toxic when inhaled directly or ingested indirectly through local crops (e.g. citrus, leafy vegetables). According to Fernández-Turiel [14], the Lastenia region had dangerous concentrations of metals as such as silver (Ag), cadmium (Cd), cooper (Cu), nickel (Ni), plumb (Pb), tin (Sn), zinc (Zn), among others. These measurements were performed in the laboratory with very specific equipment

The study area has a rectangular shape of 3.5 km in north-south and 2.4 km east-west. The prevailing winds in the area are in north and south-west direction. However, the most

Calculation grid used for simulation consists of regular prismatic cells of varying height. They have a horizontal dimension of 100 × 100 m (of dimensions equal to those used for in situ sampling by Fernández-Turiel [14]) and height ranging from 3 m to adjacent cells to the ground, up to 42 m for which are the top of the computational domain. The law of height size of the cells follows a hyperbolic tangent function. Thus, the grid consists of 37 cells in the eastwest direction, 54 cells in north-south direction and 52 cells in the vertical direction. **Figure 8**

in both soil and plants, for a sector near

**4.1. Description of study case**

samples taken from in situ in square areas of 10,000 m2

**Figure 7.** Location of study area and sample measurements.

**4.2. Simulation details**

damaging wind direction to the residential area is the eastern sector.

**Figure 8.** Horizontal (a) and vertical (b) configuration of the grid for LES computation and the location of two chim‐ neys of smelter.

In this case, it has been simulated that the copper particle dispersion has a density *ρ* = 8900 Kg/ m3 and diameter *ds* = 46.5 μm. A time step Δ*t* = 0.05 second was used. This value has been imposed by the acceleration time of the copper particles (*τ*s = 0.0503 second) calculated using the terminal freefall speed of it (Δ*t < τ*s) to ensure numerical stability.

The initial velocity at chimney outlet of solid particles is conditioned to gas temperature *T*g. This velocity at outlet chimney *w*<sup>s</sup> (0) can be estimated with

$$\text{w}\_{\text{s}^{(0)}}^2 = \frac{U\_0^2 \left(\bigvee T\_{\text{s}} \bigvee T\_{\text{g}}\right) + 2\,\text{g}\,\text{h}\left(\bigvee T\_{\text{g}} \bigvee T\_{\text{g}} - 1\right)}{1 + 4\left(\bigvee D\right)}\tag{44}$$

where *U*0 is the air velocity at height of chimney *h, T*a denotes the air temperature, *f* is the friction coefficient of wall chimney and *D* is the diameter.

This formula is derived using the *Bernoulli* theorem vertically and horizontally to consider the thermal draft and depression generated by the wind to the outlet chimney as shown in **Figure 9**.

A case of severe eastern wind is simulated. The boundary conditions of LES are considered a probability density function type *Weibull* with two parameters that serve to force the wind velocity:

$$p(u) = \frac{k}{c} \left(\frac{u}{c}\right)^{k-1} e^{-\left(\frac{u}{c}\right)^k} \tag{45}$$

where *k* is a shape parameter and *c* denotes the scale parameter. These are computed consid‐ ering the average wind velocity [36]:

$$\begin{cases} k = 0.94 \sqrt{\mu} \\ c = \frac{\pi}{\mu} \\ \frac{\pi}{\Gamma\left(1 + \frac{1}{\lambda}\right)} \end{cases} \tag{46}$$

In this case, the average wind velocity has been obtained from register of meteorological stations during a severe eastern wind. This value at 10 m over ground is *u*¯ <sup>10</sup> = 2.5 m/s. Weibull parameters obtained with Eq. (46) are *k* = 1.486 and *c* = 2.766.

The logarithmic law has been used to estimate the average velocity of wind under 100 m height. It is used with a friction velocity *u\** = 0.276 m/s, a roughness parameter *z*<sup>0</sup> = 0.228 m and a *von-Kármán* coefficient *kv* = 0.4. Above 100 m height, the potential law of wind is used with power coefficient *n* = 4.6.

**Figure 9.** Calculation scheme of the initial vertical velocity of solid particles.

#### **4.3. Results of the comparison between the measurements and the numerical simulation**

The copper concentrations obtained at 4000 seconds of the simulation with those presented by Fernández-Turiel [14] are qualitatively compared. In **Figure 10**, the copper concentrations at ground level (in number of particles per square meters) computed using grids of 100 × 100 m and the concentration (in mgCu/Kgsoil) published in Fernández-Turiel [14] are shown. **Figure 11** compares copper concentrations at ground level (in number of particles per square meters), but these have been computed using grids of 10 × 10 m. **Figure 12** shows the same concentra‐ tions as in **Figure 11** on a satellite image with the purpose of observing the affected sites.

Computational Tools for the Simulation of Atmospheric Pollution Transport During a Severe Wind Event in Argentina http://dx.doi.org/10.5772/63552 131

1

where *k* is a shape parameter and *c* denotes the scale parameter. These are computed consid‐

( )

In this case, the average wind velocity has been obtained from register of meteorological stations during a severe eastern wind. This value at 10 m over ground is *u*¯ <sup>10</sup> = 2.5 m/s. Weibull

The logarithmic law has been used to estimate the average velocity of wind under 100 m height. It is used with a friction velocity *u\** = 0.276 m/s, a roughness parameter *z*<sup>0</sup> = 0.228 m and a *von-Kármán* coefficient *kv* = 0.4. Above 100 m height, the potential law of wind is used with power

**4.3. Results of the comparison between the measurements and the numerical simulation**

The copper concentrations obtained at 4000 seconds of the simulation with those presented by Fernández-Turiel [14] are qualitatively compared. In **Figure 10**, the copper concentrations at ground level (in number of particles per square meters) computed using grids of 100 × 100 m and the concentration (in mgCu/Kgsoil) published in Fernández-Turiel [14] are shown. **Figure 11** compares copper concentrations at ground level (in number of particles per square meters), but these have been computed using grids of 10 × 10 m. **Figure 12** shows the same concentra‐ tions as in **Figure 11** on a satellite image with the purpose of observing the affected sites.

*k*

1 1

0.94

*k u*

ì = ï ï

parameters obtained with Eq. (46) are *k* = 1.486 and *c* = 2.766.

**Figure 9.** Calculation scheme of the initial vertical velocity of solid particles.

*<sup>u</sup> <sup>c</sup>*

<sup>í</sup> <sup>=</sup> <sup>ï</sup> G + ïî

*<sup>k</sup> k u <sup>c</sup> k u*


è ø (45)

(46)

( )

130 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

ering the average wind velocity [36]:

coefficient *n* = 4.6.

*pu e c c*

**Figure 10.** Copper concentration at ground level. Solid colour: simulated with 100 × 100 m grid mesh (part/m2 ). Lines: measurements (mgCu/kgsoil) with 100 × 100 m by (Fernández-Turiel [14]).

**Figure 11.** Copper concentration at ground level. Solid colour: simulated with 10 × 10 m grid mesh (part/m2 ). Lines: measurements (mgCu/kgsoil) with 100 × 100 m by (Fernández-Turiel [14]).

**Figure 12.** Copper concentration at ground level simulated with 10 × 10 m grid mesh on a satellite image.

## **4.4. Discussion**

In **Figure 10**, it is noted that concentrations simulated by the model are calculated by counting the number of copper particles falling in grids of 100 × 100 m similar to those samples that were used in Fernández-Turiel [14]. Concentration contours indicate the soil contamination that has been produced by the smelter during operation; simulating this only considers pollution with the eastern wind. Considering this situation, it can be seen that there is good agreement between the maximum value that indicates the model with closed contour 45 mgCu/Kgsoil concentration. Other major peaks of concentration are studied by Fernández-Turiel [14], but they are associated with other wind directions. However, when the concentrations simulated by the model are calculated using grids 10 × 10 m, as shown in **Figure 11**, they can be individ‐ ualized with the plumes of each chimney.

The model shows that concentrations of copper particles have little lateral dispersion. This implies that if concentrations are calculated using smaller grid cells, they will be much higher in the area where they fall. This is the reason why both figures legends differ by a factor of 10. It is likely that if the particle size was smaller, lateral dispersion increases.

The concentrations shown by Fernández-Turiel [14] to the west from the location of the smelter still indicate the existence of copper particles beyond those shown in the simulation. If the copper particles are simulated with different diameters, it is likely that smaller particles travel on greater distance before reaching the ground.

## **5. Conclusions**

We have emphasized the use of computer simulation tools for the atmospheric dispersion of gases and solid particles in this chapter.

The first part of the chapter presented the large eddy simulation (LES) approach in order to numerically solve the turbulent flows of the great Reynolds number as those presented in the atmospheric boundary layer. Next, the coupling between the large eddy simulation (LES) and Lagrangian Stochastic one particle model (STO) was presented to detail how to dynamically calculate the model coefficients based on the kinetic energy of turbulent fluid flow.

The validation of these tools has relied on experimental measurements in a wind tunnel and in situ measurements. The stochastic Lagrangian one-particle method enables simulations without much computational cost and with good results. The results of the numerical simu‐ lation using the ARPS code (LES-STO) of the concentrations of CO2 emitted by a chimney up wind uphill are in good agreement with experimental measurements performed in a wind tunnel [13].

Fernández-Turiel [14] has found high levels of trace of copper in soil and plants in the vicinity of smelter. As a consequence, it is important to determine the extent of the contaminated area and the concentration of these elements that might be potential hazards due to inhalation and ingestion. The model has been used to simulate the atmospheric dispersion of these particles emitted from the smelter chimneys during a severe eastern wind event. The results closely matched field measurements made by these authors.

Finally, this approach has been able to predict the dispersion and level concentration of pollutants (gases and/or solid particles) in the atmosphere with the aim of predicting the impact on the populations and preventing environments problems. This information can be used to determine the affected areas close to industrial factories and emission control protocols that need to be used during specific weather conditions. By understanding the local pollution concentrations and their movement, future industrial factories sites could be installed in locations that would minimize impacts to the surrounding population.

## **Author details**

**Figure 12.** Copper concentration at ground level simulated with 10 × 10 m grid mesh on a satellite image.

132 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

In **Figure 10**, it is noted that concentrations simulated by the model are calculated by counting the number of copper particles falling in grids of 100 × 100 m similar to those samples that were used in Fernández-Turiel [14]. Concentration contours indicate the soil contamination that has been produced by the smelter during operation; simulating this only considers pollution with the eastern wind. Considering this situation, it can be seen that there is good agreement between the maximum value that indicates the model with closed contour 45 mgCu/Kgsoil concentration. Other major peaks of concentration are studied by Fernández-Turiel [14], but they are associated with other wind directions. However, when the concentrations simulated by the model are calculated using grids 10 × 10 m, as shown in **Figure 11**, they can be individ‐

The model shows that concentrations of copper particles have little lateral dispersion. This implies that if concentrations are calculated using smaller grid cells, they will be much higher in the area where they fall. This is the reason why both figures legends differ by a factor of 10.

The concentrations shown by Fernández-Turiel [14] to the west from the location of the smelter still indicate the existence of copper particles beyond those shown in the simulation. If the copper particles are simulated with different diameters, it is likely that smaller particles travel

It is likely that if the particle size was smaller, lateral dispersion increases.

**4.4. Discussion**

ualized with the plumes of each chimney.

on greater distance before reaching the ground.

César Augusto Aguirre1,2\* and Armando Benito Brizuela2

\*Address all correspondence to: cesaraguirredalotto@gmail.com

1 Center of Scientific Research and Technology Transfer Production - National Scientific and Technical Research Council - Argentina (CICyTTP - CONICET), Diamante, Entre Rios, Argentina

2 School of Agricultural Sciences - National University of Entre Ríos (FCA - UNER), Oro Verde, Entre Rios, Argentina

## **References**


[14] Fernández-Turiel J. L., Aceñolaza P., Medina M. D., Llorens J. F. and Sardi F. *Assessment of a smelter impact area using surface soils and plants*. Journal of Environmental Geochem‐ istry and Health. 2001;23:65–78.

**References**

Fluid Mechanics. 1970; 41:453–480.

134 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

1985;11:119–192.

ics. 1994;26:23–36.

thesis.pdf

Fluids. 1986;29(2):378–405.

Fluid Flow. 2006;27(4):627–635.

phase Flow. 2006;32(3):344–364.

1991;54:211–230.

[1] Deardorff J. W. *A numerical study of three-dimensional turbulent channel flow*. Journal of

[2] Deardorff J. W. *The use of sub-grid transport equations in a three-dimensional model of*

[3] Schumann U. *Sub-grid scale model for finite difference simulations of turbulent flow in plane*

[4] Pope S. B. PDF methods for turbulent reactive flows. Energy Combustion Science.

[5] Haworth D. C. and Pope S. B. *A generalized Langevin model for turbulent flow*. Physics of

[6] Pope S. B. *Lagrangian PDF methods for turbulent flows*. Annual Review of Fluid Mechan‐

[7] Gicquel L. Y. M., Givi P., Jaberi F. A. and Pope S. B. *Velocity filtered density function for large-eddy simulation of turbulent flow*. Physics of Fluids. 2002;14(3):1196–1213.

[8] Aguirre C. A. *Dispersión et Mélange Atmosphérique Euléro-lagrangien de Particules Fluides Réactives. Application à des cas simples et complexes* [thesis]. Université Claude Bernard, Lyon, France: Ecole Doctorale MEGA; 2005. 323 p. Available from: http:// venus.ceride.gov.ar/twiki/pub/Cimec/RepositorioDeInformesTesis/aguirre-phd-

[9] Aguirre C. A., Brizuela A. B., Vinkovic I. and Simoëns S. *A sub-grid Lagrangian stochastic model for turbulent passive and reactive scalar dispersion*. International Journal of Heat and

[10] Aguirre C. A., Brizuela A. B., Vinkovic I. and Simoëns S. Eulero-Lagrangian coupled model for the Simulation of atmospheric dispersion of chemically reactive species into

[11] Vinkovic I., Aguirre C. A., Ayrault M. and Simoëns S. *Large-eddy simulation of the dispersion of solid particles in a turbulent boundary layers*. Journal of Boundary Layers

[12] Vinkovic I., Aguirre C. A., Simoëns S. and Gorokhovski M. *Large-eddy simulation of droplet dispersion for inhomogeneous turbulent wall flow*. International Journal of Multi‐

[13] Gong W. *A wind tunnel study of turbulent dispersion over two – and three- dimensional gentle hills from upwind point sources in neutral flow*. Journal of Boundary Layer Meteorology.

the boundary layer. Serie Mecánica Computacional. 2006, 25 (2): 185-205.

Meteorology. 2006;121:283–311. DOI: 10.1007/s10546-006-9072-6

*atmospheric turbulence*. Journal of Fluid Engineering. 1973; 4:429–438.

*channels and annuli*. Journal of Computational Physics. 1975;18:376–404.


[29] Aguirre C. A. and Brizuela A. B. Numerical Simulation of Atmospheric Dispersion Gas Passive on a hill using a coupled model. Serie Mecánica Computacional. 2008; 27 (4):

[30] Kosinski P., Kosinska A. and Hoffmann A. C. *Simulation of solid particles behaviour in a*

[31] Sommerfeld M. *Analysis of collision effects for turbulent gas–particle flow in a horizontal channel: Part I. Particle transport*. International Journal of Multiphase flow. 2003;29:675–

[32] Aguirre C. A., Y. Guo and Ayrault M. Dispersion of solid particles in saltation move‐ ment in a turbulent flow. Journal Comptes Rendus Mécanique. 2004; 332: 627-632. [33] Aguirre C. A., Simoëns S. and Ayrault M. *Dispersed of solid heavy particles in a homoge‐ neous turbulence*. In: Brebbia C. A. and Martin Duque J. F., editors. *Air Pollution*. 10th

[34] Gong W. and Ibbetson A. *A wind tunnel study of turbulent flow over models hill*. Boundary

[35] Counihan J. *An improved method of simulating an atmospheric boundary layers in a wind*

[36] Justus C. G. *Wind and system performance*. Philadelphia, Penn, USA: The Franklin

ed. Southampton, UK: Wessex Institute of Technology Press; 2002. p. 6.

*tunnel*. Journal of Atmospheric Environment. 1969;3:197–214.

*riven cavity flow*. Powder Technology. 2009;191:327–339.

136 Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Layer Meteorology. 1989;49:113–148.

Institute Press; 1978. 120 p.

217-237.

699.

## *Edited by Jill S. M. Coleman*

Natural and environmental hazards research comprises a diverse set of subjects and methodologies and this book is no exception - offering the reader only a small glimpse into the physical and social processes that threaten human interests. Atmospheric Hazards-Case Studies in Modeling, Communication, and Societal Impacts explores atmospheric-based hazards through focused investigations ranging from a local to global perspective. Within this short compendium, the major scales of atmospheric motion are well represented with topics on microscale turbulent transport of pollutants, mesoscale events stemming from thunderstorm complexes, and synoptic scale extreme precipitation episodes. Chapters include discussions on modeling aspects for investigating hazards (pollution, regional climate models) and the forecasting and structure of high wind events (derechos), whereas others delve into hazard communication, preparedness, and social vulnerability issues (tornadoes, hurricanes, and lightning). Although the chapters are quite disparate upon first inspection, the topics are united through their interweaving of both the physical and societal mechanisms that create the atmospheric hazard and eventual disaster.

Photo by Anna\_Omelchenko /CanStock

Atmospheric Hazards - Case Studies in Modeling, Communication, and Societal Impacts

Atmospheric Hazards

Case Studies in Modeling, Communication,

and Societal Impacts

*Edited by Jill S. M. Coleman*