**3. Building 3-D cyberspace in the 2000s**

As the new millennium arrived, we had accumulated a large enough sample size of 3-D cyberspace experiences to develop a preference for specialized experiences—those that were designed for subsets of professional roles that could benefit from participating in visual environments that facilitated their roles and built community around them. We had experienced enough accumulated hours in 3-D cyberspace, though mostly by way of the desktop VR version, that every day our minds lingered there, as if in parallel lives. We concluded that creating a general Metaverse, as described in the book *Snow Crash*, required a specialty we were not comfortable in acquiring.

As a result, we formed deeper bonds with other academic departments on campus. I turned my attention to designing other useful 3-D cyberspace experiences for oceanographers. I chose oceanographers because they studied one of the two large 3-D volumes in which life thrived on the planet. The atmospheric scientists that studied the spaces where birds soared seemed to be ahead technology-wise thanks to a popular human interest in weather and weather forecasts. The 3-D world of life inside of human anatomy also seemed to be ahead technologically, thanks to the support of medical practices.

As we were contemplating 3-D virtual experiences to spur on insight regarding the nature of tsunamis and underwater earthquakes, and volcanos that can produce them, hurricane Katrina devastated the gulf coast of the United States [11]. A representative of the Federal Emergency Management Agency came to our lab to share stories of how the emergency response effort in New Orleans had been deemed suboptimal. Dr. Furness and I spent hours listening and then suggesting ideas on how VR could improve future hurricane response efforts. We began to attend emergency operations meetings at the campus, city, county, and regional level and concluded that the typical resident in a community crisis lacked situation awareness more than the emergency responders, even though they were physically based in that community.

We attended one particular meeting of many that drove that point home. A redfaced emergency operation center director, who had over 30 years of EOC experience, railed against the audacity of his county's residents in response to a severe wind event that had blown through the county just south of Seattle's. Overnight, over 250,000 households had lost power through thousands breach points on the power grid. Many linemen were working 16-hour days to restore power and yet hundreds of residents were calling in to the emergency operations center to complain that power wasn't being restored fast enough. He yelled to anyone listening that 2-week outages should have been expected, given the number of qualified workers who could repair the electric grid, and the number of homes in the dark.

Our personal experiences in 3-D cyberspace strongly suggested emergency preparedness could be a ripe place to benefit by having a thriving VR experience for residents to explore and engage in with others—including EOC personnel when available. We put together three hypotheses to test our instincts and then worked with three other advisors to whittle down our enthusiasm to a reasonable scope of experiments to perform.

We published the results in four papers [12–15] and a dissertation [16], comparing results of a physical hospital evacuation scenario drill with a virtual one. In the virtual drill, emergency responders explored the physical environment in which the evacuation was to take place by way of a software application. They could study the hospital's physical layout, transfer points to transportation, the county's road network, and 23 other hospital locations that agreed in an official memo of understanding to accept patients in an emergency.

Those in the study had access to a simulator where they could watch evacuation paths and timings for any patient in any room, based on time and motion studies done with physical evacuation efforts (**Figure 3**). They could watch simulated transport of vehicles carrying patients to destination hospitals.

A server computer communicated with the simulation clients that provided the graphical user interface for participants. The server kept track of 86 key variables on simulated patients, simulated emergency responders, transport vehicles, and events that required dynamic diversions to evacuation plans (e.g. traffic congestion). The hospital emergency response coordinator challenged the behavior of those variables as she became familiar with the client application I iterated upon in preparation of the experiments.

Adapting the server to feed a 3-D cyberspace version of the training and participation interface would be trivial and add an immaterial computational demand. Every object in the simulation was tied by gravity to a ground plane. The effort to port the participant experience to 3-D cyberspace would require significant effort in comparison.

**Figure 3.**

*The software support services for an emergency response hospital evacuation simulation application (top). Participants preferred 2-D artifacts to explore and interact with the situation as it unfolded (bottom). The 2-D hospital floor layout was used often in day-to-day operations, but patient characteristics were encoded in color for active patients from data sheets used in day-to-day operations.*

Earthquake, tornado, and fire events have been included in the SimCity game application since the 1990s. Players can watch the effect of those community crises and gain insight into how urban planning matters in facilitating resilience and recovery. The third-person view of a comprehensive emergency response effort is useful. The first-person view provides an opportunity to compare a limited situation awareness in comparison—highlighting the importance of communication between all those limited views, in order to attempt to create a third-person view from firstperson experiences.

Dr. Furness and I came to the conclusion that one of the roles in an emergency operation center should be a dedicated situation awareness specialist who spends much of his time in 3-D cyberspace piecing together situation awareness from the first-person reports of emergency responders—coordinated with 3-D graphical assets. Modeling and computational support would be provided by way of a coupled computing environment that contained heuristic-based equations and forecasting equations. The immersed VR role would be available to the command and control leadership in the EOC to answer questions regarding feasibility and appropriateness of actions and to evoke insight that command and control might not be considering.

I came to the conclusion that the 3-D cyberspace that would be developed for EOC personnel and emergency responders should also be made available to the general

public by way of a 3-D cyberspace experience. Immersive VR might build compassion for the cognitive demands, under urgent conditions, of the first response effort. Immersive VR would give access to experiencing the delegation of roles involved, so as to prepare the public with reasonable expectations for interacting with those roles in an emergency. Overall, the immersive VR perspective would build up experiential knowledge to go with any more abstract knowledge the public has of their community.
