It was only last month that I wrote about hurricane-resistant design in my report on the Alabama AIA Annual Convention, held in Orange Beach, Alabama from July 31 to August 2. During my drive to the convention center along the coast of Alabama, I had witnessed first-hand the enormous devastation caused by Hurricane Ivan last year, which was still very much in evidence, and I described some of the sessions at the conference that were related to hurricanes, disaster assessment, reviewing building damage, and so on. Little did anyone present at that convention know that a far more deadly hurricane would soon hit and wash out practically the entire city of New Orleans, and that it would be immediately followed by another powerful hurricane which would worsen the damage and devastate other areas of the Gulf Coast. While the immediate needs of the regions affected by hurricanes Katrina and Rita are monetary contributions and volunteers to assist in the rescue, relief, and rehabilitation efforts, it is going to be a tremendous challenge over the long run to rebuild New Orleans and other affected areas along the Gulf Coast.
While the technological resources we currently have at our disposal were evidently not sophisticated enough to prevent the disastrous consequences of the recent hurricanes, can technology play a more critical role in the future in allowing us to design cities and buildings that don't suffer the same fate New Orleans did? This issue of the "Building the Future" series looks at how technology is being used right now in the relief, rehabilitation, and reconstruction efforts along the Gulf Coast, and what technologies we might need in the future for cities prone to natural disasters to combat them more effectively.
Let us first look at a brief history of hurricanes along the Gulf Coast and New Orleans in particular.
Prior to Katrina and Rita, thirty-four major hurricanes have crossed the Gulf coast since 1900, from Texas to the Florida Panhandle. Of these, only two hurricanes have been deadlier than Katrina: the 1900 hurricane in Galveston, Texas which killed over 8000 people and leveled large portions of the city; and the 1928 hurricane in Lake Okeechobee, Florida which killed close to 2000 people. Katrina's death toll has already crossed 1000, which makes it the deadliest hurricane in recent times. The two costliest hurricanes prior to Katrina were hurricane Andrew in 1992, which struck Florida and Louisiana and caused an estimated $26.5 billon in damage, and hurricane Charley, which struck Florida last year and caused $15 billon in damage. Compared to these numbers, the damage caused by Katrina is many times over-estimated to be as high as $200 billion.
By now, we all know about New Orleans' unusual geography that made it so vulnerable to being devastated by a strong hurricane-it is located below sea level; it has Lake Pontchartrain to its north; the Mississippi river runs through the middle of town; and it is bounded on the south by the Gulf of Mexico, where many of the huge storms originate. What is not so well known is that chilling predictions of just how bad the devastation would be-which have come true with Katrina-have been made by experts for several years. See the article "The Lost City of New Orleans?" published in December 2000 , and the five part series "Washing Away" published in June 2002 in The Times-Picayune . In particular, Part 2 of the latter series, entitled "The Big One," predicted that a major hurricane could decimate the region, but flooding from even a moderate storm could kill thousands-it was just a matter of time. The writers cited the example of hurricane Georges in 1998, a Category 2 storm that only grazed New Orleans, but still caused a lot of damage and pushed waves to within a foot of the top of the levees that protected the city. Any stronger storm on a slightly different course could realize the worst-case scenario: hundreds of billions of gallons of lake water pouring over the levees into an area averaging 5 feet below sea level with no natural means of drainage. And this is exactly what happened with Katrina.
Despite the many technological advances that society has made as a whole, it seems that we are still very much at the mercy of nature. Technology could not really help to fortify the city of New Orleans and its buildings from the destruction caused by hurricane Katrina, even though a calamity of this nature had been predicted for many years. So what is lacking in our technological repertoire? What tools are needed to design buildings and cities that can withstand natural disasters like hurricanes, earthquakes, and so on? And until we have these, can technology at least help in planning, directing, and monitoring evacuations better, so that we don't have people helplessly stranded as they were in New Orleans or 100 mile backups like the kind Houston had in anticipation of hurricane Rita, and in post-disaster rescue and relief operations? The next section provides an overview of some existing technologies that are helping, followed by a concluding section suggesting future technologies that we need to avoid a repeat of a Katrina-like disaster.
In the immediate wake of the hurricanes, various technology tools were used, first and foremost, in search and rescue operations, many of which were developed specifically for these tasks. These include pint-size robots that can move through crevices in a collapsed building to bring water, light and two-way communications to trapped survivors; miniature robot planes and helicopters that can survey the scene from above and send wireless video back to the team in the field; sensors that can detect signs of life from 3 feet away, based on thermal imaging or even the smell of a survivor's faint breathing; devices and software that can turn walkie-talkies into Internet grids when the phones are out; and sensor systems that can sniff out public health threats in the storm's aftermath. (Some of these technologies were also deployed after the tsunami that hit South Asia at the beginning of the year.) Software tools are being developed such as a meta-search engine for survivor lists and interactive maps that match the needy with what's needed. For more information on these technologies, see these two MSNBC articles: "Scientists bring gadgets to post-Katrina disaster scene" and "Hardware and software makes life easier for rescuers and rescued."
A critical need in disaster response is that of enabling and restoring communications, which is where technologies developed by networking companies come in. In the hurricane-affected areas, briefcase-sized mobile communication kits from Cisco, for example, are being deployed. These contain a packaged set of technologies designed to be easily transportable and provide mobile Internet Protocol (IP)-based wired or wireless data and voice connectivity for areas that have lost or do not have a communications infrastructure. This allows rapid communications in disaster or remote locations to be set up within minutes of arrival.
Another critical technology is GIS (Geographic Information Systems), which is used to view, analyze, and manage geographic knowledge that is represented using a series of information sets. The information sets include interactive maps and globes; databases of geographic data such as features, networks, topologies, terrains, surveys, and attributes; data models capturing the schema, behavior, and integrity rules of geographic data; and collections of procedures for manipulating GIS data for analysis and to automate common tasks. GIS links location to information (such as people to addresses, buildings to parcels, or streets within a network) and layers that information to provide a better understanding of how it all interrelates. One of the ways in which this technology was used in the post-hurricane search and recovery efforts was to provide "geo-addressing"-supplementing street addresses provided by stranded individuals with longitudinal and latitudinal coordinates which could then be used by emergency responders to locate them more easily. GIS technology is also being used to provide mapping support for a variety of governmental agencies, such as up-to-date maps of the New Orleans levee system and geo-coded addresses for water pumps located in the city for the U.S. Army Corps of Engineers. Maps and spatial data are updated daily and continue to be delivered to various task forces from different state and federal agencies to aid in recovery activities. (See this article for more information on how GIS is being used.)
While traditional GIS and mapping software vendors such as ESRI, Autodesk, Intergraph, and Mapinfo have been providing data, maps, images, software, and other resources for disaster response, a recently introduced technology from Google called Google Earth is being increasingly used to capture and share geospatial data related to the hurricanes. The Google Earth technology combines satellite imagery, the Google Maps technology, and the power of Google Search to provide access to geographic information globally. You can type an address and zoom right into the location, search for points of interest, get driving directions, as well as tilt and rotate the view to see 3D terrain and buildings. While the basic service is free, there are additional add-ons that can be purchased including Google Earth Plus, which adds GPS device support, the ability to import spreadsheets, drawing tools, and better printing; and Google Earth Pro for professional and commercial users needing location information in various industries, including AEC and real estate. Google Earth has a dedicated hurricane page where imagery of the impact of hurricane Katrina is being constantly added, supplied by agencies such as NOAA (National Oceanic and Atmospheric Administration), NGA (National Geospatial Agency), Space Imaging, Digital Globe, as well as by individual users. This imagery can be viewed as "image overlays" in Google Earth, i.e., they load on top of the pre-Katrina New Orleans base map (see Figure 1). It is proving to be an effective way to integrate data about the impact of the hurricanes from various sources as well as disseminate it to anyone who needs it.
One of the major problems facing the re-construction effort along the Gulf Coast is to understand the actual existing conditions of the area in which work will be done. Without reliable "as-is" documentation, teams will have to manually measure the existing dimensions in the field, which will be time-consuming, expensive, error prone and potentially unsafe due to inherent hazards. This is where technologies to document as-is conditions using laser scanning, developed by vendors such as Quantapoint, will be useful. Quantapoint specializes in the offshore, power, process and architectural markets, and has already committed to providing its services to restore operations to the significantly damaged process industries in the Gulf Coast region. Its as-built laser documentation provides more accurate and complete information for design, fabrication and construction decisions, including accurate dimensional fit-up, pre-fabrication and clash detection. Figure 2 shows a 3D model of an offshore platform created by Quantapoint's laser scanning technology in its Prism 3D interface, an application for managing, sharing, and extracting dimensional and other information from a 3D laser scan.
While technologies for predicting hurricanes and other weather-related phenomena are already sophisticated and are routinely used by meteorologists, we also need better simulation tools that can present accurate 3D real time simulations of various disaster scenarios such as flooding, forest fires, toxic gas spreading, oil spills, dams breaking etc., in neighborhoods and cities. I came across one promising technology that can begin to address this need: the 3D NGRAIN technology, which has been licensed by Aero Geometrics Ltd., a Vancouver-based mapping company. With this technology, a 3D image of an entire city, complete with buildings and topological information, can be captured in a file size of approximately 10 MB (small enough to be posted online) and various simulations can be run. For example, the upper image in Figure 3 shows a 3D image of the city of Miami as it currently is, untouched by disaster, while the lower image shows what areas would be flooded if the water level rose to 10 feet. The level of flooding can be manipulated interactively in the application, allowing city officials using it to predict the progress of the disaster should a flood actually occur, see which buildings were the most vulnerable, and determine which areas to evacuate first. Had a technology like this been deployed in New Orleans prior to Katrina, the evacuation warnings could have been given ahead of time and in a much more systematic manner.
With regard to the construction of individual buildings and structures such as levees, flood walls, bridges, and so on, there no dearth of academic research and professional knowledge on how to make them better capable of withstanding hurricanes. For example, the PATH (Partnership for Advancing Technology in Housing) website lists a number of building techniques to improve hurricane resistance; there is a proposal to use "smart concrete" to strengthen levees and monitor their reliability; computer models are being developed to predict hurricane damage in buildingswhich in turn can be used to guide the development of hurricane-withstanding structures; and so on. (Links to several such resources can be found on the ASCE website and the iCivilEngineer website). What is missing is an integration of this know-how with the tools architects and engineers are using to design buildings and other structures, and this is where AEC technology comes in. As I suggested in the last "Building the Future" article, BIM's ability to support analysis and evaluation of buildings is going to yield much more significant and far-reaching benefits in the long term as compared to its short term benefits of producing a better coordinated and more accurate drawing set more speedily and efficiently. We can capture various hurricane-resistant design principles (or earthquake-resistant design principles in areas prone to earthquakes) in analysis tools and run our BIM models through them for receiving feedback on how well the design meets the selected criteria as well as suggestions for improvement. How soon this scenario can be realized is hard to predict, but it can only happen once intelligent, semantically-rich representations of buildings becomes the norm rather than the exception. So it would certainly help if the building industry could transition to BIM as soon as possible.
What we also need is an extension of the BIM concept to the level of neighborhoods and cities, perhaps in the form of a "city information model" (CIM) which can capture all the critical data about a city's geographical location, topology, major roads, bridges, buildings, and so on in an intelligent format. In time, we could also find a smart way of integrating the BIM models of individual buildings within the city's CIM, so that we have a highly accurate and detailed digital replica of a city which can be subjected to sophisticated analysis and simulations. We could then predict the impact of a hurricane, earthquake, tsunami, gas leak, bioterrorist hazard, or any other kind of conceivable disaster not only on the city as a whole but on individual buildings and neighborhoods within the city as well. Just as BIM technology can help to better integrate different aspects of a building such as space, structure, mechanical systems, and so on, CIM technology could eventually help to better integrate the different structures and services within a city, allowing it to operate in a more holistic manner and deal with a disaster more effectively.
Lachmi Khemlani is founder and editor of AECbytes. She has a Ph.D. in Architecture from UC Berkeley, specializing in intelligent building modeling, and consults and writes on AEC technology. She can be reached at email@example.com.
AECbytes content should not be reproduced on any other website, blog, print publication, or newsletter without permission.