21 November 2017

Flood mapping for large areas (e.g. from catchments to a national level) is relatively new and challenging; associating mapped flood-prone areas with an annual chance of occurrence (in probabilities, say 1%) of a flood event involves many kinds of uncertainties and is even harder.

Flood maps that suggest various hazard levels over a large area are often in short supply for applications in insurance, emergency, planning, etc. This creates a perplexing situation where some users resort to open and accessible flood maps, such as FEMA’s National Flood Hazard Layer dataset, for different applications and interpretations, while ignoring the lineage and caveats attached to the underlying dataset. No single flood map is a panacea!

There is a real need to create new, insightful information products that can be used to proactively investigate flood-prone areas across a whole range of scales – from sites to river basins to a national level. Elevation is the single most important variable in determining flood hazard levels, and we focus on this by developing three flood analysis tools. Each can be implemented at scale and is a significant undertaking. We hope these tools can provide more geographic context and shed light on some key hydrodynamic processes about flooding.

1. Address-level Location Profile Report: The Importance of Elevation for Flood Mapping

The USGS has been developing and constantly updating the critical National Elevation Dataset (NED) over the past few decades, including the DEMs at ~60m, ~30m, ~10m, ~3m and ~1m resolutions (note that the resolution is approximate as the original NED is in geographic coordinates in units of decimal degrees). It appears that the two nationwide DEMs at ~30m and ~10m resolutions have been widely used for large-area flood mapping by various vendors. As elevation is the most critical input for flood mapping, it is useful to keep the following two major issues in mind:

  • The DEMs at a given resolution may be produced in earlier years with different production methods and lower quality levels. Figure 1 shows one aspect of these, the DEMs currency by year. Full information can be found at the USGS 3D Elevation Program (3DEP) websites about their current and historical release notes.

  • The vertical accuracy of NED. For example, the vertical accuracy of ~10m NED is 3.04m at 95% confidence level, based on a USGS assessment report published in 2014. It is reasonable to expect improved vertical accuracies for more recent versions of the NED.

The very recent, highly-commendable showcase (hosted by the Argo Group) on the comparison of U.S. flood models reported that for the same set of exposure locations, large dispersion of underlying elevation values was observed among four independent vendors. This suggests that the DEMs, albeit all sourced from the USGS, are somehow different. Indeed, underlying elevation values could be modified due to projections, spatial interpolations, data types, etc. It is common to observe an elevation difference up to a couple of metres even for the same site after some levels of post-processing. All these have serious implications for flood mapping and potential loss estimation. A rethink is warranted on how to reconcile the large elevation differences in the very first stage of flood modelling.

Figure 1: The currency of the USGS National Elevation Dataset (August 2017 release). Source: USGS.

In Australia, the national elevation data has been provided by Geoscience Australia. About 75% of populated areas have been covered by LiDAR-derived DTMs, typically at 5m resolution.

Using the above publicly-available national elevation datasets, along with other data sources, we have developed cloud-based analytics platforms (PropertyLocation360.com and PropertyLocation.com.au) to rapidly make address-level location profile reports. In each report, elevation is examined at a granular level and from multiple perspectives, including 3D views, slopes and water flow directions. All of these are closely related to flood modelling. In addition, various visualisation methods being explored are useful for assessing the quality of underling elevation data, e.g. the identification of detailed ground features (e.g. levees and roads) and any artifacts or abnormal terrain patterns.

Figure 2 shows an example of various elevation metrics, for a flood-prone address in Houston, TX. Many sample reports for the contiguous United States (CONUS) and Australia are available at the above websites.

Figure 2: Flood-related elevation metrics included in a typical exposure location profile report. (Full report was prepared before the 2017 Hurricane Harvey hit the region.)

2. Flood Simulation by Elevation

Flood simulation by elevation is commonly referred to as the bathtub or bucket-fill approach. It is often under criticism for its simplification and inability to incorporate complex hydrodynamic processes. But for many applications involving large areas or in aggregate nature (e.g. exposure assessment), this easy-to-understand approach is always handy and exploratory. Whether it is for coastal inundation or inland flooding, low-lying or flat floodplains can be delineated efficiently (e.g. Figure 3).

Our main contribution here is to create a cloud-based, automated approach for such simulations. Like the first tool, flood simulation by elevation can be generated and delivered within seconds for all locations in the contiguous U.S. and Australia, using the above two cloud platforms.

Figure 3: Flood simulation by elevation for an address in Houston, TX. Related elevation metrics are shown in Figure 2.

3. Flood Mapping by Inundation Depth

We have developed new geospatial processing routines for this, by capturing two key flood attributes – catchment sizes (two-dimensional) and inundation depths (vertical dimension). We develop large-area flood mapping with aggregate analysis in mind, therefore it is different from event-based classic flood models. Externals factors, such as mitigation measures (say levees, unless clearly reflected in underlying elevation data) and drainage capacities, are not considered.

Figure 4 illustrates this approach: flood extent expands as inundation depth increases. It is adaptive as the mapped flood extent is up to the minimum catchment size chosen and the maximum inundation depth that is derived from historical records or statistical extrapolations. These properties determine the flood mapping across scales. The use of a small catchment size and a major inundation depth can delineate flood-prone areas that may be conservative for risk aversion. Potential riverine and coastal flooding is covered while flash flooding excluded. We apply this new approach for large-area flood mapping in the contiguous U.S. and Australia.

Figure 4: An example of flood mapping by inundation depth.

3.1 The Contiguous U.S. 

Figure 5 shows an example of modelled flood extent largely in line with the FEMA’s flood map, suggesting the new flood mapping may be used to fill some coverage gaps that are present in the current FEMA’s dataset. We can provide many separate validations for interested users in this regard. For large and flat floodplains, our current approach tends to be conservative, overestimating flood-prone areas under some large depth thresholds.

Figures 6 and 7 show mapped flood-prone areas at the state level in the contiguous U.S. The respective results are particularly useful for evaluating exposure concentration at a broad level, which might impact a company’s bottom line when catastrophic events take place.

Figure 5: Comparison of flood mapping result. Left: red polygons show FEMA’s National Flood Hazard Layer (1% Annual Exceedance Probability); Right: blue areas represent modelled flood-prone areas by inundation depth. Minimum catchment size for mapping ~7 sq km.

Figure 6: Modelled major flood-prone areas in parts of South Carolina and North Carolina. Minimum catchment size for mapping ~50 sq km.

Figure 7: Modelled major flood-prone areas in the contiguous U.S. Boundaries of 18 USGS HUC-2 regions (Hydrologic Unit Code, 2-digit) are superimposed. Minimum catchment size for mapping ~450 sq km.

3.2 Australia 

We extend the same approach to the Australian continent, and the 30m-resolution DEM-H dataset from Geoscience Australia has been analysed. Figure 8 demonstrates that flood extent grows as inundation depth increases, for the area near the confluence of Brisbane and Bremer Rivers, one of the most flood-prone regions in Australia.

Figure 9 shows modelled major flood-prone areas at the national level. The high-risk areas are actually in the two eastern coastal drainage divisions (i.e. broadly, east of the Great Dividing Range). To our knowledge, this is the first time that such flood mapping (with depth information) is done for the whole country. We will make a separate post demonstrating more results for Australia.

Figure 8: Illustration of changing flood extent in response to various inundation depths. Minimum mapping unit for catchment size ~50 sq km.

Figure 9: Modelled major flood-prone areas in Australia. Boundaries of 12 drainage divisions are superimposed. Minimum catchment size for mapping ~450 sq km.

4. Return Periods?

When talking about the frequency of flood events, we advocate the use of annual chance of occurrence (in probabilities, say 1%) or exactly Annual Exceedance Probability (AEP), in order to avoid the potential misinterpretation of Return Periods or Average Recurrence Intervals by non-professionals. One should note that all these are merely pragmatic measures to quantify flood risk in the ballpark. In theory, there are no such rigid numbers for the natural system that is nonstationary in nature. A case in point: the dangerous global warming and climate change has been altering the frequency and magnitude of floods in space and time.

As flood is a compound phenomenon, its frequency should be more closely tied to water levels than that of regional rainfall or precipitation. We treat the estimation of flood frequency as a separate task in addition to the mapping of flood-prone areas above. Historical water-level records from monitoring gauges may be used for such estimation. (Of course, flood water levels could also be affected by a wide range of dynamic factors, such as land cover changes over time, the construction of dams in upper streams and flood defence systems.) In short, to derive a reasonable range of flood frequency, a data-driven and statistical approach is not enough, and one also needs to consider contextual changes in the environment.

5. Future Work

The three new tools introduced above are complementary to the flood data from local, state and federal government agencies. Any flood studies should sufficiently examine key aspects of the underlying elevation data; otherwise, results from subsequent flood modelling and loss estimation would be significantly compromised. We encourage developers and end-users to apply critical views and work together to achieve realistic yet innovative solutions.

We will continue R&D on large-area flood mapping, by focusing on the following directions:

  • Enhancing the current methods and tools to digest DEMs at increasingly finer resolutions, e.g. from 30m to 10m to 1m.

  • Expanding coverage to other countries, e.g. Japan, South Korea and China. As the tools and the cloud platforms are largely generic, it would be more efficient to carry out new studies.

  • Besides the effort on content creation, we plan to explore new ways to share the datasets and analytics on the cloud through APIs [Update 06/2018: Initial release of related Web APIs].

 

Related Blogs:

– Two Additional Tools to Advance Flood Risk Analytics at Scale in Australia (link, 05/2019)

– Advancing Flood Risk Analytics with Location Profile APIs (link, 09/2018)