September 21, 2009
Fire Risk Assessment – Mapping Wildland Fires
Please note that contributed articles, blog entries, and comments posted on GIScafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Welcome to GISWeekly!
GISWeekly examines select top news each week, picks out worthwhile reading from around the web, and special interest items you might not find elsewhere. This issue will feature Industry News, Top News of the Week, Acquisitions/Agreements/Alliances, Announcements, New Products, Around the Web and Events Calendar.
GISWeekly welcomes letters and feedback from readers, so let us know what you think. Send your comments to me at
Susan Smith, Managing Editor
Fire Risk Assessment – Mapping Wildland Fires
By Susan Smith
Every summer, wildland fires sweep across sections of the U.S. and threaten human lives, property, livestock, watersheds and endangered species. The methods to fight these fires have evolved to meet the changing fire environment: fuels build up as a result of the lack of controlled burns and clearing of federal and state lands, coupled with the effects of climate change and what is known as the “wildland urban interface,” where more people move out to the edge of the forest.
Sanborn of Colorado Springs, Colorado is a company well situated to address these concerns. With technology centers across the United States and abroad, the company offers a full spectrum of photogrammetric mapping and GIS services. In an interview with Jim Schriever, VP of solutions for Sanborn, GISWeekly discussed Sanborn’s fire risk assessment model and how it is put together.
Jim Schriever: How we started doing fire risk assessment was with Camp LeJeune in North Carolina. Military bases do tend to create a number of fires, so Camp LeJeune was very interested in having some fire modeling capabilities to not only understand where there was risk, but even with real time fire behavior analysis, if a fire did start, how would they contain a fire.
Then a few years later, we did a fire risk assessment for the state of Florida. We did some base data sets, fuels data sets as well as topography. We develop these weather influence zones, so there’s a meteorological component to it. We put all those core data sets together and then we basically develop a risk model that looks the susceptibility of specific areas to fire. We can get things like communities at risk. After our Florida success, we did this for the 13 southern states. We’ve completed that project and are actually in the maintenance stage of that program, and are going to be updating the entire region.
Part of what was driving us was that all the fire money tends to go to the west. The south wanted to bring attention to the fact that they have a problem too. The states banded together to demonstrate what the fire problem is in the south, and hopefully get some of those dollars and create awareness of what the problems are.
Most recently we’ve signed a contract with the Oregon Department of Forestry (ODF) to develop a fire risk assessment for 17 western states. Part of that contract includes some of the Pacific island territories such as Guam, the Marshalls, the Mariana Islands, and Samoa.
Ultimately, we’ll have about over 30 states that are licensed to our fire risk assessment software product.
GISWeekly: How is the Stimulus money affecting fire risk assessment?
Jim Schriever: Right now there are a lot of stimulus dollars to actually modify the fuels and reduce the risk. Instead of just reporting on acres treated, which is what they’ve done in the past, we’re trying to look at ROI and having people really be able to quantify what they’ve done with regards to the social, environmental or economical impacts and effects they’ve been able to reduce. For instance, there is a dollar exposure in Santa Fe; if a fire comes to Santa Fe, what’s the dollar exposure for communities within Santa Fe? Based on a certain level of susceptibility, that dollar exposure may be $300 million. Let’s modify the fuels and rerun the program and
maybe that exposure is reduced down to $150 million so you’ve had a net reduction and exposure of $150 million.
The Federal government has a program called LandFire which is a joint effort of the Department of the Interior and USDA that actually is mapping all the fuels across the U.S. Having that data available is a significant reduction in the cost, so it makes it a lot more accessible. We do map fuels but also leverage any publicly available datasets, and we actually do some of mapping for USGS and USDA so we do get involved in the LandFire program. One of the first things to develop are the core fuels data sets, next we develop a fire occurrence dataset, which takes into account where fires occurred in the past so many years. We try to go back as far as we can, so if we can get 20 years of
data for all the different states, that’s good. Federal fire reporting is very good and well organized, at the state level it’s a hodgepodge. The locals want to go out put the fire out and go home and go to bed. They don’t really want to do a bunch of reporting.
We need to know where did the fire occur, when did the fire occur and when was it contained, so we can begin to develop rate of spread, final fire size, etc. in one database. Another database we put together from subject matter compiled from subject matter experts on the Sanborn team. We have a meteorologist who can help us develop weather influence zones, so when a fire does occur the rate of spread, the final fire size, what the fuel conditions are going to be like, are all data we can compartmentalize. A fire in South Dakota is quite a bit different than a fire in California, obviously, so we break up the states into “influence zones.” We also develop “other risk input
datasets” and those could be a variety of different factors like soils. The reason soils are important in the south is that they have muck or peat soils, which prevent firefighters from fighting the fires. In the west soils are important, because after fires, particularly in the Los Angeles region, slides follow. The soil and topography are some core datasets that are developed as part of it. And we have to do some calibration to these datasets.
Once we’ve got those core risk input datasets the next task we move into is the actual analysis and reporting to summarize all of that. Next we move into the wildfire risk model development and assessment component. For that we bring in all our fire occurrence data summarized which include weather influence zones, percentile historical weather observations coupled with fire occurrence data and historical fire locations, and add the fire behavior datasets which includes service fuels, canopy closure, topography, etc. and finally fire suppression effectiveness. Included are also rate of spread vs. final fire size as well as historical protection response. With those three core pieces
we’re able to develop what we call “wildfire threat” which is the susceptibility of an area to burn. At this point, the burn may be a good thing. Just because it’s susceptible doesn’t mean it’s a real problem to burn.
We run it through our model and come up with a wildfire susceptibility index. Those come out in 30 meter pixel cells. Most of this stuff is based on Landsat data – 30 meter, although as we move into things like community wildfire protection plans, we do have a higher resolution dataset, but the concepts are still the same. For every cell we have a wildfire susceptibility index so then from there, we can move into the fire effects piece, to determine this area is susceptible.
In the fire effects you start looking at values impacted. We’ll develop quite a few different layers, and look at production forests for the western states, for example. If federal land burns, there’s an issue there, but if state lands or private lands burn, there’s more of an economic impact, and the insurance folks are concerned. Things to take into consideration include homes, the wildland urban interface, transportation and critical infrastructure, so protected species, critical watershed. We have a consensus with the customer on how to weight those different factor, and come up
with different weightings to get the values impact rating which we combine with suppression difficulty ratings such as slope.
You can find the full GISCafe event calendar here.
To read more news, click here.
-- Susan Smith, GISCafe.com Managing Editor.
Be the first to review this article