Data Evaluation or Data Analysis for Fire Modeling
A lengthening of the fire season, coupled with higher temperatures, increases the probability of fires throughout much of western North America. Although regional variation in the frequency of fires is well established, attempts to predict the occurrence of fire at a spatial resolution <10 km2 have generally been unsuccessful. We hypothesized that predictions of fires might be improved if depletion of soil water reserves were coupled more directly to maximum leaf area index (LAImax) and stomatal behavior. In an earlier publication, we used LAImax and a process-based forest growth model to derive and map the maximum available soil water storage capacity (ASWmax) of forested lands in western North America at l km resolution. To map large fires, we used data products acquired from NASA’s Moderate Resolution Imaging Spectroradiometers (MODIS) over the period 2000-2009. To establish general relationships that incorporate the major biophysical processes that control evaporation and transpiration as well as the flammability of live and dead trees, we constructed a decision tree model (DT). We analyzed seasonal variation in the relative availability of soil water (fASW) for the years 2001, 2004, and 2007, representing respectively, low, moderate, and high rankings of areas burned. For these selected years, the DT predicted where forest fires >1 km occurred and did not occur at ~100,000 randomly located pixels with an average accuracy of 69%. Extended over the decade, the area predicted burnt varied by as much as 50%. The DT identified four seasonal combinations, most of which included exhaustion of ASW during the summer as critical; two combinations involving antecedent conditions the previous spring or fall accounted for 86% of the predicted fires. The approach introduced in this paper can help identify forested areas where management efforts to reduce fire hazards might prove most beneficial.