skip banner Southern Forest Resource Assessment    Draft Report


Search this site:

 

Home > Draft Report > FIRE   

Previous PageNext Page

5.2 Effects of Prescribed Burning

5.2.1 Soil

Many factors, including fire intensity, ambient temperature, vegetation type, and soil moisture influence the effects of fire on the soil (Wells and others 1979). Low-intensity prescribed fires have few, if any, adverse effects on soil properties; in some cases such fires may improve soil properties (McKee 1982). Repeated burning over a long period may involve available phosphorus, exchangeable calcium, and organic matter content of mineral soil. Fire volatilizes nitrogen from the forest floor but the losses are often offset by increased activity of nitrogen-fixing soil microorganisms after the fire. Calcium and phosphorus may be lost from the forest floor but are partially retained in lower mineral soil horizons. Low-intensity burns have little, if any, adverse effect on soil erosion even on relatively steep slopes (Goebel and others 1967, Brender and Cooper 1968, Cushwa and others 1971).


Prescribed burns conducted when the soil and fuel are too dry can cause severe damage. Broadcast burns conducted under these conditions can remove the entire forest floor and accelerate erosion in steep terrain. High-intensity prescribed fires have a temporary negative effect on site nutrient status resulting from volatilization of nitrogen and sulfur, plus some cation loss due to ash convection. Such effects are short-lived after low-intensity fires, but recovery is not as rapid after severe fires.


Site-preparation burns of high intensity with high fuel loads and low soil moisture may damage soil by overheating. When burning is done with soil moisture near field capacity, however, little heating damage will occur (DeBano and others 1977). Fires that burn completely to mineral soil may accelerate soil erosion in steep terrain. Soil loss after severe burns can exceed 200 tons per acre per year in the Piedmont (Van Lear and Kapeluck 1989). Infiltration is decreased and run-off and sediment yield increased after severe burns in the Southern Appalachians (Robichaud and Waldrop 1994). After less severe burns, such losses have not been documented in the South, but they appear to be negligible (Van Lear and Danielovich 1988).


5.2.2 Vegetation

Plants in fire-prone ecosystems have adapted to fire in various ways, including thickening of bark, ability to resprout in from below the soil surface, and dispersing seeds. Some trees have thick insulating bark, which protects them from the scorching heat of surface fires (Hare 1965). Mature longleaf pine is well known for its resistance to fire damage because of its thick bark. Slash, loblolly, and shortleaf pines also generally survive bole scorch when they reach sapling size or larger (Komarek 1974). Virginia pine and white pine tend to have thinner bark and are more susceptible to fire damage. However, when pine trees are young, crown scorch rather than damage to the bole is the principal cause of mortality (Cooper and Altobellis 1969, Storey and Merkel 1960).


Southern pines have the ability to leaf out soon after defoliation from crown scorch (Komarek 1974). Trees are most susceptible to crown scorch during the spring, when leaders are succulent. Suppressed trees are more susceptible to fire-induced mortality than are dominant and codominant trees (Waldrop and Van Lear 1984). Diameter growth apparently is not significantly affected when crown scorch and root damage are minimal (Wade and Johansen 1986).


Aboveground portions of hardwood species are not as resistant to fire damage as conifers, primarily because of thinner bark. Bark thickness is not as critical to hardwood survival in Appalachian hardwoods, because most fires burn in light fuels and are of low intensity (Komarek 1974). There are some exceptions, however, such as when understories of mountain laurel produce high-intensity fires in hardwood stands. Some hardwoods develop exceptionally thick bark upon maturity. Yellow-poplar is one of the most fire-resistant species in the East when its bark thickness exceeds 0.5 inch (Nelson and others 1933). On the Coastal Plain, many hardwood stems over 6 inches in diameter at breast height survived after 30 years of low-intensity annual and biennial burning (Waldrop and others 1992) with little or no damage to boles.


Hardwoods sprout, generally from the base of the stem or from root suckers, when tops are killed. Suppressed buds at or below ground level often survive the heat of a surface fire and sprout in response to the loss of apical dominance (Augspurger and others 1987, Waldrop and others 1985). Although many sprouts may develop from a stump, over time they thin down to one or a few per stump.


Many species have adapted to a high-frequency fire regime by developing light seeds, which can be disseminated over large areas by wind and gravity. These light-seeded species often pioneer on burned seedbeds. Some species, such as yellow-poplar, produce seeds that remain viable for years in the duff. Yellow-poplar seeds stored in the lower duff germinate rapidly after low-intensity prescribed fires (Shearin and others 1972).


5.2.3 Water Quality

Effects of prescribed fire on water quality vary, depending on fire intensity, type and amount of vegetation present, ambient temperature, terrain, and other factors. The major problems associated with prescribed fire and water quality are potential increases in sedimentation and, to a lesser degree, increases in dissolved salts in streamflow (Tiedemann and others 1979). However, most studies in the South indicate that effects of prescribed fire on water quality are minor and of short duration when compared with effects of other forest practices (Brender and Cooper 1968). When prescribed fires are conducted properly, nutrient loss and stream sedimentation are likely to be minor compared with those resulting from mechanical methods of site preparation (Douglass and Goodwin 1980, Douglass and Van Lear 1982, Ursic 1970). Even intense broadcast burns may disturb the root mat very little, leaving its soil-holding properties intact. Furthermore, slash tends to be randomly distributed over logged areas and is seldom completely removed by broadcast burning. Therefore, the root mat, residual forest floor materials, and incompletely consumed slash form debris dams that trap much of the sediment moving downslope (Dissmeyer and Foster 1980). Also, rapid regrowth in the South quickly protects sites.


Only a few studies in the South have documented the effects of prescribed fire on nutrient concentrations in streams or ground water. Low-intensity prescribed fire had no major impact on stormflow or soil-solution nutrient levels (Douglass and Van Lear 1982, Richter and others 1982). Severe wildfire in heavy fuels in mountainous terrain had no adverse effects on water quality (Neary and Currier 1982). Research from Western States documented several cases where slash burning increased nitrate-N levels in streamflow. In no case, however, did burning cause nitrate-N levels to exceed the recommended Environmental Protection Agency standard of 10 parts per million for drinking water. Phosphorus and major cations often increased in streamflow and the soil solution, but the effects were of short duration and of a magnitude not considered damaging to surface water or site productivity (Tiedemann and others 1979).


Previous PageNext Page

Glossary | Sci.Names | Process | Comments | Final Report

 

content: John Stanturf
webmaster: John M. Pye

created: 21-NOV-2001