Time again for “Not Quite In Time–the NY Times again steps on a bar of soap when looking westward”. Today’s installment concerns an article on fire in the forests of the west and in particular California. This would be a fine article…were it written about 1975 or so. Reading it today feels like, well, hearing from a cousin that there is now this great amusement park in Orlando called Disneyworld….
OK, so what is the beef? First, this is a retread of an argument that has gone on for at least 40 years over fire suppression in western forests. Prior to the great Yellowstone fires of 1988, the Park Service in particular had decided that fire suppression was bad and the Forest Service was leaning in that direction. But when blazes on the margin of Yellowstone blew up and some blamed the Park’s “let it burn” policy, that policy was quickly dumped. Nothing had changed on the science side; this was entirely a change driven by public perception. Heaven only knows how many stories in High Country News covered the various efforts to deal with the twin goals of forest health and protection of communities that discussed this issue with more depth and insight.
But here’s the thing. In that 40 or 50 years since the science was pretty clear that fire suppression was a problem has come a second recognition, one this Times article utterly missed:
Scientists are still trying to figure out how regularly forests burned in what is now the United States in the centuries before European settlement, but reams of evidence suggest the acreage that burned was more than is allowed to burn today — possibly 20 million or 30 million acres in a typical year. Today, closer to four million or five million acres burn every year.
Scientists say that returning forests to a more natural condition would require allowing 10 million or 15 million acres to burn every year, at least.
“More natural condition”? The thing we know really well at this point is that fire before European settlement was in fact frequently managed by Native Americans, who used it as a tool to control their landscape. That the reporter goes to the Sierra Nevada, where this practice is very well documented, and utterly overlooks this aspect of the problem is troubling. Because the fires natives set were not the massive conflagrations that we are seeing now; they were more like the management fires set within, say, Sequoia or Yosemite national parks to try and reduce the fuel load without a catastrophic fire. So when the reporter in essence is claiming that big, huge fires were both natural and the pre-Columbian norm, he is creating a fantasy.
This makes it seem like the biologists arguing for these big fires are themselves ignorant of this past behavior (it seems this is unlikely to be a fair evaluation of their knowledge, though you do wonder a bit). Hopefully this is an incorrect impression, but if it is not, then there needs to be some education of the biological community about this.
Here’s the deal: “Pre-Columbian” or “before European settlement” is NOT the same as “natural”. Arguably we know little or nothing about fire in a human-free landscape as no ecosystem in the U.S. has been free of humans since the end of the last Ice Age. There was some work in the lodgepole forest of Yellowstone suggesting that big fires have been the norm there for many centuries, and there is a decent argument to be made that this does not reflect human activity. But in the Sierra, there is evidence of Indian-created fires from the foothills to treeline.
You’d like to think we could at least advance arguments about land management to somewhere near the current science. This article makes it seem that widespread recognition of a problem that was really roughly 40 years ago has only just occurred. Can we please move the setting on the time machine to 2017?