Archive | policy and science RSS for this section

Unnatural Degrees of Disaster

A op-ed-ish piece at CNN takes the devastation of hurricane Michael and seeks it to be labelled something other than a ‘natural disaster’. The main argument is that human emissions have led to warmer ocean waters, a warmer atmosphere and higher sea level, all of which allow for stronger and more impactful hurricanes. This is not news in the climate community, which has been striving the past few years to be able to say something about the effect of global warming on major storms, heat waves and droughts. But, of course, this is not the only way that humanity makes disasters worse.

A seismological aphorism is “earthquakes don’t kill people, buildings kill people.” Although an approximation (tsunamis are pretty capable of dealing death, as are quake-triggered landslides and avalanches), this does highlight the other way that humanity makes nature even more powerful. As a result, geoscientists often walk around shaking their head and muttering under their breath “Why’d they do that?” Adobe buildings in earthquake prone areas. Beach houses on barrier islands. Developments at the base of landslide-prone mountainsides–or on active landslides themselves. Cities in floodplains. Insurance designed to force the reconstruction of things in the same hazardous places. Frankly, it is so bloody obvious that these are stupid things that you want to throw your hands up in the air and embrace the inevitable extinction of such an incompetent species.

Of course these are all things that make natural disasters worse for people, but they don’t actually make the actual trigger worse, right? Um, true, but we already do plenty more than just supercharge hurricanes. Injection of waste water into deep wells has produced quite the swarm of earthquakes in Oklahoma. Paving over wetlands made floods in Houston that much worse than they would have been without paving. Human-caused fires set the stage for catastrophic landslides and mudflows that might not have happened without the fires. Subdivision have been crushed and roads destroyed because bulldozers removed the toe of stable landslides that then failed. Excessive watering and water from septic systems is likely the cause of the Portuguese Bend landslide in Southern California as the old slip planes got lubricated and the soils above increased in weight.

In sum, we’ve been at this business of making our own “natural disasters” for some time. All we’ve done with global warming is to carry our local disaster mania on the road. Arguably we’ve reached the point where a truly natural disaster is a rarity.

Ratings Failure

So FiveThirtyEight has a story about how inadequate hurricane intensity numbers (Saffir-Simpson scale categories) are.  Basically the destructive potential of a hurricane is poorly linked to that number. But the funny thing in reading the piece is that you could substitute Richter magnitude for Saffir-Simpson scale and make almost no other changes and the article would sound about right. Richter magnitudes (as popularly understood; the numbers reported for events are usually moment magnitudes these days) tell you almost nothing about the destructive potential of an earthquake.

Just as with hurricanes, where an earthquake strikes is critical in determining its damage. Magnitude 8 earthquakes 600 km under western Brazil are barely even noticed, while a M5.9 in northern Haiti kills over a dozen. The details can be amazingly important: a M6.3 earthquake in Christchurch devastated the city center and killed nearly 200 but the earlier M7.0 earthquake only a few miles away produced little damage and no fatalities.

Just as with hurricanes, the details of the earthquake will affect its ability to do damage. When an earthquake ruptures in one direction, damage will be greater in that direction than 180 degrees away. Another New Zealand quake, the 2016 M7.8 Kaikoura earthquake, ruptured from south to north, sparing areas closer to the epicenter but causing enough shaking in Wellington, across Cook Strait from the event, that several buildings had to be torn down. Toss in intrinsic variations in frequencies due to variations in stress drop and it is clear that a magnitude by itself doesn’t carry the whole story.

A popular pastime in southern California is guessing the magnitude of an earthquake solely from what was felt.  GG recalls a radio news program years ago when there was an earthquake near GG, who felt the quake before the radio broadcaster did. Callers speculated on where and how large this was: “I’m in San Bernardino and it was a slow rolling event so probably on the San Andreas to the north” “It was a sharp event that must have been a magnitude 6” and so on. (In fact, when you are close you tend to get a very sharp movement from the P-waves, but farther away it is the surface wave train that produces a more rolling movement).

The Richter magnitude is about forty years older than the Saffir-Simpson scale and as a result, seismologists have had that much more time to try and clarify all the things that go into earthquake damage.  Look into the page at a recent large event and you see far more than the magnitude. Their Pager page tries to estimate damage and deaths almost immediately after an event to help gauge the need for emergency assistance. Stories about the “Big One” that dominated California media for decades are being replaced with more nuanced stories highlighting the risk from faults through urban areas like the Malibu Coast/Hollywood Hills fault system or the Hayward Fault. And the interaction with the engineering community is far more sophisticated than 40 or 50 years ago, with power spectra and 50 year exceedence criteria being passed on from the seismological community.

And yet we get stories about the earthquake proof house that can withstand “an earthquake registering up to 9.0 on the Richter scale”.  Well, GG’s house survived a M9 earthquake–sure, it was across the globe, but the point is that distance and environment matter. Would these buildings make it if right on a 20m fault rupture? Doubtful. That surviving a M9 means nothing. Surviving some threshold of ground motion? That might be useful, but probably the public wouldn’t get a max acceleration of 2g as a useful number.

So good luck meteorologists. Your best hope might be in scaling total kinetic energy in a hurricane to a level from 1 to 5, where you could add decimals. Oh wait, they’ve done that. So why isn’t this on TV and the web now?

Reap or Sow?

Well, election season is upon us and as sometimes happens, frustration with legislative gridlock has produced initiatives that probably are less than ideal solutions to problems.

Here in Colorado the issue is oil and gas development. Generous forced pooling rules, state control of drilling permits, and limitations on the ability of surface rights holders to prevent drilling have frustrated suburban dwellers who object to the various nuisances and impacts of a booming oil and gas industry. Attempts to limit development in town laws have been struck down by courts, and attempts to pass legislation have also failed. Compounding the insult has been the explosion of a house in Firestone because of a feeder line from a well that wasn’t properly shut off. So opponents have gone to the ballot box with an initiative that would force new oil and gas wells to be a half mile from any occupied dwelling or vulnerable areas like reservoirs and rivers and streams.  There is no opportunity for a surface rights owner to waive the setback, but this would be a statutory change and not a constitutional change.

Read More…

Transparently DisHONEST

As we’ve noted a few times before, the attempts by certain members of Congress and this administration to use transparency as a subterfuge to torpedo science they don’t like.  Sometimes discussing this can seem like arguing with these folks for the sake of arguing–what could be wrong with transparency in science? Maybe the general arguments are just too vague.

So it happens that a very specific case with very specific outcomes and with a lot of strong indications of industry pressure to get to the “right” result might illustrate the problem better.  As reported by E&E News and reposted by Science, a group at Columbia made very detailed studies of a few hundred low-income women and their children who were exposed to the pesticide chlorpyrifos during pregnancy before the pesticide was banned for indoor use. The results of the study and its follow-ups were damning enough that the EPA moved to ban the pesticide outright.

Being a popular pesticide, producers lobbied to overturn the finding.  Some of those lobbyists now work within the administration, and now the EPA wants to delay a ban for at least 5 years, arguing that the original data needed to be available for all to play with. But when a federal appeals court demanded that the EPA follow through on the ban, the EPA tried to claim that the study’s data were “inaccessible.”

The problem is that the study was so detailed and from such a small area that identification of the members of the community that participated would be possible were the entire dataset made public. While Columbia offered a few alternatives that would preserve the privacy of the participants while permitting more detailed reanalysis, industry and the EPA were apparently uninterested, which seems to confirm the notion that their purpose was simply to reject the findings by claiming the data were insufficiently transparent.

This is the rather obvious playbook that those who pushed the HONEST Act effectively are trying to implement and apply to many other outcomes they dislike. If there was real honesty on the part of these people, their effort would include real money to help make anonymous the data as much as possible and would require that all industry-sponsored studies (not only the ones they want to promote) would be similarly required to share all their data.

Don’t hold your breath waiting…unless a plane just flew over dusting with chlorpyrifos…

Science is for Suckers

OR so it would seem in the current administration. The latest (?) salvo, re-reported by High Country News from an original story in the Center for Investigative Reporting’s Reveal is a revision to policy for national parks that removes using science to anticipate damage to park resources. The previous policy, that had been developed over years based on work from the advisory board that resigned earlier this year, basically instructed park managers to consider things like climate change and impacts of recreational use on biological resources as examined with science when setting the rules for various uses of the parks. This was clearly viewed by those involved as carrying forward the realignment of Park Service priorities inspired by the Leopold Report in the 1960s.  That report had shown that the recreation-heavy and visitation-first policies of the mid-twentieth century Park Service in promoting what Edward Abbey called Industrial Tourism were having negative impacts on the biological resources of the parks. This led to the recognition that park managers needed to be sensitive to their ecological charges; over time led to management changes such as removal and rerouting of buildings, infrastructure, trails and roads in giant sequoia groves (Yosemite only recently completed a rebuild of access into the Mariposa Grove of Big Trees in a multiyear effort, one that followed Sequoia’s decades-long removal of lodging from Giant Forest).

The 2012 report, titled Revisiting Leopold, was in a sense addressing the deficiencies in the 1963 report, which was oblivious to the role of Native Americans on landscapes and biota and which was written well before invasive plants and climate change became a clear threat to the integrity of the parks. In a sense, the new report was advocating future-proofing the Park Service by embedding science within it rather than waiting around to see if anybody working outside the parks would provide useful guidance. Thus the advisory board recommended that these threats were best addressed by science done within the Park Service:

To implement the resource management goals and policies described in this report, the NPS [National Park Service] will need to significantly expand the role of science in the agency. The committee has several recommendations. The NPS must materially invest in scientific capacity building by hiring a new and diverse cohort of scientists, adequately supporting their research, and applying the results. The NPS should train, equip, retain, and support the career advancement of these research scientists and scholars. They should be stationed in parks to provide place-based expertise and knowledge, long-term institutional memory, and technical support for resource management. NPS scientists (and the agency) would greatly benefit from strengthened and supportive supervision, increased opportunities to interact with the scientific community, including professional associations, and specific responsibility and opportunity for publishing their work in the scientific literature. Both NPS managers and scientists require training and requisite skills in communication, critical thinking, analysis, science, technology, and mathematics. The NPS should integrate scientific achievement into its evaluation and performance reward systems, providing incentives for scientists and managers who contribute to the advancement of science and stewardship within their park or region.

This report led to a process of trying to implement changes in the Park Service that were finally released in 2016. This represented a considerable effort over years. It was tossed aside in the first months of the Trump Administration without a single meeting by the new Secretary of the Interior with the advisory board that had launched the effort nor with any clear indication of why this policy had to be revoked.  It is unclear if (but unlikely that) there was any input from Park Service staff. The NPS was directed not to publicize the change.

The clear message?  In this administration, science is for suckers.

Ludicrous Certainty

One of the fixtures of modern life seems to be the hearty embrace of uninformed certainty. People who just know that certain things are an unqualified bad and will go to any lengths to fight those things seem to make up the vast majority of social media contributors. Although there are many fine examples of this on the political right, let’s complain about some on the political left.

Two such issues are centerpieces of complaints here in Boulder.  One is the presence of genetically modified organisms used in crops (GMOs) and the other is the practice of fracking. Neither warrants the blanket condemnation they receive.

Most opponents of GMOs know little about how we’ve ended up with the food crops we have now, though occasionally you get clues, like if you stumble on wild strawberries and wonder why they are so tiny.  Our food crops are the products of generations of hit or miss efforts of artificial selection (picking the outcomes you like best) and crossing of different plants to get useful hybrids. The genetic tools now available remove a lot of the hit and miss part of the effort allowing scientists to directly target the aspects of a plant that are causing trouble.

When you say that all GMOs are bad, you might as well say all spot welds in a car are bad and you only want a car assembled with no welds. The use of genetic tools is a technique and not an end per se. A spot weld might make a tougher car, but it will not make a better computer.  It is what you do with the tool that matters.

Does this mean all GMOs are good? Hardly, if for no other reason than the law of unintended consequences. For instance, there was a desire to have a variety of common golf green grass be resistant to Roundup; as High Country News tells the story, the new variety was successful–but when it escaped from where it was being grown, it became a troublesome weed along irrigation ditches in eastern Oregon. Human endeavors are filled with such mistakes, many having nothing to do with GMOs (think of all the times an exotic species was introduced and found to be a pest, and then the effort to use the pest’s natural enemy simply created another problem). Just as we recognize that bringing exotic species into someplace requires some forethought, development of GMOs needs to face similar scrutiny.

Fracking is a slightly different issue, though it shares the same blanket opposition that has little to do with what it is and does. Most of the concerns with fracking have nearly nothing to do with the actual process of fracturing rock deep in the earth to release hydrocarbons.  Instead when you hear the actual harms people complain about, it is the industrial noise and associated air pollution of the drilling and fracking operations, the greater density of drill pads often needed for the current “non-traditional” horizontal drilling, surface water pollution from spills, aquifer contamination from improperly sealed wells, earthquakes from injection wells disposing of accessory fluids from production, or even the antiquated forced pooling laws that greatly limit the options for those holding both surface and mineral rights. When people talk of banning fracking, it would be like a city banning a car company from using welds–it is not the welding that is the problem, it would be the noise and impacts of the car factory that are being opposed. Fracking is really being used as a proxy for resurgent oil and gas development.

Is fracking then an unalloyed good? Well, no.  There are some very positive aspects of it: by increasing the recovery of hydrocarbons from an existing field, it can slow the desire to expand production into virgin areas. The recent application in associated with horizontal drilling has opened up a lot of natural gas, which has been replacing dirtier coal in electricity generation as a result. But there are some instances where fracking is indeed a direct evil.  In a few places, it has indeed caused larger earthquakes (though far, far fewer than injection wells).  There is an indication that fracking in some shallow rocks immediately below an aquifer in Wyoming has indeed directly contaminated fresh water. And no doubt a few fracking operations have spilled fracking fluids into surface waters. And, of course, the application of the technique has opened up areas that previously were uneconomic (which is a mixed bag depending on where you are and what the land use looks like).

Most folks would probably like the world to be black and white, good or bad. But there is gray all over the place, and GG earns his nom de plume when encountering absolutism. This desire to polarize to the extreme removes all sensible middle ground.  We would all win if GMOs were not so misrepresented but also if the regulation on their development made more sense. We would all win if oil and gas development was throttled back by a more driven effort to move on to renewable energy sources. Recognizing the strengths and weakness of things like GMOs and fracking could focus our attention on the specific instances that are most troublesome. But when you just paint the whole thing one color, you lose the ability to separate the dangerous from the innocuous.

Did Science Help Start Big Lies?

Certainly one of the most striking things about modern American political discourse is the magnitude of outright lying going on.  While misdirection and obfuscation were not uncommon in political speech, outright provable lying wasn’t.  And yet now we have a President who Politifact says has made statements that are either false or “pants on fire” 47% of the time and who has inspired the Washington Post fact checker to keep a running count of lies. This follows years of internet chain emails and conspiracy theorists that have made Snopes expand rapidly to capture and review all the questionable stuff circulating on the internet. Needless to say, this tends to encourage others to play equally fast and loose with truth. For a scientist, this is a distressing trend–but it isn’t really that new.

Now to be clear, big lies have made the circuit before, being a staple of the Nazi government, for instance; the related game of “whataboutism” was a favorite of the old Soviet state. Some might point to McCarthyism in the US as a domestic episode, though the Red Scare had less questioning of objective truth and more vilification by insinuation. Here GG refers to outright misrepresentations of is going on. And as science’s goal is to discern the nature and rules of the reality we inhabit, it has a habit of landing in the crosshairs of those whose interests conflict with reality.

Read More…