As we’ve noted a few times before, the attempts by certain members of Congress and this administration to use transparency as a subterfuge to torpedo science they don’t like. Sometimes discussing this can seem like arguing with these folks for the sake of arguing–what could be wrong with transparency in science? Maybe the general arguments are just too vague.
So it happens that a very specific case with very specific outcomes and with a lot of strong indications of industry pressure to get to the “right” result might illustrate the problem better. As reported by E&E News and reposted by Science, a group at Columbia made very detailed studies of a few hundred low-income women and their children who were exposed to the pesticide chlorpyrifos during pregnancy before the pesticide was banned for indoor use. The results of the study and its follow-ups were damning enough that the EPA moved to ban the pesticide outright.
Being a popular pesticide, producers lobbied to overturn the finding. Some of those lobbyists now work within the administration, and now the EPA wants to delay a ban for at least 5 years, arguing that the original data needed to be available for all to play with. But when a federal appeals court demanded that the EPA follow through on the ban, the EPA tried to claim that the study’s data were “inaccessible.”
The problem is that the study was so detailed and from such a small area that identification of the members of the community that participated would be possible were the entire dataset made public. While Columbia offered a few alternatives that would preserve the privacy of the participants while permitting more detailed reanalysis, industry and the EPA were apparently uninterested, which seems to confirm the notion that their purpose was simply to reject the findings by claiming the data were insufficiently transparent.
This is the rather obvious playbook that those who pushed the HONEST Act effectively are trying to implement and apply to many other outcomes they dislike. If there was real honesty on the part of these people, their effort would include real money to help make anonymous the data as much as possible and would require that all industry-sponsored studies (not only the ones they want to promote) would be similarly required to share all their data.
Don’t hold your breath waiting…unless a plane just flew over dusting with chlorpyrifos…
Long long ago, computers were big expensive machines lodged in climate-controlled rooms behind lock and key, access being held by the masters of the campus IT professionals. Users paid by the kilobyte, by the seconds of connect time, by the milliseconds of compute time. The gods of IT raked in money like casinos.
Then came the PC. Within a few years, the IT department at MIT, for example, had collapsed from its previous lofty heights, discontinuing mainframes and reducing support staff to posting flyers around campus, offering services users were delighted to ignore. The totalitarian system was dead! Long live democracy!
Well, slowly but surely we’ve encouraged a new generation to take up the crown and beat us with the scepter of access until we bow down in homage to our noble masters. “The Cloud” is, in fact on most campuses, just the same mainframe. Better OS, much better iron, but as campus IT has decided that mere users must be protected from the world beyond, they have leveraged the need for security from the broader internet into security for the denizens of the IT department. Despite, all to frequently, their staff being the source of the serious break-ins (in GG’s building, the two serious security lapses were both caused by mistakes made by IT professionals).
And yet it is even more insidious. Instructors are increasingly told to place their courses within course management systems, web-based monstrosities like Blackboard, Canvas, and Desire2Learn. These three (GG has had experience with all of them) are essentially interchangeable even as each is painful in its own way; their main advantage over just regular web pages is that intraclass materials are private and so protected information like grades and use of copyrighted materials can be freely placed online. Yet, practically like clockwork, campus IT decides it is time to shift from one to the next. Why? Usually some relatively trivial capability is trundled out to justify the move (Now on smartphones! Now with free-form answer quizzes! Now looks snazzier!)–despite the likelihood that the previous provider will match that new wrinkle within a year or two. So faculty and teaching staff and students are forced to learn yet another way of doing the same damn thing, which means….time for our boys (and a few girls) in IT to collect paychecks running workshops on how to do things and building web pages on how things are different and, of course, spending months if not years first installing and then troubleshooting the new software and then migrating content over all while supporting the old system for a year or two longer than originally planned until it is now time to begin the process of investigating the latest iterations of such software, which inevitably leads to…moving to a new system!
Something similar goes on with email support, internet video conferencing, personnel management software and other computer-related interfaces. Non-IT administrators who in theory are riding herd on this are so divorced from both users and the technology that they lack the backbone to say “no, what we have will suffice.” It remains unclear if the disruption to instructors and students plays any role in the calculations made to justify these changes (it seems certain to be underestimated).
Of course campus IT is at increasing risk of being outsourced to companies like Microsoft and Google (indeed many functions already have). It isn’t hard to predict that there will be a major scandal when a university’s “private” information somehow wanders off campus. Watching all this can make a grumpy geophysicist who remembers the early days of the internet and the last gasps of the old IT mainframes dwell fondly on the memories of hope…
Gov. Jerry Brown surveyed the devastation Saturday in Ventura — the area hardest hit by firestorms that have displaced nearly 90,000 people in Southern California — calling it “the new normal.”-Los Angeles Times, Dec. 10 2017
OK, so GG is late to the parade of folks deriding the term “the new normal”. But it is a source of some grumpiness, and so while struggling to catch up to the existing bandwagon (and being pursued by the revisionist anti-bandwagon), here’s the gripe: when put in sentences like that above, the “new normal” sounds like we are there. Climate change has happened, this is what it looks like, get used to it.
Now the defense of the term is that the new normal is change, and not for the better. If this is how people are reacting to this term, then fine. But that isn’t the way it sounds. Articles on heavy rainfall, rapidly intensifying hurricanes, “bomb” lows, and flash droughts often put it as “this is that future you’ve heard about. It is here. Too bad.” The problem is that that future isn’t here yet–there is more to come, from the spread of tropical diseases to water shortages so intense that depopulation of some areas will be the only response to the creation of a refugee crisis that makes that from Syria look like tourist travel. So any terminology that seems to imply that we are over the hump is making it seem like that awful future we heard about is not so terrible after all. Annoying, maybe, and deadly for a few, sure, but then when haven’t there been weather-related deaths?
Basically, these are now the good old days. It isn’t hard to imagine folks 50 or 100 years from now saying “I remember when there were still forests left to burn–now it is just all the brush that burns.”
GG has been piddling along though the Sierra (ostensibly to give a campfire talk in Mineral King) and in doing so stared a bit longer at a recent paper on the age of a pediment in the Sierran foothills by Sousa et al. in Geosphere in 2017. In a way this is a callback to concepts from far back in the geologic literature, namely the significance of an “Eocene erosion surface.”
Here, to be brief, low-temperature thermochronology from a low-elevation pediment in the western foothills of the Sierra yields very old ages–in fact, overlapping with the emplacement of plutons in the Sierran crest [this was not a unique observation; Cecil et al., 2006, had a pretty old point in their collection]. Sousa and coauthors model these data and get a cooling to surface conditions by about 40 Ma. Because these pediments abut noticeable topography, this means there was at least that much local relief in the ancient Sierra. While the pediments had been noticed by others, many suspected a far more recent age.
In some ways, this is old news. The Eocene sediments in the northern Sierra have long made clear the presence of significant local relief, and many workers had inferred that such relief was probably higher in the southern Sierra (e.g., Wakabayashi and Sawyer, 2001). But the southern Sierra lacked the Eocene sediments necessary to know what the Eocene landscape might have looked like, so this paper opens up a new window for us.
Where does this lead us? Kind of down a rabbit hole only to come up with no strong and useful statement–though perhaps future work could nail things down. This is more a personal attempt to try and grasp what is going on, so profound errors might exist and insights are few. So, proceed at your own risk….