A curious op-ed in the New York Times on Yosemite. Curious because it points in one direction for a long time before suddenly screeching to a stop and pointing in another. Leveraging off of the controversy over Confederate monuments and the renaming of some park facilities necessary during a court battle, Daniel Duane recounts the sad history of Native Americans in California in general and in the valley in particular. Readers can anticipate the point: we should abandon the Euro-Americanisms in the park and revert to names the Ahwahneechee used. And indeed he reaches this point only to ask the descendants and relatives of these people what should be done. Their recommendation: get federal recognition for the tribe and cut back on visitation. “Renaming, [Bill Leonard, a descendent of Tenaya] said, ‘is not going to make us feel any better or more important — the reality is, most of us could care less what they call things.'” You get the feeling Duane was asked by some reader or editor to ask these people about their views (much as interviews with descendants of slaves and Confederate generals have appeared) and was given an answer kind of at odds with the thrust of the piece, which he dutifully tacked on.
Anyways, the summary of injustices is fair (Duane fortunately relies on a couple of pretty appropriate references) and something more Americans should be aware of. But he kind of lets the Park Service off the hook, hiding their role behind more generic labels of “park officials” and the “federal government.” Pre-1906 management of the valley by the state allowed the Ahwahneechee to stay in the valley, and while demands for inappropriate “Indian” shows and their menial position in Yosemite Park contrasts with what should have been their place as owners and proprietors of the valley, they were at least considered to be legitimate residents of the place. Federal management systematically marginalized and removed Native Americans; that management was, after 1916, the Park Service. There is something disturbing to most Americans to realize that one of the most highly thought-of groups of public servants did in fact behave in such a manner. And it is distressing to many who call the national parks “America’s Greatest Idea” to recognize that it was prefaced on the exclusion of the peoples who had been there first.
Duane also takes a hesitant slap at John Muir, and here GG asks a bit of forgiveness for delving a bit deeper. Read More…
Geologists have for a long, long time been telling people not to build things in certain places. Barrier islands? They move and evolve, which means property comes and goes. Not good. Floodplains? They, um, get flooded. Landslides? Only if you want a mobile home with a mobile yard. Sometimes we get heard, but usually we don’t. And the more subtle stuff, like recognizing how paving large areas can make floods worse? Lots of luck there. Doesn’t matter if the communities are rich or poor, building in bad places seems a national habit.
Maybe that is changing.
Even as the national media seems to just be noting that flood insurance is encouraging building in vulnerable spots, Politico has a big story on Louisiana’s program to consider how some communities will be forced to move and how to prepare to absorb that exodus as it occurs. For the Grumpy Geophysicist, this is a moment of actual hope, a ray of sunshine in the currently clouded over world of using science to guide public policy. [If you want more darkness, consider that politicians are rewarded for disaster relief and not disaster preparedness.]
The basic point is that people don’t like getting hammered by really bad weather (you know, like floods). And so they leave–and this isn’t typically a slow migration but instead a real wave of refugees from hurricanes or floods or other such unpleasantries. They don’t often go really far away, so neighboring communities suddenly are flooded with people. There are two main forks to preparing for this: one is to try and get the vulnerable communities to start to think about how they will evolve in the face of the next storm, and the other is for those neighboring communities to prepare for the eventual migration of their neighbors. The state is actively trying to do this kind of work.
While there are uncertainties in our future, there are a few things that will happen. There will be sea level rise. There will be bigger rainfall events. These are both so clearly tied to the basic physics of increasing CO2 in the atmosphere that there really is no avoiding them; the best we can do now on that side of the ledger is to try and keep the magnitudes lower than they might otherwise be (and some areas also see land subsidence, which is unrelated to global warming but also causes problems). So we need to prepare, which means surrendering land we cannot defend and defending land we dare not surrender.
That Louisiana is starting to consider this landscape triage may just mean we’ve moved off the “we will rebuild it” mantra of the past century. As the article makes clear, this won’t be easy–but it should be much better than letting the chaos of the next disaster drive change.
A comment on an earlier post got GG reflecting on just what counts as the professional literature. Some 20-30 years ago, things were pretty clear. Professional literature was what was published in journals and certain professional books (like AGU monographs and GSA special papers). These were reasonably well indexed and accessible to academics. Then there was the gray literature: stuff that was sort of out there. This included theses, field trip guides, meeting publications, and reports of various flavors. To some degree books were a little less than ideal. Finally there was proprietary stuff, things like industry-acquired reflection profiles and analyses that sometimes were allowed to see the light of day in some compromised form (e.g., location undisclosed). Although these are earth science materials, there are comparable things in other fields.
How is this holding up?
Hot on the heels of the Nature paper complaining about reliance on bibliometrics measures of success we have an Inside Higher Ed piece similarly bemoaning how simple metrics corrupt scientific endeavor.
And so what else showed up recently? Why, two new bibliometrics measures! One, the Impact Quotient, frankly does nothing but replace one useless measure (the Impact Factor) with a highly correlated one (the new Impact Quotient). The other is the s-index, which is a measure of how often a worker cites his or her own work.
We are going from trying to figure out something new about how the world works to making sure that everybody knows that we found out something new about how the world works, with the potential that the “something” has become increasingly trivial….
To nobody’s great shock, Adobe recently announced the end of the Flash plug-in for web browsers in 2020. Given the number of iDevices that don’t support Flash and the growth of tools that keep Flash from running, the writing has been on the walls for some time.
Now supposedly this does not mean the end of ActionScript and .swf files and such not, but it feels like there is an issue that is being overlooked. Interactive pdfs would seem to be potential victims of the death of Flash as, at present, you have to use .swf materials within pdfs (that is, there is no way to include HTML5 in a pdf) and there are indications that the display of these within Acrobat and its kin might require the Flash plugin to be present. Is the .swf format and capabilities likely to be maintained if Adobe’s Flash-creation tool Animate CC is more widely used to generate HTML5?
Why bring this up? Read More…
An interesting article in The Guardian on the rise of the profit-oriented part of scientific publishing. One part of the article describes how companies like Elsevier and Pergamon make so much money: “It is as if the New Yorker or the Economist demanded that journalists write and edit each other’s work for free, and asked the government to foot the bill.” How much money? Try revenue of $24 billion. Elsevier’s profit margin: 36%.
Now some scientists have argued that journals are outdated and provide no added value; GG has argued this isn’t true. But with the existence of non-profit publishers, does it make sense to feed these very profitable monsters?
Well, no. Worse, many scientists don’t seem to understand that their science is no longer theirs once it is in one of those journals.
Some of us have sworn off of Elsevier journals, not reviewing for them or publishing in them (though we sometimes get dragged in by colleagues). That is walking away from a lot of poor journals and a few really good ones. In the days of paper journals, this was a clear choice. Even now, Elsevier’s tactics even for open-access have driven some away. But close examination of what societies are doing suggests that avoiding vendors many view as unscrupulous is getting harder and harder.
…all the world is a nail. And the currently popular hammers are things like Twitter and Instagram and Tinder. While some have long advocated the first two as important tools for scientists, the last has been used as a model for scanning through preprints. Lots and lots of preprints. The Science story on this says “A web application inspired by the dating app Tinder lets you make snap judgments about preprints—papers published online before peer review—simply by swiping left, right, up, or down.”
Nothing says “science” like “snap judgment”.
While GG lambasted an effort to capture social media-ish solutions as a means of post-publication peer review, how about tools to let you find what cutting edge science is appearing? That Science report on social media linked above says that is what social media is good for. Um, really?
GG studies the Sierra Nevada. Try going to Twitter and searching on #SierraNevada. Bet you didn’t think there were that many people so fascinated with taking pictures of beer bottles. Add, say, #science. Chaff winnowed some, but very little wheat. Add #tectonics. Crickets.
The idea of this new app (Papr) is that if only you were able to see lots and lots of stuff quickly, you’d find some gems to explore. Really? Students complain bitterly about a firehose approach in the classroom, and the solution here is, um, a firehose? (To be fair, it appears the app developers are not necessarily expecting great things here).
Forget that. What we want and need are tools to reduce chaff, not accelerate it.
What we need is something akin to Amazon’s suggestions tool. Imagine visiting the preprint store to get a couple of papers you know you want. One maybe is on a topic you care about–say, the Sierra Nevada. Another maybe deals with a technique, say full waveform tomography. A third uses some unusual statistical tests. You download these and the preprint store suggests a few other preprints based on the full text content of the papers you got. Why that instead of keywords? Keywords have a way of being too picky. You might call work “tectonics” and GG might call it “geodynamics” and thus the keywords searches might pass by each other. But if the text is still talking about changes in elevation, changes in lithospheric structure–those are less likely to get overlooked. If this tool is smart enough to recognize quasi-synonyms and phrases, all the better.
Such a tool grows more powerful the more you work with it. While on that first try, you will also get recommendations on papers overlapping in non-interesting ways (say, applications of the techniques in paper 1, the geographic area under study in paper 2, and the measurement types in paper 3), the more you interact with this, the better it gets.
Here’s the sad thing: the tools to make something like this have been around for decades. The best spam filters (like SpamSieve) use a form of Bayesian filtering based on message content in addition to black- and whitelists. Earth science got much of its literature into a single “preprint store” long ago in GeoScienceWorld. And yet here we are, swiping left again and again and again….