Living in the west, sad stories of families losing their houses–or their lives–in wildfires is an all-too-common occurrence. And as a geophysicist, GG is familiar with the old geoscience adage that earthquakes don’t kill people, but built structures failing in earthquakes kill people. It would seem we should adjust the second adage for the first situation: wildfires don’t kill people; flaming buildings kill people.
Have you noticed how often the pictures of destroyed houses includes green trees nearby? How often the description of the burned down house includes the oddly unburned things nearby? Although there are certainly fires so intense they take everything in their path, all too often it seems like houses burn when little else does. And this applies even to the devastation in Paradise California this week. The Los Angeles Times has a piece noting those still-standing trees and finds that the devastation in Paradise was because the fire became an urban fire. Houses were igniting other houses.
Basically the issue is that western houses catch burning embers with things like debris-filled gutters, exposed eaves and ventilation grills, and wooden porches. Once lit, houses tend to go up all at once. This is not new news–anybody with a house in the forest hears about it from their insurance and local fire officials. Yet new houses continue to be built with the same weaknesses, even in fire-prone areas. (At least most areas ban wood-shake roofs). Clearly more thought should be given to eliminating the ways houses catch burning embers.
Does this mean we’re off the hook on forest health? Well, probably not, though exactly what that means looks to be up for grabs more than ever. What seems certainly true is that frequent, low-intensity fires reduce the risk of intense damaging fires: the experiences in both Yosemite and Sequoia National Parks is that major fires lie down (lay down?) when they hit areas previously burned in a controlled manner.
But controlled burning isn’t a great option within the rural subdivisions now present in many forests. Thus many advocate for other kinds of treatment, ranging from wholesale clearcutting to selective logging to mechanical thinning of understory (to…raking???). One study recently highlighted in a CNN op-ed concluded rather strongly that more heavily managed forests are forests that burn more intensely–the opposite of what is usually claimed. And GG can attest to how mundane it can be to encounter a natural wildfire in the unmanaged backcountry, having hiked through or right next to such fires on at least three occasions. But there are confounding factors in play, some of which are noted in the study. Certainly one is ignition: wilderness areas usually see fires starting from lightning strikes. Such fires occur under conditions less apt to drive monster fires: the forest has often been wetted, and long periods of strong, dry winds are less likely. In contrast, managed forests are in more heavily used areas where neglected campfires, power lines, driving over dry grass, and sparks from machinery or gunfire are capable of starting a fire when strong, dry winds are present.
[As an aside, the above is all in reference to forests; Southern California chaparral is nearly immune to controlled burning, and unlike the forests, any such burning would leave the ground bare and hydrophobic. Chaparral is a whole different ballgame.]
What remains disturbing to GG in the forestry studies he’s perused is that the assumption remains that “pre-settlement” (apparently the currently favored term for c. 1840s western U.S.) is equal to “natural”. In some places, this will prove true, but in the Sierra foothills it is almost certainly a false equivalence. Pretending that Native American management was “natural” is likely to lead to poor decision making. Better if land mangers simply sought to restore pre-settlement fire frequency and intensity rather than assuming it was natural. The reality is that many of the places most at risk in the Sierra foothills were occupied by people who had many generations of experience in burning the landscape. We might just want to recognize that as, in some instances, their management goals might not match ours, but when they do, odds are pretty good that their management schemes would be a good place to start.
This is pretty unusual, but at 9:45 am MST on Nov 14, this is what the USGS earthquake map showed:
Two things stand out. Most amazing, the entire southern hemisphere lacks earthquakes sizable enough to make the map. The second is that the very largest earthquake in the last 24 hours is a just a M5.0 in Japan. Given that on any given day you expect to see 4-5 earthquakes larger than that 5.0, and that 7/7 earthquakes above M4 are all in the northern hemisphere, this is highly unusual.
Enjoy it while it lasts….
Once upon a time, having a “subscription” meant that things would come to you until either the term of the subscription ran out or you cancelled the subscription. The stuff that had already come, whether issues of Teen Vogue, the record of the month or volumes of an encyclopedia, were yours to keep. But in the world of the academic library, that model is vanishing, and with it potentially are large parts of the academic literature.
In the paper past, an academic library’s subscription to a professional journal meant that the library got paper copies of the journal that they could then place on shelves and allow people to read. As budgets might tighten or interests wane, libraries would cancel subscriptions–but those journals they had purchased remained on the shelves unless purged to make room for other material. This model is essentially dead.
Instead publishers have shifted to the software definition of “subscription”–which isn’t really a subscription at all. Just as to use Adobe’s Cloud package of software requires you to have an active subscription, so does getting access to all the issues of Science that you had subscribed to over the years. And if the journal decides to go to predatory pricing? Your options are nil. That money you poured into the journal all those years means nothing. In general, libraries are not allowed to make local copies of all the content they are subscribing to.
Arguably this is one of the best facets of a true open access policy: the freedom to copy materials means that there can be multiple archives. University archives can legally maintain and share copies of work produced at their institutions. Research groups can maintain thematic collections of articles relevant to their focus. (Note that current open access policies do not necessarily allow this: much as you can view some movies online so long as you watch the ads, some open access materials could require you to access the original portal and, perhaps, see advertisements there). In a sense, this can return libraries to their original function: instead of mere portals for providers, they return to being actual repositories of knowledge. So while we may have permanently lost the meaning of “subscription,” we can recover the true meaning of “library.”
Rather inadvertently GG has recognized a pattern in some recent grumpiness; oddly enough it took an article about self driving cars to really crystallize it. Now of course the specific article GG saw has vanished, but this article covers the same ground. Basically, when something becomes easy, we don’t pay as much attention. Which means the ability to do a task atrophies. For cars, we are looking less over shoulders if the car is looking in blind spots-which means a driver of a car equipped with such technology won’t look when renting a car lacking that tech.
Earlier GG complained about hikers who don’t take maps and scientists who can’t use library tools–and these seem examples of the same issue. Basically, humans are slackers. Find the easy path and take it. This has GG wondering about the way we teach.
First, students will always complain about doing things the hard way. Why did I have to work through that problem when I could just look up the answer? So courses that train students by making them work are always at risk of earning negative reviews, which can lead to administrators deciding that course should change somehow. Allowing current students to set a curriculum is a disaster in the making.
But what of new learning approaches? The “guide on the side” and the flipped classroom? A blanket condemnation would be unwise-student engagement in solving problems should indeed be helpful. GG has not flipped a classroom but has spoken to those that have and the word back is mixed. In some classes many students find that they can skip the preclass prep and walk in cold and get by, either by assistance from classmates or simply dragging the instructor to go over material the student should have already examined. Those students would get a punishing homework grade in a traditional classroom but don’t in this environment.
There is a similar bar-lowering going on with content. Courses using group work, in class exercises and flipped classrooms simply cannot cover as much material. For advocates of these systems, this is good news as in traditional classes content retention can be awful. But what consistently gets downplayed is that less stuff is covered. Now for a survey course for non-majors, this is hardly a calamity, but for major courses this can be serious trouble. As universities demand more core activities, time in major courses only stays level at best. Material gets dropped from the major. Employers will start to notice (that new guy didn’t know about XYZ! Can you believe it?). Universities are not votech, but certain core capability is necessary for employers to build on.
Another article GG can’t find at the moment noted that research into popular learning styles shows such styles of learning are fantasy. This business of catering to visual learning or aural learning or what not is, in the absence of real disability, total BS. Catering to such perceived variability only kills time and keeps a student from developing a more robust ability to absorb information.
Here’s the deal. Learning is hard, failing can be good. You do a total face plant in class, you will work hard to avoid it in the future. Struggle is part of learning. The trick will be to get students to buy into that without hitting stratospheric levels of stress. It could be the dreaded firehose beats a tepid trickle.
Many of you no doubt have heard of the lack of reproducibility studies in some scientific fields. This has led to condemnation of publications that have rejected or discouraged papers attempting to reproduce some observation or effect.
Now this is not such a big deal in solid earth science (and probably not even climate science, where things are so contentious politically that redoing things is viewed in a positive way). Basically, for most geological observations we have the Earth, which remains pretty accessible to pretty nearly all of us. Raw observations are increasingly stored in open databases (seismology has been at this for decades, for instance). Cultural biases that color some psychological or anthropological works don’t apply much in solid earth, and the tweaky issues of precise use of reagents and detailed and inaccessible lab procedures that have caused heartburn in biological sciences are less prominent in earth science (but not absent! See discussions on how fission track ages are affected by etching procedures, or look at the failure of the USGS lab to use standards properly). We kind of have one experiment–Earth–and we aren’t capable of reproducing it (Hitchhiker’s Guide to the Galaxy not withstanding, there is no Earth 2.0).
No, the problem isn’t failing to publish reproductions. It is failing to recognize when we are reproducing older work. And it is going to get worse.
AS GG has noted before, citations to primary literature are become more and more scarce despite tools that make access to primary literature easier and easier. This indicates that less and less background work is being done before studies are moving forward: in essence, it is easier to do a study than prepare for it. The end result is pretty apparent: new studies will fail to uncover the old studies that essentially did the same thing.
Reexamining an area or data point is fine so long as you recognize that is what you are doing, but inadvertently conducting a replication experiment is not so great. Combine this with the already sloppier than desired citation habits we are forming and we risk running in circles, rediscovering that already discovered without gaining any insight.
Just taking a quick minute to de-grumpify. Lots of phone calls of late, clearly many from polling firms. Which is annoying. Years ago you’d be flattered to get to be in a poll; now you’d be flattered if they lost your phone number.
And yet you go to, say fivethirtyeight and they are constantly saying “we need more polls”. Why? So we know how the horserace is shaping up! And of course the candidates are running polls (most to learn about the sentiment of the electorate, but some of course are push polls designed to push you away from a candidate). Fivethirtyeight regularly asks “good use of polling or bad use of polling” in some of their podcasts.
Here’s the answer. All of this polling is a waste of polling. Why? Consider this sentence from a piece in the New York Times discussing the massive disconnect between what congressional staffers think the public wants and the actual opinions of their bosses’ constituents:
Since most congressional offices cannot regularly field public opinion surveys of their constituents, staff members depend heavily on meetings and relationships with interest groups to piece together a picture of what their constituents want.
This is precisely when you would want to have good polling. Every Congressman and Senator should be asking, what do my constituents want on issue X?* Why not save that polling money for a really good use (or even do these polls during election season) and forget the horse race polls. Because, you know, we have a really solid reliable poll–and it reports back to us on Tuesday.
* Not that the politician has to do what they want–you might use the information to recognize the need to educate your constituents.
Are college administrators making graduates dumber?
GG is late to this party but got pointed this way by an Ars Technica review of Tom Nichols’s book on the death of expertise. But this dimension of his argument is pretty well articulated in an article he wrote last year for The Chronicle of Higher Education. It is an interesting and well-written article; the point GG Is moving from is Nichols’s indictment of colleges and universities treating students as customers instead of as students, which leads to intellectual laziness. There is much to ponder there, but let’s look at why GG is indicting administrators.
Here is what matters to administrators: butts in seats, dollars in endowments (and dollars in research accounts). Period. Why be this crude and casual? Because we see on a near-daily basis the advice we are given from on high and the actions accompanying it. One of the reasons college has gotten expensive are the accommodations; here at CU, there seems to be a constant renovation of dorms to make them more attractive. Nice dining facilities, all kinds of recreational activities, big new rec center–and CU is probably trailing the pack at that. Update the chemistry building? Don’t have the bond space right now because we’re building a better football stadium for, you know, the alums who we hope will donate money.