We’ve discussed isostasy a few times here, but today let’s stand back and ask the question, how do we determine what has led to the creation of isostatically supported topography? We will for today put aside the discussions of dynamic topography and just concern ourselves with isostatically supported topography, which seems likely to describe much of the US Cordillera. For this post, we’ll just focus on the crustal part of the problem, leaving the mantle for another day.
OK, first up is that isostasy means that the integral of density from the surface to some depth of compensation (usually somewhere in the asthenosphere) is constant. So how do we get at density at such great depths? At first blush you might think “gravity” as that is the geophysical observable produced by mass. The problem is that gravity is non-unique: you can recreate any gravity field by having a thin surface layer varying in density. Gravity gradients tell you of the maximum depth an anomaly can lie, and the integral over a broad region tells you of the total mass surplus or deficit relative to some reference. Those integrals support isostasy, but the gradients are tough to work with because isostasy is only thought to work well at long enough wavelengths that the strength of the lithosphere becomes irrelevant. So in essence you need to smooth gravity out to appropriate wavelengths–and once you do that, the depth limits in the raw gravity are pretty much gone.
So with gravity being relatively useless, where do we go? Keep in mind that we’ll be wanting to compare two columns to be able to discern what happened at one column relative to the other to produce a difference in elevation.
One of the chestnuts of earth science for more than 100 years has been the concept of isostasy, which more or less holds that the outer part of the earth (the crust and mantle lithosphere) floats on the far less viscous asthenosphere. Although the concept was floated in 1855 by Pratt and Airy in seeking to explain geodetic discrepancies in the great survey of India caused by the Himalaya, the term coined by USGS scientist Clarence Dutton in 1889, and the implications brilliantly described (along with the basics of plate flexure) by G. K. Gilbert in 1890, much of the geological (and especially the geophysical) community had no use for isostasy until 1914-5, when Joseph Barrell’s work advocating from multiple lines of evidence for an asthenosphere was published (in nine parts: quite the serial publication). So while we could be celebrating the centenary of physically-based isostasy, in some ways we appear to be ready to bury it. Why? Consider this plot from Moucha (2008):
This shows the predicted dynamic topography of the western U.S.; the scale is in kilometers, so about 2 km of variation of topography in the U.S. would seem to be from dynamic effects (the paper’s title cites ‘mantle convection’ as the cause). Given that average elevations in the west are only occasionally up around 3 km, this would seem to make isostasy moot; the mantle convection alluded to is shown to be extending to the outer core, far below the asthenosphere. The paper’s abstract makes it clear that the authors are attributing much of the modern elevation of the Colorado Plateau (outlined above) to mantle convection and dynamic topography:
Herein we compute the viscous flow beneath North America that is driven by density anomalies inferred from joint seismic-geodynamic modeling. We find that the Colorado Plateau overlies a strong mantle upwelling that is coupled to the sinking Farallon slab, currently beneath the eastern United States. Consequently, the Colorado Plateau is currently a focused dynamic topography high within the western U.S. Cordillera.
So we’ve had a century of misdirection, right? Isostasy is dead, no?
On the heels of the dispiriting notion of advocacy journals, GG would like to ask just who is reading these professional journals, anyways? It seems from what funders and some journal owners are doing that they expect Joe Sixpack to be picking up the Journal of Winter Nighttime Reading and perusing its contents. Does this make any sense?
First off, why does GG think this? Well, several journals are now publishing “plain language” versions of the abstract (that appear to be required). Given that the background for some concepts some papers explore could well take a textbook chapter, this is often mildly amusing. Second, the government is pushing that all federally funded research be made fully accessible to any member of the public when published (yes, GG knows that isn’t the way this is usually phrased, but as members of the public can in fact access paywalled journals by visiting a library with a subscription, its not like this stuff has been locked in a file cabinet somewhere). Why demand this unless you expect people outside the field to read this stuff?
OK, is this a good idea? Let’s try a few analogies and see what we think. A mechanic gets an update from Honda on the proper way to replace a certain engine gasket for cars with a particular kind of fuel injector. Unless you are big on engine repair, do you understand what is different? Why it matters? Or your system administrator gets an advisory from Microsoft that the .dll file in your Windows server allows unencrypted access to the trusted machines file whenever a superuser command is issued and should be edited, although this might result in the loss of ability to remotely access other servers. Do you have any idea what this means? Is this important? Should you do it?
Look, this isn’t to argue that the science literature should be kept under lock and key, it is to say that it is for people trained to use that literature. I do not want scientific papers to be explaining the assumptions of isostasy every time they use the phrase, or defining a P-wave, or how a seismometer works each and every time any of these are brought up. What’s more, at the cutting edge, a lot of published material is wrong. An old saw is to go in front of a class with a textbook and a journal and say that 10% of the textbook is wrong, but only 10% of the journal’s science will prove to be right. Assembling that textbook out of the wilderness of the professional literature is a demonstration of skill and knowledge. It is easy, through a combination of misrepresentation and misunderstanding, to claim the professional literature says something it really doesn’t (the young earth/intelligent design literature is full of examples of this). No amount of plain language abstracts and immediate access will improve the situation.
Only a relatively small fraction of the science we do really translates to something the public might find immediately interesting. Press releases are made for those cases, and if skillfully crafted, they work quite well at conveying the science to a broader public. (It used to be you would get an assist from the science reporter at a paper, but that task is now generally completed by a beat reporter with no special skill). For those few who are really into the science and are willing to work, in essence, to train themselves, the language in the abstract isn’t going to be much of a barrier and for most, access to a library is pretty important in getting up to speed anyways.
So who should be the target of the professional literature? The public at large? No. Those setting policy? No. Professional journals fill a quite specialized niche, as well they should, in communicating amongst professionals. The Journal of Winter Nighttime Reading shouldn’t be a junior Scientific American wannabe.
A discussion with colleagues brought up an interesting question: What examples do we have from the geological record of a broad elevated ramp like we have today in the High Plains of the United States? The answer is somewhat unclear as it depends on the cause of the uplift.
Some proposals mean that the lithosphere is permanently changed. In this case, assuming isostasy over the long haul, areas about 1500m above sea level will eventually end up near sea level–but to do that you have to strip off about 7 times that 1500m, meaning that about 10 km of the upper crust erodes off. That would take away pretty much all the sedimentary rock in the region and eat into the crystalline rock underneath. In the end, the area might resemble the Canadian Shield, a vast expanse of middle crust sitting at the surface. So could places with exposures of such perviously deep rocks be the products of whatever created the High Plains?
Alternatively, the modern topography is ephemeral, perhaps a product of dynamic topography or a thermal rejuvenation of the continental interior. In this case, it is a bit of a race between erosion and subsidence. The faster the area subsides, the greater the record that is preserved for the distant future. Our modern uplift would be an unconformity between the material still remaining and the overriding sediments. One wonders if some of the unconformities out there might reflect a similarly broad and extensive uplift.
Maybe High Plains-type uplifts are somewhat more common in geologic history than we would guess. It could be one of those things that you have to believe before you can see it…
One of the most popular explanations for the High Plains is that they were dragged upward by a buoyant body, probably in the upper mantle under the Rio Grande Rift. This is arguably the only late Miocene to Pliocene event one could plausibly associate with post-Ogallala Formation tilting. GG has tended to be dismissive of this but hasn’t been through the math. Now there must be a simple analysis somewhere in the literature, but GG isn’t seeing it, so let’s make a simple model and see what it takes to make it work. We’ll assume a north-south trending horizontal cylinder with some density contrast under an elastic plate represents the source of uplift (although many folks like a “broken” plate, the physics of such a boundary are inappropriate here). We’ll place the cylinder at a depth z and calculate the uplift and the gravity anomaly from this body. We’ll tweak these until we can fit the observations.
Now we have a little difficulty in that the modern topography is due to more than just the Rift: the sub-Ogallala unconformity reveals rather clearly that there were east-flowing streams when deposition began, meaning that topography back then was tilted to the east, though that potentially was very close in time to deposition. So that topography was presumably compensated by some mechanism that might be well distributed (e.g., variation in crustal thickness). Since the free-air anomaly across the Plains is near 0, the Bouguer anomaly for local compensation of topography should be 0.112 mGal/meter. We’ll just add that to our theoretical models as needed.
The problem is that we don’t know how much topography we want to ascribe to the late Cenozoic Rift: one extreme view (seemingly that of Eaton, 1986, 1987, 2008) is that things were pretty flat in prior to the Rift on an east-west profile, with major rivers going more or less directly to the coast to the south-southeast; another that there was some gradient, though much lower than today (e.g., McMillan et al., 2002). Let’s tackle both and see what we get. In both cases we will focus on the topography east from about 105°W and we’ll place the cylinder at 106°W, under the axis of the Rift.
Recently NSF’s EarthScope program office put out a media announcement with the top ten discoveries they attributed to the soon-to-end program. (EarthScope, for those unfamiliar with the program, originally had three main legs: the Transportable Array (TA) + Flex Array collection of seismometers, the Plate Boundary Observatory (PBO) network of GPS stations, and the San Andreas Fault Observatory at Depth (SAFOD), a drill hole through the fault). What struck GG about this collection was just how little we learned about tectonics, which was a selling point of sorts for the program prior to its start.
Now some of the “discoveries” are not discoveries at all–one listed is that there is a lot of open data. Folks, that was a *design*, not a discovery. A couple are so vague as to be pointless–North America is “under pressure” and there are “ups and downs” in drought–stuff we knew well before EarthScope, so these bullets give little insight to what refinements arose from EarthScope. And then the use of LIDAR to look at displacements of the El Mayor-Cucapah earthquake was hardly a core EarthScope tool or goal even as the program might have contributed funds. So the more substantive stuff might amount to 5 or 6 points.
Arguably PBO has more than delivered and SAFOD disappointed, but GG would like to consider the TA’s accomplishments–or non-accomplishments. TA-related “discoveries” in this list are actually a single imaging result and two technique developments (ambient noise tomography, which emerged largely by happy coincidence, and source back projection for earthquake slip, which is largely a continued growth of preexisting techniques). So in terms of learning about the earth, we are really looking at one result worthy of inclusion.
How should one read a scientific paper? As presenting conclusions one should take as our best estimate of truth? Or as information one can use to test competing hypotheses? You might think it must be one or the other, but that is rarely the case.
Consider the just-published paper by Bahadori, Holt and Rasbury entitled “Reconstruction modeling of crustal thickness and paleotopography of western North America since 36 Ma”. From the abstract you might be tempted to say that this paper is solving a problem, in this case the Late Cenozoic paleoelevation history of the western U.S.:
Our final integrated topography model shows a Nevadaplano of ∼3.95 ± 0.3 km average elevation in central, eastern, and southern Nevada, western Utah, and parts of easternmost California. A belt of high topography also trends through northwestern, central, and southeastern Arizona at 36 Ma (Mogollon Highlands). Our model shows little to no elevation change for the Colorado Plateau and the northern Sierra Nevada (north of 36°N) since at least 36 Ma, and that between 36 and 5 Ma, the Sierra Nevada was located at the Pacific Ocean margin, with a shoreline on the eastern edge of the present-day Great Valley.
There is one key word in that paragraph that should make you careful in accepting the results: “model”. What is the model, and how reliable is it?
Three years ago the Grumpy Geophysicist made his debut, enticing 447 visitors over the remainder of 2014 into this odd collection of rants. That was about 4 visitors per post (yes, things are better now). As noted in the “About” page, though, this has never been about getting lots of likes, it is rather a combination of therapy and writing practice. Nevertheless, on occasion GG has accidentally stumbled into something others found interesting (well, a few, not like anything here has gone viral), and so was curious just what those interesting posts were. So without further ado, a few of the most viewed posts from the first three years of the Grumpy Geophysicist (giving many of you a chance to see what you missed…which, perhaps, will confirm why you weren’t looking here earlier). (Small posts don’t get counted so thoroughly).
One of the questions from the staff at UC Press about GG’s upcoming book was, could this be used in a class? GG’s first response was, well no, it wasn’t written that way. But thinking on it, maybe there is a role there. This is more a reference post to consider the possibility…
Drop in to a bookstore and browse their American history area, or maybe biography or possibly even science or nature and look for books about geologists. Odds are you can find some biographies on John Wesley Powell and Clarence King. Wander anywhere near Yellowstone and you are sure to encounter the Hayden survey, run by Ferdinand V. Hayden, and you are likely to see Clarence Dutton appearing in tomes on the Grand Canyon. And these were names that were prominent at the time, too.
But professional geologists see it differently. Consider this: works authored by John Wesley Powell were cited 310 times since the start of 2000. Ferdinand Hayden? 44 times. Clarence Dutton? 140 times. Really, not bad for guys whose work is mostly or entirely well over 100 years old.
There’s this other fellow, Grove Karl Gilbert, who died in 1918; his last publication, drawn from his notes, was published in 1928. He never led a survey of his own or ran the USGS. There is one real biography available through Amazon (and probably not your local bookseller). Yet his work has been cited 1,544 times since the beginning of 2000, according to Science Citation Index. That is more than three times the sum of those other guys. Even established international geologists fade against Gilbert’s record: for instance, Louis Agassiz, the developer of ice ages, only merited 393 citations since 2000 despite the rapid growth of interest in climates and paleoclimate.
Now part of this, you could argue, is that Gilbert wrote a lot–except he wasn’t prone to writing a large number of short papers, as we are today. He wrote some hefty tomes–Monograph #1 of the USGS might easily outweigh the complete production of many modern geologists (arguably both in volume and impact). Because he did stay in geology and didn’t, for instance, wander into public policy and ethnology as did Powell, there was a greater focus in his work. But Gilbert also lost a lot of time to administration work in the USGS; he was also very generous with his time.
What makes Gilbert so widely cited was the degree to which he was outside his time. In 1883, he offered the first earthquake forecast at the very same time he was asserting that earthquakes were caused by slip on faults that could in many instances be seen to slip at the surface. In many ways, he defined geomorphology as it is understood today; he provided the basic observational and experimental work to understand transport of sediment. He made one of the first clear demonstrations of isostasy, carrying it even farther to make clear the concept, if not the terminology, of elastic plates. He described one major form of deltas so well that they are called Gilbert deltas. He largely advocated for the use of multiple working hypotheses (again anticipating several others in this). By seeking to understand the basic physics or process underlying phenomena, he made contributions that can continue to be applied to phenomena today; many of his contemporaries, though, were too mired in specifics or tangled in trying to make a preferred hypothesis fit observations.
So, if any historians of science are out there looking for somebody worth studying, have at G.K. Gilbert. It seems like his profile in bookstores should be elevated some….