A recent perfunctorily-peer-reviewed paper in the Proceedings of the Royal Society
explains the distinction between analytical physics models of the climate and purely numerical models — i.e. the field known as computational fluid dynamics (CFD). The lack of an intense review cycle makes for a very readable paper, with a refreshing conversational writing style that the editors apparently allowed. The gist of the piece is that the formulation of analytically-based Geophysical Fluid Dynamics (GFD) models of the atmosphere & ocean are essential to making sure that the CFD's are on the right track.
From the start, Vallis points to the importance of simplified models for the geosciences:
This article discusses the role of geophysical fluid dynamics (GFD) in understanding the natural environment, and in particular the dynamics of atmospheres and oceans on Earth and elsewhere. GFD, as usually understood, is a branch of the geosciences that deals with fluid dynamics and that, by tradition, seeks to extract the bare essence of a phenomenon, omitting detail where possible. The geosciences in general deal with complex interacting systems and in some ways resemble condensed matter physics or aspects of biology, where we seek explanations of phenomena at a higher level than simply directly calculating the interactions of all the constituent parts. That is, we try to develop theories or make simple models of the behaviour of the system as a whole. However, these days in many geophysical systems of interest, we can also obtain information for how the system behaves by almost direct numerical simulation from the governing equations. The numerical model itself then explicitly predicts the emergent phenomena—the Gulf Stream, for example—something that is still usually impossible in biology or condensed matter physics. Such simulations, as manifested, for example, in complicated general circulation models, have in some ways been extremely successful and one may reasonably now ask whether understanding a complex geophysical system is necessary for predicting it. In what follows we discuss such issues and the roles that GFD has played in the past and will play in the future.
From the passages that I highlighted in bold, you can see that two themes are important. First that simplified models are accepted in other scientific disciplines, and secondly that more concise models are essential to foster an understanding of the fundamental behaviors. This is not exactly groundbreaking advice, as scientists such as David Mumford have been elaborating on this approach and I was well aware of its importance, having been professionally involved in an extended project (DOI) to incorporate simplified models of the environment into an organizational framework. And to Vallis's mention of condensed matter physics, I well understand how vital that is, as I cut my teeth on that thesis topic.
To see the converse, consider this recent paper discussing the possibility of complex simulations and artificial intelligence networks becoming permanently inscrutable and thereby potentially providing misleading results, which are only discovered after being applied in the field. The linked article describes incidents that have occurred in the medical field.
As the Vallis paper is very quotable, I will provide a running commentary on some passages.
"The moniker ‘GFD’ has also come to imply a methodology in which one makes the maximum possible simplifications to a problem, perhaps a seemingly very complex problem, seeking to reduce it to some bare essence. It suggests an austere approach, devoid of extraneous detail or superfluous description, so providing the fundamental principles and language for understanding geophysical flows without being overwhelmed by any inessentials that may surround the core problem. In this sense, GFD describes a method as well as an object of study."
I have to ask who is doing such an "austere approach" in atmospheric sciences? Not following the lead of Lindzen, who (before he retired) wrote the most arcane and inscrutable tracts that one can imagine.
"Although one might think that such a method is, of course, entirely appropriate in all scientific areas, in some branches of science there is a tendency to embrace the complexity of reality by using complicated models, to which we add processes whenever possible rather than taking them away."
Note the portion I bolded. A few months back, I promised at the end of this blog post to write a piece called "Needless Complexity". In fact, I did. But even though the post was clearly labelled as a diversion, I got hammered by one commenter who thought I was way off-base. So I moved it from a blog post to a linkable page (which deleted the original comments unfortunately). In any case, here is that piece:
That page goes on with admittedly subjective commentary, but one reference that I want to emphasize is that it is likely that no one truly understands GCMs. This from a recent PhD thesis by J-P Michael at FSU, a hotbed of ENSO research.
Also Vallis's description of emergent phenomenon is telling.
"The simplifications sought by GFD are not quite like that—they are more akin to those sought by biologists, or condensed matter physicists, or anyone dealing with a complex subject that contains emergent phenomena. An emergent phenomenon is one that emerges from the collective behaviour of the constituents of a system, and is not a property of its individual components—its equivalent atoms or its primitive building blocks; emergence is a manifestation of a group behaviour. Perhaps the most familiar example is temperature, which is a collective property of the molecules of a system and is proportional to the mean kinetic energy of molecules in a gas; the phenomenon of phase transitions is another example in physics."
So that leads to the following kind of assertion made by Geller et al, which I dare to question in the link below.
More than once does Vallis invoke comparisons to physical phase transitions:
"Some of the main goals (and past triumphs) of GFD lie in explaining ‘fluid-dynamical emergent phenomena’, for example the Gulf Stream in the Atlantic, or a hurricane, for these are not properties of a fluid parcel. However, compared with the complexity of biological systems, or even in some ways phase transitions, these phenomena occupy something of a half-way house: we can seek high-level explanations (theories) of the phenomena, but we can also simulate some of these phenomena quite well using the basic laws of physics, as expressed by the Navier–Stokes equations and associated thermodynamical and radiative equations. These days we do a far better job of describing ocean currents using a numerical simulation than using any theory, analytical or otherwise, that seeks to directly predict them using some more holistic method. Similarly, the climate contains a turbulent fluid (the atmosphere) but our most accurate descriptions of the future climate are made by attempting to simulate the individual eddies over the course of decades and centuries, somewhat akin to following molecules in a simulation of a gas, rather than trying to construct a macroscopic theory of climate. Given all this, is there any need to seek a high-level explanation? In other words, do we still need GFD? The answer, it turns out, is yes, but at the same time GFD needs to continue to evolve and to draw from and give to those large numerical simulations, else it will become irrelevant."
I can relate to this passage especially the bolded part. Try to explain the phase transition between the alpha and beta phases of tin. Been there and this is still a puzzle. Vallis says atmospheric physics is a "halfway house" for those of us that have seen real complexity.
"I should also emphasize that GFD is not, or should not be, a purely analytical–theoretical endeavour. Rather, and without seeking a definition, a GFD approach means seeking the most fundamental explanation of a phenomenon, specifically in the geosciences and often of complex phenomena. Using an idealized numerical model with simple equations (but perhaps complex output) certainly falls under the rubric of GFD and modern GFD relies as much on such simulations as it does on conventional ‘paper and pencil’ theory."
That's what I am trying to do with modeling the QBO. This is a combined "paper and pencil" theory followed by a computational evaluation. The idea was to follow up from Laplace's original formulation, which Vallis gives a nice historical accounting:
"It may have been Laplace who, in about 1776 (English translation is in ), was the first to use the fluid equations in a GFD context—he wrote down the linear shallow water equations on a sphere, in the rotating frame of reference (and thus with the Coriolis terms) and forced by an external potential. His goal was to understand the tides and he gave some partial solutions
Fro the QBO work, the key to a simplified Laplace formulation was to optimize the Coriolis contribution at the equator, where the forces obviously cancel, leaving the other first-order forces (i.e. lunisolar) to predominate.
"Contrary to its reputation as a difficult subject, GFD makes things easier".
His emphasis bolded, and one that I agree with.
"The unresolved breaking of gravity waves is a third example, important in both atmosphere and ocean: in the former, the breaking both provides a drag on the flow and produces the quasi-biennial oscillation (QBO)"
Is the breaking of the QBO simply related to the sin(sin(wt)) amplitude folding from the solution of the Laplace tidal equation? The internal modulation is due to the lunisolar forcing, but the outer sine-wave modulation effectively places a cap on the amplitude modulation. And if the internal modulation is great enough, then the amplitude will fold back on itself. When a gravity wave breaks on open water, it has experienced a physical limit in some characteristic, but is the breaking of wind fundamentally different? The QBO wind speed never exceeds a certain level. So do the CFD/GCM models adequately explain this, or is it up to a simplified GFD model such as I am proposing for QBO?
So read what Vallis has to say:
"The understanding that GFD and simpler models provides is of key practical importance, for suppose we put a lot of effort into producing a large numerical model, and then that numerical model produces the wrong answer, or it produces an answer that differs from another model. What then? We should try to improve the subgridscale representation, but in a complex system like the atmosphere with many feedbacks this is extremely difficult, and virtually impossible unless we have some level of intuitive understanding of the system as a whole, as discussed by Held . If a GCM does not produce a good QBO, we expect to be able to fix it by increasing the resolution and lowering the diffusivity, and by ensuring that tropical convection produces gravity waves of the correct magnitude. But we only know this because we have an understanding of the nature of the QBO, and we cannot apply the fix if we do not understand tropical convection and gravity waves."
My emphasis bolded. So the question is: Do the majority of the numerical climate modelers truly understand the causal factors behind QBO, or are they suggesting that the cyclic behavior is simply "emergent" and thus potentially inscrutable when all is said and done? Or do they fall back to Lindzen's interpretation and leave it at that?
"Scientists will always have personal preferences and differing expertise,but combining analytical ideas with simple numerical models can be a very powerful tool in both research and education, and modern tools can be used to enable this at an early stage in the classroom. A numerical model transparently coded in 100 lines and run on a laptop can then play a similar role to that of a rotating tank in illustrating phenomena and explaining what equations mean, and the rift between theory, models and phenomena then never opens".
Vallis's emphasis bolded. In other words, if everything is in sync, many of the ensuing complications never arise. So, its entirely possible that the endless tweaking of GCMs is caused by inadequacies of some of the foundational theories. For a three-dimensional Navier-Stokes model, if the boundary conditions are not set properly, you can be on a wild goose chase caused by unjustified preconditions. For the QBO, if the boundary conditions are indeed set by the lunisolar tide interacting with a strong seasonal signal, yet the numerical models do not assume this, everything that comes after is suspect. That is where the rift that Vallis describes opens.
To me, no skin off my back if this were the case. I do not have anything invested in years of climate models. But for a scientist such as Lindzen, his foundational theory of QBO collapses to some extent. It's not me that is saying this, but Vallis:
"I hope that it is by now clear that GFD has played an enormous role in the development of our understanding of the natural world. With the emergence of complicated models that role is more important than ever: it may be hidden, like the foundations of a building, but without that foundation the edifice will come tumbling down. I will continue to do GFD because it is interesting, important and fundamental."
My emphasis bolded. Edifice or artifice, the word was probably carefully chosen.
Is the detail shown above accurate? How can we know without being able to nail the more fundamental behaviors?
"Other people’s motivation may differ but whatever the future holds GFD, and the approach it brings, has (or should have) an expansive role to play."
Amen to that. Vallis is actually guarding his opinion here, as he can never truly guess what someone's motivation is and whether it is promoting some other agenda. For someone like Lindzen, I wouldn't put it past him to choose artifice over edifice. His original model of QBO is just that complicated!
Recommend reading the entire paper by Vallis, both in its frankness and in his nifty writing style. Could have been mistaken for a blog post! It's that good! Θ