Raising the Bar on ENSO Model Validation

I have been using the Azimuth Project Forum as a sounding board for the ENSO Model [1,2,3,4,5,6,7,8].  The audience there is very science-savvy so are not easily convinced of the worth of any particular finding (and whether it is correct in the first place). They also tend to prefer pure math because that can be sufficiently detached from the muddy world of applied physics such that one can avoid being labeled as "right" or "wrong".  With math one can always come up with a formulation that can exist on its own terms, separate from a practical application.

So trying to convince those folks in the validity of the ENSO model is difficult at best.

Recently the advice has been to do statistical validation on the model. One participant recommended I try an experimental approach

"I still don't have the spare cycles to address this fully, but given that one of the two terms of an AIC or BIC is the log likelihood and there is not a closed form representation of the likelihood in this case, I'd probably explore either the empirical likelihood work of Art Owen and his students, for one thing as packaged in the emplik R package, or possibly Approximate Bayesian Computation (ABC; see also and here)."

I am not going to go to the trouble of "exploring" some unaccepted statistical validation procedure, when I am having enough of a challenge defending the ENSO model physics. What am I supposed to do -- defend someone else's empirical statistical research in addition to defending my own work? No thanks.

It seems to be always about #RaisingTheBar to see what someone will do to defend their results.

Fair enough.

So I will in this post show an overwhelming piece of evidence that the modeling work is on the right track.

Continue reading

LOD as a sloshing forcing?

Is it possible that the LOD correlation to multidecadal global temperature variations is just an example of a forcing response to an ocean basin's sloshing behavior?

Changes in LOD are angular momentum changes and those get directly translated to forcing inputs to the precarious thermocline stability.  The ~4 year lag between forcing and response may be explained by the thermal mass in the system.

Continue reading

The QBOM Part 2

I previously wrote about the Quasi-Biennial Oscillation (QBO) and its periodic behavior, and in particular how it interacts with ENSO -- tentatively as a forcing. The ability of QBO to recreate the details of the ENSO behavior is remarkable.  The possibility exists however that the forcing connection may be more intimately related to tidal torques which force QBO and ENSO simultaneously.   Over a year ago, I first showed how the lunar tidal periods can be pulled from the QBO time series. See Figure 1 here, where the synodic month of 29.53 days is found precisely.

An interesting hypothesis is that the draconic lunar month of duration 27.2122 days may also be a common underlying significant driver. Unfortunately, the QBO and ENSO data are sampled at only a monthly rate, so we can’t do much to pull out the signal intact from our data ... Or can we?

What’s intriguing is that the driving force isn’t at this monthly level anyways, but likely is the result of a beat of the monthly tidal signal with the yearly signal. It is expected that strong tidal forces will interact with seasonal behavior in such a situation and that we should be able to see the effects of the oscillating tidal signal where it constructively interferes during specific times of the year. For example, a strong tidal force during the hottest part of the year, or an interaction of the lunar signal with the solar tide (a precisely 6 month period) can pull out a constructively interfering signal.

To analyze the effect, we need to find the tidal frequency and un-alias the signal by multiples of 2π

So that the draconic frequency of 2π/(27.212/365.25) = 84.33 rads/year becomes 2.65 rads/year after removing 13 × 2π worth of folded signal. This then has an apparent period of 2.368 years.

This post will go into more detail and show how a combination of the synodic tide and draconic tide cycles are the primary forcers for QBO.

Continue reading

Baby D Model

In the last post on characterizing ENSO, I discussed the trend I observe in the modeling process -- the more I have worked it, the simpler the model has become. This has culminated in what I call a "baby" differential equation model, a model that has been reduced to the bare minimum.

This has partly been inspired by Professor John Carlos Baez's advice over at the Azimuth Project

"I think it's important to take the simplest model that does a pretty good fit, and offer extensive evidence that the fit is too good to be due to chance. This means keeping the number of adjustable parameters to a bare minimum, not putting in all the bells and whistles, and doing extensive statistical tests"

Well, I am now working with the bare minimal model. All I incorporate is a QBO forcing that matches the available data from 1953 to current. Then I apply it to a 2nd-order differential equation modeled as a characteristic resonant frequency \omega_0 .

 f''(t) + \omega_0^2 f(t) = Forcing(t) = qbo(t)

I adjust the initial conditions and then slide the numerically computed Mathematica output to align with the data.

The figure below is the empirical fit to the QBO, configured as a Fourier series so I can extrapolate backwards to 1880.

Figure 2 is a fit using the first 400 months of ENSO SOI data as a training interval, and validated to the last 800 months (100 years total)

The next fit uses the second 400 months of data as a training interval, which is then validated forward and backward by 400 months

The final fit using the last 400 months of data, and validates to the first 800 months

In each case, the overall correlation coefficient is above 0.7, which is as much as can be expected given the correlation coefficient of the empirical model fit to the raw QBO data.

At this point we can show a useful transformation.

 f''(t) + \omega_0^2 f(t) = qbo(t)

If we take the Fourier transform of both sides, then:

 (-\omega^2 + \omega_0^2) F(\omega) = QBO(\omega)

This implies that the power spectra of SOI and QBO should contain many of the same spectral components, but scaled at values proportional to the frequency squared. That is why the SOI, i.e. f(t) or F( \omega ), contains stronger long time-period components; while the QBO shows the higher frequency, e.g. the principal 2.33 year period, more strongly.

One may ask, what happened to the Chandler wobble component that has been discussed in previous posts?

It is still there but now the wobble forcing is absorbed into the QBO. Using the recently introduced and experimental "FindFormula" machine learning algorithm in Mathematica, the 6+ year Chandler wobble beat period shows up in the top 3 cyclic components of QBO! See the following figure with the yellow highlight. It is not nearly as strong as the 2.33 year period, but because of the spectral scaling discussed in the previous paragraph, it impacts the ENSO time profile as importantly as the higher frequency components.



The caveat in this analysis is that I didn't include data after 1980, which is explained by the non-stationary disturbance of the climate shift that started being felt after 1980. This will require an additional 400 month analysis interval that I am still working on.

"Can you tell me in words roughly what these 20 lines do, or point me to something you've written that explains this latest version?"

It is actually now about half that, if the fitting to the QBO is removed. And the part related to just the DiffEq integration computation is just a couple of lines of Mathematica code. The rest is for handling the data and graphing.

The fit to the QBO can be made as accurate as desired, but I stopped when it got to the correlation coefficient as shown in the first figure above. The idea is that the final modeling result will have a correlation coefficient about that value as well (as the residual is propagating an error signal), which it appears to have.

The geophysics story is very simple to explain, as it is still sloshing but the forcing is now reduced to one observable component. If I want to go this Baby D Model route, I can shorten my current ENSO sloshing paper from its current 3 pages to perhaps a page and a half.

The statistical test proposed is to compare against an alternate model and then see which one wins with an information criteria such as Akaike. I only have a few adjustable parameters (2 initial conditions, a characteristic frequency, and a time shift from QBO to ENSO), so that this will probably beat any other model proposed. It certainly will beat a GCM, which contains hundreds or thousands of parameters


ENSO redux

I've been getting push-back on the ENSO sloshing model that I have devised over the last year.  The push-back revolves mainly about my reluctance to use it for projection, as in immediately.  I know all the pitfalls of forecasting -- the main one being that if you initially make a wrong prediction, even with the usual caveats, you essentially don't get a second chance.   The other problem with forecasting is that it is not timely; in other words, one will have to wait around for years to prove the validity of a model.   Who has time for that ? :)

Yet, there are ways around forecasting into the future. One of which primarily involves using prior data as a training interval, and then using other data in the timeline (out-of-band data) as a check.

I will give an example of using training data of SOI from 1880 - 1913 (400 months of data points) to predict the SOI profile up to 1980 (800 months of data points). We know and other researchers [1] have confirmed that ENSO undergoes a transition around 1980, which obviously can't be forecast.   Other than that, this is a very aggressive training set, which relies on ancient historical data that some consider not the highest quality. The results are encouraging to say the least.

Continue reading

ENSO Transformation

After playing around with the best way to map a model to ENSO, I am coming to the conclusion that the wave equation transformation is a good approach.

Fig. 1:  Upper is the wave transformation fit to SOI, lower is the difference error signal, very uniform and flat.

What the transformation amounts to is a plot of the LHS vs the RHS of the SOI differential wave equation, where k(t) (the Mathieu or Hill factor) is estimated corresponding to a characteristic frequency of ~1/(4 years),

 SOI''(t)+k(t) SOI(t) = F(t)

and where F(t) is the forcing function, comprised of the main suspects of QBO, Chandler wobble, TSI, and the long term tidal beat frequencies.

I added a forcing break at 1980, likely due to TSI, to get the full 1880 to 2013 fit.

The wavelet scalogram agreement is very impressive, indicating that all frequency scales are being accounted for.

Fig 2: Wavelet scalogram, scale in months since 1880.

This is essentially just a different way of looking at the model described in the ARXIV ENSO sloshing paper. What one can do is extend the forcing and then do an integration of the LHS to project the ENSO waveform into the future.



The Hidden Harmony of ENSO

With this analysis, I wanted to demonstrate the underlying order of the most concise SOI Model. This model characterizes the salient fit parameters:

  1. Two slightly offset forcing sinusoids which match the average QBO forcing cycle
  2. A forcing sinusoid that maps to the frequency of the Chandler wobble beat
  3. A Mathieu modulation perturbing the 2nd-order DiffEq with a periodicity of about 8 years

This set of four parameters was used to model both modern day records corresponding to the atmospheric pressure data describing the Southern Oscillation Index, as well as to proxy records of historical coral data. The parameters seem to match closely over widely separated time intervals (see Figure 5 in the latter link).

Figure 1 is the modern-day SOI record, suitably filtered to show the multi-year excursions.

Fig 1: SOI Data. The waveform is erratic, to say the least.

It is amazing that this erratic a waveform can be modeled by a limited set of parameters that actually make some physical sense, but that is nature for you and the idea behind "sloppy modeling" -- models that use just a few parameters to accurately describe a behavior. The simplified model strongly suggests that there is a hidden harmony acting to drive ENSO.

Continue reading

Lithium Battery Characterization

In the menu under Stochastic Analysis, I have a white paper called "Characterization of Charge and Discharge Regimes in Lithium-Ion Batteries".

This is a breakthrough on modeling the fat-tail behavior of Lithium-Ion batteries and something that has a lot of practical analysis benefits considering the push toward common-place use of Li+ technology (see Tesla's Powerwall and review).

From the introduction:

"Modeling with uncertainty quantification has application to such phenomena as oxidation, corrosion, thermal response, and particulate growth. These fall into the classes of phenomena governed substantially by diffusional processes. At its most fundamental, diffusion is a model of a random walk. Without a strong convection or advection term to guide the process (e.g. provided by an electric or gravitational field), the kinetic mechanism of a particle generates a random trajectory that is well understood based on statistical physics principles. The standard physics approach is to solve a master diffusion equation under transient conditions. This turns into a kernel solution that we can apply to an arbitrary forcing function, such as provided by an input material flux or thermal impulse. In the case of a rechargeable battery, such as Li+, the flux is charged ions under the influence of an electric field."

Alas, when I tried to submit the paper to ARXIV as a preprint it got rejected. The first time it was rejected due to a mixup in the citation numbering. The second time they said it was removed from the publication queue without exactly saying why, suggesting it be submitted to a "conventional journal" instead.

I do not need that kind of hassle. I can just as easily DIY.