Definition of 'Autocorrelation' .- Willis Eschenbach

By steven d keeler | Apr 26, 2014

 



A mathematical representation of the degree of similarity between a given time series and a lagged version of itself over successive time intervals. It is the same as calculating the correlation between two different time series, except that the same time series is used twice - once in its original form and once lagged one or more time periods.

If you take a random window on a highly autocorrelated “red noise” dataset, the extreme values (minimums and maximums) are indeed more likely, in fact twice as likely, to be at the start and the end of your window rather than anywhere in the middle.

I’m sure you can see where this is going … you know all of those claims about how eight out of the last ten years have been extremely warm? And about how we’re having extreme numbers of storms and extreme weather of all kinds ?  If you say “we are living today in extreme, unprecedented times”, mathematically you are likely to be right, even if there is no trend at all, purely because the data is autocorrelated and “today” is at one end of our time window !

We are indeed living in extreme times, and we have the data to prove it !

Of course, this feeds right into the AGW alarmism, particularly because any extreme event counts as evidence of how we are living in parlous, out-of-the-ordinary times, whether hot or cold, wet or dry, flood or drought.  On a more serious level, it seems to me that this is a very important observation. Typically, we consider the odds of being in extreme times to be equal across the time window. That’s not true. As a result, we incorrectly consider the occurrence of recent extremes as evidence that the bounds of natural variation have recently been overstepped (e.g. “eight of the ten hottest years”, etc.).  This finding shows that we need to raise the threshold for what we are considering to be “recent extreme weather” … because even if there are no trends at all we are living in extreme times, so we should expect extreme weather.  Of course, this applies to all kinds of datasets. For example, currently we are at a low extreme in hurricanes … but is that low number actually anomalous when the math says that we live in extreme times, so extremes shouldn’t be a surprise?

I propose that we call this the “End Times Effect”, the tendency of extremes to cluster in recent times simply because the data is autocorrelated and “today” is at one end of our time window … and the corresponding tendency for people to look at those recent extremes and incorrectly assume that we are living in the end times when we are all doomed.  Draw a parabola. Now pick a random interval on the x-axis. No matter what interval you pick, at least one endpoint of that interval will be an extreme (if the vertex is not in your interval, then both endpoints will be extremes).  Realize that any functional relationship that goes up, down, or both, will have subsets of that relationship that are somewhat parabolic in shape.

Blogger "Seattle" says :

For a Wiener process (a random walk comprising infinitesimally small random steps), the “Arcsine laws” apply .

The arcsine law says that the distribution function of the maximum on an interval, say [0,1], is 2 / pi * arcsin(sqrt(x)).

Differentiating that expression yields the probability density :

1 / (pi*sqrt(x) * sqrt(1-x))

 

https://www.wolframalpha.com/input/?i=plot+1%2F%28pi*sqrt%28x%29*sqrt%281-x%29%29

 

 

 

 

Comments (0)
If you wish to comment, please login.