Bubble Markets, Burst Markets

Wall Street forecasters are notoriously bad at predicting what the markets are going to do.  In fact the forecasts for 2001, 2002, and 2008 were actually worse than guessing.  Granted, predicting the future is a hard job, but when it comes to stock markets, there are some things you can count on.  Disclaimer:  This is a look at the numbers; it is not investment advice.

Let’s take the Standard & Poor’s 500.  It is an index of 500 large US companies stock values, much broader than the Dow Jones’s 30 company average.  It isn’t a stock, and you can’t buy shares in it.  But it is a convenient tool for tracking the overall condition of the stock market.  It may also reflect on the state of the economy, which we’ll look at in a bit.  Below are the monthly closing values of the S&P 500 since 1950.  It’s value was about 17 points in January of 1950 and it closed around 2100 points here in June of 2015.  It’s bounced around plenty in between.

Closing values of the S&P 500 stock index.

Closing values of the S&P 500 stock index.

One of the questions to ask is whether the markets are overvalued or undervalued.  Forecasters hope to predict crashes, but also to look for good buying opportunities.  Short term fluctuations in the markets have proven to be very unpredictable.  But longer term trends are a different story, and looking at them can give huge insights into what’s currently going on.

But first we have to look at the numbers in a different way.  The raw data plot above makes things more difficult than they really need to be because it fails to let you clearly see the trend in how the index grows.  Stocks have a tendency to grow exponentially in time.  This is no secret, and most of the common online stock performance charts give you a log view option.  Exponential growth is why advisors recommend most working people to get into investments early and ride them out for the long haul.

The exponential growth in the S&P is easy to see in the plot below, where I plotted the logarithm of the index value.  For convenience I also plotted the straight line fit to these data — this is its exponential trend.  Note that these data span six and a half decades, so we have some bull and bear markets in there — and whatever came in between them.  And what you see is that no matter what short term silliness was going on, the index value always came back down to the red line.  It didn’t necessarily stay there very long, but the line represents the stability position.  It is a kind of first order term in a perturbation theory model, if you will.  The line shows the value that the short term fluctuations move around.

Here I've taken the logarithm (base 10) of the index values to show the exponential growth trend.  The grey area represents the confidence intervals.

Here I’ve taken the logarithm (base 10) of the index values to show the exponential growth trend. The grey area represents the confidence intervals.

This return to the line is a little bit clearer if we plot the difference between the index and the trend.  This would seem to be a reasonable way to spot overvalued or undervalued markets.  Meaning, that in 2000, when the S&P was some 800 points over its long term model value, the corresponding rapid drop back down to the line should have caught no one by surprise.

Differences between the S&P 500 index value and the exponential trend model value.

Differences between the S&P 500 index value and the exponential trend model value.

But this look at the numbers is a bit disingenuous.  That’s because the value of the index has changed by huge amounts since 1950, so small points swings that we don’t care about at all today were a much bigger deal then.  This makes more recent fluctuations appear to be a bigger deal than they may really be.  So what we want to see is the percentage of the change, not the actual change.

And on top of this, let’s mark recession years (from the Federal Reserve Economic Database) in red.  From this view we can see the bubble markets develop and the resulting panics that result when they burst (hello 2008).  And that every recession brought a drop in the index (some bigger than others), but not every index drop represented a recession.  In the tech bubble of the late 1990s the market was 110% overvalued at its peak.  The crash of 2008 had it drop to about 45%, which is considerably undervalued.  All that in 8 years.  I think it’s safe to call that a panic.  I know it made me panic.

Deviations in the S&P 500 index value from the exponential model are shown as a percentage of the index values.  And recession years (from FRED) are shown in light red.

Deviations in the S&P 500 index value from the exponential model are shown as a percentage of the index values. And recession years (from FRED) are shown in light red.

What we see is that the exponential model does a good job at calculating the baseline (stable position) values.  If it didn’t, the recession-related drops in the index wouldn’t line up with the FRED data, and things like the 1990s bubble and the 2008 financial meltdown wouldn’t match the timeline.  But they do.  Quite well, actually.  So this is a useful analysis tool.

It is also enlightening to take the same looks at the NASDAQ index since it represents a different sector of the stock market.  NASDAQ started in 1971 and is more of a technology focused index.  The NASDAQ composite index is created from all of the stocks listed on the NASDAQ exchange, which is more than 3000 stocks.  So more companies in the index means this is a broader look, but it is focused on tech stocks.

So, as with the S&P above, here are the raw data.  It looks similar to the S&P, and the size of the tech bubble is more clear.  The initial monthly close of the index was 101 points, and it is over 5000 today.

Closing values of the NASDAQ stock index.

Closing values of the NASDAQ stock index.

Not surprising to anyone, this index also grows with an exponential trend.  The NASDAQ was absolutely on fire in the late 1990s.  I wonder if this is what Prince meant when he wanted to party like it was 1999.  Maybe he knew that would be the time to cash out?

Here I've taken the logarithm (base 10) of the index values to show the exponential growth trend.  The grey area represents the confidence intervals.

Here I’ve taken the logarithm (base 10) of the index values to show the exponential growth trend. The grey area represents the confidence intervals.

The size of the dot-com bubble is clearer if we look at the deviation from the model, as we did with the S&P.  At the height of the tech bubble, the NASDAQ was about 3500 points overvalued.  Considering that the model puts its expected value at about 1300 points in 2000, I have to ask myself, what were they thinking?

Differences between the NASDAQ index value and the exponential trend model value.

Differences between the NASDAQ index value and the exponential trend model value.

The percent deviation plot shows this very clearly.  At the height of the tech bubble, the NASDAQ was some 275% overvalued, almost three times that of the S&P 500’s overvalue.  Before the late 1990s the NASDAQ had never strayed more than about 50% from the model value.  Warren Buffet has said that the rear view mirror is always clearer than the windshield, but maybe Stevie Wonder shouldn’t be the one doing the driving.

Deviations in the NASDAQ index value from the exponential model are shown as a percentage of the index values.  And recession years (from FRED) are shown in light red.

Deviations in the NASDAQ index value from the exponential model are shown as a percentage of the index values. And recession years (from FRED) are shown in light red.

From this perspective, the NASDAQ today actually looks a few percentage points undervalued, so tech still seems to be a slightly better buy than the broader market (this is not investment advice).

Not only that, but the growth model of the NASDAQ, based on its 45 years of data, shows that it grows considerably faster than the broader market.  If you go back and look at the raw data for either of the two indices, you’ll notice something special about the nature of exponential growth.  The time it takes to double (triple, etc.) is a constant.  As these are bigger numbers and because it is convenient, let’s look at the time it takes to grow by a factor of ten (decuple).  The S&P 500 index decuples every 33.3 or so years.  The NASDAQ composite, on the other hand, decuples every ~24 years (about 23 years and 11 months, give or take).  This has huge implications for growth.  That’s nine fewer years to grow by the same factor of 10.

Now comes the dangerous part.  Let’s take the both of these indices and forecast their model values out thirty years.  Both of the datasets contain more than thirty years worth of data, so forecasting this far out is a bit of a stretch, but not without some reasonable basis.  Still, this is an exercise in “what if,” not promises, and certainly not investment advice.

Since we started with the S&P, let’s look at that first.  If the historic growth trends continue, the model forecasts that the S&P 500 (currently around 2000 points) should be bouncing around the 10,000 point mark some time in the middle of 2038.

S&500 data, along with its exponential model fit, extended out thirty years.  The grey area represents the confidence intervals.

S&500 data, along with its exponential model fit, extended out thirty years. The grey area represents the confidence intervals.

The NASDAQ, on the other hand, which is currently around 5000 points, should average around 10,000 in late 2021, and 100,000 near the end of 2045.  (Note: the S&P should be around 16,000 points at that time).  Today the ratio of the NASDAQ to the S&P is about 2.4.  But in 2045 it could reasonably be expected to be more than 6.  Depending on the number of zeroes in your investment portfolio (before the decimal point…), that could be significant.

NASDAQ data, along with its exponential model fit, extended out thirty years.  The grey area represents the confidence intervals.

NASDAQ data, along with its exponential model fit, extended out thirty years. The grey area represents the confidence intervals.

This forecasting method will not predict market crashes.  But that’s OK, because the professionals who try to forecast them can’t do that either.  (Now if only Goldman-Sachs would hire me.)  What it can do is give us a very clear idea of the market is over or under valued.  By forecasting the stable position trend, we can easily spot bubbles, identify their size, and perhaps make wise decisions as a result.

The Adjacent Possible and the Law of Accelerating Returns

A concept that inventor and futurist Ray Kurzweil drives home in his books is what he calls the Law of Accelerating Returns.  That is, the observation that technology growth (among other things) follows an exponential curve.  He shows this for no small number of pages for varying technologies and concepts.  Most famous is Moore’s Law, in which Gordon Moore (one of the founders of Intel Corporation) observed that the number of transistors on a die doubled in a fixed amount of time (about every two years).  Kurzweil argues that this exponential growth pattern applies to both technological and biological evolution. In other words, that progress grows exponentially in time.  It should be clear that this is an observation rather than something derived from fundamental scientific theories.

What makes this backward looking observation particularly interesting is that in spite of our observation of it as generally true over vast periods of time, humans are very linear thinkers and have a difficult time envisioning exponential growth rates forward in time.  Kurzweil is a notable exception to that rule.  Because of exponential growth, the technological progress we make in the next 50 years will not be the same as what we have realized in the last 50 years.  It will be very much larger.  Almost unbelievably larger — the equivalent of the progress made in the last ~600 years.  This is the nature of exponential growth (and why some people find Kurzweil’s predictions difficult to swallow).

Interestingly, when a survey of scientific literature was done by Derek Price in 1961, an exponential growth in scientific publications was readily observed, but dismissed as unsustainable.  This unsustainability in the growth rate was understood to be obvious by Price.  The survey was revisited in 2010 (citing the original work), with the exponential growth still being observed 39 years later.  So this linear forecasting is a handicap that seems to exist even when we have the data to the contrary staring us in the face.

On the other hand we have biologist Stuart Kauffmann.  He introduced the concept of the Adjacent Possible which was made more widely known in Steven Johnson’s excellent book, Where Good Ideas Come From.  The Adjacent Possible concept is another backwards-looking observation that describes how biological complexity has progressed through the combining of whatever nature had on hand at the time.  At first glimpse this sounds sort of bland and altogether obvious.  But it is a hugely powerful statement when you dig a little deeper.  This is a way of defining what change is possible.  That combining things that already exist is how things of greater complexity are formed.  Said slightly differently, what is actual today defines what is possible tomorrow.  And what becomes possible will then influence what can become actual.  And so on.  So while dramatic changes can happen, only certain changes are possible based on what is here now.  And thus the set of actual/possible combinations expands in time, increasing the complexity of what’s in the toolbox.

Johnson describes it in this way:

“Four billion years ago, if you were a carbon atom, there were a few hundred molecular configurations you could stumble into.  Today that same carbon atom, whose atomic properties haven’t changed one single nanogram, can help build a sperm whale or a giant redwood or an H1N1 virus, along with a near infinite list of other carbon-based life forms that were not part of the adjacent possible of prebiotic earth.  Add to that an equally list of human concoctions that rely on carbon—every single object on the planet made of plastic, for instance—and you can see how far the kingdom of the adjacent possible has expanded since those fatty acids self-assembled into the first membrane.” — Steven Johnson, Where Good Ideas Come From

Kauffmann’s complexity theory is really an ingenious observation.  Perhaps what is most shocking is that, given how obvious it is in hindsight, no one managed to put it into words before.  I should note that Charles Darwin’s contemporaries expressed the same sentiments.

What is next most shocking is that Kauffman’s observation is basically the same as Kurzweil’s.  We have to do a little bit of math to show this is true.  I promise, it isn’t too painful.

The Adjacent Possible is all about combinations.  So first let’s assume we have some set of n number of objects.  We want to take k of them at a time and determine how many unique k-sized combinations there are.  This is popularly known in mathematics as “n choose k.”  In other words, if I have three objects, how many different ways are there to combine them two at a time?  That’s what we’re working out.  There is a shortcut in math notation that says if we are going to multiply a number by all of the integers less than it, that we can write the number with an exclamation mark.  So 3x2x1 would simply be written as 3!, and the exclamation mark is pronounced “factorial” when you read it.  This turns out to be very helpful in counting combinations.  Our n choose k counting problem can then be written as:

Math5

You can try this out for relatively small numbers for n and k and see that this is true.

The pertinent question, however, is what are the total number of combinations for all possible values of k.  That is, if I have n objects, how many unique ways can I combine them if I take them one at a time, two at a time, three at a time, etc., all the way up to the whole set?  To find this out you evaluate the above equation for all values of k from 0 all the way to n and sum them all up.  When you do this you find that the answer is 2^n. Or written more mathematically:

Math1

So as an example, let us take 3 objects (n=3), let’s call them playing cards, and count all of the possible combinations of these three cards, as shown in the table below.  Note that there are exactly 2^3=8 distinct combinations.  Here a 1 in the row indicates a card’s inclusion in that combination.  We have no cards, all combinations of one card, all combinations of two cards, and then all three cards, for a total of 8 unique combinations.

Card 3 Card 2 Card 1
0 0 0
0 0 1
0 1 0
0 1 1
1 0 0
1 0 1
1 1 0
1 1 1

You can repeat this for any size set and you’ll find that the total number of unique combinations of any size for a set of size n will always be 2^n.  If you are familiar with base 2 math, you might have recognized that already.  So for n=3 objects we have the 2^3 (8) combinations that we just saw.  And for n=4 we get 2^4 (16) combinations, for n=5 we have 2^5 (32) combinations, and so on.

So in other words, the number of possible combinations grows exponentially with the number of objects in the set.  But this exponential growth is exactly what Kurzweil observes in his Law of Accelerating Returns.  Kurzweil simply pays attention to how n grows with time, while Kauffman pays attention to the growth of (bio)diversity without being concerned about the time aspect.

Kauffman uses this model to describe the growth in complexity of biological systems.  That simple structures first evolved, and that combinations of those simple things made structures that were more complex, and that combinations of these more complex structures went on to create even more complex structures.  A simple look at any living thing shows a mind-boggling amount of complexity, but sometimes it is obvious how the component systems evolved.  Amino acids lead to proteins.  Proteins lead to membranes.  Membranes lead to cells.  Cells combine and specialize.  Entire biological systems develop.  Each of these steps relies on components of lower complexity as bits of their construction.

Kurzweil’s observation is one of technological progress.  That the limits of ideas are pushed through paradigm after paradigm, but still it is the combination of ideas that enable us to come up with the designs, the processes, and materials that get more transistors on a die year after year.  That is to say, semiconductor engineers 30 years ago had no clues how they would get around the challenges they faced in reaching today’s level of sophistication.  But adding new ways of thinking about the problems lead to entirely new types of solutions (paradigms) and the growth curve kept its pace.

Linking combinatorial complexity to progress gives us the modern definition of innovation.  That innovation is really the exploring and exploiting of the Adjacent Possible.  It is easy to look back in time and see the exponential growth of innovation that has brought us to the quality of life we have today.  It is much easier to dismiss it continuing on because we are faced with problems that we don’t currently have good ideas about how to solve.  What we see from Kurzweil’s and Kauffman’s observations is that the likelihood of coming up with good ideas, better ideas, life-changing ideas, increases exponentially in time, and happily, we have no good reason to expect this behavior to change.