Recently, a couple of finance professors were talking about how most of their students seem quite uninterested in using models to ask the kinds of questions that might lead them to a deep understanding of the issues at hand. It’s not that the students have an aversion to the models. Quite the opposite. They want to learn the key inputs and how to get them, so that they can use the models to generate answers, to plug in the numbers and play the game.
As we say in my house, “There’s a lot of that going around.”
Models can’t capture reality, but they can provide a framework for examining it and for making investment decisions — yet a framework is not a formula, and you don’t need to kneel at the altar of a model to find something of value in its use.
What models don’t do is give easy answers, yet that’s what’s apparently expected of them most of the time. A couple of examples I’ve witnessed of late:
A representative of a large pension plan explained a series of shifts into so-called alternative assets by stating that all of the moves “reduced risk in the portfolio.” No assumptions were given and no broader examination of the meaning of “risk,” just that definitive statement, apparently because the changes led to a statistic in a model being lower than it had been before.
Another institutional investor said that a recent review had found her organization’s investment strategy to be “just about on the efficient frontier,” as if that was a place where you could set out a chaise lounge and order a margarita. The tweaking of one little assumption would cause her to appear to be in another place altogether.
These tendencies are not limited to asset owners, of course, or any other branch of the investment tree. Research analysts trumpet conclusions without context, be they based on the simplest arithmetic or the most complex DCF calculations. Portfolio managers do too.
That is to be expected — if we reported and debated every assumption in every interaction, we’d never get anything done. But the answer to that reality is not to avoid a discussion of the caveats, but to figure out how to properly communicate them, even though doing so can wreck “the illusion of certainty.”the research puzzle | The posting with that title has turned out to be very popular, so it must have struck a chord. And, to deal up front with the really big assumptions that are at the core of a critical analysis.
It is especially important — and especially difficult — to do so when working with individual investors. In a recent posting, Tadas Viskanta talked about the virtues of “sticking to a plan in the face of emotional volatility.”Abnormal Returns | In addition to his own postings, Viskanta curates the best-known daily linkfest in the finance blogosphere. That’s very true — and a great financial advisor is an expert at providing the behavioral bumpers that can help a client stay on plan. But what if the plan is based upon a questionable model or unwise assumptions? That is the case all too often.
A financial plan that uses historical inputs without a buffer of safety invites danger and increases the chances of behavioral freakouts. Yet, so many plans are based upon those inputs, without questions being asked about what the return assumptions ought to be,the research puzzle | I dealt with that in the dark days of 2008 by talking about “the famous nine percent.” how to handle the reality of migrating correlations,research puzzle pix | This chart shows the fluctuation in correlations between stocks and bonds, commodities, and the dollar. or whether standard deviation is the embodiment of risk. A mean-variance model that uses those inputs is not a bad tool, it’s just wielded inappropriately a lot of the time.
In defined contribution plans, the explosion of assets in target-date funds has been supported by the “comforting metaphor”AllAboutAlpha.com | That phrase comes from a summary of EDHEC’s research on controlling “short-term loss aversion and longer-horizon risk aversion” in target-date funds. of a “glide path” into a well-funded retirement. If only it were all as easy as we are led to believe. Not only do most participants not see the complexities and pitfalls of the model — neither do many of the plan sponsors.
If we are to plug and play, let it be in an exploratory fashion, so that we can see the strengths and weaknesses of our methodologies. Not one set of numbers into one model to get one answer, but multiple (wide-ranging) scenarios, seen through different lenses, to help us picture the undulations in the terrain that we must traverse — and to help us ask the key questions along the way.