Step 5

Men In Black II, Ocean’s Twelve, & The Hangover, Part II... all of these movies have one thing in common: they all were abysmal sequels to blockbuster movies. We long to regenerate scenarios when everything comes together perfectly and the stars align, but that kind of success is rarely duplicated. In the world of money managers, success means blockbuster performance… every year! Fund managers who are successful in the short term are considered the current financial heroes, despite the fact that every reputable study of mutual fund performance over the past 30 years has found there is no reliable way to know if managers with recent winning performance will win in the future. This is why some variation of the disclaimer “past performance is no guarantee of future results” must appear in all mutual fund advertisements and prospectuses. Even still, unwitting investors chase recent performance, and the dangerous practice of manager picking ensues.

Sometimes managers can duplicate their success a few years in a row, but it just doesn’t last. As hard as it is to duplicate success in the film world, it is even more difficult for these all-star money managers to duplicate their past success.

“Most investors follow the crowd down the path to comfortable mediocrity,” says David Swensen in Pioneering Portfolio Management.1 Anxious to capture the gains that come with a winning mutual fund manager, manager pickers blindly chase a hot performing fund manager’s track record, failing to realize their odds for future success have vastly diminished.

Figure 5-1 shows the results of a study using Morningstar data reflecting the performance of active fund managers for the 13 years from 2004 to 2017. The chart depicts how on average, only about 9 funds remained in the top 100 the following year.

Figure 5-1

Pick Your Manager

In the years 2008 and 2009, none of them repeated their previous year’s top 100 performance.

Variations in manager performance are a function of luck and the random rotation of the style of their fund. When a particular manager’s investment style is rewarded by the market, that manager is often credited with skill. As market conditions change, however, so does the performance of fund managers. Figures 5-2 and 5-3 track the rankings of the top 10 mutual fund managers in a given year and subsequent time periods. These charts reveal how quickly a “top” fund manager can slide to the bottom. For example, Figure 5-2 shows that the ProFunds Biotechnology UltraSector Inv had the highest performance out of 6,896 mutual funds in 2013. In 2014, however, the fund slipped to seventh place; then to 939 in 2015; 7,719th place in 2016 and finally landed in 393rd place in 2017. The data contained in these two figures reveal many other examples of fund performance that sharply declined in subsequent years.

Top-performing funds have failed to maintain their position throughout a meaningful subsequent period. As Bob Dylan famously said, “the first ones now will later be last, for the times they are a changing.”

Figure 5-2

Figure 5-3

Figure 5-4

Top-performing funds have failed to maintain their position throughout a meaningful subsequent period. As Bob Dylan famously said, “the first ones now will later be last, for the times they are a changing.”2

An analysis of the Morningstar database of 246 mutual funds with 10 years of returns is shown in Figure 5-4. The top graph shows the performance rankings of these 246 funds from best to worst (left to right) for the first 5-year period from 2008 to 2012. Then the same order of fund rankings is maintained in the bottom graph in order to see if fund performance was repeated in the second 5 year period from 2013 to 2017. In light of the above studies, it should come as no surprise that many of the managers who outperformed their peers in the first 5-year period did not do so in the second 5-year period, and vice versa.

Another tracking mechanism that can cause confusion is the reporting of mutual fund returns, often inflated when compared to actual long-term returns. The discrepancy arises from neglecting to account for funds that have closed or merged, resulting in the higher average returns of only surviving funds included in calculations. When funds go under, their records are stricken from databases, creating a survivorship bias. This bias inflates the remaining funds’ average returns by 21%, according to CRSP data cited by John Bogle.3 The 2017 year end SPIVA study states that 58% of actively managed domestic equity funds, 55% of actively managed global equity funds, and 48% of actively managed fixed income funds were either merged or liquidated during the previous 15 years.4

Even large institutions and pension plans chase performance, much to their detriment. A study conducted by Amit Goyal of Emory University and Sunil Wahal of Arizona State University found that manager hiring and firing decisions made by consultants, board members and trustees were a waste of time and money.

The study, “The Selection and Termination of Investment Management Firms by Plan Sponsors,”5 reveals the negative impact of manager picking. Goyal and Wahal analyzed hiring and firing decisions made by approximately 3,700 plan sponsors, representing public and corporate pension plans, unions, foundations and endowments. Figure 5-5 shows the results of hiring 8,755 managers over a 10-year period from 1994 through 2003. Note that investment manager performance is measured by average annualized excess returns over a benchmark. The chart illustrates that managers that were hired had outperformed their benchmarks by 2.91% over the three years before being hired. However, over the following three years the managers on average underperformed their benchmarks by 0.47% per year when adjusted for management fees and transition costs. Plan sponsors often proceeded to fire managers who had underperformed in favor of other recent top performers, only to repeat the cycle again. The study concluded, “In light of such large transaction costs and positive opportunity costs, our results suggest that the termination and selection of investment managers is an exercise that is costly to plan beneficiaries.”

Figure 5-5

Using data from the same study by Goyal and Wahal, Figure 5-6 conveys the tendency for investment committees or plan sponsors to hire investment managers with a history of above-benchmark returns and fire managers with lower performance. The chart shows that after managers were hired, their post-hiring excess returns were indistinguishable from zero, and the managers that were fired performed better than the hired managers. The plan sponsors should have just bought index funds and forgotten about manager picking in the first place.

In the 2009 edition of Pioneering Portfolio Management: An Unconventional Approach to Institutional Investment,6 Yale Endowment Chief Investment Officer David Swensen states, “Active management strategies, whether in public markets or private, generally fail to meet investor expectations... In spite of the daunting obstacles to active management success, the overwhelming majority of market participants choose to play the loser’s game.”

An investigative journalist for The St. Petersburg Times, approached Index Fund Advisors (IFA) and a handful of other investment experts to collect some in-depth analysis of the risks and returns of the Florida State Pension Plan for various periods of time relative to various index portfolio strategies. The research results were revealed in a July 2011 article titled, “Easy investments beat state’s expert pension planners,”7 which concluded that a simple index portfolio would have outperformed the Florida state pension plan’s investment performance over the last ten years.

“The professionally managed SBA [State Board of Administration] performed worse — by more than a percentage point — than seven index-fund portfolios for the decade ending 2010,” the article reports. “On average, a $100 investment in an index portfolio grew to $184, while Florida’s pension delivered just $157,” the reporter concluded.

The findings prompted me to dig deeper. If Florida’s $124 billion pension plan fared so poorly against the index portfolios, what about the other states? IFA attempted to analyze the employee retirement systems in all 50 states. Data on more than 40 state pension plans was obtained, yielding similar results with varying degrees of underperformance relative to the index portfolios.

Figures 5-7 through 5-10 show the annual risk and return of various state pension plans, net of fees, compared to passively managed index portfolios comprised of a blend of diversified asset allocations. A best effort was made to estimate fees in states that report returns before fees are deducted. States were analyzed for both 13-year periods and 26-year periods and were charted based on either a June 30th or December 31st year-end date. The data shows that in the 26-year study, only 2 states (South Dakota and Delaware) matched the index portfolios, and not one of them outperformed in any of the time periods analyzed. For data sources, go to pension-gate.com.

Directors of these pension plans have access to so-called “top” money managers, which would lead one to believe that these plans fired their very best shots at earning above-benchmark returns, only to fall short. This analysis reveals that the widely implemented and costly process of hiring and firing of investment managers for state pension plans has delivered a negative payout relative to a risk-appropriate set of index benchmarks.

As discussed in Step 3, a method to determine manager skill is to identify if there are enough years of performance data to be statistically significant by measuring a manager’s t-stat. If the t-stat is that of 2 or greater, then the investor has at least a 97.5% confidence level that the manager’s above-benchmark returns were due to skill, with up to a 2.5% chance that they were due to luck, and the true alpha of the manager is zero.

In this book’s 2015 printing, we analyzed the year-by-year difference between the fund return and the benchmark return of four funds that were named Morningstar’s 2013 “Manager of the Year” with ten or more years of data. At that time, each of the funds had a positive average alpha, but none of them had a t-stat of 2, meaning their alpha was not consistent enough to have a 97.5% confidence level of manager skill. How did those managers fare in the subsequent five years? Figures 5-11 through 5-14 reveal that all of the managers still had a positive average alpha, but none of them were able to attain a t-stat of 2 over the entire period ending of December 31, 2017.

When managers are subjected to the scrutiny of a simple t-test, the idea of manager skill to produce consistent alpha quickly becomes relegated to the realm of fantasy, taking its rightful place alongside unicorns, Bigfoot, and the Loch Ness Monster - as depicted in The Alpha Myth painting on the following page.

8State Retirement Systems Data from public information, includes states that provided 11 and 24 yrs of returns for fiscal years ending 6/30, and are net of fees; Index Portfolios are net of fund fees and 0.05% Advisory Fee. See www.pension-gate.com/ states for additional disclosures.

10State Retirement Systems Data from public information, includes states that provided 12 and 25 yrs. of returns for fiscal years ending 12/31, and are net of fees; Index Portfolios are net of fund fees and 0.05% Advisory Fee. See www.pension-gate.com/ for additional disclosures.

The data provided in all charts referring to IFA Index Portfolios is hypothetical backtested performance and is not actual client performance. Only data for the IFA Index Portfolios is shown net of IFA's highest advisory fee and the underlying mutual fund expenses. All other data, including the IFA Indexes, does not reflect a deduction of advisory fees. None of the data reflects trading costs or taxes, which would have lowered performance by these costs. See more important disclosures at ifabt.com.