In the Canadian province of Quebec there was once a relatively average pharmaceutical company. But then some financial wizards of the McKinsey school of magic joined its top management, and they wanted to be big and powerful. Incidentally, at the same time, the great magician Ben B. from Washington began to spread large amounts of cheap money among the people. The ambitious managers therefore borrowed a lot of this cheap money and bought one drug company after another. Profits surged, and shareholders and Wall Street analysts were in awe of the management.

This previously insignificant company grew steadily until it reached a market capitalisation of US$ 90 billion. But then a couple of independent analysts took a closer look at the company and made a shocking claim – its sales were artificially inflated. R&D expenses, on the other hand, had been slashed entirely. In addition, customers were being ripped off with excessive price increases. The company’s net debt amounted to the unsustainable sum of US$ 30 billion. And, above all, the reported profits had never been earned.

As a result, the share price crashed, decreasing by 90%, and the CEO had to go. The company postponed reporting its annual statement for an indefinite period and admitted that it was close to bankruptcy.

A fairy tale? Unfortunately not. It’s the true story of the rise and fall of Valeant, one of the big stock market darlings of the past years on Wall Street.

In recent years a new category of investment products has emerged. Companies offering these products claim that they are able to systemically outperform a market index in the long run – a promise that has rarely been fulfilled within the financial industry to date. Marketed as “smart beta” funds, they combine active and passive fund management, allegedly delivering the benefits of the two approaches. They are based “on active strategies which are not implemented by portfolio managers, but systematically implemented according to clearly defined rules“. Such rules can either relate to the criteria for selecting securities or to the methods for weighting them. Examples of them are: equal weighting, minimum variance, small cap, high dividends, low P/E ratio, CROCI, etc.

Many of these strategies, however, are not at all new. “Smart Beta” is ultimately a relabelled version of so-called “factor investing”. In the past, this was the collective term used to designate quantitative strategies based on numerically assessable success factors for the stock markets, which were identified and empirically determined by analysts. They were systematically used to put together securities portfolios.

It is common knowledge that Warren Buffett’s Berkshire Hathaway has outperformed the equity markets by a wide margin. The degree of outperformance appears especially impressive if it is adjusted for inflation. Since 1964, the value of Berkshire Hathaway increased in real terms at an annual rate of 16.8%, while the S&P 500 Total Return managed only 5.5% p.a.

There have been frequent calls for a new approach to economics in the press. For this purpose, George Soros founded the “Institute for New Economic Thinking” (INET). I personally do not see this as a solution. Nobody knows what “new thinking in economics” should be like. In addition, “old thinking in economics” also yielded a lot of good results. If you go to the INET website, you will come across a familiar problem. It is dominated by a small circle of researchers from elite English-speaking universities, and its texts recycle topics that have been treated time and time again. George Soros’ new thinking is in fact the same as old thinking, but it comes in a new packaging. And this is no solution to the problems.

Edward Fullbrook, Director of the World Economics Association and prominent critic of established economics, already urged the economic sciences to start applying diverse methods, like in modern physics, a long time ago. Physics does not expect its theories to deal with views of the world, but to explain very concrete problems. And, at the same time, it is accepted that a specific approach is not appropriate for dealing with other matters. At first glance, Fullbrook’s suggestion seems to be a logical step, yet it is tricky. Orientation towards the natural sciences could be a double-edged sword. Natural scientists also battle for recognition, striving to publish articles in top journals, but with different consequences. They do not recycle the same contents in a different packaging. Instead, an increasing number of scandals have come to light regarding rigged or incomprehensible research results. It could well be that the diversity of methods strongly favours abuse in the field of research. Read more ›

In August last year, Raghuram Rajan, the current Head of the Indian Central Bank and previous Professor of Finance in Chicago, wrote an article that had attracted considerable attention about the “paranoid style” of economic debates. He gave the hurtful, personal comments the Nobel Prize laureate Paul Krugman had made about Ken Rogoff and Carmen Reinhart as an example; they were criticized for serious statistical errors found in an article they had authored, which had previously been seen as ground-breaking.

Raghuram Rajan is right. In particular economists who receive extensive public attention, such as Paul Krugman, come over increasingly as paranoid egocentrics. This also applies to his rival, Rogoff, who likes to wallow in self-pity and portray himself as a victim of a conspiracy to the general public. This may possibly be due to the mechanisms of the media industry, which only pays attention to those that scream “scandal” the loudest. This will be explained in more detail in the following section. When economists resort to making a racket to attract publicity, they ultimately damage the reputation of their science and its representatives. This is a particular shame with regard to Paul Krugman, the mind behind numerous important and innovative insights. Read more ›

The methodological approach of mainstream economists largely involves seeing an economy as either in a state of equilibrium or in a state of striving to reach an equilibrium. Economists analyse factors and chains of effect that enable an economy to adjust, moving from one equilibrium to the next. Their forecasts are predominantly based on the assumption that the chains of effect they have supposedly identified will continue in the same way.

This methodology is rooted in a scientific tradition, which can be called a so-called “modern view of the world” in the spirit of Max Weber. “This is typified by rationalization, which means … the knowledge or belief that if one but wished one could learn [a rational explanation] at any time [and] … one can, in principle, master all things by calculation” (Weber 1948:139) *). In principle, it is the assumption that there is a rational explanation for everything that was previously unclear and, furthermore, that the previously unclear is calculable. If economic realities do not match the results of a model, it is because the model has not yet succeeded in capturing the complicated economic interdependencies in their entirety. The logical consequence is therefore that the parameters of the model only have to be redefined to more effectively capture the economy. The fundamental validity of the models themselves, however, is not questioned.

This way of thinking, termed “deterministic”, is borrowed from classical physics, which assumed that scientific principles are determined by clear physical laws. Providing that there is complete information about the status of a system at a given point in time, the status of a closed physical system is therefore predictable at any time in the future. According to this view, the quality of statements concerning the future depends largely on how precisely we comprehend the present.