Okay..anyone who was born before Ronald Regan killed the dept of education by giving block grants to states knows that people never do things for nothing, so the only true model would strictly be demand and the ability to meet that demand. Economist tend to complicate things. I have a BA in that art. But I will say if you can model damand then you can use controls to adjust. Supply will follow. Businessmen as JOB CREATORS is absurd. Demand creates jobs. From there you can add on. The model has to be dictated by fiscal policy. There too many large speculators in the US and too few confident retirees

You make the best description on the theory on the economical growth Paradigm that the economic change seems like Malthusian’s diminishing return, and I agree. However, Mr. Roubini makes his point on the social disruption reverse itself through the diminishing demand. If we can put away the elements like the Ponzi scheme and benefactors in social caused deficiency or defects to growth. Corruption by capitalism and the dependency by socialism among societies both caused failure in the economical and societal development.

Perhaps, we focus on the circuitry on the accumulation of wealth and consumable wealth that runs the economy. It seems both the capitalism and socialism ran short and proven wrong in the economical model or social model that became self-destructive; eventually, the economy runs from diminishing demand to diminishing return, or vice versa. So, if we use the living standard as the equilibrium position to the supply line of the circuitry of wealth balanced by both of the diminishing return and diminishing demand.
How about I call my paradigm on the wealth circuitry in economical and social growth that supports and balances both accumulated wealth and consumable wealth; and it created a “Z” shaped development running both on the diminishing demand and diminishing return; which is based on the assumption, the route above the standard of living equal in length with the one below the standard of living is in agreement of its living standard to sustain a viable growth, which contains;

• The base line as the diminishing return where the societies kept peace with its populace that consumable wealth that cause economical displacement like with its negative growth or no growth; it provides entitlement or social programs with non-productive individual citizens for example, 27% of its population on welfare with add-on with subsidies to sustain a standard of living.

• The top line as the diminishing demand that ended with accumulated wealth favors of concentrated wealth owned by individuals that ended with profitless, 1% holds 27% of the global or national wealth, plus those with extra wealth is not in production yields to no growth.

• And the diagonal line that connected to both ends is the support of the price and value in the middle is the standard of living which contains the most of the productive individuals who is moving up and down the ladder of growth.

If more of the wealth accumulated than the wealth consumed, then it causes saturation of the wealth. The diminishing demand under the standard of living agreement made the demand idle because of the shortage of consumption. In the process, the standard of living will go down to meet its demand after the deflationary measure to make it consumable. In reverse, the wealth consumed is over the wealth accumulated, as it is less profitable. Then, it triggers the inflationary measures to aggregate demand to accumulate more wealth in its diminishing return mode; eventually it will balance itself again with the agreement of the standard living with a viable growth.

It is not the supply and demand. It is rather the circuitry of wealth under the spells of the lower living standard that diminishing demand is being part of the deflationary measure. If the accumulated wealth became saturated, then it means the lower living standard that made the demand finite like lesser demand in loan of dollars in ECB.

I am certain I am not being introspective; I may twist the theory a little; but the proof of the lower living standard in Europe made it plausible.

Any economic model has to recognize some core facts that are basic in human behaviors. The key driver and the fuel that has propelled the growth model for several hundred years has been the steady 1% plus increase in population. Western countries, the US in particular, built economies based on a steady increase in population poured onto an almost unlimited expanse of resources. Combined with the inherently human need to improve one's personal circumstances, at worst manifesting itself as greed, growth was inevitable.

This perpetual growth model is now coming to an end with difficult consequences. Think Japan. What economic modelers should be now focus on should be economic and political systems that can provide equal income and resource distributions in circumstances of declining population.
This will be the real challenge of this century.

The writer should read Bernake's paper on FInancial accelerator dated back in 1999.http://faculty.wcas.northwestern.edu/~lchrist/course/Czech/BGG%201999%20...
Incorporating the banking sector in DSGE/Business cycle models is actually old news.
The problems with these models do not only lie with adding however many sectors you consider relevant but the methods used to pick the so-called ideal models. Based on my humble knowledge, the gold standard is still comparing DSGE models against comparable Structural VAR models. Anyone who ever did VAR probably would agree on its fickleness and its structural version could be very ideologically driven despite VAR's claim for being a ''pure'' statistical approach.
As the article rightfully points out, Macro is an empirical subject and so the statistics arms of the analysis should not be compromised by economic assumptions. Often, economic knowledge helps to build models but it may also distract.

All those models really are ill-conceived.
Mainstream economists keep thinking that they can model a whole economy.

An economy is NOT a physical system. There is no constant, millions and millions of ever-changing variables, and variables that can vary in one way or another depending on how a single other variable moves.

Who on this board would admit that he/she is so simple that his/her behaviour can be entered and forecast using a computer?
Even taking the entire population on aggregate doesn't prevent the fact that cultures, education, rationality, expectations and reactions to certain events change over time.

Using econometrics and correlations is silly.
So if there is a 0.6 correlation between two variables, what have we proved? (apart from the fact that the way variable X moves does not necessarily imply that variable Y will move the way we expect, and that X might depend on many other variables as well?)
The best way we can model anything is through very rough approximations and probabilities.
This might lead to very dangerous policy and decision-making.

We also all have seen the "effectiveness" of banks' trading and risk management heavily mathematical models, of course "validated" by regulators.
One also just has to take a look at all the economic forecasts produced by central banks and other economic institute these last few years. All of which have been proved wrong.

At some point, common sense will have to come back at the centre of the debate...

Unfortunately the common sense is not so common and differ infinitely. No one is denying applying rational to policy making but the rational has to be based on some statistical support in order to bring commonality.
What the article says is economists are trying to make the economic model better encapsulating the FIs.
If not model based then what should the policy decisions be based on?

And you believe that models will actually become "better"?...
And what do you do with your statistics? What are you supposed to do when there is 63.47% probability that the outcome of your policy will be great, and 36.53% probability that it will be disastrous? How do you even know that your statistics and correlations actually reflect reality?
.
I think you're taking too many things for granted.
Humans (and some in particular) think too highly of themselves and what they can achieve.
.
Instead of asking the question what should policy decisions be based on, perhaps you should ask the question whether or not there should even be so many policy decisions in the first place.
My guess is that the economy would do much better if left free of intervening governments and their "model-based policy decisions"...

you just contradicted yourself...
deregulation of the financial industry caused the great depression and now caused the great recession. How? Allowing FIs to take on more and more debt. There is a reason why the current crisis and in the 1930's are outstanding - huge debt. Read Keynes, read Fisher, read Galbraith and even Krugman. The legal status of a corporation (including banks) allows for limited liability - if the company goes bankrupt and 300,000 people lose their jobs, you don't suffer the financial consequence. So, if i were CEO, hell yeah i would risk as much as i could for crazy returns... unless the government doesn't allow me to and I could go to jail!!!!!

If only you knew what you were talking about...
.
I challenge you to give me a list of the so-called "deregulations" that occurred in the last 30 years. And perhaps we can then compare it with all new regulations implemented at the same time?
.
I would also strongly advise you to read some other economists as well.

So the repeal of the Glass-Steagal act had nothing to do with our current predicament?

The refusal to regulate derivatives, letting them multiply endlessly to support debt pyramid-scheming?

The federal and state acts throughout the 80's and 90's that removed government restrictions on mortgage lending criteria?

You sound like someone who just read a introductory text on Austrian economics or classical liberalism, and is suddenly possessed by the idea that the simple answer to every complex question is that if you 'get government out of the way', the private sector's banking tycoons will sit by peacefully and let a self-regulating anarchistic free market economy devoid of corruption develop. The sooner you disabuse yourself of that naive delusion, the better.

The repeal of Glass-Steagal had pretty much nothing to do with the crisis.
Most of the banks that failed were purely retail banks. Didn't know Northern Rock and HBOS were complex trading banks... Moreover, some banks were even saved by their investment banking business (RBS).
Ever heard of Basel regulations and the incentives they create to lend to housing and sovereigns? Ever heard of the US administration telling Fannie Mae and Freddie Mac (and banks) to facilitate access to housing to subprime families?
At least try to get your facts right.
.
"The sooner you disabuse yourself of that naive delusion, the better."
While anarcho-capitalism clearly isn't my opinion, the sooner you stop believing in all-powerful, all-insightful and never-misled governments and regulations, the better.

I think the most crucial aspect would be the quality of the investments. Here are some parameters that could be incorporated to forecast growth or recessions.
1. The impact of technological innovations
2. The impact of the widespreadment of technology
3. Deviation from Education
4. Excessive Debt
5. oil & energy prices, higher prices imply higher possibilities of downfall
6. The impact of natural disasters
7. The quality of the investments
8. Land prices & rentals, higher the land prices higher possibilities of downfall
9. Mismatches between Supply & Demand
10. Long ques & waits, implying supply not keeping up with demand
11. Piling up of stocks, implying over supply
12. Deviation from academic qualifications, education in the labour market, and the increase of unqualified personnel holding top positions
13. Deviation from modern & appropriate equipment at work places
14. The impacts of infrastructure shortages & new infrastructure
15. Last but not least, shortages of entrepreneurship & management skills

Hyun Song Shin of Princeton University has shown that banks’ internal risk models make them take more and more risk as asset prices rise

Considering all of the regulations that the government imposes on banks, could they not require that, whatever the risk model the bank wishes to use, it must at least show increasing risk with increasing asset prices? Either overall asset prices or just sector (e.g. housing) asset prices. That would at least reduce the positive feedback problem in one area.

Kant's point that presuming free will is (a) necessary (condition) for justifying ('employing' our) reason (faculty) may qualify prediction (no free will no reason because everything's determined) as a benchmark for social scientific theory evaluation (at least here anyway), so usefulness (for problem solving, understanding relationships,...) may be just as important to social scientific theory (as prediction).

Finally a good article about economics. Congratulations to the editors. In Brasil there is a famous economist, among the most importants ones who played Real stabilization plan, who issued a new book recently, about some economic secret laws. In his book, he refers to some brasilian economists who worked at IMF during the post war up to the 80´s and their achievements at a time when processing data were extremely difficult and slow. Today we have the advantage of computational devices and softwares supported by Moore Law, which helps deevlopments such as that mentioned in the article. But what should be noticed is that the today´s technology supported models are not sufficient to answer the macroeconomic questions due to tha fact that most of the variables are the same since Minsky or even since that the greatest economists from ever used in their models. I would like to propose a question, how did Keynes or Minsky, would behave if they had a notebook connected to the internet and allowed to source any data center available? Andeven most important, what kind of improvements in terms of structure of data banks they should suggest? I had a classmate in my "master" (by 2006) who tried in vain research long term bank loans. There were not data available to show a terms structure of the loans. So may be this is time to debate also what variables should be generated and its structure. Rationales more than technology which is already given.

While there could be intrinsic modeling issues as highlighted by the commentators, there is also the burning issue of (thanks to Taleb) the Turkey problem (highly unlikely event with low probability currently not present in the data, but could appear in due course) or the inverse Turkey problem (current presence of unlikely events with low probabilities but could be absent later), would call for constant normalization of data which lead to eventual modeling errors when assumptions are drawn from the analysis of data.

The one other point that has been highlighted in the article which is becoming increasingly predominant is the role of banks as intermediation agencies; if the availability of credit is to be taken as a measure of their inter-mediation, we have seen both the extremes, where it could move into sub-prime territory when the boom lasted, while it could contract to almost stalling the economy when the burst happened. Such extremes and their facilitation by monetary policy with the tacit connivance of a number of constituencies are beyond the realm of modeling. The role of regulation to provide the necessary fillip has more to be desired therefore the execution issue cannot be juxtaposed on modeling either as a failure.

In my opinion, this isn't a flaw in the model. For instance, the Normal distribution is used in most statistical model-building exercises. The distribution clearly specifies the drawbacks of having extreme events or 'outliers'; booms and busts are exactly this, extreme events. Hence, we ought to channel our efforts into developing a new model that overcomes the flaws.

The thing is, booms and busts occur far more frequently than it is considered to be an outlier. I don't believe that the probability distribution of economic output(if such exists) follows the normal distribution.

The reason is because the errors in a normal distribution are assumed to be random. If you assume the errors are not random, you get very different results. Nassim Taleb and the late and great Benoit Mandelbrot did much work on this. There are two books by Mandelbrot called Fractals and Scaling in Finance and The Misbehavior of Markets. I'd highly recommend them.

Also, the distributions that occur in the real world very often scale to power laws where there is a winner-take-all effect. This is especially true in financial markets and in finance/economics in general.

I've read Mandelbrot and Taleb, so let me recommend a book in the same vein, Why Stock Markets Crash by Didier Sornette. He discusses, in great detail, the transition to power law behavior when a market is building toward a crash.
*
The unfortunately favored Gaussian distribution, chosen for its tractability - as economists are wont to do - has no place in financial economics, because it requires both independence and randomness of events - a circumstance inconceivable in a financial market.

I guess people should start reading seriously the works of late Benoit Mandelbroot, who demonstrated a while ago already that most models used by Economists are fundamentally flawed and can in no way represent reality, and even less predict future trends and crises.
If we could learn from some bright minds instead of feeling the need to reinvent the wheel to prove our worth, we would save a lot of times, and most probably save humanity a lot of unforeseen once-in-a-lifetime financial/economical crisis (in my own lifetime, it is the 3rd one already...).

The late Mandelbrot didn't like Gaussian white noise coz the spectral density is flat and the tails are too thin for black swans to hide underneath. Mandelbrot wanted to use brownish-colored or fractal noises with thicker tails in the SDE's (stochastic differential equations), but I don't think he understood a lot about the real economy. Writing equations for modeling the economy is a sinchy piece of microcake. Any math guy can do that, but to predict human behavior you need astrology or ideology or sociology or some kind of bullogy or muppetology!

Not all models, improved or otherwise, will work or the time, especially economic model. New model needs time to prove its real worth. Learn from the East, its modus operandi is functional and tangible. (btt1943, vzc1943)

Weather forecasting (and hence weather modelling) has improved MASSIVELY over the last 20 years - and there's no fundamental reason why economics shouldn't do the same. My best guess as to why it isn't happening is that the people who can do it are quietly creaming the market rather than troubling themselves with the tiresome politics that I am guessing that improving national economic models would entail.

DSGE models, and all models which impose top-down structural equilibrium assumptions, are inherently flawed in their ability to measure what is truly occurring in an economy, and as such, become a bit dangerous. These models reek of a seeming desire by academic economists to cast (or force) their beloved and elegant theories into a mission-critical measurement and predictive apparatus, where they may not properly belong. Classical economic theories are wonderful for teaching economic principles, but they fall shy of the mark in descriptive and predictive modeling. I recall having such reactions to the state-of-the-art macroeconomic models of several decades ago upon examining them closely. DSGE does not fix their fatal flaw. The agent-based approach is inherently ground-up rather than top-down, and as such, has the potential for far greater accuracy, both descriptively and predictively.

Consider the simple problem, and its common solution, of a company which sells capital equipment forecasting its next several quarters of new equipment sales. How is this approached by those whose career success may turn on the accuracy of their forecasts? Generally not by using any form of model imposing an equilibrium assumption. Here is how it is done in common practice using a ground-up approach:

1. poll each salesperson for a list of equipment sales they identify as possibly occurring, asking for their best estimate of (i) the size of the sale, (ii) the date (say month) of the sale, and importantly, (iii) their subjective probability assessment of the percent likelihood of the sale;

2. for each month, apply each probability given to the size of the hoped-for sale, to calculate its expected value, and add up all such estimates from all salespersons for that month to produce a predicted sales level for that month; add months to get quarterly figures, etc.

Refinements to the above may include:

3. observing, based on the past accuracy of each salesperson, who tends to err on timing and probability and in what directions, and adjusting each salesperson's subjective estimates accordingly;

4. estimating, based on past experience, what amounts of opportunity not identified by salespersons will appear and when, and the factors which influence the appearance and magnitudes of that non-identified opportunity.

The above approach can also be used to model the impacts of shock effects, such as perturbations to financing costs, or unanticipated maneuvers of competitors, or unanticipated other shocks to customer demand, or one's supply chain. Impacts will touch sales timing and sales magnitudes in particular, and may impact price, and can roll in the firm's reactions to those impacts.

This type of bottom-up approach can in fact be applied to an entire economy. It is not a trivial modeling exercise, nor is it a trivial computational exercise. But it is worth undertaking. Starting with individual agents and their decision making histories -- people, families, businesses, banks, governments, and its active organs, is a better way to go. It is not so difficult as it may sound to build one's models properly and successfully from the ground up -- not the top down.

I’m gonna have to argue that the obsession with bottom up modelling is wrongheaded. Predictive dynamic models in other fields, from meteorology to engineering, are built top down. DSGE models are supposed to be bottom up models, as they are considered to be microfounded, the underlying logic is to start from the level of the individual and aggregate up. It is true that in practice, the whole economy is modelled as a single representative agent, but that’s not because they are top down models, that’s just how you get equilibrium out of a “bottom up” model. The problem is not top down approach but rather imposing equilibrium to a model which is supposed to represent a chaotic dynamic system like the economy.

Modelling the whole economy from the level of the individual is like modelling a bridge from the level of the atom, even if it would be computable, the predictive window would be for all practical purpose inexistant. In addition, here’s a key paper by Paul Anderson exposing the limits of a reductionist approach.

I agree that it is the imposition of an equilibrium assumption which creates the problem. An economy is in fact the aggregate behavior of individuals, firms and other entities. It is best to recognize the reality of what one is dealing with when trying to explain, predict and control its outcomes.

The only reason to have top-down models is our present inability to collect and process the data necessary for an "atomic" analysis. If we could get around this, however, the "predictive window" would in fact be as near--perfect as anything...

Not really. One quality of non linear systems is their sensitivity to initial conditions; long term prediction is impossible.

However, I think economists also assume too much that if money circulates, by definition it circulates system wide. They ignore the idea that large chunks of 'money' can circulate off in little eddy pools.

The last problem is what Soros called reflexivity. Models break down as fast as players can predict how the models work and game the system accordingly.

Exactly, infinite predictive window is theoretically impossible and it's in fact pretty naïve to believe that it could be achieved through a reductionist approach. There's a tradeoff between the complexity of a model and it's robustness, and the robustness of an "atomic" model would be nil regardless of how much computer power you could throw at it.

Economists are also trying apply math to a social construct, 'money', treating it like Calories or Joules. That dog don't hunt. Usury, and the incentive and ability to hoard money define a system that naturally tends to volatility.

I am not sure if people actually understand DSGE here. It is all about in-equilibrium. And since it is a dynamic model, any talk of equilibrium is pointless. These models talk about steady states.

Also, DSGE is all about market imperfections, where as the original RBC is admittedly classical.

No one stop you to shock a system continuously. As a matter of fact, if you ever run a DSGE, there are outputs of continuously shocking the system and see where the economy is. The point is not about where the equilibrium is but how the system react to shocks.

First, DSGE is not dynamic, it's a case of economists coining their own idiosyncratic definition of dynamics but calling a dog a bird won't make it fly, it’s still comparative statics even if you call it dynamic. Moreover, it is about calculating an equilibrium (what you call a steady state) and this equilibrium can only be disturbed by shocks. Since in reality the economy is never in equilibrium, to make the model fit the datas, the economy is modelled as being hitted by shocks of arbitrarily chosen magnitude and then converging to equilibrium again. If the market were “perfect”, the economy would quickly return to equilibrium. Since that’s not good enough to make the the model fit the datas, you have to add market “imperfection”, which are in fact arbitrarily adjusted parameters like sticky wages and prices (like if anything should adjust instantly in a “perfect” world). This really reminds me Ptolemy’s model, start with a conterfactual premise (that the earth is the center of the universe), arbitrarily adjust some parameters (epicycles) until the model fit the datas and then claim your model is predictive because it fits the datas. A dynamic model doesn’t need shocks to be pushed out of equilibrium, the chaotic patterns emerges from the interaction of interdependant variables influencing each other through time. A good example of how those models can be predictive is looking at what meteorogists do, they kickstarted and embraced the developments in chaos theory decades ago and now they have sophisticated predictive models.

Hi,
Economics after the crisis. There are a few things not worth running for - a bus, a woman or a new economic model. The DSGE model, the anatomy of a decision model is most commonly used by banks, which could account for the mess they are in. If you follow financial politic and listen to the different ways politicians do things much of what they do is doomed. For example the British parliament makes decisions mainly of inflation and unemployment such thing as spending power global transition such as outsourcing, supply and demand are only marginally considered. The chancellor gives a tax cut which take years to have effect , the economics of putting coal on a fire to keep it burning until the fire engine arrives. The newer method for example the Smets-Wouters model is based on the Euro mess. Newer models should concentrate on the flow of money not from a historical way but a controlling way the difference of going to an accident and building a house. The currency independent way people spend money should take more prominence.

Our modern civilization is not that difficult to model. We use technologies to take (or steal) free resources and turn them into something we can sell to each other. Inputs are 'resources', outputs are 'goods'. As long as the 'resources' are free or under-valued, we create wealth out-of-thin-air. As we run out of 'free' resources, we become poor. For now, we haven't run out, but eventually we will. Our science is stuck in a rut, the IC engine hasn't changed in 150 years, and Oil/Gas are still the only sources of energy (... try building a windmill without using a single drop of oil).
Economists, unfortunately, have no clue about the technology/science part. In fact, most people go to economics because they are unable to cope with math and science. Which is why the models create by Economists are bogus.