Economic Logic, Too

About Me

I discuss recent research in Economics and various events from an economic perspective, as the name of the blog indicates. I plan on adding posts approximately every workday, with some exceptions, for example when I travel.

Monday, September 30, 2013

Households with low incomes save little. In one way, this should not surprise us, as their propensity to consume is high because the marginal utility of consumption is high. If they temporarily have low income, this is not a problem at all. However, if you have persistent low income, then you absolutely need to accumulate some savings to supplement any retirement pension income. This does not seem to be happening, and a frightening share of the population is hitting retirement age with little in the bank. Even worse, the use of lotteries and other gambling operations seems to be rather popular for lower incomes. While one can rationalize playing an actuarially unfair lottery under some circumstances (see here), it is generally considered to be a poor use of scarce income. Now, could one use the temptation of lotteries to get low income households to save more?

Kadir Atalay, Fayzan Bakhtiar, Stephen Cheung and Robert Slonim show one can combine saving accounts effectively with lottery jackpots. In an experiment conducted online in the US, they had participant allocate funds under various savings schemes. While this is not quite like the real world, it still reveals some interesting findings. Introducing a jackpot lottery does indeed increase savings, and significantly so (the authors find a 12 percentage point boost). These savings come both from delayed consumption and reduced lottery participation outside of the savings account. And all these effects are stronger among those with lower incomes. Can we believe those results? After all, the jackpots in savings lotteries are supposed to be "life changing", something that such an experiment cannot simulate. But it is encouraging to still see a strong impact.

Friday, September 27, 2013

Early in the 20th century, the United States took over from the United Kingdom the role of the preeminent economic and political power. Since the last turn of the century, some people are seeing hints that the United States may be losing that role (but it is not clear who would take it), often seeing parallels with the fall of the Roman Empire. We do not know, however, why the Roman Empire fell. There are several explanations, and several may be necessary for the fall. But there is no smoking gun that the United States should particularly look out for.

John Hartwig has an interesting suggestion: the Roman Empire fell because it lost the steady revenue from seigniorage. Of course, it is difficult to get detailed data from this period, thus Hartwig proceeds by formulating a four-sector (C, I, G, gold mining) model where the government gets a substantial fraction from minting gold and silver and issuing at a value substantially above cost. About 165AD, gold deposits were exhausted and Rome had to tax significantly more to sustain itself, thus switching from one type of tax to the other. Forced to mint low quality coins, inflation sets in but this does not bring sufficient revenue. In addition, the center of the Empire being used to import goods from the periphery suffers from a lack of productive capacity once there is no new gold to pay for imports. All in all, the Roman Empire lost the benefits of holding the keys to the world currency. This is something the US has definitely been benefiting from, through seigniorage and through low interest rates, and one motivation for the creation of the Euro has been to capture this rent. But it is so far hard to find evidence that the dollar is losing its status. The US thus seems safe on that front.

Thursday, September 26, 2013

The Lucas Critique in 1976 has been a major motivation behind the building of RBC models, the folow-up DSGE models as well as the structural estimation of these models. The idea was that estimating reduced-form elasticities was not immune to policy variations, and those elasticities were being estimated to determine the impact of policy in the first place. The resulting bias reduced the trust in the Philipps curve. The structural models, however, had and still have at their core supposedly invariant parameters that describe some fundamentals of the economy. But it turns out that some of those are not invariant over time. I recently discussed the case of the labor income share (here). And misspecification can be problematic in the estimation of such structural models, with possibly important consequences.

Samuel Hurtado tries to sort that out by including parameter shifts in the estimation of a standard DSGE model, but misspecifying it in such a way that it ignores this shift. Using data form the 1970s, he then shows that the policy responses from his model look surprisingly close to those of a reduced-form Philipps curve. In other words, it seems the DSGE model without parameter shift is just as misspecified as the old Philipps curve. What this means is that one has to either include ad hoc parameter shifts or that one needs to go even deeper in the fundamentals to understand why and how these parameters shifts are occurring. The latter gives an even stronger meaning to the Lucas Critique.

Wednesday, September 25, 2013

"Big data" is the latest buzzword describing the next technological revolution wherein enormous amounts of data can be collected about our daily lives and can be used to improve our choices and better understand what is going on in all sorts of dimensions. That includes very detailed information about transactions, locations, and even online behavior. Who has not noticed that ads suddenly turn to what one has searched for a few days ago, if not getting emails about that. Whether big data will keep its promise will depend in part on what will happen with privacy protection. Europe has already taken steps, for example imposing that web cookies need to be accepted by users. In the US, people have been so far very tolerant with companies (but not the government) spying on them, but the tide could turn. But what are really the promises of big data?

Liran Einav and Jonathan Levin focus on economic policy and research. Quite obviously, we complain when data is not available when we want to measure something. Will big data make that possible? While I do not think the (mostly) random collection of big data will allow us to get exactly what we need, the authors thinks that with new statistical techniques and computer algorithms being developed specifically for big data, there should be something useful for economists. They hope to achieve better statistical power from massively larger and finer data. The opening of larger administrative data sets also has a lot of potential, especially, I would add, if the researcher is allowed to link them to each other. Denmark has shown how great data allows for better research and policy, and also makes researchers flock to you. But again, this is all dependent on how privacy laws will evolve.

Tuesday, September 24, 2013

The Arab Spring is the result of growing inequalities and iniquities and has been a wake-up call for the leaders of other countries where a few privileged dominate the masses. That could be a short summary of the commentary coming out the mainstream media about what happened in Tunisia, Libya, Syria and in particular Egypt. And it is all wrong.

Indeed, Vladimir Hlasny and Paolo Verme point out that there is really nothing special about the income distribution in Egypt, and if anything it has become more egalitarian during this millennium. They can turn the data whichever way, same result. However, it appears from the World Value Survey that the tolerance for inequality has sharply declined. And this change must be coming from other factors that the income distribution.

Monday, September 23, 2013

The academic job market is characterized by much uncertainty about the job candidates, at least in Economics where students who have yet to publish anything (in most cases) and have not even completed their studies are hired. The fact that they are supposed to be at the research frontier and that very few people, if any, can evaluate their potential makes it no surprise that recruiting committees stick to signals: who the dissertation adviser is, where the degree is from, and always glowing recommendation letters. When a recruiter has managed to identify a particularly good candidate, it does not want to let others benefit from this discovery. To avoid the job candidate from continuing to shop around, the typical strategy is to make an exploding offer: The offer letter is valid for, say, a week, and thereafter becomes void. This is quite frustrating for a candidate who may still be waiting for a preferred department to make its move, but this is well proven strategy for recruiting departments.

Mark Armstrong and Jidong Zhou show that this does not necessarily have to be so. Other options are to let candidates make a down-payment to keep a job offer alive or offer a bonus if they sign quickly (I am reinterpreting the papers results for my example). Yet, I do not think I have ever seen this happen, even a signing bonus. The model, which is actually about a seller who may offer a buy-now discount, ask for a deposit or make an exploding offer, highlights that the uncertainty about the outside options of the buyer (or the job candidate) is crucial. The search wants to deter the buyer from looking elsewhere. How much the uncertainty affects the buyer determines which strategy is best. In the case of the academic market, I guess this means that job candidates are very risk averse, thus the exploding offer strategy is optimal for the recruiters.

Friday, September 20, 2013

The typically way we model uncertainty is by assuming economic agents know the stochastic process they are facing, and we call this uncertainty. That is wrong. This should be called risk as probabilities are known. Uncertainty is when those probabilities are unknown. That does not mean the agent is not rational, it is simply that the information set is smaller than what we typically assume.

Nabil Al-Najjar and Jonathan Weinstein point out that the uncertain agent trying to smooth consumption may look like he is excessively precautionary to someone thinking he has known probabilities. They frame it within a Bayesian framework, where beliefs including subjective probabilities are updated with incoming information. This makes it very difficult to do any empirical work, including measuring time preference or risk aversion.

I am surprised, though, the Al-Najjar and Weinstein misunderstand rational expectations. They claim an uncertain agent does not have rational expectations if beliefs over probabilities do not coincide with observed frequencies. This does not need to be if the econometrician has information the agent did not have at the time of the decision. If the agent uses all the available information, then it is still rational expectations. There may be so little that he cannot determine probabilities precisely, unlike perfect foresight on probabilities, like Al-Najjar and Weinstein seem to imply. In any case, this is more about semantics than results.

Thursday, September 19, 2013

How important are the preschool years for adult outcomes? Empirical evidence from rich countries mostly shows that treatments during preschool years persist to adult years. James Heckman, for example, has pushed very hard this result. How robust is it once you look at more extreme cases?

Todd Schoellman does this by looking at refugees from Indochina who arrived in the US. He finds no difference in adult wages, education and anything else he can throw at the data between refugees who arrived in the United States at different preschool ages (before 5). One would have expected a huge effect, as they were exposed to very dire environments in their home countries or the refugee camps. It runs also counter to other empirical evidence and standard models. So what is going on here? Schoellman argues that what really matters in early childhood were the parents, much less the environment. That result changes somewhat once the arrival date falls into school age.

Wednesday, September 18, 2013

Large Scale Asset Purchases (LSAPs) of mortgage-backed securities have been a major component of recent monetary policy. It is not without critics as this is a policy that has been targeted towards a specific sector of the economy, the real estate sector. This is principle a big no-no, as a central bank should only care about the overall economy, not specific sectors or firms. In a similar fashion, the ECB has been criticized for buying out specific countries following conditions that were differentiated by country, instead of applying one rule to all, or even not buying country debt at all. But if all sectors benefit equally or if that was the most efficient way to conduct policy, that is all good.

In the case of LSAPs, Meixing Dai, Frédéric Dufourt and Qiao Zhang find it was certainly not the most efficient way to deal with a confidence shocks in the banking sector, and it obviously privileged the real estate industry. One could have done better by buying corporate bonds, but in a uniform manner across sectors. This works better than mortgage-backed securities because they are less leveraged and thus free up more bank capital. This is more true if financial markets are more segmented, that is, if bankers cannot freely reallocate resources between sectors. What the paper does not say is how this would have compared to a conventional policy, buying government bonds.

Tuesday, September 17, 2013

One thing we have learned from the last recession is that the financial sector is quite important, that its dysfunction can have important consequences, and this can happen even in the most financially elaborate economy. Some thus call for the health of the financial sector to become a component of every policy maker's dashboard. From a dashboard it is only a small step to include the financial sector into a policy formula such as the Taylor Rule.

Leonardo Gambacorta and Federico Signoretti take that step by deriving from a DSGE model a Taylor Rule that includes asset prices and the amount of credit. So far so good, but why not also include the exchange rate? And more indicators? This is not the purpose of the Taylor Rule. It was devised to be a simple guide to policy, from which you want to deviate when circumstances call you to do so, for example with unconventional policies that cannot be captured with a Taylor Rule, simple or not. The best example is when the Taylor Rule calls for negative nominal interest rates. Would anybody blindly follow this? Of course not, and this is why we should stop thinking in terms of a single equation, especially when one has several policy goals. You needs at least as many instruments as goals. The policy interest rate cannot do everything.

Monday, September 16, 2013

It is no secret that political connections help your business. The more more regulated or corrupt your economy, the more likely this is to be true. How much this helps is difficult to quantify. For one, political connections cannot be measured on some sort of scale, and gathering such data would be very difficult as people usually try to hide such connections from the public. And second, how would you measure the impact of of these connections.

Yen-Teik Lee, Bang Dang Nguyen and Quoc-Anh Do find a way by first looking at the university networks among CEOs and candidates to US state governor races and then looking at the stock valuation of firms around the time of close gubernatorial contests. Firms connected to the right candidate gain 1.36% right after the election, and the bump persists. Imagine how large this can become in countries where such patronage is less scrutinized and where regulation is more prevalent.

Friday, September 13, 2013

A lot of classes in business schools teach rather fluffy material, especially MBA classes. It is all about entertaining the students who pay dearly for their education and expect a diploma. The signaling value from the diploma happened with entry into the school and not through the selection process during classes. And quite a few classes are all about making the students believe they are learning important skills that will make them CEOs. Nowhere is that more true than with "entrepreneurship" classes, whose teachers are often adored by students who think they will turn into the next Bill Gates.

Michael Stuetzer, Martin Obschonka, Per Davidsson, and Eva Schmitt-Rodermund Do not empirical research into what it takes to be an entrepreneur, and I presume their sample includes only successful ones. It turns out education has no bearing at all. It is all about having a varied work experience. Thus, working a long time on the same job will not make you a successful entrepreneur once you quit. And taking entrepreneurship classes or getting an MBA will not help you either.

Thursday, September 12, 2013

When I am thinking about Scotland and the average diet of its residents, I am not thinking about fresh fruit. Indeed, obesity rates there are about the highest anywhere in the world, thanks to a combination of greasy food, high alcohol consumption and general lack of exercise. Fresh fruit does not seem to be high in demand, yet there is a paper that studies the price elasticity of different types of fruit in Scotland.

That paper is by Cesar Revoredo-Giha and Wojciech Florkowski and unfortunately it does not mention any numbers about the level of demand, in particular compared to other regions. The paper, like many papers in agricultural economics has a very narrow focus and it is not clear at all why it would be of interest to anybody outside of Scotland (or even in Scotland, visibly). Is there any lesson to be learned for the rest of us? Anything that could generalize? Some policy implication to get people to eat more healthily? The paper was prepared for a conference in Poland. Why would the paper be of interest there?

Wednesday, September 11, 2013

The introduction of containers has dramatically simplified international trade but has been resisted in some ports with even more dramatic consequences. Before containers, the unloading of ships was a process that would drag on for days, if not weeks, involved a lot of manual labor. With containers, a ship can unloaded or loaded in hours. Unions resisted the containerization of port facilities, and where this was successful port became rapidly obsolete and underused. The prime example was Liverpool, which is still reeling from this major negative shock to its most important economic sector. The ports that quickly adopted container thrived though, and mostly still do. While there are undoubtedly distributional consequences, what has been the overall impact of the introduction of containers?

Daniel Bernhofen, Zouheir El-Sahli and Richard Kneller exploit the different timing of the adoption of containers across ports to tease out how much more international trade containers have brought. For trade among developed economies, they find that containers multiplied exchanges by about eight over a twenty year period. That is huge. The much celebrated impact of the GATT is about half that, still huge though. This shows that any study about the growth of trade around 1960-90 needs to take containerization into account.

Tuesday, September 10, 2013

I am increasingly annoyed by political campaigns as they are less and less about political issues and more and more about personal issues. The negative campaigning is particularly irritating to me, as it usually brings nothing of substance and is based on either untrue or exaggerated facts. Can't we get back to discussions about platforms? In any case, it seems tat political strategists seem to think negative campaigning is effective in getting votes, and that is all they care about. But is that true?

Indeed, Vincenzo Galasso and Tommaso Nannicini had the incredible luck to be able to experiment on an important mayoral race in Milan, Italy. They had the privilege of contacting specific voter groups in varied ways with various content, all approved by a mayoral candidate who actually participated in putting together this material. Effectively, they designed for these potential voters a complete electoral campaign. Galasso and Nannicini then also asked them how they would vote before and after treatments at various points in the campaign. From this, they get incredibly rich data, from which they determine the following (among others). Negative campaigning gets men to the voting booth, and they tend to vote for that candidate. Women are no more likely to go vote, but vote for the other. In the end the impact of negative campaigning is negligible, and irritates everyone. So stop it.

Monday, September 9, 2013

The border effect describes a striking feature of the data on trade volumes. Volumes typically decrease with distance traveled, with a jump down when a border has to be crossed. The size of this effect is mostly estimated with distance data taken from straight lines between trading areas, and often by simply taking the center of those regions. With considerable work, one can do better.

Henrik Braconier and Mauro Pisu determine the distance along roads as well as travel time for within-Europe trade, and this for almost 50,000 city pairs. While this neglects cargo train traffic, which has a substantial share of international traffic in Europe, this is as precise as it can get, I suppose. The interesting bit is that once a border is involved, travel distance and time are about 10% longer for town pairs that are equidistant when measured as a straight line. That means that literature has over-estimated the border effect by about as much. One could, however, also argue that the border effect is precisely stemming from the fact that it leads to travel time losses, at least in part, and that there is therefore no over-estimation. It depends what you really mean by border effect.

Friday, September 6, 2013

Smoking not only is bad for your health, it also lowers your wage. Of course, this may not come from the mere fact that you are smoking, but from various characteristics that are typically associated with smokers. Once you have taken into account the latter, what is the true drop in wages that you suffer from smoking? Does the number of cigarettes a day then matter?

Julie Hotchkiss and Melinda Pitts answer there two questions by looking at data from the Current Population Survey. They find that about 16 percentages points of the 24% wage loss of smokers come from their common characteristics, such as lower education. The rest is from smoking itself, and the number of cigarettes does not matter, the first one is enough. One more reason not to take on smoking!

Thursday, September 5, 2013

The US has complained for decades that Japan is making it difficult for American companies to export cars there. The complaints were about regulation, prices, and subsidies. Japan has had a rather easy time dismissing these complaints with the mere fact that US car companies are very reluctant to build right-hand driven cars, which are required for driving on the left side of the road in Japan. The latest US complaint is about subsidies for clean cars, which again are supposed to favor Japanese cars. I would answer that the US could maybe build cleaner cars, but let us have another look at the issue.

Taiju Kitano studies the current Japanese subsidies and the American proposal on how the subsidies should be structured. The subsidy is for scrapping old cars for replacement by fuel-efficient ones. At issue is the method for determining which fuel-efficient cars qualify. Japan has its own method for setting fuel efficiency, but this is not calculated for cars with low production or imports, like US models. That disqualifies them for the subsidy. Later, foreign ratings have been accepted for qualification, but the US complained that its city-driving standard is used, while a city/highway combination were more appropriate. Japan claims its standard is close to the city standard in the US.

The use is not solely about calculation of fuel-efficiency standards, it also about potential market shares. Kitano thus estimates an oligopolistic model to determine demands in each market and thus demanded quantities under different policies. If the goal is to improve overall fuel efficiency, both policies score equivalently. The US one would be, however, much cheaper because new cars become eligible and they substitute for cars that command larger subsidies. The US is thus mainly helping Japan reduce its expenses, while having no impact on pollution or even a positive one on profits of Japanese car makers.

Wednesday, September 4, 2013

Many ecosystems feature thresholds beyond which they tip over, in some cases irremediably. The dynamics are, however, so complex that it is sometimes anybody's guess where this threshold is. But this can be dealt with in an infinite-horizon model, and an optimal schedule of, say, pollution can be determined.

Thomas Michielsen shows, however, that this breaks down once you consider successive generations, as in some sense they disagree on the discount rate. Any generation is willing to have a little more pollution and have future generations bear the cost of either reducing pollution or the risk of the ecological catastrophe. And this time inconsistency leads to an increase of the likelihood of the tipping point happening earlier. How can you prevent this? Michielsen suggests to keep the stock of pollution at its current level, as it is known to be safe. That seems drastic. I would prefer to keep pollution at the dynastic model outcome, but it is indeed difficult to see how generations could commit to such levels.

PS: FEEM needs to stop immediately with these multicolored abstracts. They are absolutely horrible.

Tuesday, September 3, 2013

The game of heads and tails with a coin toss is universally recognized as a game where the odds of each outcome are exactly 50%. In monetary terms and in expectation, nothing can be gained from this gamble, and in utility terms (again in expectation) one can only lose. Yet, people keep playing it for gain.

Silvia Bou, Jordi Brandts, Magda Cayón and Pablo Guillén devise a laboratory experiment where after an initial phase of five coin toss guesses, some students are asked to bet who will get the most guesses right in a second round of tosses. The subtlety of the experiment is that by default, the students are assigned the worst guesser of the first phase, and switching to another one is expensive. Yet almost all switched. This means that they were thinking that a lucky streak of right guesses in the first phase would continue in the second. And these were finance students, who should really know better.