December 31, 2014
I characterize the constrained efficient (or planner’s) allocation in a directed (competitive) search model with private information. There are sellers with private information on one side of the market and homogeneous buyers on the other side. They match bilaterally in different submarkets and trade. In each submarket, there are search frictions. In the market economy, homogeneous buyers enter different submarkets (i.e., post different contracts) and sellers with private information direct their search toward their preferred submarket. I define a planner whose objective is to maximize social welfare subject to the information and matching frictions of the environment. The planner can impose taxes and subsidies on agents that vary across submarkets while being subject to an overall budget-balance condition. I show that the planner generally achieves strictly higher welfare than the market economy. I also derive conditions under which the planner achieves the complete information allocation. I present examples in the context of financial and labor markets, explicitly solve for the efficient tax and transfer schemes and compare the planner’s allocation with the equilibrium allocation.
This a huge paper. Literally (64 pages), it indeed covers a lot of ground, but also in terms of its message. The abstract, while very precise, does not highlight the potential impact of this paper. We have learned from the Welfare Theorems that a planner can replicate a decentralized economy under perfect conditions. As socialist planned economies have demonstrated, these perfect conditions do not exist, one main reason being that the planner has less information than market participants. Here we have an economy where some market participants have more information than others and the planner has as much information as those that have the least of it. Yet, the planner is capable to improve on the decentralized outcome. It is not simple, as the planner needs to use subsidies and taxes, and to open and close markets. But it is possible.
December 29, 2014
By William Branch, Nicolas Petrosky-Nadeau and Guillaume Rocheteau
We develop a two-sector search-matching model of the labor market with imperfect mobility of workers, augmented to incorporate a housing market and a frictional goods market. Homeowners use home equity as collateral to finance idiosyncratic consumption opportunities. A financial innovation that raises the acceptability of homes as collateral raises house prices and reduces unemployment. It also triggers a reallocation of workers, with the direction of the change depending on firms’ market power in the goods market. A calibrated version of the model under adaptive learning can account for house prices, sectoral labor flows, and unemployment rate changes over 1996-2010.
A great paper that captures some of the essential frictions in the macroeconomy. A great next step would be to bring in some geography. A major friction, I believe, is that when a recession hits a region, the unemployed cannot move to greener pastures because their house cannot be sold. This cannot be captured in the model above because everyone lives in the same economy. The improved model would also be great to capture the cost of this moving friction, which should be taken into account when thinking about homeownership subsidies.
December 22, 2014
By Fernando Leibovici and Michael Waugh
This paper studies the dynamics of international trade flows at business cycle frequencies. We show that introducing dynamic considerations into an otherwise standard model of trade can account for several puzzling features of trade flows at business cycle frequencies. Our insight is that because international trade is time-intensive, variation in the rate at which agents are willing to substitute across time affects how trade volumes respond to changes in output and prices. We formalize this idea and calibrate our model to match key features of U.S. data. We find that, in contrast to standard static models of international trade, our model is quantitatively consistent with salient features of U.S. cyclical import fluctuations. We also find that our model accounts for two-thirds of the peak-to-trough decline in imports during the 2008-2009 recession.
There were lots of exciting papers to choose from in this week’s NEP-DGE report. I chose this one because it shows that international trade theory is finally awakening from its slumber to realize that dynamics and expectations matter. I think that is pretty big in itself.
December 19, 2014
Before the CFPs, let me mention that you can now receive announcements of new NEP-DGE papers through Twitter: @RePEc_NEP_DGE. The list of all NEP email, RSS and Twitter feeds can be found at NEP.
Tsinghua Workshop in Macroeconomics, Beijing, 20-22 May 2015.
Society for Computational Economics, Taipeh, 20-22 June 2015.
Society for Economic Dynamics, Warsaw, 25-27 June 2015.
SED sessions at ASSA 2015 in Boston.
December 15, 2014
By Soojin Kim
Two key determinants of optimal tax policies in open economies are the mobility of factors of production, capital and labor; and strategic interaction between governments in setting their policies. This paper develops a two-country, open-economy model with labor mobility and a global financial market to study optimal taxation. Governments engage in tax competition in which they choose a labor income tax code and a capital income tax rate. A quantitative application of the model to the United Kingdom (UK) and Continental European countries (CE) shows that factor mobility and competition between governments are indeed crucial in the design of optimal policies. Incorporating labor mobility leads to a divergence in the optimal tax system: Unlike in an economy with only capital mobility, where both countries use similar capital income tax rates, the optimal capital income tax rate in the UK is lower than that in the CE when both capital and labor are mobile. This is due to the differences in productivity between the two countries. In the calibrated economy, the UK, whose productivity is higher than that of the CE, attracts more labor through migration. Thus, the welfare-maximizing level of capital in the relatively small CE is lower than that in the UK. Moreover, I find that capital income tax rates are higher with competition. With competition, both governments lower capital income tax rates, rendering the marginal benefit of a lower tax rate to decrease. The steady-state welfare gain from implementing the Nash equilibrium policies is about 11 percent of consumption of the status quo economy.
The optimal taxation literature largely assumes that the studied country lives in autarky. This is definitely not true, as tax competition is always on the mind of policy makers. The important message of this paper is that even with mobile factors and tax competition, there is room for tax rates to differ across countries, still giving each country some wiggle room to set its priorities.
December 9, 2014
By Simeon Alder, David Lagakos and Lee Ohanian
No region of the United States fared worse over the postwar period than the “Rust Belt,” the heavy manufacturing region bordering the Great Lakes. This paper hypothesizes that the Rust Belt declined in large part due to a lack of competitive pressure in its labor and output markets. We formalize this thesis in a two-region dynamic general equilibrium model, in which productivity growth and regional employment shares are determined by the extent of competition. Quantitatively, the model accounts for much of the large secular decline in the Rust Belt’s employment share before the 1980s, and the relative stabilization of the Rust Belt since then, as competitive pressure increased.
As a recent resident of the Rust Belt, I find this piece fascinating. Beyond the compelling examples about the lask of competition in the region, collusion among producers, high union power and the lack of technological progress, the paper has a nice model that shows how such inefficiencies could drag down the region. A lesson that could apply to other countries as well.
December 8, 2014
By Gonzalo Llosa, Lee Ohanian, Andrea Raffo and Richard Rogerson
We document large differences across OECD countries in fluctuations of the intensive and extensive margin of labor supply over the business cycle. Countries with larger fluctuations in employment relative to hours per worker tend to display larger fluctuations in total hours worked. These facts appear to be related to policies that impede the dismissal of workers. We then present a quantitative framework that features both margins of labor supply as well as costs to the adjustment of employment. Cross-country differences in dismissal costs can account for a large fraction of the patterns observed in the data.
Interesting analysis on a question I looked at without success in the late 1990’s, although then I focused on unionization. Nice dataset, too. Consistent hours data is difficult to obtain. The paper could, maybe, also address one puzzle I have had for a long time: why is output volatility so low in France? Given the huge labor market frictions there, it must be part of the story.
December 4, 2014
By Britta Kohlbrecher, Christian Merkl and Daniela Nordmeier
This paper shows analytically and numerically that there are two ways of generating an observationally equivalent comovement between matches, unemployment, and vacancies in dynamic labor market models: either by assuming a standard Cobb-Douglas contact function or by combining a degenerate contact function with idiosyncratic productivity shocks for new jobs. Despite this observational equivalence, we provide several reasons for why it is important to understand what happens inside the black box of job creation. We calibrate a combined model with both mechanisms to administrative German wage and labor market flow data. In contrast to the model without idiosyncratic shocks, the combined model is able to replicate the observed negative time trend in estimated matching functions. In addition, the full nonlinear combined model generates highly asymmetric business cycle responses to large aggregate shocks.
Matching functions are used a little bit blindly and indiscriminately, so it is useful to be reminded that they are really black boxes. If you use a matching function, you should understand what it assumes and implies. This paper shows nicely how we can think of matching function and where there specification matters or does not matter.
December 1, 2014
By Natalie Tiernan and Pedro gete
This paper is a quantitative study of two frictions that generate banks’ underinvestment in screening borrowers and, thus, overlending: 1) Limited liability, and 2) Banks failing to internalize that their credit decisions alter the pool of borrowers faced by other banks. The resulting lax lending standards overexpose banks to negative economic shocks and amplify the effects of economic fluctuations. They generate excessive volatility in credit, banks’ capital and output. We study a calibrated model whose predictions concerning the quantity and quality of credit are in line with recent U.S. business cycles. Quantitatively, limited liability is the friction that generates laxer lending standards. It induces 27% excess volatility in output relative to 8% from the other friction. Then we study three policy tools: capital requirements and taxes on banks’ lending and borrowings. The three tools encourage banks to screen more and should be state-contingent because the frictions vary with macroeconomic conditions. In quantitative terms, we find that taxes are better tools than capital requirements because they do not reduce credit going to the more productive agents.
Nice paper that shows that taxes, when uses judiciously, can have a beneficial impact in unexpected ways. While it is quite obvious that impose a tax reduce the level of that something, it is not clear it influences its volatility. It appears to be so efficient at this that it even counterweights the loss of loans in the average.