May 28, 2014
By Daniel Borowczyk-Martins, Jake Bradley and Linas Tarasonis
In the US labor market the average black worker is exposed to a lower employment rate and earns a lower wage compared to his white counterpart. Lang and Lehmann (2012) argue that these mean differences mask substantial heterogeneity along the distribution of workers’ skill. In particular, they argue that black-white wage and employment gaps are smaller for high-skill workers. In this paper we show that a model of employer taste-based discrimination in a labor market characterized by search frictions and skill complementarities in production can replicate these regularities. We estimate the model with US data using methods of indirect inference. Our quantitative results portray the degree of employer prejudice in the US labor market as being strong and widespread, and provide evidence of an important skill gap between black and white workers. We use the model to undertake a structural decomposition and conclude that discrimination resulting from employer prejudice is quantitatively more important than skill differences to explain wage and employment gaps. In the final section of the paper we conduct a number of counterfactual experiments to assess the effectiveness of different policy approaches aimed at reducing racial differences in labor market outcomes.
I selected this paper this week because it is a rather unusual application of dynamic general equilibrium, and it highlights the potential of using standard macrotheory to address non-macro questions. Indeed, my understanding of the discrimination literature is that its empirical applications are largely devoid of theory, or at least these are not structiral estimations. Neither is this paper, but it takes theory very seriously and asks, in a very macro manner, how far theory can take us in explaining what we observe, and then uses the theory to determine the extend of discimination. The approach (and the results) should encourage others to follow up on this research.
May 24, 2014
A few important conferences have issued calls for papers recently, some with close deadlines:
May 23, 2014
By Mariacristina De Nardi and Fang Yang
Households hold vastly heterogeneous amounts of wealth when they reach retirement, and differences in lifetime earnings explain only part of this variation. This paper studies the role of intergenerational transmission of ability, voluntary bequest motives, and the recipiency of accidental and intended bequests (both in terms of timing and size), in generating wealth dispersion at retirement, in the context of a rich quantitative model. Modeling voluntary bequests, and realistically calibrating them, not only generates more wealth dispersion at retirement and reduces the correlation between retirement wealth and lifetime income, but also generates a skewed bequest distribution that is close to the one in the observed data.
My take-aways from this paper are: 1) Understanding the heterogeneity of wealth accumulation is very difficult and involves multiple dimensions; 2) Among those, voluntary bequests play an important role; 3) We need to learn much more about voluntary bequest motives. The latter point is crucial, otherwise it just becomes a free parameter that can explain anything, like preference shocks are too often.
May 16, 2014
By Paolo Gelain and Marco Guerrazzi
In this paper, we implement Bayesian econometric techniques to analyze a theoretical framework built along the lines of Farmer’s micro-foundation of the General Theory. Specifically, we test the ability of a demand-driven search model with self-fulfilling expectations to match the behaviour of the US economy over the last thirty years. The main findings of our empirical investigation are the following. First, all over the period, our model fits data very well. Second, demand shocks are the most relevant in explaining the variability of concerned variables. In addition, our estimates reveal that a large negative demand shock caused the Great Recession via a sudden drop of confidence. Overall, those results are consistent with the main features of the New ‘Farmerian’ Economics as well as to latest demand-side explanations of the finance-induced recession.
Roger Farmer’s recent work has been causing quite a stir, especially as it seems to validate some the things that happened during the recent crisis. This paper provides an empirical test of Farmer’s theory and shows that he is indeed onto something.
May 11, 2014
By Hans Holter, Dirk Krueger and Serhiy Stepanchuk
The recent public debt crisis in most developed economies implies an urgent need for increasing tax revenues or cutting government spending. In this paper we study the importance of household heterogeneity and the progressivity of the labor income tax schedule for the ability of the government to generate tax revenues. We develop an overlapping generations model with uninsurable idiosyncratic risk, endogenous human capital accumulation as well as labor supply decisions along the intensive and extensive margins. We calibrate the model to macro, micro and tax data from the US as well as a number of European countries, and then for each country characterize the labor income tax Laffer curve under the current country-specific choice of the progressivity of the labor income tax code. We find that more progressive labor income taxes significantly reduce tax revenues. For the US, converting to a flat tax code raises the peak of the laffer curve by 7%. We also find that modeling household heterogeneity is important for the shape of the Laffer curve.
I am not sure finding the tax scheme that maximizes tax revenue is the most interesting question. After all, what distinguishes a public entity is that it is supposed to maximize welfare instead of profit or revenue. Nonetheless, this paper highlights how tax rate progressivity is important for revenue, and that one really needs to quantify the model to get to an answer. While the flat tax is show to be the best, why stop there? What about regressive taxation, or even a head tax?
May 8, 2014
By Karthik Athreya, Andrew Owens and Felipe Schwartzman
The aftermath of the recent recession has seen numerous calls to use transfers to poorer households as a means to enhance aggregate activity. We show that the key to understanding the direction and size of such interventions lies in labor supply decisions. We study the aggregate impact of short-term redistributive economic policy in a standard incomplete-markets model. We characterize analytically conditions under which redistribution leads to an increase or decrease in effective hours worked, and hence, output. We then show that under the parameterization that matches the wealth distribution in the U.S. economy (Castaneda et al., 2003), wealth redistribution leads to a boom in consumption, but not in output.
While all the discussion about Thomas Piketty’s new book about redistribution of wealth focuses on the long term, this one is about redistribution of wealth in the short term. The result that one can increase consumption while output decreases is interesting. It highlights that the obsession about GDP is misplaced. But to some extend so is the focus on consumption. What really matters is how this redistribution improves (or not) well-being, a measure the model used in this paper can and should provide for each quintile. The paper shows what proportion of households is better off, yet something like consumption equivalence would be more convincing.
May 6, 2014
By Jesper Bagger and Rasmus Lentz
This paper studies wage dispersion in an equilibrium on-the-job-search model with endogenous search intensity. Workers differ in their permanent skill level and firms differ with respect to productivity. Positive (negative) sorting results if the match production function is supermodular (submodular). The model is estimated on Danish matched employer-employee data. We find evidence of positive assortative matching. In the estimated equilibrium match distribution, the correlation between worker skill and firm productivity is 0.12. The assortative matching has a substantial impact on wage dispersion. We decompose wage variation into four sources: Worker heterogeneity, firm heterogeneity, frictions, and sorting. Worker heterogeneity contributes 51% of the variation, firm heterogeneity contributes 11%, frictions 23%, and finally sorting contributes 15%. We measure the output loss due to mismatch by asking how much greater output would be if the estimated population of matches were perfectly positively assorted. In this case, output would increase by 7.7%.
Denmark is blessed with absolutely fantastic datasets, and this paper exploits this kind of data to reach some very interesting conclusions. In particular, it manages to quantify the sources of wage dispersion and the welfare gain from frictionless matching. With such a large welfare gain, the next step would be to exploit the data to actually step in and do better matches. That would be the ultimate policy experiment. Will that happen?