Money and Memory: Implicit Agents in Search Theories of Money

January 31, 2011

By Heiner Ganßmann

http://d.repec.org/n?u=RePEc:kas:poabec:2010-9&r=dge

Recent search theoretical models of monetary economies promise micro-foundations and thus a decisive improvement in the theory of money compared to the traditional mainstream approach that starts from a Walrasian general equilibrium framework to introduce money exogenously at the macro level. The promise of micro-foundations is not fulfilled, however. It can be shown that search models implicitly refer to central, most likely collective, agents doing essential work to sustain the monetary economy.

The goal of money search theory is to provide microfoundations for the use on money in macro models and thus avoid ad hoc assumptions. This paper claims this theory is still using ad hoc assumptions, specifically about the acceptance of money. I think there are two main first is that one needs to know whether an agents has accepted money in the past to decide whether to transact with him. The reason is that you want to exclude those who do not accept money. But why would you refuse money from someone if you know you could use it to buy something from someone else?

The second point has more traction, I think. A substantial part of money used in real time is not physical and requires somebody to keep track of it. Thus trust comes into play. Is this check covered? Is there a sufficient balance on this debit card? This requires new institutions that are not modeled is money search models.

Advertisement

Evaluating the strength of identification in DSGE models. An a priori approach

January 23, 2011

By Nikolay Iskrev

http://d.repec.org/n?u=RePEc:ptu:wpaper:w201032&r=dge

This paper presents a new approach to parameter identification analysis in DSGE models wherein the strength of identification is treated as property of the underlying model and studied prior to estimation. The strength of identification reflects the empirical importance of the economic features represented by the parameters. Identification problems arise when some parameters are either nearly irrelevant or nearly redundant with respect to the aspects of reality the model is designed to explain. The strength of identification therefore is not only crucial for the estimation of models, but also has important implications for model development. The proposed measure of identification strength is based on the Fisher information matrix of DSGE models and depends on three factors: the parameter values, the set of observed variables and the sample size. By applying the proposed methodology, researchers can determine the effect of each factor on the strength of identification of individual parameters, and study how it is related to structural and statistical characteristics of the economic model. The methodology is illustrated using the medium-scale DSGE model estimated in Smets and Wouters (2007).

Every good simulation exercise includes a sensitivity analysis on the calibrated parameter values. This paper presents an interesting way to which parameters are of little importance and which are crucial and thus should be calibrated or estimate with extra care.


Competitive equilibrium with search frictions: Arrow-Debreu meets Diamond-Mortensen-Pissarides

January 19, 2011

By Belén Jerez

http://d.repec.org/n?u=RePEc:cte:werepe:we1039&r=dge

When the trading process is characterized by search frictions, traders may be rationed so markets need not clear. We argue that rationing can be part of general equilibrium, even if it is outside its normal interpretation. We build a general equilibrium model where the uncertainty arising from rationing is incorporated in the definition of a commodity, in the spirit of the Arrow- Debreu theory. Prices of commodities then depend not only on their physical characteristics, but also on the probability that their trade is rationed. The standard definition of a competitive equilibrium is extended by replacing market clearing with a matching condition. This condition relates the traders’ rationing probabilities to the measures of buyers and sellers in the market via an exogenous matching function, as in the search models of Diamond (1982a, 1982b), Mortensen (1982a, 1982b) and Pissarides (1984, 1985). When search frictions vanish (so matching is frictionless) our model is equivalent to the competitive assignment model of Gretsky, Ostroy and Zame (1992, 1999). We adopt their linear programming approach to derive the welfare and existence theorems in our environment.

I consider this paper to be an important step towards understand coarseness in markets: not everybody trades everyday all contingent claims. There are trading costs, and this makes that some portfolio are out of balance, or some people do not hold a portfolio, and some goods are not traded. Trading history is for most people very sparse. And all this has important implications for pricing, which this paper takes a shot at.


Labor-market Volatility in a Matching Model with Worker Heterogeneity and Endogenous Separations

January 10, 2011

By Andri Chassamboulli

http://d.repec.org/n?u=RePEc:ucy:cypeua:13-2010&r=dge

Recessions are times when the quality of the unemployment pool is lower, because entry into unemployment is biased in favor of low-productivity workers. I develop a search and matching model with worker heterogeneity and endogenous separations that has this feature. I show that in a recession a compositional shift in unemployment towards low-productivity workers, due an increase in job separations, lowers the matching effectiveness of searching firms, thereby causing their average recruiting cost to rise. This acts to further depress vacancy creation in a recession. In contrast to most models that allow for endogenous separations, this model generates a realistic Beveridge curve correlation.

I would have thought that the quality of the pool of unemployed workers would have improved in a recession, because the marginal fired worker must be better than in a boom. But the evidence seems to be that unskilled workers are, in relative terms, even more unemployed. The consequences of this on search costs and match qualities are interesting, leading to a good match on the Beveridge curve.


Confronting Model Misspecification in Macroeconomics

January 6, 2011

By Daniel Waggoner and Tao Zha

http://d.repec.org/n?u=RePEc:emo:wp2003:1012&r=dge

We confront model misspecification in macroeconomics by proposing an analytic framework for merging multiple models. This framework allows us to 1) address uncertainty about models and parameters simultaneously and 2) trace out the historical periods in which one model dominates other models. We apply the framework to a richly parameterized DSGE model and a corresponding BVAR model. The merged model, fitting the data better than both individual models, substantially alters economic inferences about the DSGE parameters and about the implied impulse responses.

We all know models are abstractions, and some models thus perform better in some situations. Here, models are combined to let the data tell when a model is more appropriate than another. This improves the fits over using a single model. But why would this be better than using a single model with less abstraction? And how can this be useful for policy experiments, as one is uncertain which model is currently valid?