Hi, welcome to my website!
My research is concerned with forecast uncertainty, the dynamics of survey expectations, and informational frictions. Most of the time, I end up solving signal extraction problems.
Discussion of Del Negro, Bassetti and Casarin: "A Bayesian Approach for Inference on Probabilistic Surveys" (@NBB, 2023)
Shadow-rate VARs (working paper)
with Andrea Carriero (Queen Mary University of London, U Bologna), Todd Clark (Federal Reserve Bank of Cleveland), Massimiliano Marcellino (Bocconi, IGIER and CEPR)
Earlier versions of this paper were also earlier circulated under the title “Forecasting with Shadow-Rate VARs.”
Precision-based sampling for state space models that have no measurement error (2023, JEDC)
earlier draft: pdf (revised June 2023)
Bundesbank DP at IDEAS
online appendix: pdf
replication code: https://github.com/elmarmertens/ABCprecisionsampler
Abstract: This article presents a computationally efficient approach to sample from Gaussian state space models. The method is an instance of precision-based sampling methods that operate on the inverse variance-covariance matrix of the states (also known as precision). The novelty is to handle cases where the observables are modeled as a linear combination of the states without measurement error. In this case, the posterior variance of the states is singular and precision is ill-defined. As in other instances of precision-based sampling, computational gains are considerable. Relevant applications include trend-cycle decompositions, (mixed-frequency) VARs with missing variables and DSGE models.
Stochastic Volatility in Bayesian Vector Autoregressions (2023, Oxford Research Encyclopdia)
with Todd E. Clark (Federal Reserve Bank of Cleveland)
Abstract: Vector autoregressions with stochastic volatility are widely used in macroeconomic forecasting and structural inference. The stochastic volatility component of the model conveniently allows for time variation in the variance-covariance matrix of the model's forecast errors. In turn, that feature of the model generates time variation in predictive densities. The models are most commonly estimated with Bayesian methods, most typically Markov Chain Monte Carlo methods such as Gibbs sampling. Recently developed equation-by-equation methods enable the estimation of models with large variable sets at much lower computational cost than the standard approach of estimating the model as a system of equations. The Bayesian framework also facilitates the accommodation of mixed frequency data, non-Gaussian error distributions, and non-parametric specifications. With recent advances, researchers are also addressing some of the framework's outstanding challenges, particularly the dependence of estimates on the ordering of variables in the model and reliable estimation of the marginal likelihood, which is the fundamental measure of model fit in Bayesian methods.
NONE of the material posted on this personal website necessarily represents the views of
the Deutsche Bundesbank, the Eurosystem, the Bank for International Settlements,
the Board of Governors of the Federal Reserve System or the Federal Open Market Committee.