Hi, welcome to my website!

I am an applied macroeconomist and time-series econometrician and hold a position with the Research Centre of the Deutsche Bundesbank.  

My research is concerned with forecast uncertainty,  the dynamics of survey expectations, and informational frictions.
Most of the time, I end up solving signal extraction problems. 

My work is also posted at IDEAS, SSRN, GoogleScholar, ResearcherID, ORCID, and Deutsche Bundesbank.
Codes can be found at GitHub. The publications and working papers pages on this site provide links to individual projects.


Shadow-rate VARs (working paper, revised November 2023)

with Andrea Carriero (Queen Mary University of London, U Bologna), Todd Clark (Federal Reserve Bank of Cleveland), Massimiliano Marcellino (Bocconi, IGIER and CEPR)

Abstract: VARs are a popular tool for forecasting and structural analysis, but ill-suited to handle occasionally binding constraints, like the effective lower bound on nominal interest rates. We extend the VAR framework by modeling interest rates as censored observations of a latent shadow-rate process, and propose an efficient sampler for Bayesian estimation of such ``shadow-rate VARs.'' We analyze specifications where actual and shadow rates serve as explanatory variables and find benefits of including both. In comparison to a standard VAR, shadow-rate VARs generate superior predictions for short- and long-term interest rates, and deliver some gains for macroeconomic variables in US data.  Our structural analysis estimates economic responses to shocks in financial conditions, showing strong differences in the reaction of interest rates depending on whether the ELB binds or not. After an adverse shock, our shadow-rate VAR sees a stronger decline of economic activity at the ELB rather than when not. 

Earlier versions of this paper were also earlier circulated under the title “Forecasting with Shadow-Rate VARs.”

Survey expectations and forecast uncertainty (draft)

with Todd E. Clark (Federal Reserve Bank of Cleveland)

Abstract: In recent decades, the collection of survey expectations for macroeconomic variables has gained considerable attention. A bourgeoning literature has developed that studies not only the predictive content of survey data, but also its usefulness in testing economic theories about the behavior of forward-looking decision makers under uncertainty. Leading examples of economic survey data include surveys of experts (including professional forecasters and financial market participants), households and firms.  Point forecasts of professional forecasters have, on balance, emerged as competitive (albeit not fully optimal) predictors of future outcomes. Meanwhile, ex-ante measures of uncertainty derived from probabilistic surveys have been found to be systematically at variance with the distribution of ex-post outcomes. Nevertheless, puzzles remain and the analysis and design of expectation surveys is an active field of research.

Precision-based sampling for state space models that have no measurement error (2023, JEDC)

Abstract: This article presents a computationally efficient approach to sample from Gaussian state space models. The method is an instance of precision-based sampling methods that operate on the inverse variance-covariance matrix of the states (also known as precision). The novelty is to handle cases where the observables are modeled as a linear combination of the states without measurement error. In this case, the posterior variance of the states is singular and precision is ill-defined. As in other instances of precision-based sampling, computational gains are considerable. Relevant applications include trend-cycle decompositions, (mixed-frequency) VARs with missing variables and DSGE models.

Stochastic Volatility in Bayesian Vector Autoregressions (2023, Oxford Research Encyclopdia)

with Todd E. Clark (Federal Reserve Bank of Cleveland)

Abstract: Vector autoregressions with stochastic volatility are widely used in macroeconomic forecasting and structural inference.   The stochastic volatility component of the model conveniently allows for time variation in the variance-covariance matrix of the model's forecast errors.  In turn, that feature of the model generates time variation in predictive densities.  The models are most commonly estimated with Bayesian methods, most typically Markov Chain Monte Carlo methods such as Gibbs sampling. Recently developed equation-by-equation methods enable the estimation of models with large variable sets at much lower computational cost than the standard approach of estimating the model as a system of equations.  The Bayesian framework also facilitates the accommodation of mixed frequency data, non-Gaussian error distributions, and non-parametric specifications. With recent advances, researchers are also addressing some of the framework's outstanding challenges, particularly the dependence of estimates on the ordering of variables in the model and reliable estimation of the marginal likelihood, which is the fundamental measure of model fit in Bayesian methods.

NONE of the material posted on this personal website necessarily represents the views of 

the Deutsche Bundesbank, the Eurosystem, the Bank for International Settlements, 

the Board of Governors of the Federal Reserve System or the Federal Open Market Committee.