Robust forecast superiority testing with an application to assessing pools of expert forecasters

Valentina Corradi, Sainan Jin, Norman R. Swanson

Research output: Contribution to journalArticlepeer-review

Abstract

We develop forecast superiority tests that are robust to the choice of loss function by following Jin, Corradi and Swanson (JCS: 2017), and relying on a mapping between generic loss forecast evaluation and stochastic dominance principles. However, unlike JCS tests, which are not uniformly valid and are correctly sized only under the least favorable case, our tests are uniformly asymptotically valid and non-conservative. To show this, we establish uniform convergence of HAC variance estimators. Monte Carlo experiments indicate good finite sample performance of our tests, and an empirical illustration suggests that prior forecast accuracy matters in the Survey of Professional Forecasters.

Original languageEnglish (US)
Pages (from-to)596-622
Number of pages27
JournalJournal of Applied Econometrics
Volume38
Issue number4
DOIs
StatePublished - Jun 1 2023

Keywords

  • bootstrap
  • combination forecasts
  • estimation error
  • many moment inequalities
  • robust forecast evaluation
  • Survey of Professional Forecasters

ASJC Scopus subject areas

  • Social Sciences (miscellaneous)
  • Economics and Econometrics

Fingerprint

Dive into the research topics of 'Robust forecast superiority testing with an application to assessing pools of expert forecasters'. Together they form a unique fingerprint.

Cite this