A Bootstrap-Assisted Self-Normalization Approach to Inference in Cointegrating Regressions

4 Apr 2022  ·  Karsten Reichold, Carsten Jentsch ·

Traditional inference in cointegrating regressions requires tuning parameter choices to estimate a long-run variance parameter. Even in case these choices are "optimal", the tests are severely size distorted. We propose a novel self-normalization approach, which leads to a nuisance parameter free limiting distribution without estimating the long-run variance parameter directly. This makes our self-normalized test tuning parameter free and considerably less prone to size distortions at the cost of only small power losses. In combination with an asymptotically justified vector autoregressive sieve bootstrap to construct critical values, the self-normalization approach shows further improvement in small to medium samples when the level of error serial correlation or regressor endogeneity is large. We illustrate the usefulness of the bootstrap-assisted self-normalized test in empirical applications by analyzing the validity of the Fisher effect in Germany and the United States.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here