Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model

Complexity 2022:1-13 (2022)
  Copy   BIBTEX

Abstract

In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting. The resulting model average estimator is proved to be asymptotically optimal. It is shown that the Kullback–Leibler model averaging estimator asymptotically minimizes the in-sample Kullback–Leibler divergence and improves the forecast accuracy of out-of-sample even under different loss functions. In simulations, we show that the KLMA estimator compares favorably with smooth-AIC estimator, smooth-BIC estimator, and Mallows model averaging estimator, especially when some nonlinear noise is added to the data generation process. The empirical applications in the daily range of S&P500 and price duration of IBM show that the out-of-sample forecasting capacity of the KLMA estimator is better than that of other methods.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,503

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Probabilities and epistemic pluralism.Eric Christian Barnes - 1998 - British Journal for the Philosophy of Science 49 (1):31-47.
Estimation and Model Selection in Dirichlet Regression.Julio Michael Stern - 2012 - AIP Conference Proceedings 1443:206-213.

Analytics

Added to PP
2022-04-29

Downloads
9 (#1,244,087)

6 months
4 (#783,550)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references