Feeds:
Posts

## KIM-STOFFER 2006 MIXTURE OF GAUSSIAN INSTEAD OF LOG-CHISQUARE IN SV MODELS

So Kim-Stoffer 2006 (kim-stoffer-2006-svmm) solves the problem of using the EM algorithm to estimate parameters of an AR(1) SV model with a mixture of Gaussians instead of log-chi-square.  In our case this technique needs to be adopted for our six volatility prediction models.

In order to warm up for EM algorithm, we begin with the following exercise for a mixture of Gaussians.

```import numpy as np
from numpy import *
from sklearn.mixture import GMM
np.random.seed(1)
g = GMM(n_components=2)

# Generate random observations with two modes centered on 0
# and 10 to use for training.

obs = np.concatenate((np.random.randn(100, 1),10 + np.random.randn(300, 1)))
g.fit(obs)

GMM(covariance_type='diag', init_params='wmc', min_covar=0.001,
n_components=2, n_init=1, n_iter=100, params='wmc',
random_state=None, thresh=None, tol=0.001, verbose=0)
np.round(g.weights_, 2)
np.round(g.means_, 2)
np.round(g.covars_, 2)
g.predict([[0], [2], [9], [10]])
np.round(g.score([[0], [2], [9], [10]]), 2)
# Refit the model on new data (initial parameters remain the
# same), this time with an even split between the two modes.
g.fit(20 * [[0]] +  20 * [[10]])
GMM(covariance_type='diag', init_params='wmc', min_covar=0.001,
n_components=2, n_init=1, n_iter=100, params='wmc',
random_state=None, thresh=None, tol=0.001, verbose=0)
print(np.round(g.weights_, 2))

```