Feeds:
Posts
Comments

## Why univariate econometric models of financial time series is not science

Natural sciences have some consensus regarding what constitutes the basic object of study: for example in anomalous diffusions and other areas where fractional calculus has emerged since 1974 into a technology with an encyclopedic treatise by Samko, Kilbas and Marichev and exposition and research by many others see Machado-Kiryakova-Mainardi for a historical review.  Excellent and brilliant scholars such as Andrew Lo, John Campbell, and Craig Mackinlay have worked on univariate statistical time series models for financial time series.  Univariate models of different types for volatility time series (the so-called stochastic volatility is simply $\log(r_t^2)$ where $r_t = log(p_t/p_{t-1})$ from empirical prices $p_t$.  Univariate models can achieve match to stylized features of these series and therefore are useful but they exist in a vacuum which is deceptive.  This vacuum is due to a missing consensus among scientists of finance regarding the fundamental observable of finance and its actual determinants.

The most significant problem hampering a proper science for finance is that the quants are all working for financial interests and therefore absorb the terminology and imperatives of finance developed in the tradition of Harry Markowitz and William Sharpe and others for whom a theory of investments in uncertainty is the most important problem while in fact, a true science of finance would consider volatility to be a poisonous storms what destroy lives and livelihoods in a prescientific age of finance.  This cannot be corrected completely without purely non-profit and non-ideological enterprise of a pure science based on volatility as the primary observable concerned with control of turbulence in volatility.

Against in-vacuum econometric models is the following result that is not difficult to verify.  Take 1900 historical daily series for American stocks and construct a 1900*1900 correlation matrix and consider the off-diagonal elements $C_{ij}$, let $\sigma$ be the standard deviation of these values and consider the matrix consisting of soft-thresholding the correlation matrix lower triangular values by $3\sigma$.   How many of the correlations should survive intuitively?  Well one might expect no more than 45% of the values to survive the soft-threshold but in fact more than 75% survive.  This tells us that for 1900 stocks the graph with 1900 nodes with edges determined by high volatility correlation is  quite dense.  The density of this graph suggests the error of considering econometric models of univariate time series from finance, for it is impossible to understand the volatility as a proper object of scientific study without the extremely high correlation with the rest of the market defined purely ‘intrinsically’ in terms of volatility correlation with other instruments traded.  In particular, pure scientific interest in the phenomena that drives global financial volatility, often considered from Laplace’s Essay on Probabilities on as determined by fear and exhuberance, the lingo used by Alan Greenspan and refined since by many people including Shiller and Ackerloff in Animal Spirits.

The quantiative mechanisms of volatility can come from careful examination of the exact models that fit empirical data.  We have introduced several new ideas on where the cutting edge of quantitative models are for pure empirical data (rather than ‘relationships’ obtained by regression analysis to time series that are not pure volatility which are perhaps valuable after a thorough examination is made for exact laws governing global fiancial volatility as an evolving high dimensional object and improving upon continuous-time random walk models which have some good qualities such as explanation of long memory and power laws in the renewal theory sense but do not allow possibility of phase transition to turbulent dynamics which have evidence published in Nature and other reputable science journals since early 1990s.

Advertisements