Feeds:
Posts

## Natural human empathy?

During 2009 I reacted emotionally to the people killed in Gaza. This time around, I reacted with a detached sympathy. I realized that empathy for human suffering is not natural at all. It is a conditioned response. We are programmed by ideology and external factors. The various mythologies of tribes, of ‘us’ versus ‘them’ that allows us to shut off human sympathy are all dependent on manipulation precisely of empathy. It is strange to me that from a naturalistic point of view why there is a discrepancy between the dominant ideologies of the planet today and instillation of universal human sympathy. There are many stories of who the bad guys are but this is a serious question that has not really been answered seriously. I was thinking about the genocide of the native Americans and the various ways in which people have dealt with this. It is quite possible that ‘might makes right’ is the true mantra of nature but I would like to believe that there is more to the story. I just don’t believe that the logical impossibility of a creator God has anything to do with any serious answer.

## What is volatility reprise

Physicists Challet and Zhang had produced a statistical physics type model ‘Minority Game’ where individual agents choose between two strategies based on past performance.  With their model they could explain long-range correlations in volatility.  Here is a paper: MarsiliMinorityGameModel.  In my opinion, this is a good direction.  The vague analogy for volatility is a sort of temperature and these spin glass type models which introduce micro-interactions and consider thermodynamic limits are consistent with this vague analogy.  A quantitative analogy comes from the fact that fluctuations in solids and long memory in stochastic volatility are modeled with the same tool: Mandelbrot’s fractional Brownian motion.

So the Minority Game is not a full explanation of what volatility is; however, it is an excellent toy model that does give us some insight into how fractional noises could occur.  With this one explanation in mind, we could focus on the quantitative aspects of using long memory correctly:  here it is extremely nice that Donoh0-Johnstone’s wavelet thresholding technique produces minimax guarantees for recovery of Besov functions which allows us to filter SV with IID noise.

I am not sure that the Minority Game is necessarily a correct model that fully captures long memory in stochastic volatility since it makes the restrictive assumption of choice between two strategies but I agree strongly with the idea that the quantitative behavior of stochastic volatility is identical to that for fluctuations in solids or river levels.

## Steps toward long memory arbitrage

The main technical issue for long memory stochastic volatility models that I have resolved theoretically is that the Donoho-Johnstone wavelet thresholding method can be used for optimal filtering in a Besov space for the noisy observations of log-stochastic volatility.  This I have implemented using the R package wavethresh.  The functions ‘wd’, ‘threshold.wd’ and ‘wr’ are decomposition, thresholding and reconstruction functions with various thresholding options.  We filter using this method, fit an ARFIMA model and then predict T-days ahead for the ARFIMA and then use the fact that the option price is Black-Scholes with average of the future volatilities from the prediction.   The code is this:

function( histr, P0, K, T ){
histlogr2<-log(histr^2+0.000001)
histlogr2[is.nan(histlogr2)]<-log(0.000001)
n<-length(histr)
z<-djthresh((as.numeric(histlogr2)+1.27)/1.5)
#z<-as.numeric(histlogr2)+1.27
fit<-arfima(z*1.5,order=c(1,0,1))