Feeds:
Posts
Comments

Archive for April, 2016

Normal variation of a hypersurface M_0 of S^4(1/h) with \kappa= h^2 can be described by a function f:M_0\rightarrow \mathbf{R} by applying the exponential map along the normal unit vector N, i.e. x\rightarrow \exp_x( N f(x)).  We are interested in considering deformations of M_0 to M_t = \exp_x( N t f(x)).  Let w_1,w_2,w_3 be tangential coframes and it is known that their images under this deformations are Jacobi vector fields:

w_i \rightarrow T^i(f,t) w_i(t) where w_i(t) are parallel translates of w_i along the normal geodesic where

T^i(f,t) = c^i \frac{1}{\sqrt{\kappa}}\sin(\frac{tf}{\sqrt{\kappa}}) + d^i \cos(\frac{tf(x)}{\kappa})

Now the gravitational lagrangian for hypersurfaces of S^4(1/h) is the integral of the scalar curvature which is known explicitly in terms of the second fundamental form

L(g) = vol(M)\cdot Scal(S^4(1/h)) + \int_M 9 H^2 - S dv_g

where H is the mean curvature and S is the squared norm of the second fundamental form.  In analogy with the derivation of the Einstein field equations, we consider the normal variations by arbitrary functions f(x) and then consider the critical points of the function \frac{d}{dt} L(g(t)) under the variations.  This I will claim will produce the correct graviational field equations because we already know that the universe is a compact scaled four-sphere.

The implications of this picture are obviously extensive — for this points to four dimensional matter that is completely not accounted for by the established three dimensional theories of quantum mechanics and general relativity.  Quantum mechanics is likely a linear approximation of the correct physics.

 

Read Full Post »

The long chain that led to general relativity clarified that the Ricci curvature should be the gravitational Lagrangian (see weyl-purely-infinitesimal-geometry)  We know now that the universe is a compact sphere of fixed radius, so the gravitational action can be specialized to hypersurfaces of S^4(1/h).

The Einstein-Hilbert functional whose Euler-Lagrange equations produce the gravitational field equations in this case take the form

L(g) = \int_M Scal(g) \sqrt{-g} dx = vol(M) Scal(S^4(1/h)) + 2 \int_M (9H^2 - S) \sqrt{-g} dx

where H is the mean curvature and S is the squared norm of the second fundamental form.  The formula for scalar curvature of a hypersurface of a sphere is well-known to geometric analysts (see 2.4 of ScalarCurvatureWithConstantMeanCurvature2009)

Regardless of the correct field equations, here we have a unification of electromagnetism and gravity so long as the second fundamental form terms describe the electromagnetic potential.

Formally, one could follow the established derivation of the field equations as in Klainerman’s notes from 2009 (KlainermanGR2009)

klainerman-derivation-fieldequations

Doing so would take \dot{g}_{\mu\nu} to be normal to the hypersurface (the physical universe) but the novelty in my approach is that matter fields can be treated quantitatively via the second fundamental form and its derivatives.  This gives us a view of all matter as curvature of physical space in an ambient four-sphere universe.

 

Read Full Post »

 

zulfikar.ahmed@gmail.com <zulfikar.ahmed@gmail.com>

Attachments3:13 AM (0 minutes ago)

to jfrenkel, jhansen, jharrington, jharris, jhp, jhricko_4, jhs, jhutasoi, jianjunp, jinha, jjbrehm, jlind, jlondon, jlw, jmateo, jmerseth, jmetcalf, jmg, jmogel, joel, joelms, john.aldrich, john.beatty, johncrawford53, jose.oliveira
 

​Ladies and Gentlemen,

Black-Scholes-Merton 1973 model had been known to be problematic from the beginning.  Late 1990s and early 2000s long memory stochastic volatility models were considered;  the Zulf model is an exact specification of long memory stochastic volatility where price follows the familiar BSM SDE with a variable volatility:

dS_t/S_t = mu dt + sqrt(V) dw^1(t)

and the volatility follows a TIME-FRACTIONAL stochastic differential equation with a square root process just as in Heston model.

D_t^alpha V(t) = kappa*(vT-V(t)) + sigma*sqrt(V(t)) dw^2(t)/dt

with rho*dt=<dw^1,dw^2>.  Note that people had considered stochastic differential equations with fractional Brownian motions before.  My formulation via time-fractional SDE allows a simple modification of the Heston closed form solution where Heston’s D(t) — see Heston’s paper for details — is replaced by D'(t)=psi(t) D(t)​, and psi(t) is the waiting time distribution of a fractional Poisson process, i.e.,

psi(t) = kappa*t^(alpha-1)*mlf(-kappa*t^alpha,alpha,alpha)

So all this is very nice but we needed to show that this actually produces tighter fits to actual volatility surfaces COMPARED to Heston model. In generic terms, that long memory has an effect has been known since early 2000s so my model is giving you closed form option prices (inheriting Heston’s model) and here you have quantitative evidence of almost universal improvement of fit on randomly selected XLF volatility surfaces.

>>> import pandas as pd
>>> x=pd.read_csv(‘hestonVsZulfXLF.txt’)
>>> from scipy.stats import ttest_rel
>>> ttest_rel(x[‘HestonErr’],x[‘ZulfErr’])
(array(8.220630625948539), 2.3136313865144033e-10)
>>> import numpy as np
>>> np.mean(x[‘HestonErr’])
0.040236243251665478
>>> np.mean(x[‘ZulfErr’])
0.035349632026785982
>>> sum(x[‘HestonErr’]>x[‘ZulfErr’])
41
>>> len(x)
44
>>> 41./44
0.9318181818181818
>>>

This result is likely to be universal over all financial assets with liquid options but time will tell.  Now 93% improvement over Heston model for a stochastic volatility model is a significant result in finance since here I am providing a parsimonious model with a single extra parameter.  I conjecture that this is THE long memory correction in finance and resolves the issues of long memory on the table since Benoit Mandelbrot brought them to attention in the late 1960s.

Historical options data is not easily available, so I am attaching XLF historical data so you can check the R code yourself.  Option pricing code is in python .pyx file.  You cython it to produce C code, and then gcc it to produce a shared library that is used by tmlf.py which is called by hestonVzulf2.R.  The output of the code is in xlf.txt which includes the Heston and LMHeston objective function values per vol surface.  So now we have a model that looks universally better fit to volatility surfaces than Heston model which has quite a bit of theory now showing how smiles are explained; we also know from Comte-Renault’s work that the term structure of Heston model had problems that can be resolved by long memory in stochastic volatility.  My model is a tight closed form solution answer which is parsimonious now with a proven record versus Heston.
7 Attachments

Preview attachment hestonVzulf2.R

Preview attachment cmlf.pyx

Preview attachment Chronopoulou-Viens-LMSV.pdf

Preview attachment Heston-original.pdf

Preview attachment fractional-stochastic-diff-eq-sakthivel-rethavi-ren-2013.pdf

Preview attachment tmlf.py

Read Full Post »

 

zulfikar.ahmed@gmail.com <zulfikar.ahmed@gmail.com>

AttachmentsApr 14 (6 days ago)

to jfrenkel, jhansen, jharrington, jharris, jhp, jhricko_4, jhs, jhutasoi, jianjunp, jinha, jjbrehm, jlind, jlondon, jlw, jmateo, jmerseth, jmetcalf, jmg, jmogel, joel, joelms, john.aldrich, john.beatty, johncrawford53, jose.oliveira
Ladies and Gentlemen,

Attached find results on comparing ZulfII model with the Bates model on some DJIA stocks.   The results confirm that the ZulfII model is the best model in the world.

The main new ingredient for ZulfII model is the addition of long memory in stochastic volatility to Bates model (the top current model in the world including stochastic volatility and jumps).  You can look at the code attached for details.

Now the issue of long memory in economics is a murky issue that has not been answered at all — see http://mit.econ.au.dk/vip_htm/plildholdt/Master%20thesis.pdf for the confusion.  Granger had produced a model in 1980 that aggregation of AR(1) models could produce long memory.  Enrico Scalas and his colleagues had considered long memory in TICK data — see for example:  http://arxiv.org/pdf/physics/0505210v1.pdf

I have considered waiting times of jumps in log(return^2) in stocks and found that even in lower frequencies the waiting times of volatility jumps follow a Mittag-Leffler distribution.  The source of long memory in finance is therefore in the timing of human activity that informs the trading activity. We have produced the most accurate option pricing model in the world arguably following the finding that Mittag-Leffler distributions fit waiting times of volatility jumps better than ARFIMA models.

Zulfikar Moinuddin Ahmed

3 Attachments

Read Full Post »

The Stein-Stein model (stein-stein-1991) came before Heston, in 1991 and had the problem (from the scientific point of view) that there was no correlation between the brownian motion driving price and volatility which was fixed by Schobel-Zhu in 1999 (Schobel-Zhu-1999).  I included long memory in Heston model relatively easily.  In the case of Stein-Stein-Schobel-Zhu it is more complicated.  Essentially, Stein-Stein-Schobel-Zhu boils down to solving the three ordinary differential equations

D_t = -\sigma^2 D^2 + 2k D + 2s_1

B_t = (k-\sigma^2 D) B - k\theta D + s_2

C_t = -\frac{1}{2} \sigma^2 B^2 - k\theta B - \frac{1}{2} \sigma^2 D

These can be solved explicitly.

stein-stein-schobel-zhu.png

In order to add long memory with Mittag-Leffler waiting times, we replace

\sigma \rightarrow \sigma S_\alpha(t) and the same with \rho where

S_\alpha(t) = k t^{\alpha-1} E_{\alpha,\alpha}(-kt^\alpha).

Read Full Post »

 

zulfikar.ahmed@gmail.com <zulfikar.ahmed@gmail.com>

Attachments3:13 AM (3 minutes ago)

to jfrenkel, jhansen, jharrington, jharris, jhp, jhricko_4, jhs, jhutasoi, jianjunp, jinha, jjbrehm, jlind, jlondon, jlw, jmateo, jmerseth, jmetcalf, jmg, jmogel, joel, joelms, john.aldrich, john.beatty, johncrawford53, jose.oliveira
 

​Ladies and Gentlemen,

Black-Scholes-Merton 1973 model had been known to be problematic from the beginning.  Late 1990s and early 2000s long memory stochastic volatility models were considered;  the Zulf model is an exact specification of long memory stochastic volatility where price follows the familiar BSM SDE with a variable volatility:

dS_t/S_t = mu dt + sqrt(V) dw^1(t)

and the volatility follows a TIME-FRACTIONAL stochastic differential equation with a square root process just as in Heston model.

D_t^alpha V(t) = kappa*(vT-V(t)) + sigma*sqrt(V(t)) dw^2(t)/dt

with rho*dt=<dw^1,dw^2>.  Note that people had considered stochastic differential equations with fractional Brownian motions before.  My formulation via time-fractional SDE allows a simple modification of the Heston closed form solution where Heston’s D(t) — see Heston’s paper for details — is replaced by D'(t)=psi(t) D(t)​, and psi(t) is the waiting time distribution of a fractional Poisson process, i.e.,

psi(t) = kappa*t^(alpha-1)*mlf(-kappa*t^alpha,alpha,alpha)

So all this is very nice but we needed to show that this actually produces tighter fits to actual volatility surfaces COMPARED to Heston model. In generic terms, that long memory has an effect has been known since early 2000s so my model is giving you closed form option prices (inheriting Heston’s model) and here you have quantitative evidence of almost universal improvement of fit on randomly selected XLF volatility surfaces.

>>> import pandas as pd
>>> x=pd.read_csv(‘hestonVsZulfXLF.txt’)
>>> from scipy.stats import ttest_rel
>>> ttest_rel(x[‘HestonErr’],x[‘ZulfErr’])
(array(8.220630625948539), 2.3136313865144033e-10)
>>> import numpy as np
>>> np.mean(x[‘HestonErr’])
0.040236243251665478
>>> np.mean(x[‘ZulfErr’])
0.035349632026785982
>>> sum(x[‘HestonErr’]>x[‘ZulfErr’])
41
>>> len(x)
44
>>> 41./44
0.9318181818181818
>>>

This result is likely to be universal over all financial assets with liquid options but time will tell.  Now 93% improvement over Heston model for a stochastic volatility model is a significant result in finance since here I am providing a parsimonious model with a single extra parameter.  I conjecture that this is THE long memory correction in finance and resolves the issues of long memory on the table since Benoit Mandelbrot brought them to attention in the late 1960s.

Historical options data is not easily available, so I am attaching XLF historical data so you can check the R code yourself.  Option pricing code is in python .pyx file.  You cython it to produce C code, and then gcc it to produce a shared library that is used by tmlf.py which is called by hestonVzulf2.R.  The output of the code is in xlf.txt which includes the Heston and LMHeston objective function values per vol surface.  So now we have a model that looks universally better fit to volatility surfaces than Heston model which has quite a bit of theory now showing how smiles are explained; we also know from Comte-Renault’s work that the term structure of Heston model had problems that can be resolved by long memory in stochastic volatility.  My model is a tight closed form solution answer which is parsimonious now with a proven record versus Heston.
7 Attachments

Preview attachment hestonVzulf2.R

Preview attachment cmlf.pyx

Preview attachment Chronopoulou-Viens-LMSV.pdf

Preview attachment Heston-original.pdf

Preview attachment fractional-stochastic-diff-eq-sakthivel-rethavi-ren-2013.pdf

Preview attachment tmlf.py

Read Full Post »


zulfikar.ahmed@gmail.com <zulfikar.ahmed@gmail.com>

Attachments11:38 AM (1 hour ago)

to jfrenkel, jhansen, jharrington, jharris, jhp, jhricko_4, jhs, jhutasoi, jianjunp, jinha, jjbrehm, jlind, jlondon, jlw, jmateo, jmerseth, jmetcalf, jmg, jmogel, joel, joelms, john.aldrich, john.beatty, johncrawford53, jose.oliveira
 

​Ladies and Gentlemen,

The Zulf option pricing model is a time-fractional Heston model, a Heston model extended to address long memory in volatility.  The Heston model is one of the most accurate option pricing models in the world.  Our method for extending the Heston model consists of inserting into the Heston model what is essentially the waiting time distribution of a fractional Poisson process (there are other ways of considering our extension).  We introduce into the Heston model

psi(t) = kappa*delta^(-alpha)*mlf(-kappa*delta^alpha, alpha, alpha)

as a deflator to Heston’s D(t).  The R code attached is one I used to compare the fitting of Heston model versus Zulf model on a liquid option, the oil stock CVX.  Summary of results is statistically significant improvement in fits in N=49 samples

>>> from scipy.stats import ttest_rel
>>> ttest_rel(x.ix[:,2],x.ix[:,3])
(array(5.192283183607352), 4.1753688416565105e-06)

>>> sum(x.ix[:,2]>x.ix[:,3])
39
>>> len(x)
49
>>> 39./49.
0.7959183673469388

The full table of comparison is below.  Now the important thing here is that at least at this stage of establishing what I am sure is the world’s best option pricing model, it is best to stick to Nelder-Mead algorithm on R since there is a delicate issue of when the Black-Scholes implied volatility computation fails.  For our results it was also important to set the INITIAL PARAMETERS so that the convergence of fits were relatively smooth.  So for the waiting time

psi(t, alpha, delta) = kappa*delta(-alpha) * mlf(-kappa*t^alpha,alpha,alpha)

you want initial parameters for optimization alpha=2.0 and delta=1.0.  In fact, I suspect that delta could be eliminated altogether so that a single-parameter extension of the Heston model could handle long memory.

2 Attachments

Preview attachment hestonVzulf.R

Read Full Post »

Older Posts »