This is the issue of how to estimate noise level of a signal like this one, which is one of the volatility series adapted to the Laplace eigenvectors of a graph.

First, let’s look at how denoising works. Donoho spent years on developing optimal denoising of one dimensional signals using wavelet analysis. Wavelets are wonderous things that grew from the basic signal processing methods of Fourier analysis. Recall that Fourier analysis is just taking a function on an interval, and we want to take the interval [0, 2*pi] so that you can think of this interval as the circle of radius 1 cut at a point and flattened out. Joseph Fourier’s idea was that functions on a circle should be expressed in terms of trigonometric functions, which are actually best thought of as exponential functions for purely imaginary numbers. Oh yes so ‘purely imaginary’ in mathematics does not mean purely imaginary in the way that an artist considers the phrase. It means numbers which are of the form i x where i =sqrt(-1). This is really bad terminology in mathematics but it stuck. So you look at sqrt(-1)*(pi/3) say and the relation is

exp( sqrt(-1)* theta) = cos (theta) + sqrt(-1) * sin(theta)

This is very standard elementary stuff in mathematics. These are the classic periodic wavelike functions. Fourier’s idea was that all functions on the circles can be written in terms of frequencies by decomposing them in this Fourier basis which is compactly written e^{sqrt(-1) n t} for n=0,1,2,… This Fourier decomposition and when it can be done and when not is easy when the function’s square has a finite interval. This is because orthogonality exists in Hilbert space of L^2 functions but for pointwise rather than L^2 convergence the issues are still not fully resolved. Anyway, wavelets entered the fray because Fourier series have very good frequency localization but obviously no space localization: a single sinusoid of high frequency will fill up the entire interval. So these wavelets were with a balancing of space and frequency localization and also had various orthogonality properties of interest.

The great discovery of Donoho and Johnstone was that for signals in an interval obscured by white noise could be extracted with guarantees of minimax optimality using a wavelet decomposition of the signal and then a simple thresholding which essentially is a uniform reduction of the signal to zero by a noise level. The exact soft thresholding function is

T(x s) = sign(x)*(|x| – s)_+

where s is the threshold.

Anyway, the annoying thing about doing this in actual code is that the R package ‘wavethresh’ assumes that the signal is already in a length that is a power of 2. So the function

dyadic.size<-function(x){

q<-2**(ceiling(log(length(x),2))+1)

y<-matrix(0,nrow=1,ncol=q)

y[1:length(x)]<-x

y

}

just places the vector x in a larger vector that’s dyadic. Now you can wavelet threshold any signal using this function:

library(wavethresh)

donoho.johnstone<-function(z){

x<-dyadic.size(z)

wx<-wd(x)

wxt<-threshold(wx) # this does soft thresholding

xrec<-wr(wxt)

xrec[1:length(z)]

}

Donoho and Johnstone’s paper is very good. It’s attached.

So what does denoised signal look like? This will get an answer in half an hour. You are the best soulmate muse honey. I’m not joking. This is pretty nice work here.

## Leave a Reply