Spectral calibration of exponential Levy models ??
21
7.1. Lower bound for μ in the case σ = 0
Fix a positive integer j. Let ψ(j) ∈ C∞ (R) be some function with
support in [0, 1] satisfying kψ(j) kL2 = 1, R ψ (j) (x)e-2-j x dx = 0 and
R ∖F ψ (j ) ( u ) u-212 du < ∞. Certainly, there are infinitely many functions ψ(j )
fulfilling these requirements; the last property follows for instance if ψ is the
second derivative of an L2-function. Introduce the wavelet-like notation
ψjk ( x ) := 2j/2 ψ (j )(2j x - k ), j > 0, k = 0,..., 2j - 1.
Consider for any r = (rk ) ∈ {- 1, +1}2j and some β > 0 the perturbed Levy
triplets Tr = (0,γ0, μr ) with
2j
μr (x) = μo(x) + β2-j(s+1 /2) X rkψjk(x), x ∈ R .
k=1
We note that due to Fψjk (0) = 0 and e-xψjk (x) dx = 0 the triplet
Tr satisfies the martingale condition such that Tr ∈ Gs (R, 0) holds for a
sufficiently small choice of the constant β > 0.
The Gaussian likelihood ratio of the observations under the probabilities
corresponding to Tr0 and Tr under the law of Tr for some r, r0 with rk = rk0
for all k except one k0 is given by
Λ(r0,r) = expf [ (Oro-Or)(x)ε-1 dW(x)-- I ∣Oro-Or)(x)|2ε-2 dx´ .
-∞ 2 -∞
Hence, the Kullback-Leibler divergence (relative entropy) between the two
observation models equals
KL(Tro ∖Tr) = 1 ∞ |(Oro
2 -∞
- Or)(x)|2 ε-2 dx.
The standard Assouad Lemma (Korostelev and Tsybakov 1993, Thm.
2.6.4) now yields the lower bound for the risk of any estimator μ of μ
inf sup Eτ h ∕∣^(x) - μ(x) |2 dxi & 2j ∖∖μr - μro ∣∣2 2 ~ 2-2js,
μ T =(0,γ,μ)∈Gs(R,0) '-J j
provided the Kullback-Leibler divergence K L(Tro |Tr) stays uniformly
bounded by a small constant. It remains to determine a minimal rate for
2j → ∞ such that this holds when the noise level tends to zero.
Arguing in the spectral domain and using the general estimate |ez - 1| 6
2∖z∖, for ∖z∖ 6 δ and some small δ > 0, together with ∣∣φτ,r0∣ψτ,r∣∣∞ → 1 for
2j → ∞, we obtain for all sufficiently large j
KL(Tro |Tr)
1∞
= ∏2< J∞F( Or
- Or)(u)|2 du
6ε
-2
Γ∣
-∞
φτ,r (u - i) - Ψτ,rθ (u - i )
u(u - i)
∣2
∣∣ du