Testing the Information Matrix Equality with Robust Estimators



4.4 Tilted normal alternative

Let h(x) be a positive valued, scalar function satisfying h(0) = 1 and
1, and consider the tilted normal density

dh(0) _
dx


f(y; β, σ, κ, λ)=


1κ             λ

σqκλ)φ(u) 6u3- 3u■•-u4- 6u2+3)

where u = ( Y — β)σ and
which is assumed to exist. Taking
h(x) = |x +1| yields a density compa-
rable to the first two terms of an Edgeworth expansion. Let
Fκ,λ be the
distribution corresponding to
f(y;0, 1, κ, λ). Then, as κ, λ → 0, the first
four moments of
Fκ,λ are (see Appendix C.3)

q(κ,λ) = EΦ


hh K(u3 3u)


6u2 +


3)     ,


EFκ,λ(Y)=0+o(κ,λ),

EFκ,λ (Y2) = 1 + o(κ,λ),                      (15)

EFκ,λ (Y3) = κ + o(κ,λ),

E-.  (Y4) =3 + λ + o(κ,λ),

from which κ and λ have an interpretation as skewness and (excess-)kurtosis
parameters. Our interest in this distribution lies in the fact that the score
test for
κ = λ = 0 is in fact the Jarque-Bera test. Thus, under a sequence
of local alternatives

Hn : Y ~ Fn = Fκ,λ,

with κ = k/yfn and λ = ly∕n, the IM test with ML estimator is optimal.

We show in Appendix C.3 that, under Hn,

T →d χq2 (δ),

with non-centrality parameter

k2 l2

δ = τ + 2i

for all M-estimators of scale.

17



More intriguing information

1. Evidence of coevolution in multi-objective evolutionary algorithms
2. The name is absent
3. AMINO ACIDS SEQUENCE ANALYSIS ON COLLAGEN
4. The Employment Impact of Differences in Dmand and Production
5. Innovation Policy and the Economy, Volume 11
6. The name is absent
7. The name is absent
8. The name is absent
9. Testing the Information Matrix Equality with Robust Estimators
10. The name is absent
11. Chebyshev polynomial approximation to approximate partial differential equations
12. The name is absent
13. The name is absent
14. The name is absent
15. The name is absent
16. The name is absent
17. The name is absent
18. The name is absent
19. Picture recognition in animals and humans
20. The name is absent