收录:
摘要:
A novel metric, called kernel-based conditional mean dependence (KCMD), is proposed to measure and test the departure from conditional mean independence between a response variable Y and a predictor variable X, based on the reproducing kernel embedding and the Hilbert-Schmidt norm of a tensor operator. The KCMD has several appealing merits. It equals zero if and only if the conditional mean of Y given X is independent of X, i.e. E(Y vertical bar X) = E(Y) almost surely, provided that the employed kernel is characteristic; it can be used to detect all kinds of conditional mean dependence with an appropriate choice of kernel; it has a simple expectation form and allows an unbiased empirical estimator. A class of test statistics based on the estimated KCMD is constructed, and a wild bootstrap test procedure to the conditional mean independence is presented. The limit distributions of the test statistics and the bootstrapped statistics under null hypothesis, fixed alternative hypothesis and local alternative hypothesis are given respectively, and a data-driven procedure to choose a suitable kernel is suggested. Simulation studies indicate that the tests based on the KCMD have close powers to the tests based on martingale difference divergence in monotone dependence, but excel in the cases of nonlinear relationships or the moment restriction on X is violated. Two real data examples are presented for the illustration of the proposed method. (C) 2021 Elsevier B.V. All rights reserved.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
COMPUTATIONAL STATISTICS & DATA ANALYSIS
ISSN: 0167-9473
年份: 2021
卷: 160
1 . 8 0 0
JCR@2022
ESI学科: MATHEMATICS;
ESI高被引阀值:31
JCR分区:2
归属院系: