论文标题
逐渐变化的偏差是否相关?基于SUP-NORM估计器的测试方法
Are deviations in a gradually varying mean relevant? A testing approach based on sup-norm estimators
论文作者
论文摘要
经典变更点分析的目的是(1)检测可能非平稳时间序列的平均值的突然变化,并在(2)识别平均值表现出分段恒定行为的区域。但是,在许多应用中,假设平均值以平稳的方式逐渐变化是更合理的。这些逐渐变化可能是非相关的(即小),也可以与当前特定问题相关,并且本文提出了统计方法来检测后者。更准确地说,我们考虑常见的非参数回归模型$ x_ {i} =μ(i/n) + \ varepsilon_ {i} $,可能是非平稳错误的,并提出了一个无效假设的测试,即回归$ g(例如ande $ g(your)$ g(例如,$ g(例如,$ g)的最大绝对偏差,例如,$ g(例如,$ g(例如)$(例如,$ g(例如,$ g)($)$($ g(use)$ g(useve $ g(use)$ g(useve $ g(use)$ g(例如,$)$($ g(n) $ \ int_ {0}^{1}μ(t)dt $)小于给定间隔$ [x_ {0},x_ {1}] \ subseteq [0,1] $上给定阈值。使用适当的估计量(例如$ \ hat d _ {\ infty,n} $)开发了对这种假设的测试,对于最大偏差$ d _ {\ infty} = \ sup_ {t \ in [x__ {0},x_ {1},x_ {1}}}} |μ(t) - t) - g(t) - g(t) - g(us) - $ $ $。我们得出了$ \ hat d _ {\ infty,n} $的适当标准化版本的限制分布,其中标准化取决于函数$μ(\ cdot)-g(μ)$的极端点的lebesgue度量。开发了基于该集合的估计的精制程序,并证明了其一致性。结果通过仿真研究和数据示例说明了结果。
Classical change point analysis aims at (1) detecting abrupt changes in the mean of a possibly non-stationary time series and at (2) identifying regions where the mean exhibits a piecewise constant behavior. In many applications however, it is more reasonable to assume that the mean changes gradually in a smooth way. Those gradual changes may either be non-relevant (i.e., small), or relevant for a specific problem at hand, and the present paper presents statistical methodology to detect the latter. More precisely, we consider the common nonparametric regression model $X_{i} = μ(i/n) + \varepsilon_{i}$ with possibly non-stationary errors and propose a test for the null hypothesis that the maximum absolute deviation of the regression function $μ$ from a functional $g (μ)$ (such as the value $μ(0)$ or the integral $\int_{0}^{1} μ(t) dt$) is smaller than a given threshold on a given interval $[x_{0},x_{1}] \subseteq [0,1]$. A test for this type of hypotheses is developed using an appropriate estimator, say $\hat d_{\infty, n}$, for the maximum deviation $ d_{\infty}= \sup_{t \in [x_{0},x_{1}]} |μ(t) - g( μ) |$. We derive the limiting distribution of an appropriately standardized version of $\hat d_{\infty,n}$, where the standardization depends on the Lebesgue measure of the set of extremal points of the function $μ(\cdot)-g(μ)$. A refined procedure based on an estimate of this set is developed and its consistency is proved. The results are illustrated by means of a simulation study and a data example.