A general white noise test based on kernel lag-window estimates of the spectral density operator
We propose a general white noise test for functional time series based on estimating a distance between the spectral density operator of a weakly stationary time series and the constant spectral density operator of an uncorrelated time series. The estimator that we propose is based on a kernel lag-window type estimator of the spectral density operator. When the observed time series is a strong white noise in a real separable Hilbert space, we show that the asymptotic distribution of the test statistic is standard normal, and we further show that the test statistic diverges for general serially correlated time series. These results recover as a special case those of Hong (1996) in the setting of scalar time series. In order to implement the test, we propose and study a number of kernel and bandwidth choices, including a new data adaptive bandwidth for such estimators. A simulation study demonstrated that the proposed method has good size and improved power when compared to other methods available in the literature, while also offering a light computational burden.
READ FULL TEXT