Structural-Factor Modeling of High-Dimensional Time Series: Another Look at Approximate Factor Models with Diverging Eigenvalues

08/23/2018
by   Zhaoxing Gao, et al.
0

This article proposes a new approach to modeling high-dimensional time series data by providing a simple and natural way to understand the mechanism of factor models. We treat a p-dimensional time series as a nonsingular linear transformation of certain common factors and structured idiosyncratic components. Unlike the approximate factor models, we allow the largest eigenvalues of the covariance matrix of the idiosyncratic components to diverge as the dimension p increases, which is reasonable in the high-dimensional setting. A white noise testing procedure for high-dimensional random vectors is proposed to determine the number of common factors under the assumption that the idiosyncratic term is a vector white noise. We also introduce a projected Principal Component Analysis (PCA) to eliminate the diverging effect of the noises. Asymptotic properties of the proposed method are established for both fixed p and diverging p as the sample size n tends to infinity. Both simulated and real examples are used to assess the performance of the proposed method. We also compare our method with two commonly used methods in the literature and find that the proposed approach not only provides interpretable results, but also performs well in out-of-sample forecasting.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset