Cholesky-based multivariate Gaussian regression

02/26/2021
by   Thomas Muschinski, et al.
0

Multivariate Gaussian regression is embedded into a general distributional regression framework using flexible additive predictors determining all distributional parameters. While this is relatively straightforward for the means of the multivariate dependent variable, it is more challenging for the full covariance matrix Σ due to two main difficulties: (i) ensuring positive-definiteness of Σ and (ii) regularizing the high model complexity. Both challenges are addressed by adopting a parameterization of Σ based on its basic or modified Cholesky decomposition, respectively. Unlike the decomposition into variances and a correlation matrix, the Cholesky decomposition guarantees positive-definiteness for any predictor values regardless of the distributional dimension. Thus, this enables linking all distributional parameters to flexible predictors without any joint constraints that would substantially complicate other parameterizations. Moreover, this approach enables regularization of the flexible additive predictors through penalized maximum likelihood or Bayesian estimation as for other distributional regression models. Finally, the Cholesky decomposition allows to reduce the number of parameters when the components of the multivariate dependent variable have a natural order (typically time) and a maximum lag can be assumed for the dependencies among the components.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset