On the Computation and Applications of Large Dense Partial Correlation Networks

03/17/2019
by   Keith Dillon, et al.
0

While sparse inverse covariance matrices are very popular for modeling network connectivity, the value of the dense solution is often overlooked. In fact the L2-regularized solution has deep connections to a number of important applications to spectral graph theory, dimensionality reduction, and uncertainty quantification. We derive an approach to directly compute the partial correlations based on concepts from inverse problem theory. This approach also leads to new insights on open problems such as model selection and data preprocessing, as well as new approaches which relate the above application areas.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/04/2022

Multi-output Gaussian processes for inverse uncertainty quantification in neutron noise analysis

In a fissile material, the inherent multiplicity of neutrons born throug...
research
05/22/2023

Cycle Consistency-based Uncertainty Quantification of Neural Networks in Inverse Imaging Problems

Uncertainty estimation is critical for numerous applications of deep neu...
research
11/26/2021

A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors

Hierarchical models with gamma hyperpriors provide a flexible, sparse-pr...
research
06/20/2020

Estimating Model Uncertainty of Neural Networks in Sparse Information Form

We present a sparse representation of model uncertainty for Deep Neural ...
research
10/05/2019

Covariance-free Partial Least Squares: An Incremental Dimensionality Reduction Method

Dimensionality reduction plays an important role in computer vision prob...
research
04/20/2019

Partial Correlations in Compositional Data Analysis

Partial correlations quantify linear association between two variables a...

Please sign up or login with your details

Forgot password? Click here to reset