Statistical Translation, Heat Kernels and Expected Distances

06/20/2012
by   Joshua Dillon, et al.
0

High dimensional structured data such as text and images is often poorly understood and misrepresented in statistical modeling. The standard histogram representation suffers from high variance and performs poorly in general. We explore novel connections between statistical translation, heat kernels on manifolds and graphs, and expected distances. These connections provide a new framework for unsupervised metric learning for text documents. Experiments indicate that the resulting distances are generally superior to their more standard counterparts.

READ FULL TEXT
research
05/13/2019

A novel statistical metric learning for hyperspectral image classification

In this paper, a novel statistical metric learning is developed for spec...
research
10/23/2020

Unsupervised Dense Shape Correspondence using Heat Kernels

In this work, we propose an unsupervised method for learning dense corre...
research
06/10/2019

A Unified Definition and Computation of Laplacian Spectral Distances

Laplacian spectral kernels and distances (e.g., biharmonic, heat diffusi...
research
10/28/2020

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

Collections of probability distributions arise in a variety of statistic...
research
05/30/2023

A Heat Diffusion Perspective on Geodesic Preserving Dimensionality Reduction

Diffusion-based manifold learning methods have proven useful in represen...
research
01/22/2019

Fast and Robust Shortest Paths on Manifolds Learned from Data

We propose a fast, simple and robust algorithm for computing shortest pa...

Please sign up or login with your details

Forgot password? Click here to reset