While language models are powerful and versatile, they often fail to add...
Contrastive learning usually compares one positive anchor sample with lo...
Semi-supervised learning has achieved notable success by leveraging very...
We abstract the features (i.e. learned representations) of multi-modal d...
Contrastive learning is a powerful self-supervised learning method, but ...
Can machines think? Since Alan Turing asked this question in 1950, nobod...
Foundation models like chatGPT have demonstrated remarkable performance ...
With infinitely many high-quality data points, infinite computational po...
For many interdisciplinary fields, ML interpretations need to be consist...
Deep neural networks are known to be vulnerable to unseen data: they may...
Generalization is one of the critical issues in machine learning. Howeve...
It is challenging to deal with censored data, where we only have access ...
In the classical multi-party computation setting, multiple parties joint...
Due to its strong interpretability, linear regression is widely used in
...
In the big data era, many organizations face the dilemma of data sharing...
We introduce a "learning-based" algorithm for the low-rank decomposition...
This concept paper outlines some recent efforts toward the design and
de...
Strong theoretical guarantees of robustness can be given for ensembles o...
Despite the non-convex nature of their loss functions, deep neural netwo...
Evaluating generative adversarial networks (GANs) is inherently challeng...
Stochastic gradient descent (SGD) is widely used in machine learning.
Al...
We give a simple, fast algorithm for hyperparameter optimization inspire...
The amount of data available in the world is growing faster than our abi...
Accelerated coordinate descent is widely used in optimization due to its...
Many classical algorithms are found until several years later to outlive...
We analyze stochastic gradient descent for optimizing non-convex functio...