Text-to-image generation (TTI) refers to the usage of models that could
...
Continual learning (CL) trains NN models incrementally from a continuous...
ChatGPT-like models have revolutionized various applications in artifici...
Federated Learning (FL) is a distributed learning paradigm that empowers...
Recent advances on deep learning models come at the price of formidable
...
Large-scale transformer models have become the de-facto architectures fo...
The past several years have witnessed the success of transformer-based
m...
How to efficiently serve ever-larger trained natural language models in
...
Extreme compression, particularly ultra-low bit precision (binary/ternar...
DNN models across many domains continue to grow in size, resulting in hi...
Deep Learning (DL) models have achieved superior performance. Meanwhile,...
Nearest Neighbor Search (NNS) has recently drawn a rapid increase of int...
In recent years, large pre-trained Transformer-based language models hav...
As the training of giant dense models hits the boundary on the availabil...
Deep Learning (DL) models have achieved superior performance in many
app...
Continual Learning (CL) is an emerging machine learning paradigm that ai...
Recent works have demonstrated great success in training high-capacity
a...
Graph-based algorithms have shown great empirical potential for the
appr...
Large-scale model training has been a playing ground for a limited few
r...
Recently, Transformer-based language models have demonstrated remarkable...
The effectiveness of LSTM neural networks for popular tasks such as Auto...
Software-managed heterogeneous memory (HM) provides a promising solution...
With the advancement of machine learning and deep learning, vector searc...
Neural language models (NLMs) have recently gained a renewed interest by...