Dataset distillation methods aim to compress a large dataset into a smal...
Pretrained large language models (LLMs) are strong in-context learners t...
Dataset Condensation is a newly emerging technique aiming at learning a ...
Many optimizers have been proposed for training deep neural networks, an...
It has been of increasing interest in the field to develop automatic
mac...
In this paper, we proposed a general framework for data poisoning attack...
We study the robustness verification problem for tree-based models, incl...
Neural Ordinary Differential Equation (Neural ODE) has been proposed as ...
Graph convolutional network (GCN) has been successfully applied to many
...
The concept of conditional computation for deep nets has been proposed
p...
Neural language models have been widely used in various NLP tasks, inclu...
Existing attention mechanisms, are mostly item-based in that a model is
...
Model compression is essential for serving large deep neural nets on dev...
Use of nonlinear feature maps via kernel approximation has led to succes...
In this paper, we present a novel massively parallel algorithm for
accel...
In this paper, we investigate a divide and conquer approach to Kernel Ri...