Adiabatic Quantum-Flux-Parametron (AQFP) is a superconducting logic with...
As data become increasingly vital for deep learning, a company would be ...
The conventional lottery ticket hypothesis (LTH) claims that there exist...
Data compression has been widely adopted to release mobile devices from
...
Recently, sparse training has emerged as a promising paradigm for effici...
Neural architecture search (NAS) and network pruning are widely studied
...
Vision Transformers (ViT) have shown rapid progress in computer vision t...
There have been long-standing controversies and inconsistencies over the...
Recent research demonstrated the promise of using resistive random acces...
Recent works demonstrated the promise of using resistive random access m...
Generative Adversarial Networks (GANs) have achieved huge success in
gen...
In deep model compression, the recent finding "Lottery Ticket Hypothesis...
Structured weight pruning is a representative model compression techniqu...
The computing wall and data movement challenges of deep neural networks
...
The high computation and memory storage of large deep neural networks (D...
The state-of-art DNN structures involve intensive computation and high m...
Large deep neural network (DNN) models pose the key challenge to energy
...
Weight quantization is one of the most important techniques of Deep Neur...
The state-of-art DNN structures involve high computation and great deman...
Both industry and academia have extensively investigated hardware
accele...
Hardware accelerations of deep learning systems have been extensively
in...
With recent trend of wearable devices and Internet of Things (IoTs), it
...
Large-scale deep neural networks (DNNs) are both compute and memory
inte...