Q-CapsNets: A Specialized Framework for Quantizing Capsule Networks

by   Alberto Marchisio, et al.

Capsule Networks (CapsNets), recently proposed by the Google Brain team, have superior learning capabilities in machine learning tasks, like image classification, compared to the traditional CNNs. However, CapsNets require extremely intense computations and are difficult to be deployed in their original form at the resource-constrained edge devices. This paper makes the first attempt to quantize CapsNet models, to enable their efficient edge implementations, by developing a specialized quantization framework for CapsNets. We evaluate our framework for several benchmarks. On a deep CapsNet model for the CIFAR10 dataset, the framework reduces the memory footprint by 6.2x, with only 0.15 https://git.io/JvDIF.


page 1

page 2

page 3

page 4


Shifting Capsule Networks from the Cloud to the Deep Edge

Capsule networks (CapsNets) are an emerging trend in image processing. I...

Enabling Capsule Networks at the Edge through Approximate Softmax and Squash Operations

Complex Deep Neural Networks such as Capsule Networks (CapsNets) exhibit...

ReD-CaNe: A Systematic Methodology for Resilience Analysis and Design of Capsule Networks under Approximations

Recent advances in Capsule Networks (CapsNets) have shown their superior...

Capsule Network based Contrastive Learning of Unsupervised Visual Representations

Capsule Networks have shown tremendous advancement in the past decade, o...

Task-Adaptive Incremental Learning for Intelligent Edge Devices

Convolutional Neural Networks (CNNs) are used for a wide range of image-...

Pushing the Limits of Capsule Networks

Convolutional neural networks use pooling and other downscaling operatio...

Please sign up or login with your details

Forgot password? Click here to reset