Machine Learning with Clos Networks

01/18/2019
by   Timothy Whithing, et al.
0

We present a new methodology for improving the accuracy of small neural networks by applying the concept of a clos network to achieve maximum expression in a smaller network. We explore the design space to show that more layers is beneficial, given the same number of parameters. We also present findings on how the relu nonlinearity ffects accuracy in separable networks. We present results on early work with Cifar-10 dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2020

Better Together: Resnet-50 accuracy with 13x fewer parameters and at 3x speed

Recent research on compressing deep neural networks has focused on reduc...
research
11/17/2015

Learning Neural Network Architectures using Backpropagation

Deep neural networks with millions of parameters are at the heart of man...
research
01/28/2019

Squeezed Very Deep Convolutional Neural Networks for Text Classification

Most of the research in convolutional neural networks has focused on inc...
research
03/30/2020

Designing Network Design Spaces

In this work, we present a new network design paradigm. Our goal is to h...
research
11/07/2020

Depthwise Multiception Convolution for Reducing Network Parameters without Sacrificing Accuracy

Deep convolutional neural networks have been proven successful in multip...
research
02/02/2023

Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data

We study the interpolation, or memorization, power of deep ReLU neural n...
research
04/19/2019

Shallow Neural Network can Perfectly Classify an Object following Separable Probability Distribution

Guiding the design of neural networks is of great importance to save eno...

Please sign up or login with your details

Forgot password? Click here to reset