Recursion, Probability, Convolution and Classification for Computations

07/22/2019
by   Mircea Namolaru, et al.
0

The main motivation of this work was practical, to offer computationally and theoretical scalable ways to structuring large classes of computation. It started from attempts to optimize R code for machine learning/artificial intelligence algorithms for huge data sets, that due to their size, should be handled into an incremental (online) fashion. Our target are large classes of relational (attribute based), mathematical (index based) or graph computations. We wanted to use powerful computation representations that emerged in AI (artificial intelligence)/ML (machine learning) as BN (Bayesian networks) and CNN (convolution neural networks). For the classes of computation addressed by us, and for our HPC (high performance computing) needs, the current solutions for translating computations into such representation need to be extended. Our results show that the classes of computation targeted by us, could be tree-structured, and a probability distribution (defining a DBN, i.e. Dynamic Bayesian Network) associated with it. More ever, this DBN may be viewed as a recursive CNN (Convolution Neural Network). Within this tree-like structure, classification in classes with size bounded (by a parameterizable may be performed. These results are at the core of very powerful, yet highly practically algorithms for restructuring and parallelizing the computations. The mathematical background required for an in depth presentation and exposing the full generality of our approach) is the subject of a subsequent paper. In this paper, we work in an limited (but important) framework that could be understood with rudiments of linear algebra and graph theory. The focus is in applicability, most of this paper discuss the usefulness of our approach for solving hard compilation problems related to automatic parallelism.

READ FULL TEXT
research
03/16/2022

The Mathematics of Artificial Intelligence

We currently witness the spectacular success of artificial intelligence ...
research
10/06/2016

Metaheuristic Algorithms for Convolution Neural Network

A typical modern optimization technique is usually either heuristic or m...
research
05/12/2021

The Power of the Weisfeiler-Leman Algorithm for Machine Learning with Graphs

In recent years, algorithms and neural architectures based on the Weisfe...
research
07/14/2021

Higgs Boson Classification: Brain-inspired BCPNN Learning with StreamBrain

One of the most promising approaches for data analysis and exploration o...
research
05/13/2020

High Performance and Portable Convolution Operators for ARM-based Multicore Processors

The considerable impact of Convolutional Neural Networks on many Artific...
research
09/16/2012

A framework for large-scale distributed AI search across disconnected heterogeneous infrastructures

We present a framework for a large-scale distributed eScience Artificial...
research
10/24/2018

On the analysis of scheduling algorithms for structured parallel computations

Algorithms for scheduling structured parallel computations have been wid...

Please sign up or login with your details

Forgot password? Click here to reset