Bandit-Based Random Mutation Hill-Climbing

06/20/2016
by   Jialin Liu, et al.
0

The Random Mutation Hill-Climbing algorithm is a direct search technique mostly used in discrete domains. It repeats the process of randomly selecting a neighbour of a best-so-far solution and accepts the neighbour if it is better than or equal to it. In this work, we propose to use a novel method to select the neighbour solution using a set of independent multi- armed bandit-style selection units which results in a bandit-based Random Mutation Hill-Climbing algorithm. The new algorithm significantly outperforms Random Mutation Hill-Climbing in both OneMax (in noise-free and noisy cases) and Royal Road problems (in the noise-free case). The algorithm shows particular promise for discrete optimisation problems where each fitness evaluation is expensive.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2018

The N-Tuple Bandit Evolutionary Algorithm for Game Agent Optimisation

This paper describes the N-Tuple Bandit Evolutionary Algorithm (NTBEA), ...
research
07/22/2016

Optimal resampling for the noisy OneMax problem

The OneMax problem is a standard benchmark optimisation problem for a bi...
research
03/10/2018

Enhancing Evolutionary Optimization in Uncertain Environments by Allocating Evaluations via Multi-armed Bandit Algorithms

Optimization problems with uncertain fitness functions are common in the...
research
05/18/2004

Let's Get Ready to Rumble: Crossover Versus Mutation Head to Head

This paper analyzes the relative advantages between crossover and mutati...
research
08/18/2023

SHAPFUZZ: Efficient Fuzzing via Shapley-Guided Byte Selection

Mutation-based fuzzing is popular and effective in discovering unseen co...
research
11/07/2022

SLOPT: Bandit Optimization Framework for Mutation-Based Fuzzing

Mutation-based fuzzing has become one of the most common vulnerability d...
research
01/14/2022

BandMaxSAT: A Local Search MaxSAT Solver with Multi-armed Bandit

We address Partial MaxSAT (PMS) and Weighted PMS (WPMS), two practical g...

Please sign up or login with your details

Forgot password? Click here to reset