A Pragmatic Approach for Hyper-Parameter Tuning in Search-based Test Case Generation

01/14/2021
by   Shayan Zamani, et al.
0

Search-based test case generation, which is the application of meta-heuristic search for generating test cases, has been studied a lot in the literature, lately. Since, in theory, the performance of meta-heuristic search methods is highly dependent on their hyper-parameters, there is a need to study hyper-parameter tuning in this domain. In this paper, we propose a new metric ("Tuning Gain"), which estimates how cost-effective tuning a particular class is. We then predict "Tuning Gain" using static features of source code classes. Finally, we prioritize classes for tuning, based on the estimated "Tuning Gains" and spend the tuning budget only on the highly-ranked classes. To evaluate our approach, we exhaustively analyze 1,200 hyper-parameter configurations of a well-known search-based test generation tool (EvoSuite) for 250 classes of 19 projects from benchmarks such as SF110 and SBST2018 tool competition. We used a tuning approach called Meta-GA and compared the tuning results with and without the proposed class prioritization. The results show that for a low tuning budget, prioritizing classes outperforms the alternatives in terms of extra covered branches (10 times more than a traditional global tuning). In addition, we report the impact of different features of our approach such as search space size, tuning budgets, tuning algorithms, and the number of classes to tune, on the final results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2019

Revisiting Hyper-Parameter Tuning for Search-based Test Data Generation

Search-based software testing (SBST) has been studied a lot in the liter...
research
02/01/2019

Hyper-parameter Tuning under a Budget Constraint

We study a budgeted hyper-parameter tuning problem, where we optimize th...
research
06/13/2020

Online Hyper-parameter Tuning in Off-policy Learning via Evolutionary Strategies

Off-policy learning algorithms have been known to be sensitive to the ch...
research
12/23/2019

AutoML: Exploration v.s. Exploitation

Building a machine learning (ML) pipeline in an automated way is a cruci...
research
04/26/2019

AlphaClean: Automatic Generation of Data Cleaning Pipelines

The analyst effort in data cleaning is gradually shifting away from the ...
research
09/11/2018

Tuning metaheuristics by sequential optimization of regression models

Tuning parameters is an important step for the application of metaheuris...
research
06/13/2019

Meta-heuristic for non-homogeneous peak density spaces and implementation on 2 real-world parameter learning/tuning applications

Observer effect in physics (/psychology) regards bias in measurement (/p...

Please sign up or login with your details

Forgot password? Click here to reset