Publications

Stats

View publication

Title Prioritizing Versions for Performance Regression Testing: The Pharo Case
Authors Juan Pablo Sandoval Alcocer, Alexandre Bergel, Marco Tulio Valente
Publication date June 2020
Abstract Context
Software performance may suffer regressions caused by source code changes.
Measuring performance at each new software version is useful for early
detection of performance regressions. However, systematically running
benchmarks is often impractical (e.g., long running execution, prioritizing
functional correctness over non-functional).
\n\n
Objective
In this article, we propose Horizontal Profiling, a sampling technique to
predict when a new revision may cause a regression by analyzing the source
code and using run-time information of a previous version. The goal of
Horizontal Profiling is to reduce the performance testing overhead by
benchmarking just software versions that contain costly source code changes.
\n\n
Method
We present an evaluation in which we apply Horizontal Profiling to identify
performance regressions of 17 software projects written in the Pharo
programming language, totaling 1,288 software versions.
\n\n
Results
Horizontal Profiling detects more than 80% of the regressions by
benchmarking less than 20% of the versions. In addition, our experiments
show that Horizontal Profiling has better precision and executes the
benchmarks in less versions that the state of the art tools, under our
benchmarks.
\n\n
Conclusions
We conclude that by adequately characterizing the run-time information of a
previous version, it is possible to determine if a new version is likely to
introduce a performance regression or not. As a consequence, a significant
fraction of the performance regressions are identified by benchmarking only
a small fraction of the software versions.
Pages article 102415
Volume 191
Journal name Science of Computer Programming
Publisher Elsevier Science (Amsterdam, The Netherlands)
Reference URL View reference page