Izvorni znanstveni članak
https://doi.org/10.17535/crorr.2022.0006
A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
Kadir Kiran
; Department of Airframe and Powerplant Maintenance
Sažetak
In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak-
Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer
in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions.
During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of
function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.
Ključne riječi
conjugate gradient; line search; step length; steepest descent; optimization
Hrčak ID:
280265
URI
Datum izdavanja:
12.7.2022.
Posjeta: 922 *