-
Notifications
You must be signed in to change notification settings - Fork 0
2 Rosenbrock function
To assess the performance of different algorithms, we utilized the Rosenbrock function: f(x, y) = (a-x)^2 + b(y-x^2)^2 This function was specifically chosen due to its ability to create a challenging optimization problem. It features a global minimum at coordinates (x, y) = (a, a^2), where the function value is zero. However, the function exhibits a distinctive characteristic a long, narrow, curved valley with steep sides. This unique shape makes it difficult for optimization algorithms to locate the global minimum.
The shape and steepness of the Rosenbrock function are influenced by the coefficients a and b. The parameter a controls the position of the minimum along the x-axis. Increasing the value of a shifts the minimum to the right. On the other hand, the parameter b affects the steepness of the function's valley. Larger values of b result in a more rugged function, posing a greater challenge for optimization algorithms.
In a previous study, researchers examined the behavior of algorithms (Conjugate gradient, Newton-Rapson, Steppest descend and it's variatons) with respect to two critical values of b (1 and 100). In this work, we sought to extend their idea by investigating the algorithms behavior over a range of b values, using a logarithmic scale.