Find the global minimum of Rastring function
In mathematical optimization, the Rastrigin function is a non-convex function used as a performance test problem for optimization algorithms. It's a typical example of non-linear multimodal function. It was first proposed by Rastrigin (Systems of extremal control Mir, Moscow 1974) as a 2-dimensional function and has been generalized by Mühlenbein, Schomisch and Born (The Parallel Genetic Algorithm as Function Optimizer, Parallel Computing 17, 1991).
Finding the minimum of this function is a fairly difficult problem due to its large search space and its large number of local minima.
On an n
-dimensional domain it's defined by:
where \(A=10\). The function is usually evaluated on the hypercube \(x_i \in [5.12, 5.12]\).
As the plot shows, Rastrigin's function has many local minima. However there is just one global minimum which occurs at \(x = 0\), where the value of the function is \(0\). At any local minimum other than \(0\), the value of Rastrigin's function is greater than \(0\). The farther the local minimum is from the origin, the larger the value of the function is at that point.
Rastrigin's function is often used for testing, because its many local minima make it difficult for standard, gradient-based methods to find the global minimum.
We will use Differential Evolution to find out the global minimum of the five-dimensional Rastrigin function.
The function can be written as:
double neg_rastrigin(const ultra::de::individual &x)
{
using std::numbers::pi;
constexpr double A = 10.0;
const double rastrigin =
A * x.parameters()
+ std::accumulate(x.begin(), x.end(), 0.0,
[=](double sum, double xi)
{
return sum + xi*xi - A*std::cos(2*pi*xi);
});
return -rastrigin;
}
Observe how we have defined the negation of the Rastrigin function. Ultra always tries to maximize a function, if you want the minimum of a function you must consider its negation (maximizing a function is equivalent to minimizing its negative).
We specify the search space:
const unsigned dimensions(5); // 5D - Rastrigin function
de::problem prob(dimensions, {-5.12, 5.12});
and some of the Differential Evolution parameters:
prob.params.population.individuals = 50;
prob.params.evolution.generations = 1000;
The de::search
class constructor links the search space to the objective function and the de::search::run()
member function starts the search:
de::search search(prob, neg_rastrigin);
const auto res(search.run());
The default is to perform a single evolutionary run. EAs are stochastic processes, there is no guarantee they will discover the global optimum. In general it's a good idea to perform multiple runs (de::search.run(n)
).
The remaining code just prints the candidate solution:
const auto solution(res.best_individual);
const auto value(res.best_measurements.fitness);
std::cout << "Minimum found: " << *value << '\n';
std::cout << "Coordinates: [ ";
for (auto xi : solution)
std::cout << xi << ' ';
std::cout << "]\n";
Differential Evolution is a general purpose, "black box" optimization algorithm so you can easily adapt this example to different functions.
The complete source code of the example (also available here):
#include <numbers>
#include "kernel/ultra.h"
double neg_rastrigin(const ultra::de::individual &x)
{
using std::numbers::pi;
constexpr double A = 10.0;
const double rastrigin =
A * x.parameters()
+ std::accumulate(x.begin(), x.end(), 0.0,
[=](double sum, double xi)
{
return sum + xi*xi - A*std::cos(2*pi*xi);
});
return -rastrigin;
}
int main()
{
using namespace ultra;
log::reporting_level = log::lINFO;
const unsigned dimensions(5); // 5D - Rastrigin function
de::problem prob(dimensions, {-5.12, 5.12});
prob.params.population.individuals = 50;
prob.params.evolution.generations = 1000;
de::search search(prob, neg_rastrigin);
const auto res(search.run());
const auto solution(res.best_individual);
const auto value(res.best_measurements.fitness);
std::cout << "Minimum found: " << *value << '\n';
std::cout << "Coordinates: [ ";
for (auto xi : solution)
std::cout << xi << ' ';
std::cout << "]\n";
}