Symbolic regression
Symbolic regression is a type of regression analysis that searches the space of mathematical expressions to find the model that best fits a given dataset.
No particular model is provided as a starting point to the algorithm. Instead, initial expressions are formed by randomly combining mathematical building blocks such as operators, analytic functions, constants and state variables.
Genetic programming
While Genetic Programming (GP) can be used to perform a wide variety of tasks, symbolic regression is probably one of the most frequent area of application (the term symbolic regression stems from earlier work by John Koza on GP).
GP builds a population of simple random formulae to represent relationships among independent variables in order to predict new data. Successive generations of formulae (aka individuals / programs) are evolved from the previous one by selecting the fittest individuals from the population to undergo genetic operations.
The fitness function that drives the evolution can take into account not only error metrics (to ensure the models accurately predict the data), but also special complexity measures, thus ensuring that the resulting models reveal the data's underlying structure in a way that's understandable from a human perspective. This facilitates reasoning and favors the odds of getting insights about the data-generating system.
The code
#include "kernel/ultra.h"
int main()
{
using namespace ultra;
// DATA SAMPLE (output, input)
// (the target function is `x + sin(x)`)
std::istringstream training(R"(
-9.456,-10.0
-8.989, -8.0
-5.721, -6.0
-3.243, -4.0
-2.909, -2.0
0.000, 0.0
2.909, 2.0
3.243, 4.0
5.721, 6.0
8.989, 8.0
)");
// READING INPUT DATA
src::problem prob(training);
// SETTING UP SYMBOLS
prob.insert<real::sin>();
prob.insert<real::cos>();
prob.insert<real::add>();
prob.insert<real::sub>();
prob.insert<real::div>();
prob.insert<real::mul>();
// SEARCHING
src::search s(prob);
const auto result(s.run());
std::cout << "\nCANDIDATE SOLUTION\n"
<< out::c_language << result.best_individual
<< "\n\nFITNESS\n" << *result.best_measurements.fitness << '\n';
}
All the classes and functions are placed into the ultra namespace
.
Line by line description
#include "kernel/ultra.h"
ultra.h
is the only header file you have to include: it's enough for genetic programming (both symbolic regression and classification), genetic algorithms and differential evolution.
Data points, in the f(X), X
format, are stored in the training
input stream (in general they're in a CSV file):
std::istringstream training(R"(
-9.456,-10.0
-8.989, -8.0
...
)");
Points come from an unknown target function f
. The discovery of this function is our goal.
On a graph:
src::problem prob(training);
The src::problem
object contains everything needed for evolution: parameters, dataset... and the constructor initializes almost all the sensible values and the training set.
...almost all, something remains to be done. Input variables are automatically inserted when reading the input data. The remaining building blocks for individuals / formulae / programs have to be specified:
prob.insert<real::sin>();
prob.insert<real::cos>();
prob.insert<real::add>();
prob.insert<real::sub>();
prob.insert<real::div>();
prob.insert<real::mul>();
Ultra comes with batteries included: real::sin
, real::cos
, real::add
are part of a predefined primitive set.
Now all that's left is to start the search:
src::search s(prob);
const auto result(s.run());
and print the results:
std::cout << "\nCANDIDATE SOLUTION\n"
<< out::c_language << result.best_individual
<< "\n\nFITNESS\n" << *result.best_measurements.fitness << '\n';
out::c_language
is a manipulator that makes it possible to control the output format. python_language
and cpp_language
are other possibilities (see individual.h for the full list).
What you get is something like:
[INFO] Reading dataset from input stream...
[INFO] Setting up terminals...
[INFO] ...terminals ready. Variables: `X1`
[INFO] ...dataset read. Examples: 10, categories: 1, features: 1, classes: 0
[INFO] Number of layers set to 1
[INFO] Population size set to 72
0: -0.00553475 ( 0.173)
[INFO] Evolution completed at generation: 102. Elapsed time: 1.179
CANDIDATE SOLUTION
sin(X1)+X1
FITNESS
-0.00553475
Graphically this is what happen to the population:
(if you're curious about the animation take a look at examples/symbolic_regression/symbolic_regression02.cc and examples/symbolic_regression02.py)