Biased Initialization

One way of customizing an algorithm is a biased initial population. This can be very helpful if expert knowledge already exists, and known solutions should be improved. In the following, two different ways of initialization are provided: a) just providing the design space of the variables and b) a Population object where the objectives and constraints are provided and are not needed to be calculated again.

NOTE: This works with all population-based algorithms in pymoo. Technically speaking, all algorithms which inherit from GeneticAlgorithm. For local-search based algorithm, the initial solution can be provided by setting x0 instead of sampling.

By Array

[1]:
import numpy as np

from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.factory import get_problem
from pymoo.optimize import minimize

problem = get_problem("zdt2")

X = np.random.random((300, problem.n_var))

algorithm = NSGA2(pop_size=100, sampling=X)

minimize(problem,
         algorithm,
         ('n_gen', 10),
         seed=1,
         verbose=True)
============================================================
n_gen |  n_eval |     igd      |      gd      |      hv
============================================================
    1 |     300 |  3.424124395 |  3.793712721 |  0.00000E+00
    2 |     400 |  3.424124395 |  3.777539511 |  0.00000E+00
    3 |     500 |  3.105771779 |  3.612825206 |  0.00000E+00
    4 |     600 |  3.019487666 |  3.836984034 |  0.00000E+00
    5 |     700 |  2.903584663 |  3.477562886 |  0.00000E+00
    6 |     800 |  2.600814724 |  2.889631289 |  0.00000E+00
    7 |     900 |  2.467609282 |  2.747636609 |  0.00000E+00
    8 |    1000 |  2.290068222 |  2.465545391 |  0.00000E+00
    9 |    1100 |  2.280968182 |  2.088307264 |  0.00000E+00
   10 |    1200 |  2.045836901 |  1.980975586 |  0.00000E+00
[1]:
<pymoo.core.result.Result at 0x7fb2ed941160>

By Population (pre-evaluated)

[2]:
import numpy as np

from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.factory import get_problem
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.optimize import minimize

problem = get_problem("zdt2")

# create initial data and set to the population object
X = np.random.random((300, problem.n_var))
pop = Population.new("X", X)
Evaluator().eval(problem, pop)

algorithm = NSGA2(pop_size=100, sampling=pop)

minimize(problem,
         algorithm,
         ('n_gen', 10),
         seed=1,
         verbose=True)
============================================================
n_gen |  n_eval |     igd      |      gd      |      hv
============================================================
    1 |       0 |  3.631168773 |  3.734414601 |  0.00000E+00
    2 |     100 |  3.631168773 |  3.657227975 |  0.00000E+00
    3 |     200 |  3.474726369 |  3.831564316 |  0.00000E+00
    4 |     300 |  3.026528580 |  3.159552956 |  0.00000E+00
    5 |     400 |  2.808529938 |  2.882119563 |  0.00000E+00
    6 |     500 |  2.804618833 |  2.578455548 |  0.00000E+00
    7 |     600 |  2.647365537 |  2.545304737 |  0.00000E+00
    8 |     700 |  2.468328424 |  2.279051140 |  0.00000E+00
    9 |     800 |  2.280912143 |  2.328351333 |  0.00000E+00
   10 |     900 |  1.932929122 |  2.106731201 |  0.00000E+00
[2]:
<pymoo.core.result.Result at 0x7fb2fcc1ae50>