Checkpoints¶
Sometimes, it might be useful to store some checkpoints while executing an algorithm. In particular, if a run is very time-consuming. pymoo offers to resume a run by serializing the algorithm object and loading it. Resuming runs from checkpoints is possible
the functional way by calling the
minimize
method,the object-oriented way by repeatedly calling the
next()
method orfrom a text file (Biased Initialization from
Population
)
Functional¶
[1]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.factory import get_problem
from pymoo.optimize import minimize
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
res = minimize(problem,
algorithm,
('n_gen', 5),
seed=1,
copy_algorithm=False,
verbose=True)
np.save("checkpoint", algorithm)
checkpoint, = np.load("checkpoint.npy", allow_pickle=True).flatten()
print("Loaded Checkpoint:", checkpoint)
# only necessary if for the checkpoint the termination criterion has been met
checkpoint.has_terminated = False
res = minimize(problem,
checkpoint,
('n_gen', 20),
seed=1,
copy_algorithm=False,
verbose=True)
============================================================
n_gen | n_eval | igd | gd | hv
============================================================
1 | 100 | 0.591406724 | 2.857718076 | 0.084181986
2 | 200 | 0.439419904 | 2.804026496 | 0.155903707
3 | 300 | 0.439419904 | 1.872462802 | 0.155903707
4 | 400 | 0.406628182 | 1.691762017 | 0.171264712
5 | 500 | 0.346478785 | 1.554097169 | 0.232891285
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x7fa1cdd58e20>
6 | 600 | 0.260455007 | 1.253989046 | 0.265093318
7 | 700 | 0.222821654 | 0.717126159 | 0.347501941
8 | 800 | 0.203582202 | 0.391439629 | 0.375532991
9 | 900 | 0.186117820 | 0.369671935 | 0.403788724
10 | 1000 | 0.151651385 | 0.224374761 | 0.428602488
11 | 1100 | 0.138821144 | 0.222776811 | 0.452520064
12 | 1200 | 0.110953674 | 0.192121107 | 0.487726964
13 | 1300 | 0.091142642 | 0.138930760 | 0.521068931
14 | 1400 | 0.076385894 | 0.101588370 | 0.538341984
15 | 1500 | 0.065724984 | 0.085599120 | 0.556340204
16 | 1600 | 0.052312985 | 0.073323910 | 0.580538446
17 | 1700 | 0.040157888 | 0.057590782 | 0.598526672
18 | 1800 | 0.033648449 | 0.046445845 | 0.609433690
19 | 1900 | 0.025987575 | 0.042402381 | 0.621694736
20 | 2000 | 0.023786958 | 0.037166445 | 0.625944818
Object Oriented¶
[2]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.factory import get_problem
from pymoo.factory import get_termination
from pymoo.optimize import minimize
from pymoo.visualization.scatter import Scatter
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
algorithm.setup(problem, seed=1, termination=('n_gen', 20))
for k in range(5):
algorithm.next()
print(algorithm.n_gen)
np.save("checkpoint", algorithm)
checkpoint, = np.load("checkpoint.npy", allow_pickle=True).flatten()
print("Loaded Checkpoint:", checkpoint)
while checkpoint.has_next():
checkpoint.next()
print(checkpoint.n_gen)
1
2
3
4
5
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x7fa1dd0253a0>
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
From a Text File¶
First, load the data from a file. Usually, this will include the variables X
, the objective values F
(and the constraints G
). Here, they are created randomly. Always make sure the Problem
you are solving would return the same values for the given X
values. Otherwise the data might be misleading for the algorithm.
(This is not the case here. It is really JUST for illustration purposes)
[3]:
import numpy as np
from pymoo.factory import G1
problem = G1()
N = 300
np.random.seed(1)
X = np.random.random((N, problem.n_var))
# here F and G is re-evaluated - in practice you want to load them from files too
F, G = problem.evaluate(X, return_values_of=["F", "G"])
Then, create a population object using your data:
[4]:
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.problems.static import StaticProblem
# now the population object with all its attributes is created (CV, feasible, ...)
pop = Population.new("X", X)
pop = Evaluator().eval(StaticProblem(problem, F=F, G=G), pop)
And finally run it with a non-random initial population sampling=pop
:
[5]:
from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.optimize import minimize
# the algorithm is now called with the population - biased initialization
algorithm = GA(pop_size=100, sampling=pop)
res = minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================================
n_gen | n_eval | cv (min) | cv (avg) | fopt | fopt_gap | favg
==========================================================================================
1 | 0 | 0.00000E+00 | 0.119240090 | -3.86901E+00 | 1.11310E+01 | -1.03797E+00
2 | 100 | 0.00000E+00 | 0.00000E+00 | -3.86901E+00 | 1.11310E+01 | -2.20457E+00
3 | 200 | 0.00000E+00 | 0.00000E+00 | -3.86901E+00 | 1.11310E+01 | -2.74838E+00
4 | 300 | 0.00000E+00 | 0.00000E+00 | -4.52633E+00 | 1.04737E+01 | -3.24875E+00
5 | 400 | 0.00000E+00 | 0.00000E+00 | -5.06900E+00 | 9.930996811 | -3.65891E+00
6 | 500 | 0.00000E+00 | 0.00000E+00 | -5.41687E+00 | 9.583134303 | -4.00395E+00
7 | 600 | 0.00000E+00 | 0.00000E+00 | -5.66705E+00 | 9.332952806 | -4.38690E+00
8 | 700 | 0.00000E+00 | 0.00000E+00 | -6.10300E+00 | 8.896996672 | -4.79175E+00
9 | 800 | 0.00000E+00 | 0.00000E+00 | -6.92066E+00 | 8.079343572 | -5.25495E+00
10 | 900 | 0.00000E+00 | 0.00000E+00 | -8.20256E+00 | 6.797437932 | -5.73335E+00