Version: 0.5.0

# RepairΒΆ

The repair operator is mostly problem-dependent. Most commonly, it is used to make sure the algorithm is only searching in the feasible space. It is applied after the offsprings have been reproduced. In the following, we are using the knapsack problem to demonstrate the repair operator in pymoo.

In the well-known Knapsack Problem. In this problem, a knapsack has to be filled with items without violating the maximum weight constraint. Each item $$j$$ has a value $$b_j \geq 0$$ and a weight $$w_j \geq 0$$ where $$j \in \{1, .., m\}$$. The binary decision vector $$z = (z_1, .., z_m)$$ defines, if an item is picked or not. The aim is to maximize the profit $$g(z)$$:

\begin{eqnarray} max & & g(z) \\[2mm] \notag \text{s.t.} & & \sum_{j=1}^m z_j \, w_j \leq Q \\[1mm] \notag & & z = (z_1, .., z_m) \in \mathbb{B}^m \\[1mm] \notag g(z) & = & \sum_{j=1}^{m} z_j \, b_j \\[2mm] \notag \end{eqnarray}

A simple GA will have some infeasible evaluations in the beginning and then concentrate on the infeasible space.

[1]:

from pymoo.factory import get_crossover, get_mutation, get_sampling
from pymoo.optimize import minimize
from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.problems.single.knapsack import create_random_knapsack_problem

problem = create_random_knapsack_problem(30)

algorithm = GA(pop_size=200,
sampling=get_sampling("bin_random"),
crossover=get_crossover("bin_hux"),
mutation=get_mutation("bin_bitflip"),
eliminate_duplicates=True)

res = minimize(problem,
algorithm,
termination=('n_gen', 10),
verbose=True)

===========================================================================
n_gen |  n_eval |   cv (min)   |   cv (avg)   |     fopt     |     favg
===========================================================================
1 |     200 |  2.36000E+02 |  5.18510E+02 |            - |            -
2 |     400 |  2.14000E+02 |  3.84785E+02 |            - |            -
3 |     600 |  5.80000E+01 |  3.03340E+02 |            - |            -
4 |     800 |  1.30000E+01 |  2.28240E+02 |            - |            -
5 |    1000 |  0.00000E+00 |  1.55045E+02 | -3.56000E+02 | -3.13000E+02
6 |    1200 |  0.00000E+00 |  9.43350E+01 | -4.85000E+02 | -3.29800E+02
7 |    1400 |  0.00000E+00 |  4.37700E+01 | -4.85000E+02 | -2.94053E+02
8 |    1600 |  0.00000E+00 |  1.37600E+01 | -5.18000E+02 | -2.75719E+02
9 |    1800 |  0.00000E+00 |  0.190000000 | -5.73000E+02 | -2.80931E+02
10 |    2000 |  0.00000E+00 |  0.00000E+00 | -5.73000E+02 | -3.34395E+02


Because the constraint $$\sum_{j=1}^m z_j \, w_j \leq Q$$ is fairly easy to satisfy. Therefore, we can make sure that this constraint is not violated by repairing the individual before evaluating the objective function. A repair class has to be defined, and the population is given as input. The repaired population has to be returned.

[2]:

import numpy as np
from pymoo.core.repair import Repair

class ConsiderMaximumWeightRepair(Repair):

def _do(self, problem, pop, **kwargs):

# maximum capacity for the problem
Q = problem.C

# the packing plan for the whole population (each row one individual)
Z = pop.get("X")

# the corresponding weight of each individual
weights = (Z * problem.W).sum(axis=1)

# now repair each indvidiual i
for i in range(len(Z)):

# the packing plan for i
z = Z[i]

# while the maximum capacity violation holds
while weights[i] > Q:

# randomly select an item currently picked
item_to_remove = np.random.choice(np.where(z)[0])

# and remove it
z[item_to_remove] = False

weights[i] -= problem.W[item_to_remove]

# set the design variables for the population
pop.set("X", Z)
return pop

[3]:

algorithm = GA(pop_size=200,
sampling=get_sampling("bin_random"),
crossover=get_crossover("bin_hux"),
mutation=get_mutation("bin_bitflip"),
repair=ConsiderMaximumWeightRepair(),
eliminate_duplicates=True)

res = minimize(problem,
algorithm,
termination=('n_gen', 10),
verbose=True)


===========================================================================
n_gen |  n_eval |   cv (min)   |   cv (avg)   |     fopt     |     favg
===========================================================================
1 |     171 |  0.00000E+00 |  0.00000E+00 | -3.87000E+02 | -1.51398E+02
2 |     371 |  0.00000E+00 |  0.00000E+00 | -4.82000E+02 | -2.31815E+02
3 |     571 |  0.00000E+00 |  0.00000E+00 | -4.89000E+02 | -2.88540E+02
4 |     771 |  0.00000E+00 |  0.00000E+00 | -5.14000E+02 | -3.38850E+02
5 |     971 |  0.00000E+00 |  0.00000E+00 | -5.94000E+02 | -3.86450E+02
6 |    1171 |  0.00000E+00 |  0.00000E+00 | -6.09000E+02 | -4.29760E+02
7 |    1371 |  0.00000E+00 |  0.00000E+00 | -6.69000E+02 | -4.69560E+02
8 |    1571 |  0.00000E+00 |  0.00000E+00 | -6.69000E+02 | -5.03635E+02
9 |    1771 |  0.00000E+00 |  0.00000E+00 | -6.70000E+02 | -5.25405E+02
10 |    1971 |  0.00000E+00 |  0.00000E+00 | -6.70000E+02 | -5.48395E+02


As demonstrated, the repair operator makes sure no infeasible solution is evaluated. Even though this example seems to be quite easy, the repair operator makes especially sense for more complex constraints where domain-specific knowledge is known.