pymoo: Multi-objective Optimization in Python

Overview

There are some breaking changes in pymoo 0.5.0.

  • The module pymoo.models has been renamed to pymoo.core

  • The package structure has been modified to distinguish between single- and multi-objective optimization more clearly. For instance, the implementation of PSO has been moved from pymoo.algorithms.so_pso to pymoo.algorithms.soo.nonconvex.pso.

  • Furthermore, the elementwise_evaluation flag has been replaced by the ElementwiseProblem class to inherit from.

Enjoy our new release!

Our framework offers state of the art single- and multi-objective optimization algorithms and many more features related to multi-objective optimization such as visualization and decision making. pymoo is available on PyPi and can be installed by:

pip install -U pymoo

Please note that some modules can be compiled to speed up computations (optional). The command above attempts is made to compile the modules; however, if unsuccessful, the pure python version is installed. More information are available in our Installation Guide.

Icon Getting Started

Getting Started Guide: The most important steps in multi-objective optimization: Problem Definition, Optimization, Convergence Analysis, and Decision Making.

Icon Newsletter

Newsletter: Sign up for our newsletter to stay tuned and up to date about the current development and new features.

Features

Furthermore, our framework offers a variety of different features which cover various facets of multi-objective optimization:

List Of Algorithms

Algorithms available in pymoo

Algorithm

Class

Convenience

Objective(s)

Constraints

Description

Genetic Algorithm

GA

ga

single

x

A modular implementation of a genetic algorithm. It can be easily customized with different evolutionary operators and applies to a broad category of problems.

Differential Evolution

DE

de

single

x

Different variants of differential evolution which is a well-known concept for in continuous optimization especially for global optimization.

Biased Random Key Genetic Algorithm

BRKGA

brkga

single

x

Mostly used for combinatorial optimization where instead of custom evolutionary operators the complexity is put into an advanced variable encoding.

Nelder Mead

NelderMead

nelder-mead

single

x

A point-by-point based algorithm which keeps track of a simplex with is either extended reflected or shrunk.

Pattern Search

PatternSearch

pattern-search

single

x

Iterative approach where the search direction is estimated by forming a specific exploration pattern around the current best solution.

CMAES

CMAES

cmaes

single

Well-known model-based algorithm sampling from a dynamically updated normal distribution in each iteration.

Evolutionary Strategy

ES

es

single

The evolutionary strategy algorithm proposed for real-valued optimization problems.

Stochastic Ranking Evolutionary Strategy

SRES

sres

single

x

An evolutionary strategy with constrained handling using stochastic ranking.

Improved Stochastic Ranking Evolutionary Strategy

ISRES

isres

single

x

An improved version of SRES being able to deal dependent variables efficiently.

NSGA-II

NSGA2

nsga2

multi

x

Well-known multi-objective optimization algorithm based on non-dominated sorting and crowding.

R-NSGA-II

RNSGA2

rnsga2

multi

x

An extension of NSGA-II where reference/aspiration points can be provided by the user.

NSGA-III

NSGA3

nsga3

many

x

An improvement of NSGA-II developed for multi-objective optimization problems with more than two objectives.

U-NSGA-III

UNSGA3

unsga3

many

x

A generalization of NSGA-III to be more efficient for single and bi-objective optimization problems.

R-NSGA-III

RNSGA3

rnsga3

many

x

Allows defining aspiration points for NSGA-III to incorporate the user’s preference.

MOEAD

MOEAD

moead

many

Another well-known multi-objective optimization algorithm based on decomposition.

AGE-MOEA

AGEMOEA

agemoea

many

Similar to NSGA-II but estimates the shape of the Pareto-front to compute a score replacing the crowding distance.

C-TAEA

CTAEA

ctaea

many

x

An algorithm with a more sophisticated constraint-handling for many-objective optimization algoritms.

Cite Us

If you have used our framework for research purposes, you can cite our publication by:

@ARTICLE{pymoo,
    author={J. {Blank} and K. {Deb}},
    journal={IEEE Access},
    title={pymoo: Multi-Objective Optimization in Python},
    year={2020},
    volume={8},
    number={},
    pages={89497-89509},
}

News

September 12, 2021: After quite some time, a bigger release of pymoo (version 0.5.0) is available. The project has made significant progress regarding its structure and has an entirely new module organization. Even though there might be some breaking changes for users, it shall improve the clarity and readability of code in the long term. The documentation has gotten a completely new design and become responsive. In addition, some more algorithms have been improved (PSO, DE) and added (AGEMOEA, ES, SRES, ISRES). For more details, please have a look at the changelogs. (Release Notes)

September 4, 2020: We are more than happy to announce that a new version of pymoo (version 0.4.2) is available. This version has some new features and evolutionary operators, as well as an improved getting, started guide. For more details, please have a look at the release notes. (Release Notes)

May 4, 2020: A new release of pymoo is available. Version 0.4.1 of our framework contains a novel method to generate an arbitrary number of reference directions. Reference directions are required to run most of the many-objective optimization algorithms such as NSGA3 or MOEAD. Moreover, we have fixed minor bugs and provide a basic implementation of the well-known Hooke and Jeeves Pattern Search algorithm for single-objective problems. (Release Notes)

More News

About

This framework is powered by anyoptimization, a Python research community. It is developed and maintained by Julian Blank who is affiliated to the Computational Optimization and Innovation Laboratory (COIN) supervised by Kalyanmoy Deb at the Michigan State University in East Lansing, Michigan, USA.

We have developed the framework for research purposes and hope to contribute to the research area by delivering tools for solving and analyzing multi-objective problems. Each algorithm is developed as close as possible to the proposed version to the best of our knowledge. NSGA-II and NSGA-III have been developed collaboratively with one of the authors and, therefore, we recommend using them for official benchmarks.

If you intend to use our framework for any profit-making purposes, please contact us. Also, be aware that even state-of-the-art algorithms are just the starting point for many optimization problems. The full potential of genetic algorithms requires customization and the incorporation of domain knowledge. We have experience for more than 20 years in the optimization field and are eager to tackle challenging problems. Let us know if you are interested in working with experienced collaborators in optimization. Please keep in mind that only through such projects can we keep developing and improving our framework and making sure it meets the industry’s current needs.

Moreover, any kind of contribution is more than welcome:

(i) Give us a star on GitHub. This makes not only our framework but, in general, multi-objective optimization more accessible by being listed with a higher rank regarding specific keywords.

(ii) To offer more and more new algorithms and features, we are more than happy if somebody wants to contribute by developing code. You can see it as a win-win situation because your development will be linked to your publication(s), which can significantly increase your work awareness. Please note that we aim to keep a high level of code quality, and some refactoring might be suggested.

(iii) You like our framework, and you would like to use it for profit-making purposes? We are always searching for industrial collaborations because they help direct research to meet the industry’s needs. Our laboratory solving practical problems have a high priority for every student and can help you benefit from the research experience we have gained over the last years.

If you find a bug or you have any kind of concern regarding the correctness, please use our Issue Tracker Nobody is perfect Moreover, only if we are aware of the issues we can start to investigate them.

Contact

Julian Blank (blankjul [at] egr.msu.edu)
Michigan State University
Computational Optimization and Innovation Laboratory (COIN)
East Lansing, MI 48824, USA