Installation

First, make sure you have a python environment installed. We recommend miniconda3 or anaconda3.

conda --version

Then from scratch create a virtual environment for pymoo:

conda create -n pymoo -y python==3.6 cython numpy
conda activate pymoo

For the current stable release please execute:

pip install pymoo

For the current development version:

git clone https://github.com/msu-coinlab/pymoo
cd pymoo
pip install .

Since for speedup some of the modules are also available compiled you can double check if the compilation worked. When executing the command be sure not already being in the local pymoo directory because otherwise not the in site-packages installed version will be used.

python -c "from pymoo.cython.function_loader import is_compiled;print('Compiled Extensions: ', is_compiled())"

Algorithms

Genetic Algorithm

A simple genetic algorithm to solve single-objective problems.

from pymoo.optimize import minimize
from pymop.factory import get_problem

problem = get_problem("g01")

res = minimize(problem,
               method='ga',
               method_args={
                   'pop_size': 100,
                   'eliminate_duplicates': False,
               },
               termination=('n_gen', 50),
               disp=True)

print("Best solution found: %s" % res.X)
print("Function value: %s" % res.F)

NSGA2

The algorithm is coded corresponding to [1].



import numpy as np

from pymoo.optimize import minimize
from pymoo.util import plotting
from pymop.factory import get_problem

# create the optimization problem
problem = get_problem("zdt1")
pf = problem.pareto_front()

res = minimize(problem,
               method='nsga2',
               method_args={'pop_size': 100},
               termination=('n_gen', 200),
               pf=pf,
               save_history=True,
               disp=True)

plot = True
if plot:
    plotting.plot(pf, res.F, labels=["Pareto-front", "F"])

# set true if you want to save a video
animate = False
if animate:
    from pymoo.util.plotting import animate as func_animtate
    H = np.concatenate([e.pop.get("F")[None, :] for e in res.history], axis=0)
    func_animtate('%s.mp4' % problem.name(), H, problem)

Non-dominated sorting genetic algorithm for bi-objective problems. The mating selection is done using the binary tournament by comparing the rank and the crowding distance. The crowding distance is a niching measure in a two-dimensional space which sums up the difference to the neighbours in each dimension. The non-dominated sorting considers the rank determined by being in the ith front and the crowding distance to achieve a good diversity when converging.

NSGA3

[2] [3] A referenced-based algorithm used to solve many-objective problems. The survival selection uses the perpendicular distance to the reference directions. As normalization the boundary intersection method is used.

import matplotlib.pyplot as plt

from pymoo.optimize import minimize
from pymoo.util import plotting
from pymoo.util.reference_direction import UniformReferenceDirectionFactory
from pymop.factory import get_problem

problem = get_problem("c3dtlz4", n_var=12, n_obj=3)

# create the reference directions to be used for the optimization
ref_dirs = UniformReferenceDirectionFactory(3, n_points=91).do()
#ref_dirs = UniformReferenceDirectionFactory(2, n_points=100).do()

# create the pareto front for the given reference lines
pf = problem.pareto_front(ref_dirs)

res = minimize(problem,
               method='nsga3',
               method_args={
                   'pop_size': 92,
                   'ref_dirs': ref_dirs
               },
               termination=('n_gen', 1000),
               pf=pf,
               seed=4,
               disp=True)

plotting.plot(res.F)

UNSGA3

from pymoo.optimize import minimize
from pymoo.util import plotting
from pymoo.util.reference_direction import UniformReferenceDirectionFactory
from pymop.factory import get_problem

problem = get_problem("dtlz2", n_var=12, n_obj=3)

# create the reference directions to be used for the optimization
ref_dirs = UniformReferenceDirectionFactory(3, n_points=91).do()

# create the pareto front for the given reference lines
pf = problem.pareto_front(ref_dirs)

res = minimize(problem,
               method='unsga3',
               method_args={
                   'pop_size': 100,
                   'ref_dirs': ref_dirs},
               termination=('n_gen', 200),
               pf=pf,
               disp=True)
plotting.plot(res.F)

RNSGA3

import numpy as np

from pymoo.optimize import minimize
from pymoo.util import plotting
from pymop.factory import get_problem, UniformReferenceDirectionFactory

problem = get_problem("zdt1")
pf = problem.pareto_front()

# create the reference directions to be used for the optimization
ref_points = np.array([[0.3, 0.4], [0.8, 0.5]])

res = minimize(problem,
               method='rnsga3',
               method_args={
                   'ref_points': ref_points,
                   'pop_per_ref_point': 50,
                   'mu': 0.1
               },
               termination=('n_gen', 400),
               pf=pf,
               disp=True)
plotting.plot(pf, res.F, ref_points, show=True, labels=['pf', 'F', 'ref_points'])


problem = get_problem("dtlz4", n_var=12, n_obj=3)
ref_dirs = UniformReferenceDirectionFactory(3, n_points=91).do()
pf = problem.pareto_front(ref_dirs)

# create the reference directions to be used for the optimization
ref_points = np.array([[1.0, 0.5, 0.2], [0.3, 0.2, 0.6]])

res = minimize(problem,
               method='rnsga3',
               method_args={
                   'ref_points': ref_points,
                   'pop_per_ref_point': 91,
                   'mu': 0.1
               },
               termination=('n_gen', 400),
               pf=pf,
               disp=True)
plotting.plot(pf, res.F, ref_points, show=True, labels=['pf', 'F', 'ref_points'])

Differential Evolution

[4] The classical single-objective differential evolution algorithm where different crossover variations and methods can be defined. It is known for its good results for effective global optimization.

from pymoo.optimize import minimize
from pymop.factory import get_problem

problem = get_problem("rastrigin", n_var=10)

res = minimize(problem,
               method='de',
               method_args={
                   'variant': "DE/best/1/bin",
                   'CR': 2,
                   'F': 0.75,
                   'n_replace': 5,
                   'pop_size': 200
               },
               termination=('n_gen', 1000),
               disp=True)

print("Best solution found: %s" % res.X)
print("Function value: %s" % res.F)

Contributors

Julian Blank

Contact

Feel free to contact me if you have any question:

Julian Blank (blankjul [at] egr.msu.edu)
Michigan State University
Computational Optimization and Innovation Laboratory (COIN)
East Lansing, MI 48824, USA

References

[1]K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: nsga-ii. Trans. Evol. Comp, 6(2):182–197, April 2002. URL: http://dx.doi.org/10.1109/4235.996017, doi:10.1109/4235.996017.
[2]Kalyanmoy Deb and Himanshu Jain. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: solving problems with box constraints. IEEE Transactions on Evolutionary Computation, 18(4):577–601, 2014. doi:10.1109/TEVC.2013.2281535.
[3]K. Deb and H. Jain. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: solving problems with box constraints. IEEE Transactions on Evolutionary Computation, 18(4):577–601, Aug 2014. doi:10.1109/TEVC.2013.2281535.
[4]Kenneth Price, Rainer M. Storn, and Jouni A. Lampinen. Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series). Springer-Verlag, Berlin, Heidelberg, 2005. ISBN 3540209506.
[5]Qingfu Zhang and Hui Li. A multi-objective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, Accepted, 2007.
[6]Kalyanmoy Deb, Karthik Sindhya, and Tatsuya Okabe. Self-adaptive simulated binary crossover for real-parameter optimization. In Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, GECCO ‘07, 1187–1194. New York, NY, USA, 2007. ACM. URL: http://doi.acm.org/10.1145/1276958.1277190, doi:10.1145/1276958.1277190.

Changelog

0.2.2

  • Several improvements in the code structure
  • Make the cython support optional
  • Modifications for pymop 0.2.3

0.2.1

  • First official release providing NSGA2, NSGA3 and RNSGA3