Many Objective Optimization

Prior to the introduction, we recommend three good many-objective optimizers:

  • (i) PICEA-g from preference-based coevolutionary class
  • (ii) HypE from indicator based class
  • (iii) NSGA-III from decomposition based class

    Introduction to MOPs, EMO, and MaOPs


    Multi-objective problems (MOPs) regularly arise in real-world design scenarios, where two or more objectives are required to be optimized simultaneously. As such objectives are often in competition with one another, the optimal solution of MOPs is a set of trade-off solutions, rather than a single solution. Due to the population-based approach, multi-objective evolutionary algorithms (MOEAs) are well suited for solving MOPs since this leads naturally to the generation of an approximate trade-o_ surface (or Pareto front) in a run. So far there are mainly three classes of MOEAs according to the selection strategies: Pareto dominance based ones, decomposition based ones and indicator based ones.

    Pareto-dominance based MOEAs, e.g., MOGA (Fonseca and Fleming 1993), NSGA-II (Deb et al. 2002) and SPEA2 (Zitzler et al. 2002), were some of the earliest approaches and are generally accepted to perform well on MOPs with 2 and 3 objectives. However, their search capability often degrades significantly as the number of objectives increases (Purshouse and Fleming 2003). This is because the proportion of Pareto optimal (or non-dominated) objective vectors in the population grows large when MOPs have more than 3 objectives i.e. many-objective problems (MaOPs). As a result, insufficient selection pressure is generated towards the Pareto front.

    • To handle MaOPs, there has been considerable effort invested in other types of MOEAs. A number of representatives are

      (i)the modified Pareto-dominance relation based MOEAs, e.g., "epsilon-MOEA (Deb et al. 2003),
    • (ii)the modified density estimation of the Pareto-based MOEAs, e.g., the shift-based density estimation (SDE) strategy (Li et al. 2014), Grid based EAs (Yang et al. 2013),
    • (iii)ranking based MOEAs, e.g., average ranking based MOEA (Corne and Knowles 2007),
    • (iv)dimension reduction based MOEAs, e.g., principle component analysis based NSGA-II (Deb and Saxena 2006, Lygoe et al. 2010), divide-and-conquer based MOEA (Purshouse and Fleming 2003),
    • (v)indicator based MOEAs, e.g., IBEA (Zitzler and Knzli 2004), SMS-EMOA (Emmerich et al. 2005), HypE (Bader and Zitzler 2011)
    • (vi)decomposition based MOEAs, e.g., CMOGA (Murata et al. 2001), MSOPS (Hughes 2003), MOEA/D (Zhang and Li 2007), NSGA-III (Deb and Jain 2014) .

    Our algorithms: PICEAs

    Please contact me via; for more information about the PICEAs