This optimizer is a non-dominating sorting genetic algorithm that solves non-convex and non-smooth single and multiobjective optimization problems. The algorithm attempts to perform global optimization, while enforcing constraints using a tournament selection-based strategy
Currently, the Python wrapper does not catch exceptions. If there is any error in the user-supplied function, you will get a seg-fault and no idea where it happened. Please make sure the objective is without errors before trying to use nsga2.
NSGA2 Optimizer Class - Inherited from Optimizer Abstract Class
__call__(self, optProb, storeHistory=None, hotStart=None, **kwargs)¶
This is the main routine used to solve the optimization problem.
- optProb : Optimization or Solution class instance
This is the complete description of the optimization problem to be solved by the optimizer
- storeHistory : str
File name of the history file into which the history of this optimization will be stored
- hotStart : str
File name of the history file to “replay” for the optimziation. The optimization problem used to generate the history file specified in ‘hotStart’ must be IDENTICAL to the currently supplied ‘optProb’. By identical we mean, EVERY SINGLE PARAMETER MUST BE IDENTICAL. As soon as he requested evaluation point from NSGA2 does not match the history, function and gradient evaluations revert back to normal evaluations.
The kwargs are there such that the sens= argument can be supplied (but ignored here in nsga2)