Optimziation

class pyoptsparse.pyOpt_optimization.Optimization(name, objFun, comm=None)[source]

Create a description of an optimization problem.

Parameters:
name : str

Name given to optimization problem. This is name is currently not used for anything, but may be in the future.

objFun : python function

Python function handle of function used to evaluate the objective function.

comm : MPI intra communication

The communicator this problem will be solved on. This is required for both analysis when the objective is computed in parallel as well as to use the internal parallel gradient computations. Defaults to MPI.COMM_WORLD if not given.

__init__(self, name, objFun, comm=None)[source]

Initialize self. See help(type(self)) for accurate signature.

addCon(self, name, *args, **kwargs)[source]

Convenience function. See addConGroup() for more information

addConGroup(self, name, nCon, lower=None, upper=None, scale=1.0, linear=False, wrt=None, jac=None)[source]

Add a group of variables into a variable set. This is the main function used for adding variables to pyOptSparse.

Parameters:
name : str

Constraint name. All names given to constraints must be unique

nCon : int

The number of constraints in this group

lower : scalar or array

The lower bound(s) for the constraint. If it is a scalar, it is applied to all nCon constraints. If it is an array, the array must be the same length as nCon.

upper : scalar or array

The upper bound(s) for the constraint. If it is a scalar, it is applied to all nCon constraints. If it is an array, the array must be the same length as nCon.

scale : scalar or array

A scaling factor for the constraint. It is generally advisable to have most optimization constraint around the same order of magnitude.

linear : bool

Flag to specify if this constraint is linear. If the constraint is linear, both the ‘wrt’ and ‘jac’ keyword arguments must be given to specify the constant portion of the constraint jacobian.

wrt : iterable (list, set, OrderedDict, array etc)

‘wrt’ stand for stands for ‘With Respect To’. This specifies for what dvs have non-zero jacobian values for this set of constraints. The order is not important.

jac : dictionary

For linear and sparse non-linear constraints, the constraint jacobian must be passed in. The structure is jac dictionary is as follows:

{‘dvName1’:<matrix1>, ‘dvName2’, <matrix1>, …}

They keys of the jacobian must correspond to the dvGroups given in the wrt keyword argument. The dimensions of each “chunk” of the constraint jacobian must be consistent. For example, <matrix1> must have a shape of (nCon, nDvs) where nDVs is the total number of design variables in dvName1. <matrix1> may be a dense numpy array or it may be scipy sparse matrix. However, it is HIGHLY recommended that sparse constraints are supplied to pyOptSparse using the pyOptSparse’s simplified sparse matrix format. The reason for this is that it is impossible for force scipy sparse matrices to keep a fixed sparsity pattern; if the sparsity pattern changes during an optimization, IT WILL FAIL.

The three simplified pyOptSparse sparse matrix formats are summarized below:

mat = {‘coo’:[row, col, data], ‘shape’:[nrow, ncols]} # A coo matrix mat = {‘csr’:[rowp, colind, data], ‘shape’:[nrow, ncols]} # A csr matrix mat = {‘coo’:[colp, rowind, data], ‘shape’:[nrow, ncols]} # A csc matrix

Note that for nonlinear constraints (linear=False), the values themselves in the matrices in jac do not matter, but the sparsity structure does matter. It is imperative that entries that will at some point have non-zero entries have non-zero entries in jac argument. That is, we do not let the sparsity structure of the jacobian change throughout the optimization. This stipulation is automatically checked internally.

addVar(self, name, *args, **kwargs)[source]

This is a convenience function. It simply calls addVarGroup() with nVars=1. Variables added with addVar() are returned as scalars.

addVarGroup(self, name, nVars, type='c', value=0.0, lower=None, upper=None, scale=1.0, offset=0.0, choices=None, **kwargs)[source]

Add a group of variables into a variable set. This is the main function used for adding variables to pyOptSparse.

Parameters:
name : str

Name of variable group. This name should be unique across all the design variable groups

nVars : int

Number of design variables in this group.

type : str.

String representing the type of variable. Suitable values for type are: ‘c’ for continuous variables, ‘i’ for integer values and ‘d’ for discrete selection.

value : scalar or array.

Starting value for design variables. If it is a a scalar, the same value is applied to all ‘nVars’ variables. Otherwise, it must be iterable object with length equal to ‘nVars’.

lower : scalar or array.

Lower bound of variables. Scalar/array usage is the same as value keyword

upper : scalar or array.

Upper bound of variables. Scalar/array usage is the same as value keyword

scale : scalar or array. Define a user supplied scaling

variable for the design variable group. This is often necessary when design variables of widely varying magnitudes are used within the same optimization. Scalar/array usage is the same as value keyword.

offset : scalar or array. Define a user supplied offset

variable for the design variable group. This is often necessary when design variable has a large magnitude, but only changes a little about this value.

choices : list

Specify a list of choices for discrete design variables

Notes

Calling addVar() and addVarGroup(…, nVars=1, …) are NOT equivalent! The variable added with addVar() will be returned as scalar, while variable returned from addVarGroup will be an array of length 1.

It is recommended that the addVar() and addVarGroup() calls follow the examples above by including all the keyword arguments. This make it very clear the intent of the script’s author. The type, value, lower, upper and scale should be given for all variables even if the default value is used.

Examples

>>> # Add a single design variable 'alpha'
>>> optProb.addVar('alpha', type='c', value=2.0, lower=0.0, upper=10.0,         scale=0.1)
>>> # Add 10 unscaled variables of 0.5 between 0 and 1 with name 'y'
>>> optProb.addVarGroup('y', type='c', value=0.5, lower=0.0, upper=1.0,         scale=1.0)
delVar(self, name)[source]

Delete a variable or variable group

Parameters:
name : str

Name of variable or variable group to remove

getDVs(self)[source]

Return a dictionary of the design variables. In most common usage, this function is not required.

Returns:
outDVs : dict

The dictionary of variables. This is the same as ‘x’ that would be used to call the user objective function.

printSparsity(self, verticalPrint=False)[source]

This function prints an (ascii) visualization of the jacobian sparsity structure. This helps the user visualize what pyOptSparse has been given and helps ensure it is what the user expected. It is highly recommended this function be called before the start of every optimization to verify the optimization problem setup.

Parameters:
verticalPrint : bool

True if the design variable names in the header should be printed vertically instead of horizontally. If true, this will make the constraint Jacobian print out more narrow and taller.

Warning

This function is collective on the optProb comm. It is therefore necessary to call this function on all processors of the optProb comm.

setDVs(self, inDVs)[source]

set the problem design variables from a dictionary. In most common usage, this function is not required.

Parameters:
inDVs : dict

The dictionary of variables. This dictionary is like the ‘x’ that would be used to call the user objective function.

setDVsFromHistory(self, histFile, key=None)[source]

Set optimization variables from a previous optimization. This is like a cold start, but some variables may have been added or removed from the previous optimization. This will try to set all variables it can.

Parameters:
histFile : str

Filename of the history file to read

key : str

Key of the history file to use for the x values. The default is None which will use the last x-value stored in the dictionary.