Symbolic Math Toolbox

Using Symbolic Mathematics with Optimization Toolbox™ Solvers

This example shows how to use the Symbolic Math Toolbox™ functions named jacobian and matlabFunction to provide derivatives to optimization solvers. Optimization Toolbox™ solvers are usually more accurate and efficient when you supply gradients and Hessians of the objective and constraint functions.

Additional Requirements:

Symbolic Math Toolbox

There are several considerations in using symbolic calculations with optimization functions:

1. Optimization objective and constraint functions should be defined in terms of a vector, say x. However, symbolic variables are scalar or complex-valued, not vector-valued. This requires you to translate between vectors and scalars.

2. Optimization gradients, and sometimes Hessians, are supposed to be calculated within the body of the objective or constraint functions. This means that a symbolic gradient or Hessian has to be placed in the appropriate place in the objective or constraint function file or function handle.

3. Calculating gradients and Hessians symbolically can be time-consuming. Therefore you should perform this calculation only once, and generate code, via matlabFunction, to call during execution of the solver.

4. Evaluating symbolic expressions with the subs function is time-consuming. It is much more efficient to use matlabFunction.

5. matlabFunction generates code that depends on the orientation of input vectors. Since fmincon calls the objective function with column vectors, you must be careful to call matlabFunction with column vectors of symbolic variables.

We write the independent variables as x1 and x2 because in this form they can be used as symbolic variables. As components of a vector x they would be written x(1) and x(2). The function has a twisty valley as depicted in the plot below.

The number of iterations is lower when using gradients and Hessians, and there are dramatically fewer function evaluations:

sprintf(['There were %d iterations using gradient'...' and Hessian, but %d without them.'], ...
output.iterations,output2.iterations)
sprintf(['There were %d function evaluations using gradient'...' and Hessian, but %d without them.'], ...
output.funcCount,output2.funcCount)

ans =
There were 14 iterations using gradient and Hessian, but 18 without them.
ans =
There were 15 function evaluations using gradient and Hessian, but 81 without them.

Second Example: Constrained Minimization Using the fmincon Interior-Point Algorithm

We consider the same objective function and starting point, but now have two nonlinear constraints:

The constraints keep the optimization away from the global minimum point [1.333,1.037]. Visualize the two constraints:

The nonlinear constraints must be written in the form c(x) <= 0. We compute all the symbolic constraints and their derivatives, and place them in a function handle using matlabFunction.

The gradients of the constraints should be column vectors; they must be placed in the objective function as a matrix, with each column of the matrix representing the gradient of one constraint function. This is the transpose of the form generated by jacobian, so we take the transpose below.

We place the nonlinear constraints into a function handle. fmincon expects the nonlinear constraints and gradients to be output in the order [c ceq gradc gradceq]. Since there are no nonlinear equality constraints, we output [ ] for ceq and gradceq.

The interior-point algorithm requires its Hessian function to be written as a separate function, instead of being part of the objective function. This is because a nonlinearly constrained function needs to include those constraints in its Hessian. Its Hessian is the Hessian of the Lagrangian; see the User's Guide for more information.

The Hessian function takes two input arguments: the position vector x, and the Lagrange multiplier structure lambda. The parts of the lambda structure that you use for nonlinear constraints are lambda.ineqnonlin and lambda.eqnonlin. For the current constraint, there are no linear equalities, so we use the two multipliers lambda.ineqnonlin(1) and lambda.ineqnonlin(2).

We calculated the Hessian of the objective function in the first example. Now we calculate the Hessians of the two constraint functions, and make function handle versions with matlabFunction.

Local minimum found that satisfies the constraints.
Optimization completed because the objective function is non-decreasing in
feasible directions, to within the default value of the function tolerance,
and constraints are satisfied to within the default value of the constraint tolerance.
xfinal =
0.4396
0.0373
fval =
0.8152
exitflag =
1
output2 =
iterations: 17
funcCount: 54
constrviolation: 0
stepsize: 4.2417e-06
algorithm: 'interior-point'
firstorderopt: 3.8433e-07
cgiterations: 0
message: 'Local minimum found that satisfies the constraints.
...'
ans =
There were 10 iterations using gradient and Hessian, but 17 without them.
ans =
There were 13 function evaluations using gradient and Hessian, but 54 without them.

Cleaning Up Symbolic Variables

The symbolic variables used in this example were assumed to be real. To clear this assumption from the symbolic engine workspace, it is not sufficient to delete the variables. You must either clear the variables using the syntax

syms x1x2clear

or reset the symbolic engine using the command

% reset(symengine) % uncomment this line to reset the engine

After resetting the symbolic engine you should clear all symbolic variables from the MATLAB® workspace: