Use in the DefineOptimization function
If you can explicitly (analytically) compute the first or second derivatives of the objective and left-hand side constraint functions, expressions for these can be optionally provided to the optimizer. When these are available, the optimizer can evaluate these expressions rather than estimating the derivatives through finite differencing. This can speed up convergence by reducing the number of evaluations, and by providing a "cleaner" computation because derivatives, and especially second derivates, can be quite inaccurate or noisy when computed through finite differencing.
The first derivative of the objective is called the gradient and is specified in the optional parameter «Gradient». It should compute a value dimensioned by the
The first derivative of the left-hand side constraint is the Jacobian. It should be dimensioned by the
Vars indexes. If one is provided, both must be (unless there isn't an objective, or are zero constraints).
Analytica allows expressions for the Hessian (second derivative of the objective) and LhsHessian (second derivative of the constraint left-hand side) to be specified using the optional parameters «Hessian» and «LhsHessian». Hessian should compute a derivative with the same dimensionality as Jacobian (vars), and LhsHessian should have the same dimensionality as Jacobian (constraints x vars). The Hessian gives the optimizer information about the curvature.
The built-in GRG Nonlinear solver is most likely to make good use of derivatives. To ensure these are used well, set the ObjNl and LhsNl to
N (smooth non-linear).
The GRG Nonlinear solver will evaluate Gradient and Jacobian expressions, but not Hessians. The Knitro engine (an optional add-on engine) can make use of Hessian information. The Quadratic and SOCP Barrier engines use Hessian information, but these are based on the supplied matrices of coefficients, not DefineOptimization.