Broyden's first method, also known as the good method. The initial Jacobian is built
from MD history if available. Otherwise switches to SD for one SCF iteration until
a Jacobian can be built from the SCF history.
[Edit]

BT1_EXPLICIT

Same as BT1, but computes the explicit Jacobian with finite differences. Requires
a CDFT SCF procedure to be active.
[Edit]

BT1_EXPLICIT_LS

Same as BT1_EXPLICIT, but uses backtracking line search for optimizing the step size.
[Edit]

BT1_LS

Same as BT1, but uses backtracking line search for optimizing the step size (see optimizer
NEWTON_LS).
[Edit]

BT2

Same as BT1, but uses Broyden's second method, also known as the bad method.
[Edit]

Continue backtracking line search until MAX_LS steps are reached or the norm of the
CDFT gradient no longer decreases. Default (false) behavior exits the line search
procedure on the first step that the gradient decreases.
[Edit]

The target gradient of the outer SCF variables. Notice that the EPS_SCF of the inner
loop also determines the value that can be reached in the outer loop, typically EPS_SCF
of the outer loop must be smaller than EPS_SCF of the inner loop.
[Edit]

Defines parameters that control how often the explicit Jacobian is built, which is
needed by some optimizers. Expects two values. The first value determines how many
consecutive CDFT SCF iterations should skip a rebuild, whereas the latter how many
MD steps. The values can be zero (meaning never rebuild) or positive. Both values
cannot be zero.
[Edit]

Newton's method with backtracking line search to find the optimal step size. Only
compatible with CDFT constraints. Starts from the regular Newton solution and successively
reduces the step size until the L2 norm of the CDFT gradient decreases or MAX_LS steps
is reached. Potentially very expensive because each iteration performs a full SCF
calculation.
[Edit]

The initial step_size used in the optimizer (currently steepest descent). Note that
in cases where a sadle point is sought for (constrained DFT), this can be negative.
For Newton and Broyden optimizers, use a value less/higher than the default 1.0 (in
absolute value, the sign is not significant) to active an under/overrelaxed optimizer.
[Edit]