Alternatively, a Krylov object may be converted to a particular method-object which inherits from LinearSolverBase. This means it can be used as a BlockType of a SparseBlockMatrix, and can offer more configuration options, such as setting the 'restart' option for the krylov::GMRES method.

// This performs no convergence check, and is probably a very bad idea

res = gmresMat * rhs ;

Constrained Iterative Solvers

The main feature of the BlockSolvers module is providing solvers for a specific class of nonsmooth problems, which includes

Quadratic optimisation under convex constraints (QCCP)

Linear systems with normal cone inclusions, such as Coulomb friction problems (which correspond to a slight modification of the optimality conditions of QCCP)

More specifically, we are concerned with problems that can be expressed as

where denotes the normal cone to the set C. The dimensions of the sets should correspond to that of the blocks of rows of M. If is zero, we retrieve the optimality conditions of the quadratic minimization problem

The definition of the constraint set C and of the translation term s(y) is done by passing a NSLaw to the solve() function of the solvers. The NSLaw should conform to a specific interface, and provide at least

NSLaw::eval(i,x_i,y_i) to evaluate the residual to the inclusion at the i^th block

NSLaw::dualityCOV(i, y_i, &s) to compute s_i(y_i)

Supplemental methods may need to be provided by the NSLaw depending on the solver chosen. See LCPLaw, SOCLaw or PyramidLaw for examples of the interfaces that should be provided by a NSLaw.

Projected Gauss Seidel

Constrained linear systems can be solved using the GaussSeidel or ProductGaussSeidel classes, as long as a NSLaw defining a corresponding NSLaw::solveLocal() function is available. SOCLaw, from the Second Order module, is an example of such NSlaw.

Example code for 3D Coulomb friction (requires the Second Order module):

The ProductGaussSeidel class is useful when explicitely computing the matrix W would be expensive, but the product MInv * H' is quite sparse (that is, when updating a force component does not impact too many degrees of freedom). When Minv is the identity matrix, the above example can then be modified as

Projected Gradient

The ProjectedGradient class may be used to compute the solution of the quadratic minimization problem

using a variant of the Projected Gradient or Projected Gradient Descent algorithms. Its interface is very similar to that of the above GaussSeidel. A few variants of the algorithm, such as the Nesterov [7] acceleration, are implemented; they can be selected using the ProjectedGradient::setDefaultVariant() method or using a template parameter. See projected_gradient::Variant for more information.

The NSLaw passed to the ProjectedGradient::solve() method should define a projectOnConstraint() function that computes the orthogonal projection onto the constraint set C.

Note

This algorithm should in theory only be used to solve constrained quadratic optimization problems. Coulomb friction does not belong to this class, but Linear and Cone Complementarity problems do. In practice, bogus does not disallow using a ProjectedGradient with a NSLaw for which the term s(y) is non-zero, but convergence may be degraded.