This iterative scheme can be generalized to several dimensions by replacing the derivative with the gradient, , and the reciprocal of the second derivative with the inverse of the Hessian matrix, . One obtains the iterative scheme
Usually Newton's method is modified to include a small step size γ > 0 instead of γ = 1

This iterative scheme can be generalized to several dimensions by replacing the derivative with the gradient, , and the reciprocal of the second derivative with the inverse of the Hessian matrix, . One obtains the iterative scheme
Usually Newton's method is modified to include a small step size γ > 0 instead of γ = 1

We assume that the (scalar) function that we seek the minimum of is locally quadratic, and take the two term Taylor expansion for the gradient of the function (we are going to use N-R to find a zero of the gradient).

This gives:

Thus if is the minimum we seek then , and so:

or:

Now we use this as an itteration formula (because the function is not really a quadratic surface):