>> Only use derivative free optimization methods if your problem is not continuous.
>> If your problem is differentiable, you should compute the Jacobian
>> yourself, e.g. with
>>>> def myJacobian(x):
>> h = 10**-3
>> # do finite differences approximation
>> return ....
>>>> and provide the Jacobian to
>> scipy.optimize.leastsq(..., Dfun = myJacobian)
>> This should work much better/reliable/faster than any of the alternatives.
>> Maybe increasing the step length in the options to leastsq also works:
>> epsfcn – A suitable step length for the forward-difference
> approximation of the Jacobian (for Dfun=None).
>> I don't think I have tried for leastsq, but for some fmin it works
> much better with larger step length for the finite difference
> approximation.
>> Josef
Okay, I got leastsq working when I manually compute the Jacobian.
The function i want to compute has non-trivial dependencies on
its input parameters and the jacobian has some regions where it
does not change at all. But manually specifying the step length
for the finite difference scheme in the jacobian helps.
Cheers, Ralph