On Thu, Feb 23, 2012 at 12:09 AM, Christopher Jordan-Squire
<cjordan1@uw.edu> wrote:
> On Wed, Feb 22, 2012 at 2:02 PM, Nathaniel Smith <njs@pobox.com> wrote:
>> On Wed, Feb 22, 2012 at 8:48 PM, <josef.pktd@gmail.com> wrote:
>>> On Wed, Feb 22, 2012 at 3:26 PM, Greg Friedland
>>> <greg.friedland@gmail.com> wrote:
>>>> Hi,
>>>>>>>> Is it possible to calculate asymptotic confidence intervals for any of
>>>> the bounded minimization algorithms? As far as I can tell they don't
>>>> return the Hessian; that's including the new 'minimize' function which
>>>> seemed like it might.
>>>>>> If the parameter ends up at the bounds, then the standard statistics
>>> doesn't apply. The Hessian is based on a local quadratic
>>> approximation, which doesn't work if part of the local neigborhood is
>>> out of bounds.
>>> There is some special statistics for this, but so far I have seen only
>>> the description how GAUSS handles it.
>>>>>> In statsmodels we use in some cases the bounds, or a transformation,
>>> just to keep the optimizer in the required range, and we assume we get
>>> an interior solution. In this case, it is possible to use the standard
>>> calculations, the easiest is to use the local minimum that the
>>> constraint or transformed optimizer found and use it as starting value
>>> for an unconstrained optimization where we can get the Hessian (or
>>> just calculate the Hessian based on the original objective function).
>>>> Some optimizers compute the Hessian internally. In those cases, it
>> would be nice to have a way to ask them to somehow return that value
>> instead of throwing it away. I haven't used Matlab in a while, but I
>> remember running into this as a standard feature at some point, and it
>> was quite nice. Especially when working with a problem where each
>> computation of the Hessian requires an hour or so of computing time.
>>>> Are you talking about analytic or finite-difference gradients and
> hessians? I'd assumed that anything derived from finite difference
> estimations wouldn't give particularly good confidence intervals, but
> I've never needed them so I've never looked into it in detail.
statsmodels has both, all discrete models for example have analytical
gradients and hessians.
But for models with a complicated log-likelihood function, there isn't
much choice, second derivatives with centered finite differences are
ok, scipy.optimize.leastsq is not very good. statsmodels also has
complex derivatives which are numerically pretty good but they cannot
always be used.
I think in most cases numerical derivatives will have a precision of a
few decimals, which is more precise than all the other statistical
assumptions, normality, law of large numbers, local definition of
covariance matrix to calculate "large" confidence intervals, and so
on.
One problem is that choosing the step size depends on the data and
model. numdifftools has adaptive calculations for the derivatives, but
we are not using it anymore.
Also, if the model is not well specified, then the lower precision of
finite difference derivatives can hurt. For example, in ARMA models I
had problems when there are too many lags specified, so that some
roots should almost cancel. Skipper's implementation works better
because he used a reparameterization that forces some nicer behavior.
The only case in the econometrics literature that I know is that early
GARCH models were criticized for using numerical derivatives even
though analytical derivatives were available, some parameters were not
well estimated, although different estimates produced essentially the
same predictions (parameters are barely identified)
Last defense: everyone else does it, maybe a few models more or less,
and if the same statistical method is used, then the results usually
agree pretty well.
(But if different methods are used, for example initial conditions are
treated differently in time series analysis, then the differences are
usually much larger. Something like: I don't worry about numerical
problems at the 5th or 6th decimal if I cannot figure out what these
guys are doing with their first and second decimal.)
(maybe more than anyone wants to know.)
Josef
.
>> -Chris
>>>> -- Nathaniel
>> _______________________________________________
>> SciPy-User mailing list
>>SciPy-User@scipy.org>>http://mail.scipy.org/mailman/listinfo/scipy-user> _______________________________________________
> SciPy-User mailing list
>SciPy-User@scipy.org>http://mail.scipy.org/mailman/listinfo/scipy-user