1

# bug in normal distribution

## description

There is a bug in the second derivative of the normal log density at line 164 of +logdist\normal.m
    case 'info'
Y = 1/(Std^2)*ones(size(X));
should be changed to
    case 'info'
Y = -1/(Std^2)*ones(size(X));
This is also misleading since information is the inverse of the second derivative of the log density.
Closed Feb 15 at 7:36 PM by jaromirbenes

wrote Feb 15 at 7:07 PM

this bug doesn't affect estimation, only the output matrix Hess{2} (contributions of the priors to the hessian).

wrote Feb 15 at 7:35 PM

Fisher information is defined as minus second derivative of log likelihood. For a univariate normal, that's exactly 1/(std^2) as in the code.

The inverse of that (or of the Fisher information matrix in multivariate cases) is then an unbiased estimator of the covariance matrix.

wrote Feb 15 at 8:13 PM

The bugs then are for information of beta, gamma and inverse gamma distributions (maybe others, didn't check). They all should have opposite signs and Hess{2} should be negative for normal priors.

wrote Feb 15 at 9:31 PM

There is a bug at line 61 of classlib\@estimateobj\mydiffprior.m
iDiffPPrior = Pri.prior{ip}(x0(ip),'info');
should be changed to
iDiffPPrior = -Pri.prior{ip}(x0(ip),'info');
and line 69 of +logdist\beta.m
Y(inx) = -(B - 1)./(X - 1).^2 - (A - 1)./X.^2;
should be changed to
Y(inx) = (B - 1)./(X - 1).^2 + (A - 1)./X.^2;
and line 26 of +logdist\mygamma.m
Y(inx) = -(A - 1)/X.^2;
should be changed to
Y(inx) = (A - 1)/X.^2;
and line in +logdist\invgamma.m
Y(inx) = -(2*B - X*(A + 1))./X.^3;
should be changed to
Y(inx) = (2*B - X*(A + 1))./X.^3;
and line 64 of +logdist\lognormal.m
Y(inx) = (B^2 + A - log(X) + 1)./(B^2*X.^2);
should be changed to
Y(inx) = (-B^2 + A - log(X) + 1)./(B^2*X.^2);