Fisher Information是未知参数的信息量,这个怎么理解,它的统计意义是什么?
1个回答
Suppose likelihood is $L(X; \theta)$, log likelihood is $l(X; \theta)$, then
(1) Fisher Information is second moment (and variance) of the gradient of log likelihood
Fisher information $I(\theta)= \mathbb{E}[(\frac{\mathrm{d}l}{\mathrm{d}\theta})^2| \theta] = Var(\frac{\mathrm{d}l}{\mathrm{d}\tilde{\theta}}| \theta)$
since $\mathbb{E}(\frac{\mathrm{d}l}{\mathrm{d}\theta}| \theta)=0$ (Proof)
(2) Fisher information is related to asymptotic distribution of MLE $\hat{\theta}_{MLE}$
By CLT and slutsky theorem, we can conclude that $$\sqrt{n}(\hat{\theta}_{MLE}-\theta) \overset{p}{\to} N(0, I(\theta)^{-1})$$
Applications: Cramer Rao Bound
Under regularity conditions, the variance of any unbiased estimator ${\hat{\theta }}$ of $\theta$ is then bounded by inverse of Fisher information $I(\theta)$ with
$$Var(\hat{\theta}) \geq \frac{1}{I(\theta)}$$
Note that this CR lower bound is just a "theoretical" lower bound, which means that it may not be applicable (i.e. fail to satisfy regularity conditions) or attainable (can't reach lower bound)
e.g.
CR applicable but not attainable for estimating $\sigma^2$ when $X \overset{i.i.d} \sim N(\mu, \sigma^2)$ since $var(s^2)= \frac{2\sigma^4}{n-1} > \frac{2\sigma^4}{n} = $CR bound.
Reference:
Sinho Chewi's Theoretical Statistics Note
SofaSofa数据科学社区DS面试题库 DS面经 相关主题
什么函数族满足关于最值函数封闭?
0回答
超几何概率问题
1回答
证明马尔可夫不等式
1回答
今天明天都下雨的概率
1回答
对于独立正态变量X, Y ~ N(0,1),X+Y和X-Y是否独立?
2回答
用一个骰子生成1到7的随机数?
5回答
柯西分布没有数学期望
1回答
指数家族有哪些常见的概率分布?
2回答
由均匀分布生成标准正态分布
3回答
我们谢绝在回答前讲“生动”的故事。
我们谢绝“这么简单,你自己想”、“书上有的,你认真看”这类的回答;如果你认为对方的提问方式或者内容不妥,你可以直接忽略该问题,不用进行任何作答,甚至可以对该问题投反对票。
我们谢绝答非所问。
我们谢绝自己不会、硬要回答。
我们感激每一个用户在编写答案时的努力与付出!