Questions & Answers

A Theoretical Result

0

Show that a Bayes estimator is a function of the sufficient statistic.

Resolved
The discussion has been resolved.

Accepted Answer

  • Replied by tsakanikasnickos on Wednesday, February 04 2015, 05:29 PM
    Let \( \displaystyle \underset{\sim}{X} =( X_{1}, \dots , X_{n} ) \) be a random sample from a density \( \displaystyle f(x|\theta) \), where \(\theta\) is a value of the random variable \(\Theta\) with prior density \( \displaystyle \pi_{\Theta} \). Let \( \displaystyle d^{*} \) be a Bayes estimator of \(\theta\) and let \( \mathcal{L}(\theta,\cdot) \) be the loss function for the estimation of \(\theta\). By definition, \( \displaystyle d^{*} \) is the statistic that minimizes the Bayes risk function. Equivalently, it can be easily shown that \( \displaystyle d^{*} \) is the statistic that minimizes the posterior risk function \( \displaystyle E_{\Theta | \underset{\sim}{X} }\left( \mathcal{L}(\theta,\cdot) \right) \). We have that

    \begin{align*}
    \displaystyle
    E_{\Theta | \underset{\sim}{X} } \left( \mathcal{L}(\theta,d) \right) &= \int_{\Theta} \mathcal{L}(\theta,d) \pi_{\Theta | \underset{\sim}{X} }(\theta | \underset{\sim}{X} ) \mathrm{d}\theta \\
    &= \int_{\Theta} \mathcal{L}(\theta,d) \pi_{\Theta}(\theta) f(\underset{\sim}{X} | \theta) \frac{1}{c(\underset{\sim}{X} )} \mathrm{d}\theta \; \; \; (1)
    \end{align*}

    where \( \displaystyle c = c(\underset{\sim}{X} ) \) is a constant. By Neyman-Fischer's factorization theorem, if \( \displaystyle T=T(\underset{\sim}{X} ) \) is the sufficient statistic, then
    \[ \displaystyle f(\underset{\sim}{X} | \theta) = g\left( T(\underset{\sim}{X} ) , \theta \right) h(\underset{\sim}{X} ) \; \; \; (2) \]
    We now observe, due to (1) and (2), that, in order to minimize the quantity
    \[ E_{\Theta | \underset{\sim}{X} } \left( \mathcal{L}(\theta,d) \right) \]with respect to \( d \), we have to minimize the quantity
    \[ \int_{\Theta} \mathcal{L}(\theta,d) \pi_{\Theta}(\theta) g\left( T(\underset{\sim}{X} ) , \theta \right) \mathrm{d}\theta \]with respect to \( d \), which is a function of \( \displaystyle T=T(\underset{\sim}{X} ) \text{ and } d \). We thus conclude that the Bayes estimator \( \displaystyle d^{*} \) of \( \theta \) is a function of the sufficient statistic \( T(\underset{\sim}{X} ) \), since we already mentioned that \( \displaystyle d^{*} = \min_{d} E_{\Theta | \underset{\sim}{X} }\left( \mathcal{L}(\theta,d) \right) \).

  • There is no reply for this discussion yet
Your Response
Please login first in order for you to submit comments

Questions & Answers | Tags

Mathimatikoi on line

We have 279 guests and no members online

Contact

info(at)mathimatikoi.org
2012-2016 - mathimatikoi.org