Bayesian Parameter Estimation Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - "\<P\>([\s]{1,7})([^\s])" to "<P> $2")
m (Text replacement - "ions]] " to "ion]]s ")
 
Line 22: Line 22:
=== 2012 ===
=== 2012 ===
* ([[Levy, 2012]]) ⇒ [[Roger Levy]]. ([[2012]]). “[http://idiom.ucsd.edu/~rlevy/pmsl_textbook/chapters/pmsl_4.pdf Probabilistic Models in the Study of Language - Chapter 4: Parameter Estimation]."  
* ([[Levy, 2012]]) ⇒ [[Roger Levy]]. ([[2012]]). “[http://idiom.ucsd.edu/~rlevy/pmsl_textbook/chapters/pmsl_4.pdf Probabilistic Models in the Study of Language - Chapter 4: Parameter Estimation]."  
** QUOTE: … In this chapter [[we]] delve more deeply into the theory of [[probability density estimation]], focusing on [[inference]] within [[parametric families of probability distributions]] (see discussion in Section 2.11.2). We start with some important properties of [[estimator]]s, then turn to basic [[frequentist parameter estimation]] ([[maximum-likelihood estimation]] and [[corrections for bias]]), and finally basic [[Bayesian Parameter Estimation Algorithm|Bayesian parameter estimation]].
** QUOTE: … In this chapter [[we]] delve more deeply into the theory of [[probability density estimation]], focusing on [[inference]] within [[parametric families of probability distribution]]s (see discussion in Section 2.11.2). We start with some important properties of [[estimator]]s, then turn to basic [[frequentist parameter estimation]] ([[maximum-likelihood estimation]] and [[corrections for bias]]), and finally basic [[Bayesian Parameter Estimation Algorithm|Bayesian parameter estimation]].


----
----

Latest revision as of 07:27, 22 August 2024

A Bayesian Parameter Estimation Algorithm is a parameter estimation algorithm that is a Bayesian algorithm.



References

2014

  • http://en.wikipedia.org/wiki/Bayesian_inference#Estimates_of_parameters_and_predictions
    • It is often desired to use a posterior distribution to estimate a parameter or variable. Several methods of Bayesian estimation select measurements of central tendency from the posterior distribution. For one-dimensional problems, a unique median exists for practical continuous problems. The posterior median is attractive as a robust estimator.[1] If there exists a finite mean for the posterior distribution, then the posterior mean is a method of estimation.[citation needed] :[math]\displaystyle{ \tilde \theta = \operatorname{E}[\theta] = \int_\theta \theta \, p(\theta \mid \mathbf{X},\alpha) \, d\theta }[/math] Taking a value with the greatest probability defines maximum a posteriori (MAP) estimates:[citation needed] :[math]\displaystyle{ \{ \theta_{\text{MAP}}\} \subset \arg \max_\theta p(\theta \mid \mathbf{X},\alpha) . }[/math]

      There are examples where no maximum is attained, in which case the set of MAP estimates is empty.

      There are other methods of estimation that minimize the posterior risk (expected-posterior loss) with respect to a loss function, and these are of interest to statistical decision theory using the sampling distribution ("frequentist statistics").[citation needed]

      The posterior predictive distribution of a new observation [math]\displaystyle{ \tilde{x} }[/math] (that is independent of previous observations) is determined by[citation needed] :[math]\displaystyle{ p(\tilde{x}|\mathbf{X},\alpha) = \int_\theta p(\tilde{x},\theta \mid \mathbf{X},\alpha) \, d\theta = \int_\theta p(\tilde{x} \mid \theta) p(\theta \mid \mathbf{X},\alpha) \, d\theta . }[/math]

  1. Sen, Pranab K.; Keating, J. P.; Mason, R. L. (1993). Pitman's measure of closeness: A comparison of statistical estimators. Philadelphia: SIAM. 

2012