**3. Bayesian parameter estimation**

To introduce, in a very simple way, the Bayes rule for parameter estimation, we consider the case where we have a set of data: *g* ¼ *g*1, ⋯, *gn* � � where we assign them a probability law *p gi* <sup>j</sup>*<sup>θ</sup>* � � with a set of unknown parameters *<sup>θ</sup>:* The question now is how to infer *θ* from those data. We can immediately use the Bayes rule:

$$p(\boldsymbol{\theta}|\mathbf{g}) = \frac{p(\mathbf{g}|\boldsymbol{\theta})p(\boldsymbol{\theta})}{p(\mathbf{g})} \propto l(\boldsymbol{\theta})p(\boldsymbol{\theta}) \tag{8}$$

where:


$$p(\mathbf{g}) = \int p(\mathbf{g}|\boldsymbol{\theta}) p(\boldsymbol{\theta}) \, \mathrm{d}\boldsymbol{\theta} \tag{9}$$

is called the *evidence*.

So, the process of using the Bayes rule for parameter estimation can be summarized as follows:

	- Compute its expected value, called Expected A Posteriori (EAP) or Posterior Mean (PM):

$$
\hat{\boldsymbol{\theta}}\_{\rm PM} = \int \boldsymbol{\theta} p(\boldsymbol{\theta} | \mathbf{g}) \, \mathrm{d}\boldsymbol{\theta} \tag{10}
$$

◦ Compute the value of *<sup>θ</sup>* for which the *<sup>p</sup>*ð Þ *<sup>θ</sup>*j*<sup>g</sup>* is maximum; Maximum A Posteriori (MAP):

$$\hat{\theta}\_{\text{MAP}} = \underset{\theta}{\text{arg}\,\text{max}}\,\{p(\theta|\mathbf{g})\}\tag{11}$$

◦ Sampling and exploring [Monte Carlo methods]

$$
\boldsymbol{\theta} \sim p(\boldsymbol{\theta} | \mathbf{g})
$$

which gives the possibility to obtain any statistical information we want to know about *θ:* For example, if we generate *N* samples f g *θ*1, ⋯, *θ<sup>N</sup>* , for large enough *N*, we have:

$$\mathrm{E}\{\theta\} \simeq \frac{1}{N} \sum\_{n=1}^{N} \theta\_n. \tag{12}$$
