**6.4 Address the problem of prior data conflict for binomial distribution**

In the part on the distribution of Weibull, how to solve this problem was explained, so we will enter the following steps [3]:

$$\mathbf{f}\_{1}(\mathbf{p}/\mathbf{s}) = \frac{\mathbf{1}}{\beta \left(\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \mathbf{s}, \underline{\mathbf{n}}^{0} \left(\mathbf{1} - \underline{\mathbf{y}}^{0}\right) + \mathbf{n} - \mathbf{s}\right)} \mathbf{p}^{\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \mathbf{s} - \mathbf{1}} (\mathbf{1} - \mathbf{p})^{\underline{\mathbf{n}}^{0}\left(\mathbf{1} - \underline{\mathbf{y}}^{0}\right) + \mathbf{n} - \mathbf{s} - \mathbf{1}} \tag{55}$$

$$\mathbf{f}\_2(\mathbf{p}/\mathbf{s}) = \frac{1}{\beta \left(\underline{\mathbf{n}}^0 \overline{\mathbf{y}}^0 + \mathbf{s}, \underline{\mathbf{n}}^0 \left(\mathbf{1} - \overline{\mathbf{y}}^0\right) + \mathbf{n} - \mathbf{s}\right)} \mathbf{p}^{\underline{\mathbf{n}}^0 \overline{\mathbf{y}}^0 + \mathbf{s} - 1} (\mathbf{1} - \mathbf{p})^{\underline{\mathbf{n}}^0 \left(\mathbf{1} - \overline{\mathbf{y}}^0\right) + \mathbf{n} - \mathbf{s} - 1} \tag{56}$$

$$\mathbf{f}\_3(\mathbf{p}/\mathbf{s}) = \frac{\mathbf{1}}{\beta \left(\bar{\mathbf{n}}^0 \underline{\mathbf{y}}^0 + \mathbf{s}, \bar{\mathbf{n}}^0 \left(\mathbf{1} - \underline{\mathbf{y}}^0\right) + \mathbf{n} - \mathbf{s}\right)} \mathbf{p}^{\bar{\mathbf{n}}^0 \underline{\mathbf{y}}^0 + \mathbf{s} - 1} (\mathbf{1} - \mathbf{p})^{\bar{\mathbf{n}}^0 \left(\mathbf{1} - \underline{\mathbf{y}}^0\right) + \mathbf{n} - \mathbf{s} - 1} \tag{57}$$

$$\mathbf{f}\_4(\mathbf{p}/\mathbf{s}) = \frac{1}{\beta(\bar{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \mathbf{s}, \bar{\mathbf{n}}^0 (1 - \bar{\mathbf{y}}^0) + \mathbf{n} - \mathbf{s})} \mathbf{p}^{\bar{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \mathbf{s} - 1} (\mathbf{1} - \mathbf{p})^{\bar{\mathbf{n}}^0 (1 - \bar{\mathbf{y}}^0) + \mathbf{n} - \mathbf{s} - 1} \tag{58}$$

**The first posterior distribution**: From Eq. (55) and by using the Bayes rule, we get the posterior distribution, as in the following equation:

$$\mathbf{f}\_{1}(\mathbf{p}/\mathbf{s}) = \frac{\mathbf{1}}{\beta \left(\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \mathbf{s}, \underline{\mathbf{n}}^{0} \left(\mathbf{1} - \underline{\mathbf{y}}^{0}\right) + \mathbf{n} - \mathbf{s}\right)} \mathbf{p}^{\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \mathbf{s} - \mathbf{1}} (\mathbf{1} - \mathbf{p})^{\underline{\mathbf{n}}^{0}\left(\mathbf{1} - \underline{\mathbf{y}}^{0}\right) + \mathbf{n} - \mathbf{s} - \mathbf{1}} \tag{59}$$

The above equation represents the first posterior distribution which is the Beta distribution and by using the properties of the Beta distribution we get the central moments, as in the following equation:

$$\mathbf{M}\_r = \frac{\mathbf{r}(\underline{\mathbf{n}}^0 + \mathbf{n}) \, \mathbf{r}\left(\underline{\mathbf{n}}^0 \underline{\mathbf{y}}^0 + \mathbf{s} + \mathbf{r}\right)}{\mathbf{r}(\underline{\mathbf{n}}^0 + \mathbf{n} + \mathbf{r}) \mathbf{r}\left(\underline{\mathbf{n}}^0 \underline{\mathbf{y}}^0 + \mathbf{s}\right)} \tag{60}$$

**The second posterior distribution**: From Eq. (56) and by using the Bayes rule, we get the second posterior distribution, as in the following equation:

$$\mathbf{f}\_2(\mathbf{p}/\mathbf{s}) = \frac{1}{\beta(\underline{\mathbf{u}}^0 \overline{\mathbf{y}}^0 + \mathbf{s}, \underline{\mathbf{u}}^0 (1 - \overline{\mathbf{y}}^0) + \mathbf{n} - \mathbf{s})} \mathbf{p}^{\underline{\mathbf{u}}^0 \overline{\mathbf{y}}^0 + \mathbf{s} - 1} (1 - \mathbf{p})^{\underline{\mathbf{z}}^0 (1 - \overline{\mathbf{y}}^0) + \mathbf{n} - \mathbf{s} - 1} \tag{61}$$

The above equation represents the second posterior distribution, which is the Beta distribution, and by using the properties of the Beta distribution, we get the central moments, as in the following equation:

*Bayesian Inference - Recent Advantages*

$$\mathbf{M}\_{\mathbf{r}} = \frac{\mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n}) \, \mathbf{r}(\underline{\mathbf{n}}^{0} \overline{\mathbf{y}}^{0} + \mathbf{s} + \mathbf{r})}{\mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{r}) \mathbf{r}(\underline{\mathbf{n}}^{0} \overline{\mathbf{y}}^{0} + \mathbf{s})} \tag{62}$$

**The third posterior distribution**: From Eq. (57) and by using the Bayes rule, we get the third posterior distribution, as in the following equation:

$$\mathbf{f}\_3(\mathbf{p}/\mathbf{s}) = \frac{\mathbf{1}}{\beta \left(\bar{\mathbf{n}}^0 \underline{\mathbf{y}}^0 + \mathbf{s}, \bar{\mathbf{n}}^0 \left(\mathbf{1} - \underline{\mathbf{y}}^0\right) + \mathbf{n} - \mathbf{s}\right)} \mathbf{p}^{\bar{\mathbf{n}}^0 \underline{\mathbf{y}}^0 + \mathbf{s} - 1} (\mathbf{1} - \mathbf{p})^{\bar{\mathbf{n}}^0 \left(\mathbf{1} - \underline{\mathbf{y}}^0\right) + \mathbf{n} - \mathbf{s} - 1} \tag{63}$$

The above equation represents the third posterior distribution, which is the Beta distribution, and by using the properties of the Beta distribution, we get the central moments, as in the following equation:

$$\mathbf{M}\_{\mathbf{r}} = \frac{\mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n}) \, \mathbf{r} \left(\bar{\mathbf{n}}^{0} \underline{\mathbf{y}}^{0} + \mathbf{s} + \mathbf{r}\right)}{\mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{r}) \mathbf{r} \left(\bar{\mathbf{n}}^{0} \underline{\mathbf{y}}^{0} + \mathbf{s}\right)} \tag{64}$$

**The fourth posterior distribution**: From Eq. (58) and by using the Bayes rule, we get the fourth posterior distribution, as in the following equation:

$$\mathbf{f}\_{4}(\mathbf{p}/\mathbf{s}) = \frac{1}{\beta(\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \mathbf{s}, \bar{\mathbf{n}}^{0}(1-\bar{\mathbf{y}}^{0}) + \mathbf{n} - \mathbf{s})} \mathbf{p}^{\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \mathbf{s} - 1}(1-\mathbf{p})^{\bar{\mathbf{n}}^{0}(1-\bar{\mathbf{y}}^{0}) + \mathbf{n} - \mathbf{s} - 1} \tag{65}$$

The above equation represents the fourth posterior distribution, which is the Beta distribution, and by using the properties of the Beta distribution, we get the central moments, as in the following equation:

$$\mathbf{M}\_{\mathbf{r}} = \frac{\mathbf{r}(\bar{\mathbf{n}}^0 + \mathbf{n}) \, \mathbf{r}(\bar{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \mathbf{s} + \mathbf{r})}{\mathbf{r}(\bar{\mathbf{n}}^0 + \mathbf{n} + \mathbf{r}) \mathbf{r}(\bar{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \mathbf{s})} \tag{66}$$

After taking the average of the posterior distributions, we get the iLuck-Model [10]:

$$\underline{\mathbf{y}}^{n} = \text{lower}(\mathbf{y}^{n}) = \left\{ \begin{array}{c} \frac{\bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \tau(\mathbf{x})}{\bar{\mathbf{n}}^{0} + \mathbf{n}} \text{if } \bar{\tau}(\mathbf{x}) \ge \underline{\mathbf{y}}^{0} \\\\ \frac{\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \tau(\mathbf{x})}{\underline{\mathbf{n}}^{0} + \mathbf{n}} \text{ if } \bar{\tau}(\mathbf{x}) < \underline{\mathbf{y}}^{0} \end{array} \right. \tag{67}$$
 
$$\bar{\mathbf{y}}^{n} = \text{upper}(\mathbf{y}^{n}) = \begin{cases} \frac{\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \tau(\mathbf{x})}{\bar{\mathbf{n}}^{0} + \mathbf{n}} \text{if } \bar{\tau}(\mathbf{x}) \le \bar{\mathbf{y}}^{0} \\\\ \frac{\underline{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \tau(\mathbf{x})}{\underline{\mathbf{n}}^{0} + \mathbf{n}} \text{if } \bar{\tau}(\mathbf{x}) > \bar{\mathbf{y}}^{0} \end{cases} \tag{68}$$

Eqs. (67), (68) represent a generalized iLuck-Model, a model that represents the lower bound and the model that represents the upper bound is chosen based on the value of τð Þ *x* , the estimator we obtain will be in the form of an interval Therefore, we will take the average for that period, and from the above, the posterior distribution model will be in the final form and as in the following equation:

*Robust Bayesian Estimation DOI: http://dx.doi.org/10.5772/intechopen.104090*

$$\mathbf{f}\left(\mathbf{p}/\mathbf{n}^{\mathrm{m}},\mathbf{y}^{\mathrm{m}}\right) = \frac{\mathbf{1}}{\mathfrak{P}\left(\mathbf{n}^{\mathrm{m}}\mathbf{y}^{\mathrm{m}},\mathbf{n}^{\mathrm{m}}(\mathbf{1}-\mathbf{y}^{\mathrm{m}})\right)}\mathbf{p}^{\mathrm{n}^{\mathrm{m}}\mathbf{y}^{\mathrm{m}}-1}(\mathbf{1}-\mathbf{p})^{\mathrm{n}^{\mathrm{m}}(\mathbf{1}-\mathbf{y}^{\mathrm{n}})-1} \tag{69}$$
 
$$\mathbf{n}^{\mathrm{m}} = \frac{\mathrm{lower}(\mathbf{n}^{\mathrm{n}}) + \mathrm{upper}(\mathbf{n}^{\mathrm{n}})}{2}, \mathbf{y}^{\mathrm{m}} = \frac{\mathrm{lower}(\mathbf{y}^{\mathrm{n}}) + \mathrm{upper}(\mathbf{y}^{\mathrm{n}})}{2}$$
