**3.5 Address the problem of prior data conflict for Weibull distribution**

Although this problem is represented by the prior data conflict problem, we can use a model to address the prior data conflict problem, This is done through the use of a set of prior parameters and according to the proposal presented by (Quaeghebeur and Cooman; 2005) [11], In short <sup>Q</sup><sup>0</sup> <sup>¼</sup> *<sup>n</sup>*0*<sup>x</sup> <sup>y</sup>*0, *<sup>y</sup>*<sup>0</sup> h i, Another proposal was submitted (Walter and Augustin; 2009) In order to obtain a set of prior parameters, in brief <sup>Q</sup><sup>0</sup> <sup>¼</sup> *<sup>n</sup>*0, *<sup>n</sup>*<sup>0</sup> ½ �*<sup>x</sup> <sup>y</sup>*0, *<sup>y</sup>*<sup>0</sup> h i, In general, the model presented to obtain a set of prior parameters is called (Generalized iLuck-Model), after that we get a set of posterior distributions, as shown below [3]:

$$\mathbf{f}\_1(\boldsymbol{\theta}/\underline{\mathbf{n}}^0, \underline{\mathbf{y}}^0) = \frac{\left(\underline{\mathbf{n}}^0 \underline{\mathbf{y}}^0\right)^{\underline{\mathbf{n}}^0 + 1}}{\mathbf{r}(\underline{\mathbf{n}}^0 + \mathbf{1})} \boldsymbol{\theta}^{-\left(\underline{\mathbf{n}}^0 + \mathbf{1}\right) - \mathbf{1}} \mathbf{e}^{-\frac{\underline{\mathbf{n}}^0 \underline{\mathbf{y}}^0}{\boldsymbol{\theta}}} \tag{23}$$

$$\mathbf{f}\_2(\boldsymbol{\theta}/\underline{\mathbf{n}}^0, \bar{\mathbf{y}}^0) = \frac{\left(\underline{\mathbf{n}}^0 \bar{\mathbf{y}}^0\right)^{\mathbf{n}^0 + 1}}{\mathbf{r}(\underline{\mathbf{n}}^0 + \mathbf{1})} \boldsymbol{\theta}^{-\left(\underline{\mathbf{n}}^0 + 1\right) - 1} \mathbf{e}^{-\frac{\underline{\mathbf{n}}^0 \bar{\mathbf{y}}^0}{\mathbf{0}}} \tag{24}$$

$$\mathbf{f}\_{\mathfrak{I}}\left(\boldsymbol{\theta}/\bar{\mathbf{n}}^{0},\underline{\mathbf{y}}^{0}\right) = \frac{\left(\bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0}\right)^{\bar{\mathbf{n}}^{0}+1}}{\mathbf{r}(\bar{\mathbf{n}}^{0}+\mathbf{1})}\boldsymbol{\Theta}^{-\left(\bar{\mathbf{n}}^{0}+\mathbf{1}\right)-1}\underline{\mathbf{e}}^{-\frac{\bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0}}{\mathbf{r}}}\tag{25}$$

$$\mathbf{f}\_4\left(\boldsymbol{\theta}/, \bar{\mathbf{n}}^0, \bar{\mathbf{y}}^0\right) = \frac{\left(\bar{\mathbf{n}}^0 \bar{\mathbf{y}}^0\right)^{\bar{\mathbf{n}}^0 + 1}}{\mathbf{r}(\bar{\mathbf{n}}^0 + 1)} \boldsymbol{\theta}^{-\left(\bar{\mathbf{n}}^0 + 1\right) - 1} \mathbf{e}^{-\frac{\bar{\mathbf{n}}^0 \bar{\mathbf{y}}^0}{\boldsymbol{\theta}}} \tag{26}$$

Since:

n0:Minimum.

n0: Maximum.

y0: Minimum.

y0: Maximum.

The above equations represent a set of the prior distributions obtained through the iLuck-Model, after that we extract the posterior set of distributions according to the following steps:

**The first posterior distribution**: From Eq. (23) and by using the Bayes rule, we get the first posterior distribution, as in the following equation:

$$\mathbf{f}\_{1}(\boldsymbol{\theta}/\mathbf{t}) = \frac{\left(\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)^{\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1}}}{\mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \boldsymbol{\Theta}^{-\left(\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1}\right) - \mathbf{1}} \mathbf{e}^{-\frac{\left(\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \mathbf{r}(\mathbf{t})\right)}{\boldsymbol{\theta}}} \tag{27}$$

The above equation represents the first posterior distribution which is the Inverse Gamma distribution and by taking advantage of the properties of the Inverse Gamma distribution we get the central moments as in the following equation:

$$\mathbf{M}\_{\mathbf{r}} = \frac{\left(\underline{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \boldsymbol{\pi}(\mathbf{t})\right)^{\mathbf{r}}}{\mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n} - \mathbf{r} + \mathbf{1})\tag{28}$$

**The second posterior distribution**: From Eq. (24) and by using the Bayes rule we get the second posterior distribution as in the following equation:

$$\mathbf{f}\_2(\boldsymbol{\theta}|\mathbf{t}) = \frac{\left(\underline{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \boldsymbol{\tau}(\mathbf{t})\right)^{\underline{\mathbf{n}}^0 + \mathbf{n} + 1}}{\mathbf{r}(\underline{\mathbf{n}}^0 + \mathbf{n} + \mathbf{1})} \boldsymbol{\Theta}^{-\left(\underline{\mathbf{n}}^0 + \mathbf{n} + 1\right) - 1} \mathbf{e}^{-\frac{\left(\underline{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \mathbf{r}(\mathbf{t})\right)}{\mathbf{0}}} \tag{29}$$

In short:

$$\mathbf{f}\_2(\boldsymbol{\theta}|\mathbf{t}) \sim \text{IG}\left(\underline{\mathbf{n}}^0 + \mathbf{n} + \mathbf{1}, \underline{\mathbf{n}}^0 \bar{\mathbf{y}}^0 + \boldsymbol{\tau}(\mathbf{t})\right),$$

The above equation represents the second posterior distribution, which is the Inverse Gamma distribution, and by taking advantage of the properties of the Inverse Gamma distribution we get the central moments as in the following equation:

$$\mathbf{M}\_{\mathbf{r}} = \frac{\left(\underline{\mathbf{n}}^{0}\overline{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)^{\mathbf{r}}}{\mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \mathbf{r}(\underline{\mathbf{n}}^{0} + \mathbf{n} - \mathbf{r} + \mathbf{1})\tag{30}$$

**The third posterior distribution**: From Eq. (25) and by using the Bayes rule, we get the third posterior distribution as in the following equation:

$$\mathbf{f}\_{3}(\boldsymbol{\theta}/\mathbf{t}) = \frac{\left(\bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)^{\bar{\mathbf{n}}^{0} + \mathbf{n} + 1}}{\mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \boldsymbol{\Theta}^{-\left(\underline{\mathbf{n}}^{0} + \mathbf{n} + 1\right) - 1} \mathbf{e}^{-\frac{\left(\bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \mathbf{r}(\mathbf{t})\right)}{\mathbf{0}}} \tag{31}$$
 
$$\mathbf{f}\_{3}(\boldsymbol{\theta}/\mathbf{t}) \sim \mathbf{IG}\left(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1}, \bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)$$

The above equation represents the third posterior distribution which is the Inverse Gamma distribution and by taking advantage of the properties of the Inverse Gamma distribution we get the central moments as in the following equation:

$$\mathbf{M}\_{\mathbf{r}} = \frac{\left(\bar{\mathbf{n}}^{0}\underline{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)^{\mathbf{r}}}{\mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} - \mathbf{r} + \mathbf{1})\tag{32}$$

**The fourth posterior distribution**: From Eq. (26) and by using the Bayes rule we get the fourth posterior distribution as in the following equation:

$$\mathbf{f}\_{4}(\boldsymbol{\theta}/\mathbf{t}) = \frac{\left(\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)^{\bar{\mathbf{n}}^{0} + \mathbf{n} + 1}}{\mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \boldsymbol{\Theta}^{-\left(\underline{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1}\right) - \mathbf{1}} \mathbf{e}^{-\frac{\left(\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \mathbf{r}(\mathbf{t})\right)}{\mathbf{0}}} \tag{33}$$
 
$$\mathbf{f}\_{4}(\boldsymbol{\theta}/\mathbf{t}) \sim \mathbf{IG}\left(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1}, \bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)$$

The above equation represents the fourth posterior distribution which is the Inverse Gamma distribution and by taking advantage of the properties of the Inverse Gamma distribution we get the central moments as in the following equation:

$$\mathbf{M}\_{\mathbf{r}} = \frac{\left(\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \boldsymbol{\tau}(\mathbf{t})\right)^{\mathbf{r}}}{\mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} + \mathbf{1})} \mathbf{r}(\bar{\mathbf{n}}^{0} + \mathbf{n} - \mathbf{r} + \mathbf{1})\tag{34}$$

After taking the arithmetic mean of the posterior distributions we get the iLuck-Model [10]:

$$\underline{\mathbf{y}}^{\text{n}} = \text{lower } \left( \mathbf{y}^{\text{n}} \right) = \begin{cases} \frac{\bar{\mathbf{n}}^{0} \underline{\mathbf{y}}^{0} + \tau(\mathbf{t})}{\bar{\mathbf{n}}^{0} + \mathbf{n}} & \text{if } \overline{\tau}(\mathbf{t}) \ge \underline{\mathbf{y}}^{0} \\\\ \frac{\underline{\mathbf{n}}^{0} \underline{\mathbf{y}}^{0} + \tau(\mathbf{t})}{\underline{\mathbf{n}}^{0} + \mathbf{n}} & \text{if } \overline{\tau}(\mathbf{t}) < \underline{\mathbf{y}}^{0} \end{cases} \tag{35}$$

*Robust Bayesian Estimation DOI: http://dx.doi.org/10.5772/intechopen.104090*

$$\bar{\mathbf{y}}^{\text{n}} = \text{upper}(\mathbf{y}^{\text{n}}) = \begin{cases} \frac{\bar{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \tau(\mathbf{t})}{\bar{\mathbf{n}}^{0} + \mathbf{n}} \text{ if } \bar{\tau}(\mathbf{t}) \le \bar{\mathbf{y}}^{0} \\\\ \frac{\underline{\mathbf{n}}^{0}\bar{\mathbf{y}}^{0} + \tau(\mathbf{t})}{\underline{\mathbf{n}}^{0} + \mathbf{n}} \text{ if } \bar{\tau}(\mathbf{t}) > \bar{\mathbf{y}}^{0} \end{cases} \tag{36}$$

Eqs. (35), (36) represent a generalized iLuck-Model, a model that represents the lower bound and the model that represents the upper bound is chosen based on the value of �τð Þt the estimator we obtain will be in the form of an interval. Therefore we will take the average for that period and from the above the posterior distribution will be in the following form:

$$\mathbf{f}\left(\theta/\mathbf{n}^{\mathrm{m}}\mathbf{y}^{\mathrm{m}}\right) = \frac{\left(\mathbf{n}^{\mathrm{m}}\mathbf{y}^{\mathrm{m}}\right)^{\mathrm{n}^{\mathrm{m}}+1}}{\mathbf{r}(\mathbf{n}^{\mathrm{m}}+1)}\theta^{-\left(\mathbf{n}^{\mathrm{m}}+1\right)-1}\mathbf{e}^{-\frac{\mathbf{n}^{\mathrm{m}}\mathbf{y}^{\mathrm{m}}}{\mathbf{0}}}\tag{37}$$

$$\mathbf{n}^{\mathrm{m}} = \frac{\mathrm{lower}(\mathbf{n}^{\mathrm{n}}) + \mathrm{upper}(\mathbf{n}^{\mathrm{n}})}{2}, \mathbf{y}^{\mathrm{m}} = \frac{\mathrm{lower}(\mathbf{y}^{\mathrm{n}}) + \mathrm{upper}(\mathbf{y}^{\mathrm{n}})}{2}$$
