**Abstract**

Bayesian methods provide the means for studying probabilistic models of linear as well as non-linear stochastic systems. They allow tracking changes in probability distributions by applying Bayes's theorem and the chain rule for factoring the probabilities. However, an excessive complexity of resulting distributions often dictates the use of numerical methods when performing statistical and causal inferences over probabilistic models. In this chapter, the Bayesian methods for intractable distributions are first introduced as sampling, filtering, approximation, and likelihood-free methods. Their fundamental principles are explained, and the key challenges are identified. The concise survey of Bayesian methods is followed by outlining their applications. In particular, Bayesian experiment design aims at maximizing information gain or utility, and it is often combined with an optimum model selection. Bayesian hypothesis testing introduces optimality in the data-driven decision making. Bayesian machine learning assumes data labels to be random variables. Bayesian optimization is a powerful strategy for configuring and optimizing large-scale complex systems, for which conventional optimization techniques are usually ineffective. The chapter is concluded by examining Bayesian Monte Carlo simulations. It is proposed that augmented Monte Carlo simulations can achieve explainability and also provide much better information efficiency.

**Keywords:** Bayesian analysis, distribution, Monte Carlo, numerical method, machine learning, optimization, posterior, prior, simulation, statistical inference
