Statistical data analysis See the separate Wikipedia entry on
Bayesian statistics, specifically the
statistical modeling section in that page.
Computer applications Bayesian inference has applications in
artificial intelligence and
expert systems. Bayesian inference techniques have been a fundamental part of computerized
pattern recognition techniques since the late 1950s. There is also an ever-growing connection between Bayesian methods and simulation-based
Monte Carlo techniques since complex models cannot be processed in closed form by a Bayesian analysis, while a
graphical model structure
may allow for efficient simulation algorithms like the
Gibbs sampling and other
Metropolis–Hastings algorithm schemes. Recently Bayesian inference has gained popularity among the
phylogenetics community for these reasons; a number of applications allow many demographic and evolutionary parameters to be estimated simultaneously. As applied to
statistical classification, Bayesian inference has been used to develop algorithms for identifying
e-mail spam. Applications which make use of Bayesian inference for spam filtering include
CRM114,
DSPAM,
Bogofilter,
SpamAssassin,
SpamBayes,
Mozilla, XEAMS, and others. Spam classification is treated in more detail in the article on the
naïve Bayes classifier.
Solomonoff's Inductive inference is the theory of prediction based on observations; for example, predicting the next symbol based upon a given series of symbols. The only assumption is that the environment follows some unknown but computable
probability distribution. It is a formal inductive framework that combines two well-studied principles of inductive inference: Bayesian statistics and
Occam's Razor. Solomonoff's universal prior probability of any prefix
p of a computable sequence
x is the sum of the probabilities of all programs (for a universal computer) that compute something starting with
p. Given some
p and any computable but unknown probability distribution from which
x is sampled, the universal prior and Bayes' theorem can be used to predict the yet unseen parts of
x in optimal fashion.
Bioinformatics and healthcare applications Bayesian inference has been applied in different
bioinformatics applications, including differential gene expression analysis. Bayesian inference is also used in a general cancer risk model, called
CIRI (Continuous Individualized Risk Index), where serial measurements are incorporated to update a Bayesian model which is primarily built from prior knowledge.
Cosmology and astrophysical applications The Bayesian approach has been central to recent progress in cosmology and astrophysical applications, and extends to a wide range of astrophysical problems, including the characterisation of exoplanet (such as the fitting of atmosphere for
k2-18b), parameter constraints with cosmological data, and calibration in astrophysical experiments. In cosmology, it is often employed with computational techniques such as
Markov chain Monte Carlo(MCMC) and
Nested sampling algorithm to analyse complex datasets and navigate high-dimensional parameter space. A notable application is to the Planck 2018 CMB data for parameter inference. The bayesian code for cosmology `cobaya` sets up cosmological runs and interfaces cosmological likelihoods, Boltzmann code, which computes the predicted CMB anisotropies for any given set of cosmological parameters, with MCMC or nested sampler. This computational framework is not limited to the standard model, it is also essential for testing alternative or extended theories of cosmology, such as theories with early dark energy, or modified gravity theories introducing additional parameters beyond Lambda-CDM.
Bayesian model comparison can then be employed to calculate the evidence for competing models, providing a statistical basis to assess whether the data support them over the standard Lambda-CDM.
In the courtroom Bayesian inference can be used by jurors to coherently accumulate the evidence for and against a defendant, and to see whether, in totality, it meets their personal threshold for "
beyond a reasonable doubt". Bayes' theorem is applied successively to all evidence presented, with the posterior from one stage becoming the prior for the next. The benefit of a Bayesian approach is that it gives the juror an unbiased, rational mechanism for combining evidence. It may be appropriate to explain Bayes' theorem to jurors in
odds form, as
betting odds are more widely understood than probabilities. Alternatively, a
logarithmic approach, replacing multiplication with addition, might be easier for a jury to handle. If the existence of the crime is not in doubt, only the identity of the culprit, it has been suggested that the prior should be uniform over the qualifying population. For example, if 1,000 people could have committed the crime, the prior probability of guilt would be 1/1000. The use of Bayes' theorem by jurors is controversial. In the United Kingdom, a defence
expert witness explained Bayes' theorem to the jury in
R v Adams. The jury convicted, but the case went to appeal on the basis that no means of accumulating evidence had been provided for jurors who did not wish to use Bayes' theorem. The Court of Appeal upheld the conviction, but it also gave the opinion that "To introduce Bayes' Theorem, or any similar method, into a criminal trial plunges the jury into inappropriate and unnecessary realms of theory and complexity, deflecting them from their proper task." Gardner-Medwin argues that the criterion on which a verdict in a criminal trial should be based is
not the probability of guilt, but rather the
probability of the evidence, given that the defendant is innocent (akin to a
frequentist p-value). He argues that if the posterior probability of guilt is to be computed by Bayes' theorem, the prior probability of guilt must be known. This will depend on the incidence of the crime, which is an unusual piece of evidence to consider in a criminal trial. Consider the following three propositions: :
A – the known facts and testimony could have arisen if the defendant is guilty. :
B – the known facts and testimony could have arisen if the defendant is innocent. :
C – the defendant is guilty. Gardner-Medwin argues that the jury should believe both
A and not-
B in order to convict.
A and not-
B implies the truth of
C, but the reverse is not true. It is possible that
B and
C are both true, but in this case he argues that a jury should acquit, even though they know that they will be letting some guilty people go free. See also
Lindley's paradox.
Bayesian epistemology Bayesian epistemology is a movement that advocates for Bayesian inference as a means of justifying the rules of inductive logic.
Karl Popper and
David Miller have rejected the idea of Bayesian rationalism, i.e. using Bayes rule to make epistemological inferences: It is prone to the same
vicious circle as any other
justificationist epistemology, because it presupposes what it attempts to justify. According to this view, a rational interpretation of Bayesian inference would see it merely as a probabilistic version of
falsification, rejecting the belief, commonly held by Bayesians, that high likelihood achieved by a series of Bayesian updates would prove the hypothesis beyond any reasonable doubt, or even with likelihood greater than 0.
Other • The
scientific method is sometimes interpreted as an application of Bayesian inference. In this view, Bayes' rule guides (or should guide) the updating of probabilities about
hypotheses conditional on new observations or
experiments. The Bayesian inference has also been applied to treat
stochastic scheduling problems with incomplete information by Cai et al. (2009). •
Bayesian search theory is used to search for lost objects. •
Bayesian inference in phylogeny •
Bayesian tool for methylation analysis •
Bayesian approaches to brain function investigate the brain as a Bayesian mechanism. • Bayesian inference in ecological studies • Bayesian inference is used to estimate parameters in stochastic chemical kinetic models • Bayesian inference in
econophysics for currency or prediction of trend changes in financial quotations •
Bayesian inference in marketing •
Bayesian inference in motor learning • Bayesian inference is used in
probabilistic numerics to solve numerical problems ==Bayes and Bayesian inference==