Skip to main content

Currently Skimming:

Bayesian Inference / Not an Enigma Anymore
Pages 20-23

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.

From page 20...
... It was won in part by a statistical method called Bayesian inference, which allowed code breakers to determine probabilistically which settings of the Enigma machine (which changed daily) were more likely than others (see Figure 6)
From page 21...
... But to a prior degree a Bayesian, it makes perfect sense; it means that you are willing to give 3-1 odds in a bet of belief in the on team A hypothesis you The key ingredient in Bayesian statistics is Bayes's rule (named after the Reverend Thomas Bayes, whose monograph on the subject was published posthumously in 1763)
From page 22...
... In the late 1980s, statisticians realized that a technique called Markov chain Monte Carlo provided a very efficient and general way to empirically sample a random distribution that is too complex to be expressed in a formula. Markov chain Monte Carlo had been developed in the 1950s by physicists who wanted to simulate random processes like the chain reactions of neutrons in a hydrogen bomb.
From page 23...
... For example, an experimenter might know that a parameter will be negative without knowing anything about the specific value of the parameter. Basic research in these areas will complement the application-specific research on problems like finding breast cancer genes or building robots and will therefore ensure that Bayesian inference continues to find a wealth of new applications.

This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.