Reddit Reddit reviews Bayesian Data Analysis, Second Edition (Chapman & Hall/CRC Texts in Statistical Science)

We found 5 Reddit comments about Bayesian Data Analysis, Second Edition (Chapman & Hall/CRC Texts in Statistical Science). Here are the top ones, ranked by their Reddit score.

Science & Math
Books
Mathematics
Probability & Statistics
Applied Mathematics
Bayesian Data Analysis, Second Edition (Chapman & Hall/CRC Texts in Statistical Science)
Used Book in Good Condition
Check price on Amazon

5 Reddit comments about Bayesian Data Analysis, Second Edition (Chapman & Hall/CRC Texts in Statistical Science):

u/othercriteria · 5 pointsr/math

Andrew Gelman is the one of the authors of Bayesian Data Analysis. He generally favors Bayesian approaches to statistics, although I get the impression he sees them as means to getting robust/tractable and partially-pooled estimates from data, rather than as the only coherent way to make any inferences, ever.

u/orangeforahead · 5 pointsr/statistics

You beat me to it! Well, here are the recommendations:

> On advanced Bayesian statistics, Cyan recommends Gelman's Bayesian Data Analysis over Jaynes' Probability Theory: The Logic of Science and Bernardo's Bayesian Theory.

> On basic Bayesian statistics, jsalvatier recommends Skilling & Sivia's Data Analysis: A Bayesian Tutorial over Gelman's Bayesian Data Analysis, Bolstad's Bayesian Statistics, and Robert's The Bayesian Choice.

u/hadhubhi · 2 pointsr/MachineLearning

It's a parametric model. The parameters of the model are simply the parameters of the distributions he assumes (or the "hyperparameters" if there's some sort of multilevel modelling) over the visible data he feeds into the model (previous years' results). He's fitting using the Stan software (which uses No-U-Turn-Sampling, other reference and another). Once he gets all the posterior probability distributions over the parameters, it's pretty trivial to simulate the model a bunch of times to see the distribution over outcomes.

The advantage of MCMC is that you don't HAVE to calculate the normalization constant (which is hard). Look at the formal derivation of Metropolis-Hastings on wiki. The basic idea is that it relies on a fraction of posterior probability distributions for generating samples from a distribution. Since the normalization constant is present in both numerator and denominator, it cancels out. So you don't need to calculate it directly, and you only need to know the posterior up to a constant of proportionality. And this is generally much easier to do.

If you want a book to look through this stuff, the classic reference is Gelman's Bayesian Data Analysis (and he'll be coming out with a third edition pretty soon).

u/hyperionsshrike · 2 pointsr/statistics

If you're looking for a thorough and rigorous introduction into probability theory, I'd recommend going with Introduction to Probability Theory and Its Applications Vol.1 and 2 by Feller. Another well recommended book is Probability and Random Processes by Grimmett and Stirzaker (this starts from the get-go with measure theory).

If you're looking for general statistics, then you may want to look at All of Statistics by Wasserman and perhaps Bayesian Data Analysis by Gelman, et al.

Finally, since you're a physicist, you'll probably want to take a look at Monte Carlo methods in particular, such as with Monte Carlo Statistical Methods by Robert and Casella.

u/cbrunos · 1 pointr/econometrics