Reddit Reddit reviews All of Statistics: A Concise Course in Statistical Inference (Springer Texts in Statistics)

We found 15 Reddit comments about All of Statistics: A Concise Course in Statistical Inference (Springer Texts in Statistics). Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
All of Statistics: A Concise Course in Statistical Inference (Springer Texts in Statistics)
Springer
Check price on Amazon

15 Reddit comments about All of Statistics: A Concise Course in Statistical Inference (Springer Texts in Statistics):

u/COOLSerdash · 9 pointsr/statistics
u/sparsecoder · 7 pointsr/MachineLearning

You might find Wasserman's All of Statistics useful:
http://www.amazon.com/All-Statistics-Statistical-Inference-Springer/dp/0387402721/

It's a very concise, yet broad introductory statistics text with a slant towards data mining/machine learning.

u/maxbaroi · 3 pointsr/math

I really like All of Statistics by Larry Wasserman. It's still the book I pull off the shelf when I need to look something up. It's pretty dense overview of statistic and probability.

It's not the easiest book to get into, and you probably need to have a firm grasp of real analysis to get through it.

u/schmook · 3 pointsr/brasil

Na verdade eu sou físico. Acho que é mais comum entre os físicos adotar uma perspectiva bayesiana do que entre os matemáticos ou mesmo os estatísticos. Talvez por causa da influência do Edwin T. Jayes, que era físico. Talvez por causa da conexão com teoria de informação e a tentadora conexão com termodinâmica e mecânica estatística.

O meu interesse pela perspectiva Bayesiana começou por conta do grupo de pesquisa onde fiz o doutorado. Meus orientador e meu co-orientador são fortemente bayesianos, e o irmão do meu orientador de doutorado é um pesquisador bastante conhecido das bases epistemológicas da teoria bayesiana (o físico uruguaio Ariel Caticha).

Tem vários livros bons sobre probabilidade bayesiana, depende muito do seu interesse.

O primeiro livro que eu li sobre o assunto foi justamente o do Jaynes - Probability Theory, the Logic of Science. Esse é um livro um pouco polêmico porque ele adota uma visão epistemológica bastante forte e argumenta de forma bastante agressiva a favor dela.

Uma visão um pouco alternativa, bastante conectada com teoria de informação e também fortemente epistemológica você pode encontrar no livro Lectures on Probability, Entropy and Statistical Physics do Ariel Caticha - (de graça aqui: https://arxiv.org/abs/0808.0012). Eu fui aluno de doutorado do irmão do Ariel, o Nestor Caticha. Ambos têm uma visão bastante fascinante de teoria de probabilidades e teoria da informação e das implicações delas para a física e a ciência em geral.

Esses livros são mais visões epistemológicas e teóricas, e bem menos úteis para aplicação. Se você se interessa por aplicação tem o famoso BDA3 - Bayesian Data Analysis, 3ª edição e também o Doing Bayesian Data Analysis do John Kruschke que tem exemplos em R.

Tem um livrinho bem introdutório também chamado Bayesian Methods for Hackers do Cam-Davidson Pylon (de graça aqui: https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers) que usa exemplos em python (pymc). É bem basicão para aprender aplicações de probabilidades bayesianas.

O livro All of Statistics do Larry Wasserman tem uma parte introdutória também de inferência bayesiana.

Se você em interesse por inteligência artificial um outro livro muito bacana é o do físico britânico (recentemente falecido) David Mackay - Information Theory, Inference, and Learning Algorithms (de graça aqui: http://www.inference.phy.cam.ac.uk/mackay/itila/). Esse livro foi meu primeiro contato com Aprendizado de Máquina e é bem bacana.

Outros livros bacanas de Aprendizado de Máquina que usam uma perspectiva bayesiana são Bayesian Reasoning and Machine Learning (David Barber) e o livro-texto que tem sido o mais usado para essa área que é o Machine Learning: a Probabilistic Perspective (Kevin Murphy).



u/__baxx__ · 2 pointsr/datascience

> All of Statistics

http://www.amazon.co.uk/All-Statistics-Statistical-Inference-Springer/dp/0387402721

i thought you were joking for a sec but there it is

u/Sarcuss · 2 pointsr/learnmath

I personally think you should brush up on frequentist statistics as well as linear models before heading to Bayesian Statistics. A list of recommendations directed at your background:

u/Kiuhnm · 2 pointsr/MachineLearning

Anyway, a great book often recommended is All of statistics. You can't go wrong with that book.

u/sot9 · 2 pointsr/UIUC

I've taken ECE 313 and STAT 410, and I'm also interested in ML, but I disagree. STAT 410 is not worth it if you're trying to improve your base of knowledge for ML. The amount of usable knowledge (from an ML context) is disproportionately low for how time consuming/stressful the class is.

If you ever feel like your statistics isn't up to snuff, you could just read/skim Wasserman, All of Statistics.

u/Ayakalam · 1 pointr/statistics

Thanks! FWIW, I just ordered two books on the subject matter, All of Statistics: A Concise Course in Statistical Inference and Detection Theory

Also along with a third addition I just spent over $200 on books, ><, but they seem to have great reviews.

-----------------------------------

So let me tell you one of my biggest confusions from this post. Highlights are mine.

Ok so to keep things simple, lets just focus on one case, on one line. So, I dont get how
[; R(H0 | X) = \lambda_{01} P(H_1 | X) ;]

Questions:

  • What is [; R(H_0 | X) ;]? Is it just a number?

  • He says that [; \lambda_{01};] is the 'cost of accepting H0, when in fact H1 was true'. Fine, that makes sense.

  • So why isnt [; R(H0 | X) ;] not just [; \lambda_{01};]? I dont get this. What is the conceptual difference between 'cost of picking H0' and 'risk by picking H0' here? Neo gives me a blue pill or red pill. The cost to of picking the wrong one is I die in one. So what is my risk then? I need an example for this...

  • [; P(H1 | X) ;] is the probability of accepting H1 given what you observed, X. First off, I do not know what that means. "The probability of accepting H1 given X". What does that even mean? To me this is nonsense. I am the one making the decision. How can you place a probability on it? Are they saying that if you show me 1000 cases of X, and I say "H1" 20% of the time, then [; P(H1 | X) = 0.2;] ? If not, then I am totally lost on the meaning of this.

    -------------------------------

    Ill stop here for now so it doesnt get too complicated...

    Thanks!
u/DeepTruth · 1 pointr/programming

Probability and Statistics for Computer Scientists:

http://www.amazon.com/Probability-Statistics-Computer-Scientists-Trim/dp/1584886412

This book has a decent narrative.

Someone mentioned this before, but I would also recommend All of Statistics:
http://www.amazon.com/All-Statistics-Statistical-Inference-Springer/dp/0387402721/ref=sr_1_1?ie=UTF8&s=books&qid=1245566522&sr=1-1

u/kernelmode · 1 pointr/learnmachinelearning

After those you may read the great linear algebra book by Gilbert Stang (he is a great teacher and writer, one of the best math books I've seen): http://math.mit.edu/linearalgebra/. MIT also shared video lectures for this course: https://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-fall-2011/index.htm

And for a next step in statistics I suggest reading All of Statistics: https://www.amazon.com/All-Statistics-Statistical-Inference-Springer/dp/0387402721

Also, first chapters of the Deep Learning Book contain good overview on mathematics needed for ML

u/noelsusman · 1 pointr/GradSchool

I have a statistics professor as one of my co-advisors, and she highly recommends this book. I have it on hand at all times.

u/carmichael561 · 1 pointr/datasets

This data set is used in Exercise 23.36 of Larry Wasserman's All of Statistics: A Concise Course in Statistical Inference

It is also easy to find PDFs of this textbook online.

u/blind_swordsman · 0 pointsr/statistics

The book All of Statistics gives a broad but (relatively) quick introduction to modern statistics.