Reddit Reddit reviews An Introduction to Kolmogorov Complexity and Its Applications (Texts in Computer Science)

We found 5 Reddit comments about An Introduction to Kolmogorov Complexity and Its Applications (Texts in Computer Science). Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
AI & Machine Learning
Computer Vision & Pattern Recognition
An Introduction to Kolmogorov Complexity and Its Applications (Texts in Computer Science)
Check price on Amazon

5 Reddit comments about An Introduction to Kolmogorov Complexity and Its Applications (Texts in Computer Science):

u/tsarnicky · 4 pointsr/math

The subject isn't particularly popular as far as I can tell, but Li and Vitanyi is the main (only?) comprehensive reference. I thought it was quite well written, and it starts from 0. To be able to get through the problems you'll probably need to have an undergrad background in CS or math.

I'm not sure what your background is, but usually one starts off in TCS by reading Sipser's text on the theory of computation. I'm not particularly advanced, but I've taken a few classes in TCS so I could elaborate more if you like?

u/moreLytes · 3 pointsr/askphilosophy

> Algorithmic information theory doesn't study induction

I'm not so sure; one-eighth of my textbook treats inductive reasoning.

> SI does help with assigning prior probabilities

Actually, Kolmogorov Complexity is the proposed mechanism for valuating priors. SI is a Bayesian reasoning vehicle that subsumed KC when it calculates its universal prior.

> The question decision theorists and epistemologists want to ask is "what explains the justificatory relation" or less question beggingly "is there a justificatory relation".

The question SI seems most vested in answering is: "how should one build, and what are the properties, of an ideal inductive engine". An exemplative result:

> This result implies that ξ(xt|x<t) will rapidly converge to µ(xt|x<t) with probability one as t → ∞. In a stochastic environment “with probability one” is usually the strongest guarantee that can be made so this is a strong result. Roughly it states that the expected number of errors will be finite and Bayes mix will eventually be the same as the true predictive distribution.

Such a result does not directly answer to the justificatory relation, but I could see evidences like it being used to construct such an argument.

u/fraac · 3 pointsr/Harmontown

I bought this as a student, when I didn't have the maths to understand half of it, because the idea was so attractive. So speak for yourself.

Musk is literally attempting to save humanity.

u/tluyben2 · 2 pointsr/programming

I actually found the first page of this book about the subject http://www.amazon.com/Introduction-Kolmogorov-Complexity-Applications-Computer/dp/0387339981/ref=sr_1_1?ie=UTF8&qid=1291565261&sr=8-1 clearer than the explanation in GEB. But it is a great book.

u/paultypes · 1 pointr/programming

I'd be careful about saying that. See chapter 8 of An Introduction to Kolmogorov Complexity and Its Applications, for example. To get heat dissipation in processors and memory down, we'll eventually have to quit pretending that bit-banging doesn't generate waste heat!