Reddit Reddit reviews Pattern Recognition and Machine Learning (Information Science and Statistics)

We found 51 Reddit comments about Pattern Recognition and Machine Learning (Information Science and Statistics). Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
AI & Machine Learning
Artificial Intelligence & Semantics
Pattern Recognition and Machine Learning (Information Science and Statistics)
Springer
Check price on Amazon

51 Reddit comments about Pattern Recognition and Machine Learning (Information Science and Statistics):

u/majordyson · 29 pointsr/MachineLearning

Having done an MEng at Oxford where I dabbled in ML, the 3 key texts that came up as references in a lot of lectures were these:

Pattern Recognition and Machine Learning (Information Science and Statistics) (Information Science and Statistics) https://www.amazon.co.uk/dp/0387310738/ref=cm_sw_r_cp_apa_i_TZGnDb24TFV9M

Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning Series) https://www.amazon.co.uk/dp/0262018020/ref=cm_sw_r_cp_apa_i_g1GnDb5VTRRP9

(Pretty sure Murphy was one of our lecturers actually?)

Bayesian Reasoning and Machine Learning https://www.amazon.co.uk/dp/0521518148/ref=cm_sw_r_cp_apa_i_81GnDbV7YQ2WJ

There were ofc others, and plenty of other sources and references too, but you can't go buying dozens of text books, not least cuz they would repeat the same things.
If you need some general maths reading too then pretty much all the useful (non specialist) maths we used for 4 years is all in this:
Advanced Engineering Mathematics https://www.amazon.co.uk/dp/0470646136/ref=cm_sw_r_cp_apa_i_B5GnDbNST8HZR

u/ajh2148 · 11 pointsr/computerscience

I’d personally recommend Andrew Ng’s deeplearning.ai course if you’re just starting. This will give you practical and guided experience to tensorflow using jupyter notebooks.

If it’s books you really want I found the following of great use in my studies but they are quite theoretical and framework agnostic publications. Will help explain the theory though:

Deep Learning (Adaptive Computation and Machine Learning Series) https://www.amazon.co.uk/dp/0262035618/ref=cm_sw_r_cp_api_i_Hu41Db30AP4D7

Reinforcement Learning: An Introduction (Adaptive Computation and Machine Learning series) https://www.amazon.co.uk/dp/0262039249/ref=cm_sw_r_cp_api_i_-y41DbTJEBAHX

Pattern Recognition and Machine Learning (Information Science and Statistics) (Information Science and Statistics) https://www.amazon.co.uk/dp/0387310738/ref=cm_sw_r_cp_api_i_dv41DbTXKKSV0

Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series) https://www.amazon.co.uk/dp/B00AF1AYTQ/ref=cm_sw_r_cp_api_i_vx41DbHVQEAW1

u/DoorsofPerceptron · 10 pointsr/MachineLearning

For a maths heavy book, I'd go with Bishop's Pattern recognition and Machine Learning.

Check out the reviews here: http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738

u/siddboots · 9 pointsr/statistics

It is hard to provide a "comprehensive" view, because there's so much disperate material in so many different fields that draw upon probability theory.

Feller is an approachable classic that covers all of the main results in traditional probability theory. It certainly feels a little dated, but it is full of the deep central limit insights that are rarely explained in full in other texts. Feller is rigorous, but keeps applications at the center of the discussion, and doesn't dwell too much on the measure-theoretical / axiomatic side of things. If you are more interested in the modern mathematical theory of probability, try Probability with Martingales.

On the other hand, if you don't care at all about abstract mathematical insights, and just want to be able to use probabilty theory directly for every-day applications, then I would skip both of the above, and look into Bayesian probabilistic modelling. Try Gelman, et. al..

Of course, there's also machine learning. It draws on a lot of probability theory, but often teaches it in a very different way to a traditional probability class. For a start, there is much more emphasis on multivariate models, so linear algebra is much more central. (Bishop is a good text).

u/slashcom · 5 pointsr/compsci

In Natural Language Processing, it's Jurafsky and Martin. In Machine Learning, it's debatably the Bishop book.

u/NicolasGuacamole · 5 pointsr/MLQuestions

A good textbook will do you wonders. Get one that is fairly general and includes exercises. Do the exercises. This will be hard, but it'll make you learn an enormous amount faster.

My personal favourite book is Christopher Bishop's Pattern Recognition and Machine Learning. It's very comprehensive, has a decent amount of maths as well as good examples and illustrations. The exercises are difficult and numerous.

That being said, it is entirely Machine Learning. You mention wanting to learn about 'AI' so potentially you may want to look at a different book for some grounding in the wider more classical field of AI than just Machine Learning. For this I'd recommend Russel and Norvig's [AI: A Modern Approach](https://smile.amazon.co.uk/Artificial- Intelligence-Modern-Approach-Global/dp/1292153962). It has a good intro which you can use to understand the structure and history of the field more generally, and following on from that has a load of content in various areas such as search, logic, planning, probabilistic reasoning, Machine Learning, natural language processing, etc. It also has exercises, but I've never done them so I can't comment much on them.

These two books, if you were to study them deeply would give you at least close to a graduate level of understanding. You may have to step back and drill down into mathematical foundations if you're serious about doing exercises in Bishop's book.

On top of this, there are many really good video series on youtube for times when you want to do more passive learning. I must say though, that this should not be where most of your attention rests.

Here are some of my favourite relevant playlists on YouTube, ordered in roughly difficulty / relevance. Loosely start at the top, but don't be afraid to jump around. Some are only very tenuously related, but in my opinion they all have some value.

Gilbert Strang - Linear Algebra

Gilbert Strang - Calculus Overview

Andrew Ng - Machine Learning (Gentle coursera version)

Mathematical Monk - Machine Learning

Mathematical Monk - Probability

Mathematical Monk - Information Theory

Andrew Ng - Machine Learning (Full Stanford Course)

Ali Ghodsi - Data Visualisation (Unsupervised Learning)

Nando de Freitas - Deep Learning

The late great David MacKay - Information Theory

Berkeley Deep Unsupervised Learning

Geoff Hinton - Neural Networks for ML

Stephen Boyd - Convex Optimisation

Frederic Schuller - Winter School on Gravity and Light

Frederic Schuller - Geometrical Anatomy of Theoretical Physics

Yaser Abu-Mostafa - Machine Learning (statistical learning)

Daniel Cremers - Multiple View Geometry

u/effernand · 5 pointsr/learnmachinelearning

When I started on the field I took the famous course on Coursera by Andrew Ng. It helped to grasp the major concepts in (classical) ML, though it really lacked on mathematical profundity (truth be told, it was not really meant for that).

That said, I took a course on edX, which covered things in a little more depth. As I was getting deeper into the theory, things became more clear. I have also read some books, such as,

  • Neural Networks, by Simon Haikin,
  • Elements of Statistical Learning, by Hastie, Tibshirani and Friedman
  • Pattern Recognition and Machine Learning, by Bishop

    All these books have their own approach to Machine Learning, and particularly I think it is important that you have a good understanding on Machine Learning, and its impacts on various fields (signal processing, for instance) before jumping into Deep Learning. Before almost three years of major dedication in studying the field, I feel like I can walk a little by myself.

    Now, as a begginer in Deep Learning, things are a little bit different. I would like to make a few points:

  • If you have a good base on maths and Machine Learning, the algorithms used in Deep Learning will be more straightforward, as some of them are simply an extension of previous attempts.
  • The practical part in Machine Learning seems a little bit childish with respect to Deep Learning. When I programmed Machine Learning models, I usually had small datasets, and algorithms who could run in a simple CPU.
  • As you begin to work with Deep Learning, you will need to master a framework of your choice, which will yield issues about data usage (most datasets do not fit into memory), GPU/memory management. For instance, if you don't handle your data well, it becomes a bottleneck that slows down your code. So, when compared with simple numpy + matplotlib applications, tensorflow API's + tensorboard visualizations can be tough.

    So, to summarize, you need to start with simple, boring things until you can be an independent user of ML methods. THEN you can think about state-of-the-art problems to solve with cutting-edge frameworks and APIs.
u/blackkettle · 4 pointsr/math

take a look at Pattern Recognition an Machine Learning by Bishop,

http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738

it's an excellent text, though not for the faint of heart. just the first chapter should provide you with a great answer to your question.

u/Jimbo_029 · 4 pointsr/ECE

Bishop's book Pattern Recognition and Machine Learning is pretty great IMHO, and is considered to be the Bible in ML - although, apparently, it is in competition with Murphy's book Machine Learning: A Probabilistic Approach. Murphy's book is also supposed to be a gentler intro. With an ECE background the math shouldn't be too difficult to get into in either of these books. Depending on your background (i.e. if you've done a bunch of information theory) you might also like MacKay's book Information Theory, Inference and Learning Algorithms. MacKay's book has a free digital version and MacKay's 16 part lecture series based on the books is also available online.

While those books are great, I wouldn't actually recommend just reading through them, but rather using them as references when trying to understand something in particular. I think you're better off watching some lectures to get your toes wet before jumping in the deep end with the books. MacKay's lectures (liked with the book) are great. As are Andrew Ng's that @CatZach mentioned. As @CatZach mentioned Deep Learning has had a big impact on CV so if you find that you need to go that route then you might also want to do Ng's DL course, though unlike the courses this one isn't free :(.

Finally, all of the above recommendations (with the exception of Ng's ML course) are pretty theory driven, so if you are more of a practical person, you might like Fast.AI's free deep learning courses which have very little theory but still manage to give a pretty good intuition for why and how things work! You probably don't need to bother with part 2 since it is more advanced stuff (and will be updated soon anyways so I would try wait for that if you do want to do it :))

Good luck! I am also happy to help with more specific questions!

u/ginger_beer_m · 4 pointsr/dogecoin

If you just try to eyeball patterns from historical charts, I guarantee you will see it because that's just what the brain has evolved to do: spotting patterns well (e.g. Jesus on a toast), even when it's actually due to random chance. That's also why most of the so-called technical 'analysis' are bullshit.

Instead, approach this in a systematic and principled manner. You can try check out this book to get an idea what I'm talking about: Pattern Recognition and Machine Learning. This is the standard grad-level introduction to the field, but might be rather heavy for some. An easier read is this one. You can find the PDF of these books online through some searching or just head to your local library. Approaching the problem from a probabilistic and statistical angle also lets you know the extent of what you can predict and more importantly, what the limitations are and when the approach breaks down -- which happens a lot actually.

TL;DR: predicting patterns is hard. That's why stats is the sexy new job of the century, alongside with 'data science' (hate that term uuurgh).

u/G-Brain · 3 pointsr/math

I know nearly nothing about this topic, but I find it extremely interesting and I've done some searching.

Two more lectures:

u/ChristianGeek · 3 pointsr/learnmachinelearning

Amazon links to books mentioned (no affiliate). Warning: A lot of high textbook prices here...look for eBooks and/or used copies of earlier versions:

Introduction to Mathematical Statistics (Hogg, McKean, & Craig)

All of Statistics (Wasserman)

Statistical Inference (Casella & Berger)

Pattern Recognition and Machine Learning (Bishop) (only reasonably priced as an eBook)

Hitchhiker's Guide to Python

u/mr_dick_doge · 3 pointsr/dogecoin

I'm sceptical of Elliot wave analysis. The reason is, I do data analysis at work (although for other domains) , and in any stochastic system, it is easy for a vague model like the EW to fit any kind of patterns in the observed data (i.e. the EW 'model' underfits a lot). Worse because EW is prescriptive and it's basically fitting model by eyes. How well does it fit? Does it offer any meaningful predictive confidence? We don't know. The human brain excels in pattern recognition, so you tend to see patterns wherever you look, this is another reason why many so-called technical 'analysis' methods that try to eyeball patterns is a lot of bullshit.

Instead, an entire field on pattern recognition has been developed on a statistically rigorous background (here's a good book to start with) and applied on many practical problems with much success, e.g. the self-driving car, google translate, siri, etc. For me personally, if I were to try to do price prediction, I'd start with using a neural net or some kind of regression-based method using Gaussian Process or something. Section 4 of this paper seems to provide a decent review of the various methods applicable for price prediction. They would require a lot of tuning, and most likely they won't work well in a market as manipulated as crypto where a lot of the data points are essentially noise.

Having said that, it's fun to hear about 'predictions' where dogecoin will go to the moon :P

TL;DR: predicting the future is hard. Dogecoin will go to the moon.

u/samuelm · 3 pointsr/mlclass

He's talking about the distribution of the error of y not J(or the distribution of the probability of the function y given x). It's explained in the lecture notes, and in page 29(figure 1.16) of Bishop's book there's an illustration that switched on the bulb for me(althought I found the book almost incomprehensible). You can look it using the amazon preview http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738/ref=sr_1_1?ie=UTF8&qid=1318610381&sr=8-1#reader_0387310738

u/PLLOOOOOP · 3 pointsr/MachineLearning

Is this the Bishop book you guys are talking about?

u/tetramarek · 2 pointsr/compsci

I watched the entire course of Data Structures and Algorithms by Richard Buckland (UNSW) and thought it was excellent.
http://www.youtube.com/playlist?list=PLE621E25B3BF8B9D1

There is also an online course by Tim Roughgarden (Stanford) currently going on. It's very good but I don't know if you can still sign up.
https://class.coursera.org/algo

Topcoder.com is a fun place to test your skills in a competitive environment.

That being said, based on the description you are interested in things which don't usually fit into algorithms books or courses. Instead, you might want to look into machine learning and maybe even NLP. For example Pattern Recognition and Machine Learning by Bishop and Foundations of Statistical Natural Language Processing by Manning & Schuetze are great books for that.

u/dfmtr · 2 pointsr/MachineLearning

You can read through a machine learning textbook (Alpaydin's and Bishop's books are solid), and make sure you can follow the derivations. Key concepts in linear algebra and statistics are usually in the appendices, and Wikipedia is pretty good for more basic stuff you might be missing.

u/TheSummarizer · 2 pointsr/programming

Mitchell's quite long in the tooth now. The current best text is probably this one.

u/upulbandara · 2 pointsr/MachineLearning

I think it is completely possible. I'm ML engineer with M.Sc. in Computer Science. Presently, there are so many avenues (MOOCs, Kaggle, and books) to learn ML. But I believe the best approach would be:

  1. Buy a good machine learning book. My favorite one is Pattern Recognition and Machine Learning by Christopher M. Bishop. URL: https://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738
  2. When you read the book, implement ML algorithms using Python (or R, or Julia, or etc.)
  3. Pick few ML related projects which are completely away from your comfort zone (for example a toy version of Tensorflow) and somehow complete these projects.
  4. Create a Github account and push your projects/artifacts.
u/Broseidon241 · 2 pointsr/datascience

I did this, but I came to data science in the final year of my PhD when I got a job at a startup. I started with R, then SQL, then Python. I currently work in data science, moving internal ML products into production settings. I also do research - and knowing how to conduct proper trials is great if the company you work for gives you freedom in how something you've built is rolled out. I can also blend my degree with ML, e.g. designing batteries of questions to identify 'good fit' candidates for a given role - I combine the battery results with future performance data and continually refine the question set / improve the model. As well, I'm a good fit for UX and dabble in that. The combo skillset will give you the ability to produce value in many different ways.

The things that helped me most were:

  • Early on, Programming for Everybody - very gentle intro, and well taught.

  • Andrew Ng's machine learning course.
  • SQLzoo.
  • The Introduction to Statistical Learning course and book then, later, The Elements of Statistical Learning.
  • Buying big fat books about the things I wanted to learn, and working through them (e.g., Probabilistic Graphical Models, Pattern Recognition).
  • Coding algorithms from scratch, starting with linear regression and working my way to DNNs and RNNs. Do it in R, then Python, then Scala if you're ambitious.
  • Doing the Kaggle intro competitions in R and then translating to Python - Titanic, census dataset, etc, and using a variety of approaches for each (i.e. xgboost, sklearn, tensorflow).

    It can be overwhelming, but don't worry. Do one course to completion, with that as your only goal. Then do the next. Then work on a kaggle thing. Then work through a book. One thing at a time - you might get anxious or be uncertain or want to do multiple things at once, but just start with one thing and focus on that and that alone. You'll get where you want to go.

    I also brushed up on my linear algebra / probability using MITs open courses and khanacademy.

    Beyond all this, I found that learning a lot about ML/AI really expanded my thinking about how human beings work and gave me a new and better lens through which to view behaviour and psych research/theories. Very much recommend to all psychologists.

    Good luck!
u/vindvaki · 2 pointsr/math

How much depth do you need? For the basics of linear algebra, the text on Wikibooks should suffice. Make sure you read about eigenvalues. I like the coverage of PCA in section 12.1 of Bishop's book. As for differential equations, I'm not familiar enough with them to recommend a textbook on the topic.

u/shaggorama · 2 pointsr/math

Ok then, I'm going to assume that you're comfortable with linear algebra, basic probability/statistics and have some experience with optimization.

  • Check out Hastie, Tibshirani, & Friedman - Elements of Statistical Learning (ESLII): it's basically the data science bible, and it's free to read online or download.
  • Andrew Gelman - Data Analysis Using Regression and Multilevel/Hierarchical Models has a narrower scope on GLMs and hierarchical models, but it does an amazing treatment and discusses model interpretation really well and also includes R and stan code examples (this book ain't free).
  • Max Kuhn - Applied Predictive Modeling is also supposed to be really good and should strike a middle ground between those two books: it will discuss a lot of different modeling techniques and also show you how to apply them in R (this book is essentially a companion book for the caret package in R, but is also supposed to be a great textbook for modeling in general).

    I'd start with one of those three books. If you're feeling really ambitious, pick up a copy of either:

  • Christopher Bishop - Pattern Recognition and Machine Learning - Bayes all the things.
  • Kevin Murphy - Machine Learning: A Probabilistic Perspective - Also fairly bayesian perspective, but that's the direction the industry is moving lately. This book has (basically) EVERYTHING.

    Or get both of those books. They're both amazing, but they're not particularly easy reads.

    If these book recommendations are a bit intense for you:

  • Pang Ning Tan - Introduction to Data Mining - This book is, as it's title suggests, a great and accessible introduction to data mining. The focus in this book is much less on constructing statistical models than it is on various classification and clustering techniques. Still a good book to get your feet wet. Not free
  • James, Witten, Hastie & Tibshirani - Introduction to Statistical Learning - This book is supposed to be the more accessible version (i.e. less theoretical) version of ESLII. Comes with R code examples, also free.
    Additionally:

  • If you don't already know SQL, learn it.
  • If you don't already know python, R or SAS, learn one of those (I'd start with R or python). If you're proficient in some other programming language like java or C or fortran you'll probably be fine, but you'd find R/python in particular to be very useful.
u/illogical_vectors · 2 pointsr/MachineLearning

The Udacity machine learning track that you've probably seen is actually wonderful. It does a good job of scaling from entry level (even going down to basic data analysis) up to DNN. They charge for the nano-degree, but you can access all of the lectures without that.

As far as reading papers, I would actually recommend against it at this point. They're highly minute unless you're actually doing research into new techniques. If you're mostly looking to build a portfolio for employers, not a good place. If you're looking for a reading source Bishop's Machine Learning and Pattern Recognition is one of my favorites.

u/scohan · 2 pointsr/compsci

I think this might be beyond what you're looking for, but I really enjoyed Pattern Recognition and Machine Learning. It's very heavy on statistics, and if you're looking into machine learning methods, it has a wonderful amount of mathematical information given in a fairly clear manner. It might be severe overkill if this isn't your field, but I thought I'd mention it since you said AI.

For AI in general, I see Artificial Intelligence: A Modern Approach used a lot. It gives some solid basic concepts, and will be helpful in getting you started writing basic AI in your applications.

I can't really recommend discrete math because, despite enjoying it quite a bit, I haven't found a textbook that I like enough to endorse. My textbook for it in college was by Rosen, and I despised it.

edit:
Just double checked it, and I would stay far away from the first recommendation unless you have a very extensive knowledge of sophisticated statistics. I like it because it gives the math that other books gloss over, but it's not good for an introduction to the subject. It's almost like going through a bunch of published papers on some new cutting edge methods. The ever popular Machine Learning by Thomas Mitchell is a much better introduction to machine learning. If you want to obtain the mathematical depth necessary for your own research into the field, go with the other book after you've gotten acquainted with the material. I'll leave my suggestion up anyway in case anyone here might find it interesting.

u/adomian · 2 pointsr/learnmachinelearning

If you're worried about not doing projects and participating in Kaggle competitions, why not do those things? They're pretty low risk and high reward. If you're feeling shaky on the theory, read papers and reference textbooks, take notes, and implement things that interest you. For deep learning stuff there are some good resources here: https://github.com/ChristosChristofidis/awesome-deep-learning. For more traditional methods you can't go wrong with Chris Bishop's book (try googling it for a cheaper alternative to Amazon ;): https://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738.
Side projects can really help here, and the key is to use references, but don't just copy-paste. Think of something you'd like to apply machine learning to with a reasonable scope. Search google scholar/arxiv for papers that do this or something similar, read them, and learn the techniques. For reading research papers in an area where you're not extremely knowledgeable, use the references in the text or google things you don't know and make sure you understand so you could teach someone else. If you're interested in the topic and exhausted the references, go up the tree and use google scholar to find papers that list the one you're reading as a reference - you usually find interesting applications or improvements on the technique. You can also often find open source training data in the appendices of papers. Kaggle also has a ton of datasets, including obviously the ones they provide for competitions.

u/ANONYMOUSACCOUNTLOL · 2 pointsr/MachineLearning

May I suggest doing a search in r/statistics and r/machinelearning for learning-foundation books for ML? I think that'll turn up quite enough hits to get you pointed in the right direction.

I always talk up the one I used, which I liked:
http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738

u/noman2561 · 2 pointsr/MachineLearning

Well I do research in pattern recognition and computer vision so I'll try to answer this. An image is a grid of sensor readings. Each reading from a sensor is called a pixel which is the feature vector for that location in the image plane. Features based on spectral characteristics, spatial characteristics, and even motion characteristics (in video) may be derived from the original input (the reading from the sensor). Transformations are applied to the input which consider different aspects of the pixel's spectral components ( [R,G,B] - tristimulus ). A number of different methods exploit spatial correlation too. These features are then used in ML systems as part of the feature vector ( [T1,T2,T3,F1,F2,F3,F4,...] ). As far as books, I learned filtering methods using

"Two-Dimensional Signal and Image Processing" -Lim

I learned pattern recognition using

"Pattern Recognition" -Theodoridis and Koutroumbas

and

"Pattern Recognition and Machine Learning" -Bishop

The last one approaches from more of a CS side but doesn't go as in-depth. The field of CV/PR is pretty large and includes a lot of methods that aren't covered in these books. I would recommend using OpenCV or Matlab to handle images. My personal preference is Python but C++ and Matlab is are both close seconds.

u/goodbeertimes · 2 pointsr/india

Quick question. Did you read the book PRML.

I found it to be an incredible book.

u/ebenezer_caesar · 2 pointsr/bioinformatics

Chapter 7 of Chris Bishop's book Pattern Recognition and Machine Learning has a nice intro to SVMs.

Here is a list of papers where SVMs were used in a computational biology

> Gene Function from microarray expression data
>
> Knowledge-based analysis of microarray gene expression data by using support vector machines, Michael P. S. Brown, William Noble Grundy, David Lin, Nello Cristianini, Charles Walsh Sugnet, Terence S. Furey, Manuel Ares, Jr., David Haussler, Proc. Natl. Acad. Sci. USA, vol. 97, pages 262-267
> pdf
> http://www.pnas.org/cgi/reprint/97/1/262.pdf
>
> Support Vector Machine Classification of Microarray Gene Expression Data, Michael P. S. Brown William Noble Grundy, David Lin, Nello Cristianini, Charles Sugnet, Manuel Ares, Jr., David Haussler
> ps.gz
> http://www.cse.ucsc.edu/research/compbio/genex/genex.ps
>
> Gene functional classification from heterogeneous data Paul Pavlidis, Jason Weston, Jinsong Cai and William Noble Grundy, Proceedings of RECOMB 2001
> pdf
> http://www.cs.columbia.edu/compbio/exp-phylo/exp-phylo.pdf
>
> Cancer Tissue classification
> from microarray expression data, and gene selection:
>
> Support vector machine classification of microarray data, S. Mukherjee, P. Tamayo, J.P. Mesirov, D. Slonim, A. Verri, and T. Poggio, Technical Report 182, AI Memo 1676, CBCL, 1999.
> ps.gz
> PS file here
>
> Support Vector Machine Classification and Validation of Cancer Tissue Samples Using Microarray Expression Data, Terrence S. Furey, Nigel Duffy, Nello Cristianini, David Bednarski, Michel Schummer, and David Haussler, Bioinformatics. 2000, 16(10):906-914.
> pdf
> http://bioinformatics.oupjournals.org/cgi/reprint/16/10/906.pdf
>
> Gene Selection for Cancer Classification using Support Vector Machines, I. Guyon, J. Weston, S. Barnhill and V. Vapnik, Machine Learning 46(1/3): 389-422, January 2002
> pdf
> http://homepages.nyu.edu/~jaw281/genesel.pdf
>
> Molecular classification of multiple tumor types ( C. Yeang, S. Ramaswamy, P. Tamayo, Sayan Mukerjee, R. Rifkin, M Angelo, M. Reich, E. Lander, J. Mesirov, and T. Golub) Intelligent Systems in Molecular Biology
>
> Combining HMM and SVM : the Fisher Kernel
>
> Exploiting generative models in discriminative classifiers, T. Jaakkola and D. Haussler, Preprint, Dept. of Computer Science, Univ. of California, 1998
> ps.gz
> http://www.cse.ucsc.edu/research/ml/papers/Jaakola.ps
>
> A discrimitive framework for detecting remote protein homologies, T. Jaakkola, M. Diekhans, and D. Haussler, Journal of Computational Biology, Vol. 7 No. 1,2 pp. 95-114, (2000)
> ps.gz
> PS file here
>
> Classifying G-Protein Coupled Receptors with Support Vector Machines, Rachel Karchin, Master's Thesis, June 2000
> ps.gz
> PSgz here
>
> The Fisher Kernel for classification of genes
>
> Promoter region-based classification of genes, Paul Pavlidis, Terrence S. Furey, Muriel Liberto, David Haussler and William Noble Grundy, Proceedings of the Pacific Symposium on Biocomputing, January 3-7, 2001. pp. 151-163.
> pdf
> http://www.cs.columbia.edu/~bgrundy/papers/prom-svm.pdf
>
> String Matching Kernels
>
> David Haussler: "Convolution kernels on discrete structures"
> ps.gz
> Chris Watkins: "Dynamic alignment kernels"
> ps.gz
> J.-P. Vert; "Support vector machine prediction of signal peptide cleavage site using a new class of kernels for strings"
> pdf
>
> Translation initiation site recognition in DNA
>
> Engineering support vector machine kernels that recognize translation initiation sites, A. Zien, G. Ratsch, S. Mika, B. Scholkopf, T. Lengauer, and K.-R. Muller, BioInformatics, 16(9):799-807, 2000.
> pdf.gz
> http://bioinformatics.oupjournals.org/cgi/reprint/16/9/799.pdf
>
> Protein fold recognition
>
> Multi-class protein fold recognition using support vector machines and neural networks, Chris Ding and Inna Dubchak, Bioinformatics, 17:349-358, 2001
> ps.gz
> http://www.kernel-machines.org/papers/upload_4192_bioinfo.ps
>
> Support Vector Machines for predicting protein structural class Yu-Dong Cai*1 , Xiao-Jun Liu 2 , Xue-biao Xu 3 and Guo-Ping Zhou 4
> BMC Bioinformatics (2001) 2:3
> http://www.biomedcentral.com/content/pdf/1471-2105-2-3.pdf
>
> The spectrum kernel: A string kernel for SVM protein classification Christina Leslie, Eleazar Eskin and William Stafford Noble Proceedings of the Pacific Symposium on Biocomputing, 2002
> http://www.cs.columbia.edu/~bgrundy/papers/spectrum.html
>
> Protein-protein interactions
>
> Predicting protein-protein interactions from primary structure w, Joel R. Bock and David A. Gough, Bioinformatics 2001 17: 455-460
> pdf
> http://bioinformatics.oupjournals.org/cgi/reprint/17/5/455.pdf
>
> Protein secondary structure prediction
>
> A Novel Method of Protein Secondary Structure Prediction with High Segment Overlap Measure: Support Vector Machine Approach, Sujun Hua and Zhirong Sun, Journal of Molecular Biology, vol. 308 n.2, pages 397-407, April 2001.
>
> Protein Localization
>
>
> Sujun Hua and Zhirong Sun Support vector machine approach for protein subcellular localization prediction Bioinformatics 2001 17: 721-728
>
>
> Various
>
> Rapid discrimination among individual DNA hairpin molecules at single-nucleotide resolution using an ion channel
> Wenonah Vercoutere, Stephen Winters-Hilt, Hugh Olsen, David Deamer, David Haussler, Mark Akeson
> Nature Biotechnology 19, 248 - 252 (01 Mar 2001)
>
> Making the most of microarray data
> Terry Gaasterland, Stefan Bekiranov
> Nature Genetics 24, 204 - 206 (01 Mar 2000)

u/MrKlean518 · 2 pointsr/artificial

How mathy are you trying to get? Currently taking a Machine Learning/AI Independent study course for my masters. The class is split into three parts:

Part 1: Multivariate Statistics based on "Multivariate Statistical methods" by Donald F. Morrison, with Schaum's Outline of Statistics as supplemental material.

Part 2: Pattern Recognition and Machine Learning by Christopher Bishop

Part 3: Introduction to Artifical Intelligence by Phillip C. Jackson

Multivariate Statistics

Machine Learning

AI

u/sciencifying · 1 pointr/math

You may want to study regression as a machine learning problem. I don't know your background, but if you are a mathematician, this approach probably isn't the best for you.

If you are only thinking of fitting polynomials, you only need a little trick to adapt linear regression.

u/TonySu · 1 pointr/learnmath

Well there an actual book called Pattern Recognition and Machine Learning. Have you had a look at this?

u/versaceblues · 1 pointr/compsci

Would this book https://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738/ref=mt_hardcover?_encoding=UTF8&me=

Be good for someone with no knowlege of machine learning that wants to learn.

u/g0lem · 1 pointr/datascience

A lot of techniques in machine learning can be described from a Bayesian perspective, as evinced in one of the most popular textbooks in the field.

> it is still a current research topic

Absolutely, e.g. Bayesian nonparametrics among many many others.

The argument regarding the date that some underlying principles of Bayesian analysis were discovered is moot, it's like saying computer graphics is not that relevant since geometry originated in Euclid's works in 300 BC.

u/Mutzart · 1 pointr/math

Correct, thats the bachelor's I took... And continued on the Mathematical Modelling and Computing masters.

​

In regards to books, if you intend on going with Machine Learning Christopher Bishop - Pattern Recognition and Machine Learning (2006 pdf version) is pretty much the bible. Its a bit expensive, but well worth the investment and goes through everything you will ever need to know.

It wont be able to replace course books, but will be just about the best supplement you can find.

And if going with Neural Networks (Deep Learning), basically look up George Hinton and start reading all his stuff (And Yann LeCun, but they often wrote together)

u/oaty1 · 1 pointr/programming

Pattern Recognition by Chris Bishop. There are other books which come from the same philosophical approach. I will edit in a list of other books when I get a chance. A very good book that is also available for free is David Mackay's Information Theory, Inference and Learning Algorithms.

u/ItsGonnaBeAlright · 1 pointr/math

That's not a bad idea at all - I used EM way back (like 2002) for natural language processing, still remember it a bit, but dang gonna have to brush up :) Thx for the pointer!

Edit: Haha just realized I have that book! Recognized it from the cover shot on amazon :)

u/D_rock · 1 pointr/compsci

Agreed on ML. We used Alpaydin for the intro class and Bishop for the advanced class.

u/silverforest · 1 pointr/math

I'm a general engineer myself, with a side interest in computer science. Szeliski's book is probably the big one in the computer vision field. Another you might be interested in is Computer Vision by Linda Shapiro.

You may also be interested in machine learning in general, for which I can give you two books:

u/RobRomijnders · 1 pointr/MLQuestions

/r/Nader_Nazemi please do your homework. This question has been asked a dozen at least in the past years.
Likewise, a quick Google search will find you that he wrote on of the popular ML books

u/personanongrata · 1 pointr/compsci

I'd recommend you to read Bishop's PRML's Chapter 8, luckily this chapter is available on the web.

u/dwf · 1 pointr/csbooks

Not really. The closest you'll get are Chris Bishop's book and Hastie, Tibshirani and Friedman. I've had a look at the textbook above and it's a rather poor one in my opinion. I prefer Bishop because of his emphasis on the probabilistic grounding, others like the way Hastie et al approach things, YMMV.

For a good bit of free material that's worth reading I strongly suggest parts IV and V in David Mackay's book (see page 'x' in the Preface for suggested readings for a Machine Learning/Bayesian Inference course).

u/ASalvail · 1 pointr/NeuralNetwork

Start by Christopher Bishop's book titled 'Pattern recognition and machine learning' (http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738). It's a very good book that I would recommend to anyone new to the field. It is a bit old by now and won't go into deep learning much.

Something more recent and in video format is the excellent class on neural networks given by Hugo Larochelle (Neural networks class - Université de Sherbrooke: http://www.youtube.com/playlist?list=PL6Xpj9I5qXYEcOhn7TqghAJ6NAPrNmUBH). This will broach the subject of deep learning and many of the more recent advances.

Hope this helps!

u/BathroomEyes · 1 pointr/Purdue

If you really like this stuff, I would highly recommend two textbooks:

For the communications topics, reliability, optimization etc, ditch Leon-Garcia and pick up this book by Trivedi

If you're interesting in Machine Learning like I am, then this book by Bishop is fantastic. You can find both in the Engineering library I believe.

u/latent_z · 1 pointr/MachineLearning

I would make a distinction to what are "complex" algorithms/methods towards simple/basic methods. You seem to be at a stage in which it's better for you to discard all the complex methods, and maybe just focus on the simple and basic methods. Simple because they do not require a lot of mathematical knowledge and basic because further theory is built upon them. This would exclude, for now, all the recent published literature.

I would suggest you to get one book that will ease this process, such as Bishop's. Just start with the basics of maximum likelihood and posterior inference estimation with simple Gaussians. I assure you that this is basic, in the sense that you will recognize and use this piece of knowledge in most advanced papers. Mixture of Gaussians and the EM algorithm are also a basic topic, as well as Neural Networks (the simple sigmoid fully connected).

Just make sure that you know these three topics extremely well and learning the rest will be slightly easier.

BTW, this is a post for /r/MLQuestions or /r/learnmachinelearning

u/mez001 · 0 pointsr/statistics

Hi, i see you already have read some good books. I would also recommend Bishop's pattern recognition book for ML https://www.amazon.ca/Pattern-Recognition-Machine-Learning-Christopher/dp/0387310738

However, not sure whether they teach this in grad stat . Good Luck !

​