Reddit Reddit reviews Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems (Computational Neuroscience Series)

We found 12 Reddit comments about Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems (Computational Neuroscience Series). Here are the top ones, ranked by their Reddit score.

Science & Math
Books
Biological Sciences
Biology
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems (Computational Neuroscience Series)
Check price on Amazon

12 Reddit comments about Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems (Computational Neuroscience Series):

u/anastas · 22 pointsr/askscience

My main hobby is reading textbooks, so I decided to go beyond the scope of the question posed. I took a look at what I have on my shelves in order to recommend particularly good or standard books that I think could characterize large portions of an undergraduate degree and perhaps the beginnings of a graduate degree in the main fields that interest me, plus some personal favorites.

Neuroscience: Theoretical Neuroscience is a good book for the field of that name, though it does require background knowledge in neuroscience (for which, as others mentioned, Kandel's text is excellent, not to mention that it alone can cover the majority of an undergraduate degree in neuroscience if corequisite classes such as biology and chemistry are momentarily ignored) and in differential equations. Neurobiology of Learning and Memory and Cognitive Neuroscience and Neuropsychology were used in my classes on cognition and learning/memory and I enjoyed both; though they tend to choose breadth over depth, all references are research papers and thus one can easily choose to go more in depth in any relevant topics by consulting these books' bibliographies.

General chemistry, organic chemistry/synthesis: I liked Linus Pauling's General Chemistry more than whatever my school gave us for general chemistry. I liked this undergraduate organic chemistry book, though I should say that I have little exposure to other organic chemistry books, and I found Protective Groups in Organic Synthesis to be very informative and useful. Unfortunately, I didn't have time to take instrumental/analytical/inorganic/physical chemistry and so have no idea what to recommend there.

Biochemistry: Lehninger is the standard text, though it's rather expensive. I have limited exposure here.

Mathematics: When I was younger (i.e. before having learned calculus), I found the four-volume The World of Mathematics great for introducing me to a lot of new concepts and branches of mathematics and for inspiring interest; I would strongly recommend this collection to anyone interested in mathematics and especially to people considering choosing to major in math as an undergrad. I found the trio of Spivak's Calculus (which Amazon says is now unfortunately out of print), Stewart's Calculus (standard text), and Kline's Calculus: An Intuitive and Physical Approach to be a good combination of rigor, practical application, and physical intuition, respectively, for calculus. My school used Marsden and Hoffman's Elementary Classical Analysis for introductory analysis (which is the field that develops and proves the calculus taught in high school), but I liked Rudin's Principles of Mathematical Analysis (nicknamed "Baby Rudin") better. I haven't worked my way though Munkres' Topology yet, but it's great so far and is often recommended as a standard beginning toplogy text. I haven't found books on differential equations or on linear algebra that I've really liked. I randomly came across Quine's Set Theory and its Logic, which I thought was an excellent introduction to set theory. Russell and Whitehead's Principia Mathematica is a very famous text, but I haven't gotten hold of a copy yet. Lang's Algebra is an excellent abstract algebra textbook, though it's rather sophisticated and I've gotten through only a small portion of it as I don't plan on getting a PhD in that subject.

Computer Science: For artificial intelligence and related areas, Russell and Norvig's Artificial Intelligence: A Modern Approach's text is a standard and good text, and I also liked Introduction to Information Retrieval (which is available online by chapter and entirely). For processor design, I found Computer Organization and Design to be a good introduction. I don't have any recommendations for specific programming languages as I find self-teaching to be most important there, nor do I know of any data structures books that I found to be memorable (not that I've really looked, given the wealth of information online). Knuth's The Art of Computer Programming is considered to be a gold standard text for algorithms, but I haven't secured a copy yet.

Physics: For basic undergraduate physics (mechanics, e&m, and a smattering of other subjects), I liked Fundamentals of Physics. I liked Rindler's Essential Relativity and Messiah's Quantum Mechanics much better than whatever books my school used. I appreciated the exposition and style of Rindler's text. I understand that some of the later chapters of Messiah's text are now obsolete, but the rest of the book is good enough for you to not need to reference many other books. I have little exposure to books on other areas of physics and am sure that there are many others in this subreddit that can give excellent recommendations.

Other: I liked Early Theories of the Universe to be good light historical reading. I also think that everyone should read Kuhn's The Structure of Scientific Revolutions.

u/xingdongrobotics · 5 pointsr/MachineLearning

For computational neuroscience, it is highly recommended for the classic textbook, Theoretical neuroscience by Peter Dayan and Larry Abbott

u/Double-Down · 3 pointsr/neuro

Information theory and neural coding - Borst A, Theunissen FE (1999)

Abstract:

> Information theory quantifies how much information a neural response carries about the stimulus. This can be compared to the information transferred in particular models of the stimulus−response function and to maximum possible information transfer. Such comparisons are crucial because they validate assumptions present in any neurophysiological analysis. Here we review information-theory basics before demonstrating its use in neural coding. We show how to use information theory to validate simple stimulus−response models of neural coding of dynamic stimuli. Because these models require specification of spike timing precision, they can reveal which time scales contain information in neural coding. This approach shows that dynamic stimuli can be encoded efficiently by single neurons and that each spike contributes to information transmission. We argue, however, that the data obtained so far do not suggest a temporal code, in which the placement of spikes relative to each other yields additional information.

See also: Theoretical Neuroscience, Ch.4 - Dayan P, Abbott F (2005)

u/kevroy314 · 3 pointsr/compmathneuro

Depends on the level at which you are looking. Cellular stuff? Signal processing and calculus. Population dynamics? Linear algebra and calculus. Brain regions/networks/whole brain? Graph theory and linear algebra.

And for all of them, statistics will be the tool to understand what's happening within the mathematical formulations.

On side note: if you're looking at the level of behavior, many other discrete methods become much more important in my opinion. However, it is fairly uncommon for people to use a behavioral approach these days without linking it to some other measurement one of the levels I mentioned before.

See From Neuron to Cognition, Fundamentals of Computational Neuroscience, and Theoretical Neuroscience for some foundational understanding.

u/inquilinekea · 3 pointsr/neuroscience

Theoretical neuroscience. Check out this textbook: http://www.amazon.com/Theoretical-Neuroscience-Computational-Mathematical-Modeling/dp/0262541858

You should also look into the Redwood Center for Theoretical Neuroscience (at Berkeley) and at http://www.cns.nyu.edu/wanglab/ if you have the chance.

u/pushbak · 2 pointsr/neuro

I got a specialty in neuroengineering coursewise as a masters (it was still biomedical engineering). I took an Applied Electrophysiology class that I thought was very good. Most of our neuroscience classes and engineering classes lended from this Principles of Neural Science book.
The applied electrophys class also used an Applied Bioelectricity text.

We also has a pretty comprehensive Computational Neuroengineering course that relied on this Theoretical Neuroscience text.

As far as teaching these topics goes, it's pretty specific. You might want to look into related neuroscience labs to apply some of these theories.

u/atomichumbucker · 1 pointr/neurology

hmm, Im confused... For one, it seems like people in this forum do agree with me. Additionally, I think there are enough people here with some background understanding of basic neurology... heck, anyone who has ever read any Oliver Sacks can be interested.

Im not asking that we have a technical discussion of the benefits of a 3 hour versus 4.5 hour window of tPA administration... no, I just want to have a conversation about actual neurological topics.

I am also not say we need to focus on textbook/well-established science. There is a great deal of new evidence and interesting case reports that call into question currently held beliefs. Even anecdotal data that is just interesting for its presentation's sake.


I do not think we are interested in isolating neurology from the basic and behavioural sciences. But I do think we need to at the very least present actual science and not baseless personal theories.

  • However more importantly I think the confusion here can best be summed up by a fundamental lack of understanding about neural physiology on your behalf. You keep mentioning processing power as a function of metabolism and energy as a function of... Well Im guessing you mean ionic potentials). This is simply wrong. A neuron that fires more frequently is not processing it is just firing. Just as a wire that is at a high voltage is not a computer. It is the connections (and aberrant connections) that determine processing capability. A neuron that is more frequently being acted upon will have an increased metabolic demand to maintain its ionic potentials, but this is an effect rather than a cause. Similar to how a computer processor ( a network of micro capacitors) gets hot when being actively used.

  • Speculating on neural computational power is a very active field known as Computational neuroscience. I strongly recommend Dayon and Abbot's "Theoretical Neuroscience" as a guide into this field. Mind you, its heavy in linear algebra and not by any means a beach read. While it is not necessarily neurology, it does become important for neurologists to have an understanding of this and so obviously topics in this field are more than welcome here as well. An example of how this is important is in the development of new prosthesis and the brain/machine interaction. This is also interesting to think about from the pathophysiological stand point in epilepsy and traumatic brain injury.

  • It appears you attend a DO school. I am certain that the MCAT requires at least some basic physiology, and medical schools also require coursework in physiology, cell biology, and neuroscience in their pre-clinical years. I am concerned because some of what you have said in this forum represented a severe misunderstanding of how the nervous system operates. This will come up on your boards, and more importantly, in your future patients.
u/RobotSpaceBrain · 1 pointr/artificial

I have heard good things about this book, I've only read the 1st chapter, but I think it's good for Theoretical Neuroscience:

http://www.amazon.com/Theoretical-Neuroscience-Computational-Mathematical-Modeling/dp/0262541858

u/pratchett2 · 1 pointr/neuroscience

First, on your broader point, you may want to look for programs that stress first-year rotations. I had a BME background, and now do neuroscience related research for my PhD, and joining a department that didn't force me to immediately join a lab was key. I second neuro_exo, it's hard to imagine a top university that won't have multiple people studying the areas you're interested in.

On your more specific question, what sort of math you should review depends on the sort of neuroscience you're talking about.

If you're referring to theoretical neuroscience/modeling, Dayan and Abbott is a standard reference. It includes the broader neuroscientific context as well as the math, so it's quite rewarding to read.

If you're talking about motor neuroscience/learning, a lot of the ideas derive from linear algebra and controls. Watch a few machine learning lectures, review those topics and you should be set.

A lot of the new ideas/excitement has recently focused on techniques to handle high dimensional datasets (see some of the discussion behind the BRAIN initiative). This gets into some rather complex math pretty quickly, so there's not too much I'd directly recommend, except that you check out recent papers in the field to see what you'd need (there's typically a lot of dynamical systems work here).

Most of the rest of neuroscience does use a fair amount of math, but they what it uses tends to be very vague/operational. You'll do a lot of signal processing, a lot of digital filtering/averaging, and noise reduction will be a major focus. Review your EE class notes to get set for this.

Edit: This was coincident with neuro-exo's response. I agree with everything he/she said.

u/[deleted] · 1 pointr/math

What type of math are you interested in?

If you like dynamical systems, there's this free book:

www.izhikevich.org/publications/dsn.pdf

If you think you'd be more interested in neural coding instead of neural dynamics (probability, stochastic processes, stats versus dynamical systems and bifurcation theory), there's this book:

http://www.amazon.com/Theoretical-Neuroscience-Computational-Mathematical-Modeling/dp/0262541858

I haven't read this one, but it's quite popular and newer (hence probably more up-to-date with current research and methods):

http://www.amazon.com/Mathematics-Neuroscientists-Fabrizio-Gabbiani/dp/0123748828/ref=sr_1_1?ie=UTF8&qid=1309541025&sr=8-1

You can also check out comp neuro articles on scholarpedia:

http://www.scholarpedia.org/article/Encyclopedia_of_computational_neuroscience

u/technically_art · 1 pointr/neuroscience

I'll try to address your questions first and give general advice at the end.

> many of these expressions have a summation of delta functions over index k. One major problem I have is that I do not know how far back my window should go when considering previous spikes. Should it just be my time increment dt=0.1ms? Or more?

This is often up to the modeler, but Dayan & Abbott's textbook has a section comparing the pros and cons of computing for single spikes vs. sliding windows vs. full history. One reasonable first approach would be to find out how long it takes for a single spike event to decay to the point of being neglible (say, 1/100th of total depolarization) and use that as your window size.

>Another issue I'm having is that I'm confused by what they mean by w+ and w- when talking about Hebbian learning. Are these fixed values?

I think w^+ is the upper bound on weights, w^- is the lower bound. They're using a non-normalized scheme where w^+ or w^- is compared against 1 to determine synapse strength - w < 1 means depression, w > 1 means potentiation.

> Also, why does the expression for I_GABA not have any dependence on w_j? Shouldn't there be some reliance on synaptic connectivity between presynaptic and postsynaptic neurons?

I'm not sure how the weights are being folded into the input current equations, but it's possible that I_GABA isn't affected by synaptic strength - they could compute each input current individually and scale them based on weights, for example.

---------------------


This definitely isn't a beginner-friendly model or paper. Are you recreating as part of a class project, or for a lab? Don't be shy to ask colleagues for help, or even your PI (just make sure you know exactly what you're going to ask and why.) If there isn't a harsh time constraint, I'd recommend checking out a textbook or some other modeling papers from the same lab, and/or citations from this paper.

One thing that experimentalists often have trouble with when trying to reproduce a model is that modelling is not an exact science. You're allowed to mess around with equations, parameters, thresholds and windows to make it work. For every clean equation in the paper, there are 3 or 4 very ugly equations or hacks making the graphs look pretty...it's not ideal, but that's the way the field is and has been for a long time. The point being - keep trying different things until it works. If you're close to the original model, great. If not, find out what new feature in your model makes it work, and see if you can find where the original model addressed that problem.

Good luck!