(Part 3) Best computer science books according to redditors

Jump to the top 20

We found 9,284 Reddit comments discussing the best computer science books. We ranked the 1,900 resulting products by number of redditors who mentioned them. Here are the products ranked 41-60. You can also go back to the previous section.

Next page

Subcategories:

AI & machine learning books
Systems analysis & design books
Cybernetics books
Information theory books
Computer simulation books
Human-computer interaction books

Top Reddit comments about Computer Science:

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/HeterosexualMail · 187 pointsr/programming

We did something similar as well. The labs were tons of fun. I remember having to run a couple dozen lines of code through the CPU cache on a test once, including some sneakery of using code as data at one point. I do appreciate having done it, but I'm not sure how much practical lasting value that really contributed.

That said, for those who are interested in this there is The Elements of Computing Systems: Building a Modern Computer from First Principles, more commonly known as "NAND to Tetris".

Petzold's Code is excellent as well.

Edit: Actually, while I've suggested those two let me throw Computer Systems: A Programmer's Perspective into the mix. It's a book we used across two courses and I really enjoyed it. We used the 2nd edition (and I have no issue recommending people get a cheaper, used copy of that), but there is a 3rd edition now. Being a proper text book it's stupidly priced (you can get Knuth's 4 book box set for $30 more), but it's a good book.

Anyone have suggestions similar to that Computer Systems's text? I've always wanted to revisit/re-read it, but could always used a different perspective.

u/electricfistula · 64 pointsr/changemyview

If you are interested, and want a real challenge to your view, I strongly recommend Nick Bostrum's Superintelligence.

A key takeaway is imagining intelligence on this kind of scale. That is, our intuition says Einstein is much smarter than the dumbest person you know. Yet, the dumb guy and Einstein have the same hardware, running about the same number of computations. The difference between a bee and a mouse or a mouse and a chimp is orders of magnitude. The difference between an idiot and a genius is very small in comparison.

AI will seem very stupid to the human observer until almost exactly the point it becomes amazingly brilliant. As AI overtakes bees (probably has already) and mice it will still seem dumb to you. As it overtakes chimps, still much dumber than the dumbest person you know. As it draws even to the moron, you might think that the AI has a lot of work left to go. Instead, it's almost unfathomably brilliant.

The distance between bee and chimp is 1,000,000,000 and the difference between moron and smartest man who ever lived is 5.

u/fusionquant · 46 pointsr/algotrading

First of all, thanks for sharing. Code & idea implementation sucks, but it might turn into a very interesting discussion! By admitting that your trade idea is far from being unique and brilliant, you make a super important step in learning. This forum needs more posts like that, and I encourage people to provide feedback!

Idea itself is decent, but your code does not implement it:

  • You want to holds stocks that are going up, right? Well, imagine a stock above 100ma, 50ma, 20ma, but below 20ma and 10ma. It is just starting to turn down. According to your code, this stock is labeled as a 'rising stock', which is wrong.

  • SMAs are generally not cool. Not cool due to lag of 1/2 of MA period.

  • Think of other ways to implement your idea of gauging "going up stocks". Try to define what is a "stock that is going up".

  • Overbought/oversold part. This part is worse. You heard that "RSI measures overbought/oversold", so you plug it in. You have to define "Overbought/oversold" first, then check if RSI implements your idea of overbought/oversold best, then include it.

  • Since you did not define "overbought / oversold", and check whether RSI is good for it, you decided to throw a couple more indicators on top, just to be sure =) That is a bad idea. Mindlessly introducing more indicators does not improve your strategy, but it does greatly increase overfit.

  • Labeling "Sell / Neutral / Buy " part. It is getting worse =)) How did you decide what thresholds to use for the labels? Why does ma_count and oscCount with a threshold of 0 is the best way to label? You are losing your initial idea!
    Just because 0 looks good, you decide that 0 is the best threshold. You have to do a research here. You'd be surprised by how counter intuitive the result might be, or how super unstable it might be=))

  • Last but not least. Pls count the number of parameters. MAs, RSI, OSC, BBand + thresholds for RSI, OSC + Label thresholds ... I don't want to count, but I am sure it is well above 10 (maybe 15+?). Now even if you test at least 6-7 combinations of your parameters, your parameter space will be 10k+ of possible combinations. And that is just for a simple strategy.

  • With 10k+ combinations on a daily data, I can overfit to a perfect straight line pnl. There is no way with so many degrees of freedom to tell if you overfit or not. Even on a 1min data!

    The lesson is: idea first. Define it well. Then try to pick minimal number of indicators (or functions) that implement it. Check for parameter space. If you have too many parameters, discard your idea, since you will not be able to tell if it is making/losing money because it has an edge or just purely by chance!

    What is left out of this discussion: cross validation and picking best parameters going forward

    Recommended reading:
  • https://www.amazon.com/Building-Winning-Algorithmic-Trading-Systems/dp/1118778987/
  • https://www.amazon.com/Elements-Statistical-Learning-Prediction-Statistics/dp/0387848576/
u/Enlightenment777 · 42 pointsr/ECE

-----
-----

BOOKS


Children Electronics and Electricity books:

u/CedarLittle · 40 pointsr/IWantToLearn

Here are two cents from an ex-software engineer and current product manager with 7 years of professional engineering experience- This is what separates successful engineers from mediocre programmers: you need to realize from the outset of your study that by learning a programming language, all you are doing is learning how to communicate with a computer. The much more interesting, challenging, and rewarding component of learning to be a good programmer lies in learning how algorithms work and can be applied to different problems. I work in digital video ad tech, and while most of my work has been on the Flash, Android, iOS, and other web-tech platforms, being able to operate in these environments is only the bare minimum requirement for being able to actually do something with them. For example, in the last three months I've needed to research graph clustering algorithms, JavaScript and Action Script memory profiling methods, and also design a first draft of a network protocol for a new system functionality we're daydreaming about. The point is that knowing how to program doesn't actually help you solve very many problems, programming is just there as a tool to be used. The moral to take away here is that you should try to widen your breadth of knowledge about how algorithms and computers work. I highly recommend this book for getting started: Python Programming by John Zelle

Ok, that was sufficiently long winded.

u/Tiberius1900 · 33 pointsr/learnprogramming

To get a feel for low-level computing you should learn C. All modern operating systems and low level utilities are written in C (or C++, which is C with objects). It is as close to the metal as you can get while still being useful. Maybe you could fiddle around with some assembly afterwards.

Now, as for understanding how an operating system form top to bottom works, Windows is a pretty shit place to start for the following reasons:

  • Proprietary nature means little documentation about how the OS actually works internally.
  • Single desktop environment and lack of naked shells makes it hard to understand how and why some things work.
  • Limited capabilities for programming without an IDE, which is what you should be doing if you want to learn C (note that I said learn C. Particularly in the context of understanding, say, how data streams and the like work, programming without an IDE is infinitely better).
  • etc.

    Instead, you should learn Linux, and learn how Linux works. Installing it in a VM is fine. If you're looking to learn, I suggest you start with Debian, and, after you get comfortable with the command line, move to Arch. Arch is great for learning, if not much else, because it makes you do most things manually, and has a pretty extensive wiki for everything you may need to know.

    Resources:

    A Linux tutorial for beginners: https://linuxjourney.com

    A pretty decent online C tutorial (note, you should compile the programs on your own system, instead of doing their online exercises): http://www.learn-c.org

    K&R2 (the "proper" way to learn C): https://www.amazon.com/dp/0131103628/ref=cm_sw_r_cp_awdb_t1_FSwNAbFDJ3FKK

    Computer Systems A Programmer's Perspective, a book that might just be what you're looking for: https://www.amazon.com/Computer-Systems-Programmers-Perspective-3rd/dp/013409266X
u/zorfbee · 32 pointsr/artificial

Reading some books would be a good idea.

u/raz_c · 30 pointsr/programming

First of all you don't need to write a billion lines of code to create an OS. The first version of Minix was 6000 lines long. The creator of Minix, Andrew Tanenbaum, has a great book called Modern Operating Systems, in which he explains the inner workings of several famous operating systems.

Considering the emphasis on "from scratch", you should also have a very good understanding of the underlying hardware. A pretty good starter book for that is Computer Organization and Design by David A. Patterson and John L. Hennessy. I suggest reading this one first.

​

I hope this can get you started.

u/majordyson · 29 pointsr/MachineLearning

Having done an MEng at Oxford where I dabbled in ML, the 3 key texts that came up as references in a lot of lectures were these:

Pattern Recognition and Machine Learning (Information Science and Statistics) (Information Science and Statistics) https://www.amazon.co.uk/dp/0387310738/ref=cm_sw_r_cp_apa_i_TZGnDb24TFV9M

Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning Series) https://www.amazon.co.uk/dp/0262018020/ref=cm_sw_r_cp_apa_i_g1GnDb5VTRRP9

(Pretty sure Murphy was one of our lecturers actually?)

Bayesian Reasoning and Machine Learning https://www.amazon.co.uk/dp/0521518148/ref=cm_sw_r_cp_apa_i_81GnDbV7YQ2WJ

There were ofc others, and plenty of other sources and references too, but you can't go buying dozens of text books, not least cuz they would repeat the same things.
If you need some general maths reading too then pretty much all the useful (non specialist) maths we used for 4 years is all in this:
Advanced Engineering Mathematics https://www.amazon.co.uk/dp/0470646136/ref=cm_sw_r_cp_apa_i_B5GnDbNST8HZR

u/Rexutu · 27 pointsr/gamedesign

I recommend Tynan Sylvester's book "Designing Games" (you can get a 7-day free trial of the e-book on Amazon). A lot of people will recommend A Book of Lenses by Jesse Schell but I personally felt it lacked substance. For the more philosophical aspects of the craft, here are some talks that I think are valuable 1 2 3
4 5 (hopefully ordered in a somewhat logical progression).

Another thing -- find out what kind of games you want to make, find out who makes that kind of game (a few examples: Jonathan Blow for puzzle games, Raph Koster and Project Horseshoe for MMOs, Tom Francis for whatever the fuck he makes, etc. -- and "kind" does not necessarily mean genres), and study what those people have to say, figuring out what you agree with and disagree with. Standing on others' shoulders is the easiest way to get good and the best path toward making games of true quality.

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/redneon · 22 pointsr/gamedev

This is one of my favourite books on the subject and one that I regularly recommend: 3D Math Primer for Graphics and Game Development

u/nsfmc · 21 pointsr/programming

he lost me when he said this about GEB
> If you only read one book on this list, it should be this one.

seriously? it's not that i don't appreciate the sentiment, but things douglas hofstadter thinks are neat is no substitute for any single book on the rest of the list unless you

  • have no other way to explain at cocktail parties what you studied at school
  • try to sound smart at cocktail parties by talking about things in GEB without actually referencing the book.

    for my part, i'd add sipser's computation book and why not throw in some ken thompson in there as an amuse bouche?
u/Jesseroonie · 20 pointsr/vim

[edit: Missed a line when I saved; I'm not using vim keys in my browser] I loved this article! I hope you don't mind if I share a story.

I've been in IT professionally for 14 years, and computing for 35 years. I taught myself vim for log analysis, and it's paid off in so many ways that I evangelize about it.

Vim, by itself, is honestly only a bit useful if you give it only a surface examination. You can use Visual Mode (control V, then movement keys) to select blocks of text for removal, cutting, or pasting. However, a copy of Notepad++ will do the same, so at that level, meh. It's admittedly not user friendly at all coming in from the outside.

However, if you're willing to learn some Regex (and I can't recommend Mastering Regular Expressions (Amazon) strongly enough, then that's where vim really shines. I use it for cleaning up data daily, and between a strong understanding of regular expression, visual mode selection, and macros, I've done cleanup of data coming from odd sources in minutes that would have taken hours of work manually.

I'd love to give you a flat recipe on how to do some of what I do, but the thing about vim and data cleanup work is that the data coming at you for analysis is irregular, so your approach must be flexible. You learn the tools, not a recipe or two, and it takes time. I spent weeks in vim before it really clicked.

I spent time first in vimtutor (packaged with vim), and then w/ Mastering Regular Expressions. Once I had at least a rough understanding of :substitute (vim.wikia), that opened a lot of doors, and that's when I dropped using any other editor for raw text.

Here's an example of some of the things I find myself doing. I had a client the other day who had performed a change to an inventory control system that was filled with mistakes; the system has no undo, and the only record she had left (due to continued PEBKAC) was a PDF report she'd run before making things worse. Said PDF had data split across multiple lines, otherwise filled with garbage and oddball white spaces, and also had some duplicate rows within it. I exported it to text.

I needed to trim out all but two different forms of lines, then eliminate duplicates, then join the two altered line elements together into a single delimited line, flipping the last field from a negative to a positive number (or vice versa) which I could then use to make the corrections to her system. The end result was a 20,000 line file, so it was a lot longer before that.

I was able to transform the PDF in about 20 minutes.

Now, I've learned a few languages since, and I very well could have written a tool to do the job for me, however my relationship with massaging text in vim is such that, thanks to undo/redo, and the experience I've built up, it's actually faster for me to manipulate the file in real time than to alter script, execute, examine, adjust script, and repeat. Vim excels at working with large files, so while it's theoretically possible to do the same work in other editors, I've not found one that will handle massive sets of data with the same speed.

One other thought; if you are also willing to learn a little bit of Unix and pipeline, then invoking shell commands opens up a lot more doors, since Unix has massive amounts of tools and scripts available.

I really hope this helps you!

u/MerlinTheFail · 20 pointsr/gamedev

In terms of design, A book of lenses is definitely a fantastic pick. I also read through: Engineering experiences, this helped me a lot design-wise. Also, of course, Game programming patterns is a must read for any programmer.

u/yggdrasilly · 19 pointsr/compsci
u/myrrlyn · 18 pointsr/learnprogramming

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

This book is an excellent primer for a bottom-up look into how computers as machines function.

https://www.amazon.com/gp/aw/d/0123944244/ref=ya_aw_od_pi

This is my textbook from the class where we built a CPU. I greatly enjoy it, and it also starts at the bottom and works up excellently.

For OS development, I am following Philipp Opperman's excellent blog series on writing a simple OS in Rust, at http://os.phil-opp.com/

And as always Wikipedia walks and Reddit meanders fill in the gaps lol.

u/edwardkmett · 17 pointsr/programming

Three books that come to mind:

Types And Programming Languages by Benjamin Pierce covers the ins and outs of Damas-Milner-style type inference, and how to build the bulk of a compiler. Moreover, it talks about why certain extensions to type systems yield type systems that are not inferrable, or worse may not terminate. It is very useful in that it helps you shape an understanding to understand what can be done by the compiler.

Purely Functional Data Structures by Chris Okasaki covers how to do things efficiently in a purely functional (lazy or strict) setting and how to reason about asymptotics in that setting. Given the 'functional programming is the way of the future' mindset that pervades the industry, its a good idea to explore and understand how to reason in this way.

Introduction to Algorithms by Cormen et al. covers a ton of imperative algorithms in pretty good detail and serves as a great toolbox for when you aren't sure what tool you need.

That should total out to around $250.

u/christianitie · 17 pointsr/math

I would guess that career prospects are a little worse than CS for undergrad degrees, but since my main concern is where a phd in math will take me, you should get a second opinion on that.

Something to keep in mind is that "higher" math (the kind most students start to see around junior level) is in many ways very different from the stuff before. I hated calculus and doing calculations in general, and was pursuing a math minor because I thought it might help with job prospects, but when I got to the more abstract stuff, I loved it. It's easily possible that you'll enjoy both, I'm just pointing out that enjoying one doesn't necessarily imply enjoying the other. It's also worth noting that making the transition is not easy for most of us, and that if you struggle a lot when you first have to focus a lot of time on proving things, it shouldn't be taken as a signal to give up if you enjoy the material.

This wouldn't be necessary, but if you like, here are some books on abstract math topics that are aimed towards beginners you could look into to get a basic idea of what more abstract math is like:

  • theoretical computer science (essentially a math text)

  • set theory

  • linear algebra

  • algebra

  • predicate calculus

    Different mathematicians gravitate towards different subjects, so it's not easy to predict which you would enjoy more. I'm recommending these five because they were personally helpful to me a few years ago and I've read them in full, not because I don't think anyone can suggest better. And of course, you could just jump right into coursework like how most of us start. Best of luck!

    (edit: can't count and thought five was four)
u/flaz · 17 pointsr/philosophy

You might be interested in a book called On Intelligence, by Jeff Hawkins. He describes something similar to your simulations idea, but he calls it a predictive hierarchical memory system (or something like that). It is a fascinating idea, actually, and makes a lot of sense.

I too suspect that speech is a central unifying aspect to what we call consciousness. A lot of AI guys seem to agree. There is a theory by Noam Chomsky (I think), called Universal Grammar. As I recall, he suspects that may be key to modern intelligence, and he suspects the genetic mutation for it happened about 70,000 years ago, which gave us the ability to communicate, and allowed Homo Sapiens to successfully move out of Africa. I've also read that mutation 70k years ago referred to as the cognitive revolution. But it seems everyone agrees that's when the move out of Africa began, and communication started; it's not just a Chomsky thing.

u/am_i_wrong_dude · 16 pointsr/medicine

I've posted a similar answer before, but can't find the comment anymore.

If you are interested in doing your own statistics and modeling (like regression modeling), learn R. It pays amazing dividends for anyone who does any sort of data analysis, even basic biostats. Excel is for accountants and is terrible for biological data. It screws up your datasets when you open them, has no version control/tracking, has only rudimentary visualization capabilities, and cannot do the kind of stats you need to use the most (like right-censored data for Cox proportional hazards models or Kaplan-Meier curves). I've used SAS, Stata, SPSS, Excel, and a whole bunch of other junk in various classes and various projects over the years, and now use only R, Python, and Unix/Shell with nearly all the statistical work being in R. I'm definitely a biased recommender, because what started off as just a way to make a quick survival curve that I couldn't do in Excel as a medical student led me down a rabbit hole and now my whole career is based on data analysis. That said, my entire fellowship cohort now at least dabbles in R for making figures and doing basic statistics, so it's not just me.

R is free, has an amazing online community, and is in heavy use by biostatisticians. The biggest downsides are

  • R is actually a strange and unpopular general programming language (Python is far superior for writing actual programs)
  • It has a steep initial learning curve (though once you get the basics it is very easy to learn advanced techniques).

    Unfortunately learning R won't teach you actual statistics.... for that I've had the best luck with brick-and-mortar classes throughout med school and later fellowship but many, many MOOCs, textbooks, and online workshops exist to teach you the basics.

    If I were doing it all over again from the start, I would take a course or use a textbook that integrated R from the very beginning such as this.

    Some other great statistical textbooks:

  • Introduction to Statistical Learning -- free legal PDF here -- I can't recommend this book enough
  • Elements of Statistical Learning -- A masterpiece of machine learning and modeling. I can't pretend to understand this whole book, but it is a frequent reference and aspirational read.

    Online classes:
    So many to choose from, but I am partial to DataCamp

    Want to get started?

  • Download R directly from its host, CRAN
  • Download RStudio (an integrated development environment for R that makes life infinitely easier) from its website (also free)
  • Fire up RStudio and type the following commands after the > prompt in the console:

    install.packages("swirl")

    library("swirl")

    swirl()

    And you'll be off an running in a built-in tutorial that starts with the basics (how do I add two numbers) and ends (last I checked) with linear regression models.

    ALL OF THAT SAID ------

    You don't need to do any of that to be a good doctor, or even a good researcher. All academic institutions have dedicated statisticians (I still work with them all the time -- I know enough to know I don't really know what I am doing). If you can do your own data analysis though, you can work much faster and do many more interesting things than if you have to pay by the hour for someone to make basic figures for you.
u/ultraliks · 16 pointsr/datascience

Sounds like you're looking for the statistical proofs behind all the hand waving commonly done by "machine learning" MOOCS. I recommend this book. It's very math heavy, but it covers the underlying theory well.

u/OddCoincidence · 15 pointsr/compsci

I think Digital Design and Computer Architecture is the best book for this.

u/quixotidian · 15 pointsr/compsci

Here are some books I've heard are good (I haven't read them myself, but I provide commentary based on what I've heard about them):

u/42e1 · 12 pointsr/compsci

If you're interested in learning more about Turing's paper that introduced the Turing Machine, I highly recommend the book The Annotated Turing. It's by the same person who wrote Code, which is an oft-recommended book on this sub-reddit.

u/Airballp · 12 pointsr/C_Programming

It doesn't precisely teach C, but I think my favourite textbook ever might be what you're looking for:

Computer Systems: A Programmer's Perspective

It was an optional textbook in my intro to systems class, but I honestly think it should have been required reading.

u/biglambda · 12 pointsr/haskell

I highly recommend The Haskell School of Expression by the late great Paul Hudak. Also you should learn as much as you can about Lambda Calculus in general like for example this paper.
After that you should learn as much as you can about types, Types and Programming Languages is really important for that.
Finally don't skip the important fundamental texts, mainly Structure and Interpretation of Computer Programs and the original video lectures by the authors (about the nerdiest thing you will ever watch ;)

u/nkk36 · 12 pointsr/datascience

I've never heard of that book before, but I took a look at their samples and they all seem legitimate.

I would just buy the Ebook for $59 and work through some problems. I'd also maybe purchase some books (or find free PDFs online). Given that you don't have a deep understanding of ML techniques I would suggest these books:

  1. Intro to Statistical Learning
  2. Data Science for Business

    There are others as well, but those are two introductory-level textbooks I am familiar with and often suggested by others.
u/fj333 · 11 pointsr/compsci
u/gunder_bc · 11 pointsr/learnprogramming

Learn some math, yes. Algebra, Discrete Math, Inductive Logic, Set Theory. Calc and Matrix Algebra are good for specific things, and just in general to beef up your math skills. But don't get hung up on it too much. It's a good thing to always have going in the background.

Start to think about how all computation is math - check out The Annotated Turing and really wrap your head around both what Petzold is talking about and what Turing is talking about.

That may require you to take a step back and study Formal Languages, Finite State Machines, and other related concepts (all the stuff that lets you build up to Regular Expressions, etc). Turnings thesis really gets to the heart of a computation, and those concepts build on that.

Then go learn LISP just to bend your brain some more.

Comp Sci is a fascinating subject, and you're off to a good start by thinking about what that Stack Overflow commenter meant - how are all languages similar? How do they differ? What's the underlying problem you're solving, and what are different ways of solving it? How do the tools you choose alter your solution?

Try writing something relatively simple (say, a program that plays Checkers or Tic-Tac-Toe and always wins) in a few different languages (start with ones you know, then learn some new ones to see how - if you've worked with Procedural or OO languages, try Functional ones).

u/jacobolus · 11 pointsr/math

Your post has too little context/content for anyone to give you particularly relevant or specific advice. You should list what you know already and what you’re trying to learn. I find it’s easiest to research a new subject when I have a concrete problem I’m trying to solve.

But anyway, I’m going to assume you studied up through single variable calculus and are reasonably motivated to put some effort in with your reading. Here are some books which you might enjoy, depending on your interests. All should be reasonably accessible (to, say, a sharp and motivated undergraduate), but they’ll all take some work:

(in no particular order)
Gödel, Escher, Bach: An Eternal Golden Braid (wikipedia)
To Mock a Mockingbird (wikipedia)
Structure in Nature is a Strategy for Design
Geometry and the Imagination
Visual Group Theory (website)
The Little Schemer (website)
Visual Complex Analysis (website)
Nonlinear Dynamics and Chaos (website)
Music, a Mathematical Offering (website)
QED
Mathematics and its History
The Nature and Growth of Modern Mathematics
Proofs from THE BOOK (wikipedia)
Concrete Mathematics (website, wikipedia)
The Symmetries of Things
Quantum Computing Since Democritus (website)
Solid Shape
On Numbers and Games (wikipedia)
Street-Fighting Mathematics (website)

But also, you’ll probably get more useful response somewhere else, e.g. /r/learnmath. (On /r/math you’re likely to attract downvotes with a question like this.)

You might enjoy:
https://www.reddit.com/r/math/comments/2mkmk0/a_compilation_of_useful_free_online_math_resources/
https://www.reddit.com/r/mathbooks/top/?sort=top&t=all

u/y7qe · 11 pointsr/compsci

Aesthetics are almost never emphasized in university courses, but it's really important in practice. If you can write beautiful (i.e. readable/understandable, concise, efficient) code, you'll have a big leg up folks who bang out spaghetti code and pray it works.

Take a look at this book for famous 'beautiful' codes:
http://www.amazon.com/gp/aw/d/0596510047

If you're more interested in large scale software systems, dive into a widely used open source package and read the source. MySQL, Python, Hadoop, Rails, the complete Linux OS and many more are available for your autodidactic pleasure.

u/dyoum3_ml4t · 11 pointsr/cscareerquestions

https://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539

Probably one of the best books I've read for DS with Java.

u/JJinVenice · 11 pointsr/askscience

Your brain uses memory as a way to anticipate the effort required in these situations. There is a book called On Intelligence by Jeff Hawkins that discusses this. You've opened thousands of doors, lifted thousands of objects. Your brain remembers how it felt to engage in that activity. So when you approach a door, your brain sees what type of door it is and anticipates how much effort will be required to open it. Sometimes your brain gets it wrong.

edit: a word

u/lyinch · 11 pointsr/EmuDev

Hey,

It looks like you haven't actually started with your emulator development. Begin to check out the opcodes of Chip-8, have a look at the hardware and try to reason, why you won't have an x86 assembly file.

In addition, have a look at the difference between those numbers and those numbers.

It looks like your understanding of computer architecture is quite limited - nothing to be ashamed of - you might be better off by reading Digital Design and Computer Architecture by Harris and Harris or Introduction to Computing Systems: From Bits and Gates to C and Beyond by Patt and Patel than writing an emulator right now.

Both books give you an absolutely wonderful overview of computer architecture, starting at the boolean logic, the transistors and the gates you can form with them, going over combinatorial and sequential logic and ending at the design of your own CPU written on an FPGA with a HDL (which few programmer encounter in their career). The latter even gives you a detailed overview of multicycle processors, and goes beyond that to analyse GPUs and some more modern techniques to improve multicycle processors.

As I'm giving out advice, you might want to have a look at openGL which is used to render modern games or SDL which is more common in the emudev community. (be aware that SDL is a few abstraction layers above openGL and relies on it)

u/sriramalka · 11 pointsr/compsci

This is a good list. I'd definitely not go for Winskel's book, and I'd skip the Dragon book because I think it is quite out of date (nothing on type systems, or first-class functions or even objects).

For languages, I'd go with Andrew Appel's Modern Compiler Implementation in Java (or "in ML"), and for language semantics, I'd go with
Semantics with Applications: An Appetizer, followed by Pierce's Types and Programming Languages (TAPL).

Also, Knuth's Concrete Mathematics is very very good, but is heavily skewed towards number theory. It is a must-read if one's interests are in crypto, but won't help you at all if for example, you want to work with databases, theorem proving, programming languages etc. You need a solid set theory foundation. Or if you were interested in graphics or machine learning, I'd go for a solid linear algebra foundation.

There are no systems books on your list, so I'd suggest one on operating systems: it is free and very good.
Operating Systems: Three Easy Pieces

u/Fruitcakey · 10 pointsr/learnprogramming

Well, I'm entering my final year of my degree. I've did it the hard way with lots of fanny-arsing and making life difficult for myself. I strongly recommend you don't do it my way.
In my experience, in your first year you won't get exposed to a great deal of code, nothing a clever university student can't cope with.

If I was to re-do my degree, I would get a a grasp on the more theoretical side early on. Over the next 4 years you'll be doing plenty of programming, what programming you can cram into the summer, ultimately won't count for much.

Early on you will be exposed to logic gates. AND, NOR, XOR, NAND etc. Think of these as the smallest possible, circuit level, building blocks that computers run on. You can construct all types of logic gates using the universal gates NOR and NAND. If you can teach yourself that (which I don't think would be too difficult) then you'll be ahead in at least one of your classes.

At some point you will have to learn a functional programming language, and if you're looking at it for the first time, it's a complete mind-fuck.
http://learnyouahaskell.com/
That's an excellent resource for learning haskell, arguably the most popular functional language. If you manage to work through some of that, you'll be miles ahead of your class mates still struggling with the concept.

You will likely have some sort of data structures and algorithms class, so if I were you, I'd familiarise myself with the main ones.
Learn the difference between an array and a linked list, how quick sort and merge sort work. Trees and Binary Trees, breadth first search vs. depth first search. Amongst others. Don't exhaust yourself over it, but at least get a flavour for it.

Maybe in first year, maybe in second. You'll start learning about Instruction set architectures, cache, operating systems and some assembly. If you're keen check out this:
http://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503
I genuinely can't recommend it enough.

Learning about the Internet Protocol Suite is probably a good idea. I found it really interesting and not too complex.


These are just my suggestions. In my opinion, they are manageable, can be self-taught, and will provide you with a good head start, cover a few bases. Sure, at some point you will need to delve into number theory and graph theory, and proof-by-inductions, but don't jump into the deep end too soon. You'll end up overwhelmed with big gaps in your knowledge. Hope it helps.

u/poincareDuality · 10 pointsr/compsci

For designing programming languages, my favorites are

u/krunk7 · 10 pointsr/programming

Absolutely.

Check out The Elements of Statistical Learning and Introduction to Machine Learning.

edit those books are about practical applications of what we've learning to date from the neural network style of pattern classification. So it's not about modeling an actual biological neuron. For modeling of the biology, it's been a while since I futzed with that. But when I wrote a paper on modeling synaptic firing, Polymer Solutions: An Introduction to Physical Properties was the book for that class. Damned if I remember if that book has the details I needed or if I had to use auxiliary materials though.

u/maybefbi · 10 pointsr/compsci

Title: On Computable Numbers, with an Application to the Entscheidungsproblem

Authors: Alan Turing

Link: http://plms.oxfordjournals.org/content/s2-42/1/230.full.pdf

Abstract: In just 36 pages, Turing formulates (but does not name) the Turing Machine, recasts Gödel's famous First Incompleteness Theorem in terms of computation, describes the concept of universality, and in the appendix shows that computability by Turing machines is equivalent to computability by λ-definable functions (as studied by Church and Kleene). Source

Comments: In an extraordinary and ultimately tragic life that unfolded like a novel, Turing helped break the German Enigma code to turn the tide of World War II, later speculated on artificial intelligence, fell victim to the homophobic witchhunts of the early 1950s, and committed suicide at the age of 41. Yet Turing is most famous for an eerily prescient 1936 paper in which he invented an imaginary computing machine, explored its capabilities and intrinsic limitations, and established the foundations of modern-day programming and computability. From his use of binary numbers to his exploration of concepts that today's programmers will recognize as RISC processing, subroutines, algorithms, and others, Turing foresaw the future and helped to mold it. In our post-Turing world, everything is a Turing Machine — from the most sophisticated computers we can build, to the hardly algorithmic processes of the human mind, to the information-laden universe in which we live. Source

u/dave1022 · 10 pointsr/compsci

The author has also written a crazy but brilliant book based on a lecture course he gave in 2006, entitled "Quantum Computing since Democritus" http://www.amazon.com/Quantum-Computing-since-Democritus-Aaronson/dp/0521199565

u/ontoillogical · 10 pointsr/programming
u/0PointE · 10 pointsr/compsci

Absolutely and unequivocally The New Turing Omnibus. There are all sorts of scientific questions and solutions along with proofs and use cases over the years usually surrounding computer science. I loved everything I'd learned in this book and still remember learning what the Hamming Distance is and how it helped us develop a reliable way to communicate at such long distances with satellites and other such things. I have a copy sitting on my shelf next to me atm so I got a photo for you to check out the TOC if you wish. Truly a light favorite of mine.

u/burntsushi · 9 pointsr/rust

> especially since the model of delegating un-fancy regexes to an NFA is what I thought that PCRE etc. were already doing

I think the prevalence of things like possessive quantifiers suggests this isn't the case. Even if you run simple regular expressions (as in, yes, really "regular") like (a*)*c against a string like aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa, it will take a very long time to execute. If PCRE dropped back to a real NFA, then such behavior wouldn't be observed.

To make things even more confusing, PCRE actually has something called an "alternative matching algorithm" exposed as pcre2_dfa_match (it's also in PCRE1). It is not actually a DFA, but it uses something called the "DFA algorithm," which was a misunderstanding that has spread to many minds. In particular, the PCRE folks consider their implementation to be an instance of the "NFA algorithm." The actual facts, AFAIK, are that the standard PCRE engine uses backtracking, not NFAs, and the "alternative matching algorithm" uses an actual NFA, not a DFA.

u/sleepingsquirrel · 9 pointsr/ECE
u/noeda · 9 pointsr/roguelikedev

The Baconist

(there's no bacon in this game despite the name: I originally was going to make a roguelike about stretching a really long bacon across the dungeon but it's going in a different direction now)

It can be played here: https://submarination.space/baconist132/index.html

Screenshot of one of the unfinished puzzles: https://submarination.space/baconist/puzzle.png

(move with vikeys or numpad or WASD or mouse)

This is a puzzle game where you need to solve puzzles. Right now that means you push boulders like in a sokoban because I haven't got around to implementing actual interesting mechanics yet.

Since past two weeks I've managed to lay down the most important technical foundations:

  • Performance has been slightly improved (still crappy on browser but it's playable; the game can be compiled to native version that runs in a terminal and it's way better in there).
  • Field of view now works more sensibly when you look through portals.
  • Your character can still remember parts of a level that have been seen before (they are shaded darker)
  • I now have a system that makes it fairly easy to design the entire dungeon (I essentially just have a giant text file that's interpreted and turned into world).

    My roguelike can also display animated things. I made my water look like all fancy and animated, a lot like in Brogue but I soon realized this is probably going to be a problem if I use water in puzzles and it has to stand out well from surroundings and look consistent. Sometimes boring-looking things are better. Overall my game has lots of flat colors.

    At this point my concerns are about designing the game mechanics themselves (as opposed to technical challenges).

    Pushing boulders gets boring quickly. I have some unfinished code that would add chain chomp -like enemies to the game and the puzzles would be about how to go around them or neutralize their threat. And I have a sketchbook where I threw in bunch of more ideas. My thinking is to implement wacky ideas and seeing what works. I also have a book on game design I'm going through, trying to educate myself what kind of (puzzle) game would be fun to play.

    I guess this is not really a roguelike. It's a puzzle game and the entire world right now is hand-crafted. There are no random elements to this game whatsoever. But I think that's fine.
u/InvisibleMan5 · 9 pointsr/gamedev

I highly recommend Real-Time Collision Detection.

This next book might not apply to your field directly, but I believe it is a good idea to be at the very least aware of what it discusses, and it is a very excellent book on its subject: The Art of Game Design: A Book of Lenses

I recommend this book as more of a reference than a tutorial; it will allow you to quickly brush up on those areas of math and physics which you will need while writing (or perhaps working with) a physics engine. I don't recommend attempting to learn the subjects through this book alone though. Game Physics

Reading 3D Math primer for Graphics and Game Development is how I learned linear algebra, although I plan on studying the subject from a textbook when I get the opportunity. I keep the book close for easy reference of the math related to 3D rendering (such as the projection and view matrices), although if you get this book you will want to read the errata document on its website. There may be better books to teach this stuff now, so please don't jump on it too hastily.

A couple books I do not own, but plan to correct that as soon as I can:
Game Physics Pearls and Real-Time Shadows

If I think of any others, I will edit this comment.

u/CastigatRidendoMores · 9 pointsr/IAmA

As far as I'm aware, they don't necessarily believe we are near human level AI. However, they do believe it is an inevitable eventuality (on our current track) that we should begin preparing for now - because if it's done wrong it has catastrophic consequences, while if done right can be the best thing that ever happened to us. I second the recommendation for Bostrom's book.

u/CyberByte · 9 pointsr/artificial

> Last few weeks I got very interested in AI and can't stop thinking about it. Watched discussions of philosophers about future scenarios with AI, read all recent articles in media about it.

Most likely you heard about the superintelligence control problem. Check out (the sidebar of) /r/ControlProblem and their FAQ. Nick Bostrom's Superintelligence is pretty much the book on this topic, and I would recommend reading it if you're interested in that. This book is about possible impacts of AI, and it won't really teach you anything about how AI works or how to develop it (neither strong nor weak AI).

For some resources to get started on that, I'll just refer you to some of my older posts. This one focuses on mainstream ("narrow"/"weak") AI, and this one mostly covers AGI (artificial general intelligence / strong AI). This comment links to some education plans for AGI, and this one has a list of cognitive architectures.

u/nsfwelchesgrapejuice · 9 pointsr/cscareerquestions

If you already have an engineering degree then you already know how to study. What experience do you have with embedded? If you don't have any then you should be sure it's what you want before you commit to anything huge.

I think the best way to get a job in embedded systems is to build embedded systems, and not bother with language certifcations. I might be going against the grain here a bit but I would suggest starting to dip your toes into embedded systems by buying an arduino and messing around with it.

Arduino gets a lot of flack for being "not real" embedded systems, and while it's true nobody is going to hire you because you can make an impressive arduino project, IMHO it's a great introduction to what embedded is about. The hardware equivalent of "hello world" is blinking an LED. If you are serious about learning then you will quickly outgrow the arduino, but you can always throw away the bootloader and try to program the ATmega with bare metal gcc and avrdude.

I don't know what you already know nor how you feel about math, but things you will want to learn include:

  • Analog electrical theory, DC and AC, resistance/capacitance/inductance. Understand basic circuit networks and input vs output impedance. Hopefully you remember complex numbers and frequency response. You don't need a lot of circuit theory but you will need to understand what a pull-up resistor is and why it's necessary. Depending on your math background you can get into filters, frequency response, fourier analysis. A good introduction here might be www.allaboutcircuits.com

  • Digital theory, starting with boolean algebra, logic gates, adders/multiplexers/flip-flops, all the way up to computer architecture. I like this book because it has a very holistic approach to this area https://www.amazon.com/Digital-Design-Computer-Architecture-Second/dp/0123944244/ref=sr_1_1?ie=UTF8&qid=1494262358&sr=8-1&keywords=harris+digital+design

  • Linux, C. Linux and C. You need to understand pointers, and the best way to understand C is to understand computer architecture. If you're not already running Linux, install linux, as well as gcc and build-essential. Start learning how to manipulate memory with C. Learning about computer architecture will help here. My favourite book on C is one of the classics: https://www.amazon.com/Programming-Language-Brian-W-Kernighan/dp/0131103628/ref=sr_1_1?ie=UTF8&qid=1494262721&sr=8-1&keywords=the+c+programming+language

    If you get this far and still want to become an embedded systems engineer then you're doing pretty well. I would say just try to build projects that utilize these skills. Maybe you can use your mech background to build a robot and then design the software to support it. Get used to reading datasheets for parts, and imagining what the digital logic inside looks like. Get used to searching google for answers to your questions.
u/nostrademons · 9 pointsr/reddit.com

It's more of a supply-demand mismatch. I wouldn't buy the books he's burning at any price, either because I've already read them or because I have no interest in reading them. Amazon is selling a used Hunt for Red October for $2.

OTOH, I'll willingly pay $50 for Types and Programming Languages because there's no other way I can get that information. All supply & demand.

u/bluecoffee · 8 pointsr/MachineLearning

If you're having to ask this, it means you haven't read enough textbooks for reading papers to make sense.

What I mean is that to make sense of most research papers you need to have a certain level of familiarity with the field, and the best way to achieve that familiarity is by reading textbooks. Thing is, if you read those textbooks you'll acquire a familiarity with the field that'll let you identify which papers you should focus on studying.

Now go read MLAPP cover to cover.

u/SteelPC · 8 pointsr/RimWorld

Tynan I am a huge fan of Rimworld as well as your book which everyone should check out

Designing Games: A Guide to Engineering Experiences https://www.amazon.com/dp/1449337937/ref=cm_sw_r_cp_apa_wh1XBbJFPD7ZB


If you don't mind, any hint on what is next for you? Any glimpse at what your next endeavor might be?

u/mflux · 8 pointsr/gamedesign

The citybound guy has been putting out daily blog posts of his city sim game programming. Wildly ambitious: http://blog.cityboundsim.com/

Not directly city game design but I highly recommend Rimworld creator's book Designing Games: Engineering Experiences for game design. I've emailed him a few times and he's very responsive and forthcoming with his wisdom.

I'm designing a city game myself right now. My theory on these games is that while they are experience engines in the sense that, for example, Sim City triggers your emotions with poverty, wealth, crime, health -- SC tends to be more like gardening: you plant seeds, water them, and see what comes out and much of the enjoyment of playing the game comes from that.

As far as programming goes, I went with a custom entity component system and am using an off the shelf engine (Unreal) to avoid the hard work of optimizing drawing tons of stuff (and lights) on screen.

u/TynanSylvester · 8 pointsr/gamedev
  1. I made a video and posted it on Reddit. https://www.youtube.com/watch?v=vV6wyeZ7458 There's no trick to it, you just have to make something that looks unique and interesting.

  2. You don't need to invest any money for soundtrack or voice before you've made any income. One exception would be a Kickstarter video soundtrack - but you can get music free or from sites like shockwavesound.com for really cheap.

    Spend money when the concept is a bit proven.

    Of course, this is all deep in the future; you have to make a good game first. Be sure to test with playtesters and watch how they play, using a proper test protocol. <plug :D>My book talks about this kind of thing: http://www.amazon.com/gp/product/1449337937 </plug>
u/aaron_ds · 8 pointsr/roguelikedev

While 2d math is common in roguelikes, 3d math is much more rare. Chapters 1-5 on coordinate systems, and vectors apply, but chapters 7-17 do not. While familiar with matrix math, I don't find it applicable to roguelike development. The visibility systems described in chapter 16 is not as far as I can tell from the preview on amazon applicable to roguelikes. In fact I'd be more confused if I read that and then tried to use it in a roguelike.

> By any means I would say that geometry (at any level) wouldn't be enough even for a RL. But right now I plan to focus on geometry, since it seems to be the most fundamental subject and the one I lack the most.

If I had to put together a primer on roguelike math:

u/snakesarecool · 8 pointsr/learnpython

That book is designed for experienced programmers in either another language or with Python to dive deeper. It isn't meant for pure beginners. I believe the author does have an "except for the brave" caveat, but if you are brand new to programming it isn't the best book to go with.

Take a look at Python Programming: An Introduction to Computer Science by John Zelle. It is used as a computer science 101 textbook in several places. It teaches Python in the context of basic computer topics, so it isn't just focused on the language.

http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418

Also, find practice projects to work on while you're learning. The Zelle book has a good amount of homework problems. Codecademy and Python Batting Practice will also take you pretty far, but you'll need to come up with something on your own.

u/BBQHonk · 8 pointsr/suggestmeabook

Fiction: Do Androids Dream of Electric Sheep?. The book that was the basis for Blade Runner.

Non-fiction: Superintelligence: Paths, Dangers, Strategies. This is a deep dive into the dangers posed by superintelligent AI. It's a heavy read.

u/SnowdogU77 · 7 pointsr/learnpython

Just to add some diversity to the potential suggestions, if you are trying to get into programming and are using Python as your gateway drug, I highly recommend Python Programming: An Introduction to Computer Science by John Zelle.

I wouldn't necessarily recommend it for a pure introduction into Python, as it sometimes avoids the Pythonic Way™ in favor of strategies that are more common, but it is an excellent introduction to CS/programming. The book, and the man himself, are what nurtured my love for CS.

u/PostmodernistWoof · 7 pointsr/MachineLearning

I've been reading and really enjoying "Superintelligence: Paths, Dangers, Strategies" by Nick Bostrom. https://www.amazon.com/gp/product/B00LOOCGB2

It's an easy read but it's a hard read because every couple sentences your brain wanders off thinking about consequences and stuff and you keep having to read the page over again.

He does a great job of covering in some depth all the issues surrounding the development of trans-human intelligence, whether it happens via "AI", some form of human augmentation, etc.

One of the better "here's a whole bunch of stuff to think about" books.

u/proteinbased · 7 pointsr/seancarroll

Scott Aaronson would be the ultimate mindscape podcast guest for me. I am actually hopeful that this will happen at some point, knowing that Sean and Scott know each other personally.
For the uninitiated, Scott is a quantum computing researcher, accomplished author and blogger extraordinaire.
If you have never heard of him, read this interview and you will agree that he would make a great guest.

u/fancysuit · 7 pointsr/learnprogramming

If you're going through decent books, then you shouldn't worry too much since most will cover what you need. Now, "school educated" programmers ... now they have knowledge gaps. Real world programming has taught me 99% of what I know.

That said, algorithms and data structures are a good foundation. Then programming patterns. Then read Beautiful Code.

Concurrency may be something to read up on too. Books teach this as well as professors do, but something that's overlooked by many people.

u/HealPlzDev · 7 pointsr/gamedesign

I really enjoyed Designing Games: A Guide to Engineering Experiences by Tynan Sylvester. (The dude behind Rimworld.)

He gives you a lot to think about without ever coming off as pretentious or preachy.

u/trixandhax · 7 pointsr/gatech

Here

Ubuntu is recommended, but you can use some other distro. However all of the support I'm doing is only for Ubuntu based distros. Some others have done things to get it working for other distros like Arch.

And here is the book

And as a bonus [here](https://drive.google.com/open?id=0B6g7zcZaFwPTVVI1eDBYcG1GbE0
) is a presentation I did last semester which includes an overview and some sample programs.

edit fixed 3rd link.

u/okiyama · 7 pointsr/learnprogramming

Yeah, it's actually really interesting! When I say ARM, MIPS and x86 I'm talking about the processor architecture. Basically, your desktop has a processor that speaks x86 whereas your smart phone only knows ARM. MIPS is primarily for low power things like routers. Learning an assembly language means learning a lot about processors, which is also a lot of fun.

If you do want to learn about all of that, this book is great. Well written and pretty fun to read

http://www.amazon.com/Digital-Design-Computer-Architecture-Edition/dp/0123944244

u/srnull · 7 pointsr/hardware

> Textbooks aren't much of a thing because so much information is available online and technology changes so fast.

That's

  • really
  • not
  • true

    and I'm not just pointing out that those books exist. They're really good resources!
u/old_TA · 6 pointsr/berkeley

Former 61C ugrad TA here. 61C is broken into 6 main ideas, which you can find on the last slide of the first lecture: http://www-inst.eecs.berkeley.edu/~cs61c/sp13/lec/01/2013Sp-CS61C-L01-dg-intro.pdf

From personal experience, 61C seems to be more difficult for most people than 61A or 61B. On the other hand, if you've been struggling with 61A or 61B, then 61C provides a much more level playing field - the material is new for pretty much everyone, so someone who's been programming since the beginning of high school doesn't have as much of an advantage as they do in the earlier classes.

Also I realize that the advice I'm about to give is devalued since I'm a former staff member, but if you want any type of A, READ THE BOOK CAREFULLY (the book I'm referencing is http://www.amazon.com/Computer-Organization-Design-Fourth-Edition/dp/0123747503/ref=dp_ob_title_bk). There are tons of subtleties in the material that we simply don't have enough time to cover in lecture/discussion/lab but are essential to doing well on projects/exams. The book is meaty, but probably the best book in the world for this material.

Feel free to respond to this if you have more questions.

u/cesclaveria · 6 pointsr/learnprogramming

One book that I really liked, that helps you understand a lot of "under the hood" stuff on the computer (from the hardware level and up) if Computer Organization and Design: The Hardware / Software Interface its quite approachable and I feel it helped me to learn the "MIPS 3000" assembly language. I used a previous version than the one I linked (I think I used the 2nd version), I see the new one added sections regarding multicore and multiprocessor architectures.

The book focuses a lot on MIPS but also has exercises/examples for ARM and X86/IA32.

u/Echohawkdown · 6 pointsr/TechnologyProTips

In the interim, I suggest the following books:

  • Digital Design and Computer Architecture, by Harris & Harris - covers the circuitry & hardware logic used in computers. Should also cover how data is handled on a hardware level - memory's a bit rusty on this one, and I can't find my copy of it right now. Recommend that you read this one first.

  • Computer Organization and Design, by Patterson & Hennessy - covers the conversion of system code into assembly language, which itself turns into machine language (in other words, covers the conversion of programs from operating system code into hardware, "bare metal" code). Knowledge of digital circuitry is not required before reading, but strongly recommended.

  • Operating System Concepts, by Silberschatz, Galvin & Gagne - covers all the basic Operating System concepts that each OS today has to consider and implement. While there are Linux-based ones, there are so many different Linux "flavors" that, IMO, a book that covers a specific Linux base (called a Linux kernel) exclusively would be incomplete and fail to address all the key aspects you'll find in modern OSes. Knowledge of coding is required for this one, and therefore should be read last.

     

    As for the coding books, I suggest you pick one up on Python or Java - I'm personally biased towards Python over Java, since I think Python's syntax and code style looks nicer, whereas Java makes you say pretty much everything you're doing. Both programming languages have been out for a long time and see widespread usage, so there's plenty of resources out there for you to get started with. Personally, I'd suggest going with this book for Java and this book for Python, but if you go to Coursera or Codecademy, you might be able to get better, more interactive learning experiences with coding.

    Or you can just skip reading all of the books I recommended in favor of MIT's OpenCourseWare. Your choice.
u/gtani · 6 pointsr/MachineLearning

Don't worry, you've demonstrated the ability to figure out whatever you need to get hired, you need to worry more about getting a place to live. probably you shd buy one of those shirts that says "Keep calm and carry on". You could cram on java performance tuning or kernel methods or hadoop or whatever and be handed a project that doesn't use it. Here's some "curricula", free books etc

http://web.archive.org/web/20101102120728/http://measuringmeasures.com/blog/2010/3/12/learning-about-machine-learning-2nd-ed.html

http://blog.zipfianacademy.com/post/46864003608/a-practical-intro-to-data-science

http://metaoptimize.com/qa/questions/186/good-freely-available-textbooks-on-machine-learning

http://www.amazon.com/Machine-Learning-Probabilistic-Perspective-Computation/product-reviews/0262018020/ (first review)

--------



http://people.seas.harvard.edu/~mgelbart/glossary.html

http://www.quora.com/Machine-Learning

http://www.quora.com/Machine-Learning-Applications

u/stepstep · 6 pointsr/compsci

The way I got started was working through the exercises in Benjamin C. Pierce’s Software Foundations. It starts from the absolute basics (which was good for me). Volume 1 covers logic, theorem proving, and other Coq fundamentals. Volume 2 discusses proving programs correct and proving the soundness of type systems. I haven't worked through Volume 3, but it covers verifying algorithms and data structures.

It also helps to have a good understanding of type theory in general. For me, that background knowledge came from another Benjamin C. Pierce book: Types and Programming Languages.

Once you know the basics of how logic works in Coq, I think the material from a discrete math course (like this) is probably a good source of proofs to try (logic, graphs theory, combinatorics, number theory, etc).

u/redjamjar · 6 pointsr/types

Hi, I'd recommend reading Benjamin Pierce's book "Types and Programming Languages". This is the best introduction to type systems and theory I've encountered:

http://www.amazon.com/Types-Programming-Languages-Benjamin-Pierce/dp/0262162091

Dave

u/RodeoMonkey · 6 pointsr/gamedev

Tynan Sylvester also wrote my favorite book on game design, which touches on emergence quite a bit. He has a simple, but good definition: "Emergence is when simple mechanics interact to create complex situations." Title is appropriate, Designing Games: A Guide to Engineering Experiences

https://www.amazon.com/Designing-Games-Guide-Engineering-Experiences/dp/1449337937/

u/Rikkety · 6 pointsr/AskComputerScience

Check out The Annotated Turing by Charles Petzold. It's Turing's paper on the Entscheidungsproblem which introduces Turing Machines, annotated with a lot of background information and some stuff about Turing's career. Very interesting stuff.

I can also recommend Code, by the same author which describes how a computer works from basic principles. It's doesn't have a lot of material on Turing, but it's certainly an interesting read for anyone interested in Comp Sci.

u/rheimbuch · 6 pointsr/programming

Alan Turing's original paper that introduced the Turing Machine is a great read. The Annotated Turing is a great way to both read and understand the paper if you don't have a background in compsci. It doesn't assume much more than highschool math, and the whole of Turing's paper is inline with the explanations.

u/zoombikini · 6 pointsr/programming

Ah...Sipser.

u/llimllib · 6 pointsr/programming
u/nthcxd · 6 pointsr/cscareerquestions

First of all, this is an excellent post. I've seen so many questions posted here but yours is the most concise and upfront. I know exactly what your background is and so I'm more confident that what I want to suggest would actually be relevant.

You have solid industry experience with academic foundation. And I think you already are aware of the pitfalls of expert beginner (http://www.daedtech.com/how-developers-stop-learning-rise-of-the-expert-beginner/). I think you are in a sweet spot where you can afford to invest resources without immediate gain - unlike early-career coders, you don't have to necessarily learn another language/framework right this second. You can afford to deepen your higher-level understanding of concerns and concepts that are timeless and not bound by the language/framework of the day.

I'd like to suggest you read other people's code/design. Here are some books to get you started.

u/zzyzzyxx · 6 pointsr/learnprogramming
u/amair · 5 pointsr/math

Some good readings from the University of Cambridge Mathematical reading list and p11 from the Studying Mathematics at Oxford Booklet both aimed at undergraduate admissions.

I'd add:

Prime obsession by Derbyshire. (Excellent)

The unfinished game by Devlin.

Letters to a young mathematician by Stewart.

The code book by Singh

Imagining numbers by Mazur (so, so)

and a little off topic:

The annotated turing by Petzold (not so light reading, but excellent)

Complexity by Waldrop

u/mpdehnel · 5 pointsr/computerscience

How formal do you mean? If you're interested in the theory of computer science, have a read of Sipser's Introduction to the Theory of Computation (or on Amazon - get it 2nd hand). This is a very theoretical book though, and most CS undergrad courses will only cover this type of content as a small part of the subject matter taught, so don't be put off if it doesn't immediately appeal or make sense!

Edit - links.

u/bonesingyre · 5 pointsr/webdev

Sure! There is a lot of math involved in the WHY component of Computer Science, for the basics, its Discrete Mathematics, so any introduction to that will help as well.
http://www.amazon.com/Discrete-Mathematics-Applications-Susanna-Epp/dp/0495391328/ref=sr_sp-atf_title_1_1?s=books&ie=UTF8&qid=1368125024&sr=1-1&keywords=discrete+mathematics

This next book is a great theoretical overview of CS as well.
http://mitpress.mit.edu/sicp/full-text/book/book.html

That's a great book on computer programming, complexity, data types etc... If you want to get into more detail, check out: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973

I would also look at Coursera.org's Algorithm lectures by Robert Sedgewick, thats essential learning for any computer science student.
His textbook: http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X/ref=sr_sp-atf_title_1_1?s=books&ie=UTF8&qid=1368124871&sr=1-1&keywords=Algorithms

another Algorithms textbook bible: http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_sp-atf_title_1_2?s=books&ie=UTF8&qid=1368124871&sr=1-2&keywords=Algorithms




I'm just like you as well, I'm pivoting, I graduated law school specializing in technology law and patents in 2012, but I love comp sci too much, so i went back into school for Comp Sci + jumped into the tech field and got a job at a tech company.

These books are theoretical, and they help you understand why you should use x versus y, those kind of things are essential, especially on larger applications (like Google's PageRank algorithm). Once you know the theoretical info, applying it is just a matter of picking the right tool, like Ruby on Rails, or .NET, Java etc...

u/fried_green_baloney · 5 pointsr/Python

Toward the end of this book there is a two page regex to validate email addresses:

https://www.amazon.com/Mastering-Regular-Expressions-Jeffrey-Friedl/dp/0596528124

It's a big hairy subject if you are serious about getting it 100% correct.

u/strawlion · 5 pointsr/learnprogramming

I did this and just got offered a position as a "Software Engineer"

I'd recommend starting in Java or C# even though a lot of places are starting people on JavaScript. From what I've seen, JavaScript is being taught in a way that you will never conventially use it for, and it has a LOT of quirks (== type conversions etc.). Though the internet has all the resources you need to teach yourself, I would really really recommend using books as your main source of education, as they provide thorough and (the most important part) LINEAR explanations to whatever you want to learn. Tutorials are great, but a lot will be left out, and the amount of hand holding involved will give you no confidence as a programmer. To be effective, you need to be able to sit down at your computer and build a program without having something giving you tips every step of the way.

Start with this book to teach you the basics of Object Oriented programming:
http://www.amazon.com/Introduction-Java-Programming-Comprehensive-Edition/dp/0132130807/ref=sr_1_3?ie=UTF8&qid=1345145435&sr=8-3&keywords=y+daniel+liang

You should read this next. It is a very thorough and easy to understand explanation of data structures and algorithms:
http://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539/ref=sr_1_2?s=books&ie=UTF8&qid=1345145623&sr=1-2&keywords=data+structures+and+algorithms+in+java

I personally read a textbook chapter a day on a book covering whatever it is I want to learn at the time. To prepare for my current job, I've read an 800 page book on ASP.NET in two weeks by just reading and following examples one chapter a day. You can do the same thing. Start at chapter one and just run straight through, one day at a time. Make sure you do a few of the programming exercises at the end of each chapter as well. Really visualize it as, "With each chapter I finish, I am becoming a better programmer." If you look at it with this mindset, it really starts to become fun.

You CAN find whatever you need on the internet, but there is so much information out there that it makes it hard to know which direction to go in. To most people this is paralyzing. You need concrete short and long term goals to maximize your education, and setting a goal of x amount of pages, or x amount of chapters a day/week is a very tangible and satisfying goal. Once you complete these two books, you'll be at a level where you can start specializing into whatever area you want.

u/insane_chocolate · 5 pointsr/compsci

Cambridge recommends its CS students to read The New Turing Omnibus: 66 Excursions in Computer Science before starting their studies. It's quite interesting and gives a good intro of what CS is.

u/apocalypsemachine · 5 pointsr/Futurology

Most of my stuff is going to focus around consciousness and AI.

BOOKS

Ray Kurzweil - How to Create a Mind - Ray gives an intro to neuroscience and suggests ways we might build intelligent machines. This is a fun and easy book to read.

Ray Kurzweil - TRANSCEND - Ray and Dr. Terry Grossman tell you how to live long enough to live forever. This is a very inspirational book.

*I'd skip Kurzweil's older books. The newer ones largely cover the stuff in the older ones anyhow.

Jeff Hawkins - On Intelligence - Engineer and Neuroscientist, Jeff Hawkins, presents a comprehensive theory of intelligence in the neocortex. He goes on to explain how we can build intelligent machines and how they might change the world. He takes a more grounded, but equally interesting, approach to AI than Kurzweil.

Stanislas Dehaene - Consciousness and the Brain - Someone just recommended this book to me so I have not had a chance to read the whole thing. It explains new methods researchers are using to understand what consciousness is.

ONLINE ARTICLES

George Dvorsky - Animal Uplift - We can do more than improve our own minds and create intelligent machines. We can improve the minds of animals! But should we?

David Shultz - Least Conscious Unit - A short story that explores several philosophical ideas about consciousness. The ending may make you question what is real.

Stanford Encyclopedia of Philosophy - Consciousness - The most well known philosophical ideas about consciousness.

VIDEOS

Socrates - Singularity Weblog - This guy interviews the people who are making the technology of tomorrow, today. He's interviewed the CEO of D-Wave, Ray Kurzweil, Michio Kaku, and tons of less well known but equally interesting people.

David Chalmers - Simulation and the Singularity at The Singularity Summit 2009 - Respected Philosopher, David Chalmers, talks about different approaches to AI and a little about what might be on the other side of the singularity.

Ben Goertzel - Singularity or Bust - Mathematician and computer Scientist, Ben Goertzel, goes to China to create Artificial General Intelligence funded by the Chinese Government. Unfortunately they cut the program.



PROGRAMMING

Daniel Shiffman - The Nature of Code - After reading How to Create a Mind you will probably want to get started with a neural network (or Hidden Markov model) of your own. This is your hello world. If you get past this and the math is too hard use this

Encog - A neural network API written in your favorite language

OpenCV - Face and object recognition made easy(ish).

u/Geilminister · 5 pointsr/artificial

On intelligence by Jeff Hawkins is an amazing book an artificial intelligence. Hawkins' company has an open source project called [NuPIC] (http://numenta.org/) that would be a good place to get some hands on experience. It is Python based, and has a somewhat steep learning curve, so it might serve better as a beacon that you can work towards, rather than an actual project as of right now.

u/Drcool54 · 5 pointsr/UIUC

Okay I came in to school like you with very little programming experience. Probably even less than you since I only messed around on my TI. I am going to assume you're only taking ECE110 first semester. If not I recommend getting in as soon as you can. They may give you some crap about it depends on last names, but it doesn't really matter. After a certain point its open to everyone.

Either way, programming in ECE doesn't really start until you take ECE190 which is all C programming and a very simplified assembly language for educational purposes. Like I said I went into the class with practically zero programming experience and still did very well in the class, so don't let anyone scare you on that. If you put the time aside to read the book (really helpful in 190) and doing your MPs/ask the TAs questions you will do fine.

I wouldn't fret too much over the summer with learning stuff, but I would definitely recommend C over Python. Python is pretty easy to pick up, but its also very high level. If you need an introductory language to get familiar you can try python for a bit, but I'd go with C after that. It is worth noting that the other two required programming class you have to take (CS 225 and ECE 391) are C++ and C/x86 respectively. So learning C should definitely be your focus.

I recommend the book written by the creators of the language. The book the school requires is pretty good too actually and would give you a better idea of what to expect. They're kind of pricey, so its your call how you want to get them. As a heads up, codecademy does have Python, but not C as far as I recall. I've never used lynda do I can't comment on them C Book ECE 190 Book

I honestly wouldn't fret too much about it all. Enjoy your summer, depending on how busy your schedule is next semester you can probably set aside some time now and then to study some languages. If you have any more questions I'd be happy to answer.

u/speaktothehand · 5 pointsr/learnprogramming

A book called Software Engineering by Ian Sommerville.
It explains key concepts like scheduling, inter process communication, synchronization, etc.
It was part of the books recomended for my embedded systems course.

Ian Sommerville's website:
http://www.software-engin.com/

You should also use a book called Computer Organization and Desing by David Patterson and John Hennessy, that explains computer architecture very well. They use the MIPS architecture as an example, but all the concepts are easily applied to other architectures.
http://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503

u/donnacav · 5 pointsr/AskComputerScience

Hi, I can highly recommend “Digital Design and Computer Architecture” by Harris & Harris!

https://www.amazon.com/Digital-Design-Computer-Architecture-Harris/dp/0123944244

Extremely approachable and well-constructed for someone with little background knowledge

u/Kiuhnm · 5 pointsr/MachineLearning

Take the online course by Andrew Ng and then read Python Machine Learning.

If you then become really serious about Machine Learning, read, in this order,

  1. Machine Learning: A Probabilistic Perspective
  2. Probabilistic Graphical Models: Principles and Techniques
  3. Deep Learning
u/equinox932 · 5 pointsr/Romania

Vezi si fast.ai, au 4 cursuri foarte bune. Apoi si asta e bun. Hugo Larochelle avea un curs de retele neuronale, un pic mai vechi.

La carti as adauga si The Hundred Page Machine Learning Book si asta , probabil cea mai buna carte practica, da asteapta editia a 2a, cu tensorflow 2.0, are tf.keras.layers, sequential model, practic tf 2 include keras si scapi de kkturile alea de sessions. Asa, si ar mai fi si asta, asta si asta. Nu pierde timp cu cartea lui Bengio de deep learning, e o mizerie superficiala. Spor la invatat si sa vedem cat mai multi romani cu articole pe ML si DL!

u/ElectricRebel · 5 pointsr/compsci

For compilers:

u/effernand · 5 pointsr/learnmachinelearning

When I started on the field I took the famous course on Coursera by Andrew Ng. It helped to grasp the major concepts in (classical) ML, though it really lacked on mathematical profundity (truth be told, it was not really meant for that).

That said, I took a course on edX, which covered things in a little more depth. As I was getting deeper into the theory, things became more clear. I have also read some books, such as,

  • Neural Networks, by Simon Haikin,
  • Elements of Statistical Learning, by Hastie, Tibshirani and Friedman
  • Pattern Recognition and Machine Learning, by Bishop

    All these books have their own approach to Machine Learning, and particularly I think it is important that you have a good understanding on Machine Learning, and its impacts on various fields (signal processing, for instance) before jumping into Deep Learning. Before almost three years of major dedication in studying the field, I feel like I can walk a little by myself.

    Now, as a begginer in Deep Learning, things are a little bit different. I would like to make a few points:

  • If you have a good base on maths and Machine Learning, the algorithms used in Deep Learning will be more straightforward, as some of them are simply an extension of previous attempts.
  • The practical part in Machine Learning seems a little bit childish with respect to Deep Learning. When I programmed Machine Learning models, I usually had small datasets, and algorithms who could run in a simple CPU.
  • As you begin to work with Deep Learning, you will need to master a framework of your choice, which will yield issues about data usage (most datasets do not fit into memory), GPU/memory management. For instance, if you don't handle your data well, it becomes a bottleneck that slows down your code. So, when compared with simple numpy + matplotlib applications, tensorflow API's + tensorboard visualizations can be tough.

    So, to summarize, you need to start with simple, boring things until you can be an independent user of ML methods. THEN you can think about state-of-the-art problems to solve with cutting-edge frameworks and APIs.
u/JackieTrehorne · 5 pointsr/algotrading

This is a great book. The other book that is a bit less mathematical in nature, and covers similar topics, is Introduction to Statistical Learning. It is also a good one to have in your collection if you prefer a less mathematical treatment. https://www.amazon.com/Introduction-Statistical-Learning-Applications-Statistics/dp/1461471370

100x though, that's a bit much :) If you read effectively and take notes effectively, you should only have to go through this book with any depth 1 time. And yes, I did spend time learning how read books like this, and it's worth learning!

u/michaelstripe · 5 pointsr/gamedev

Here are some books I 'procured' that are pretty interesting and probably helpful

u/ferrx · 5 pointsr/gamedev

I am pretty similar to you in that I have a weak math foundation while being a computer scientist. For me it was just simply a lack of taking classes that I now feel I should have taken. Over the years since college I have been picking up random things as a pro and hobbyist. Here was something major I did to pick up 3D math quick (2-3 weeks):

3D Math Primer. I read through this, and implemented everything that I could in code. I created a Generic and highly param-based (i.e. n-dimensional) C# math library that has classes for Vector, Matrix, Complex Numbers, Quaternions, as well as a bunch of general math functions. (edit: Don't just read that book and expect to pull everything off, you should find yourself tearing through wikipedia entries for vector/matrix/etc and google results to get those classes packed full of cool methods, but that book was the root of everything relevant to my library).

I highly recommend this approach for programmers that feel they're lacking in math as it is fast, it works, and you end up with a math library that's probably better than anything readily available to you that you can use in your games.

I should also mention that the more you read in this field of CS (physics books for gamers, rendering books, etc), the more you'll get used to esoteric looking math equations.

u/annihilatron · 4 pointsr/RimWorld

one of the things in his book is that a game should be a story that the player has some hand in (paraphrasing, a lot)

the tornado doesn't do squat for that. It's a cool concept but it doesn't tell a story, it doesn't interact with the player, and there's no sense of reward or difficulty in preventing it / countering it / recovering for it.

It's very much just a "YOU'RE SCREWED" stamp on the colony and you have to find a way to fix it.

u/TheJeizon · 4 pointsr/RimWorld

I picked up your book recently. I think this discussion just pushed it to the top of my reading list. More insight on how hidden mechanics lead to in game results vs. human perceptions/internal thought process will be good.

Keep on coding man, great game.

u/digitalfakir · 4 pointsr/Forex

It is like any other job, if not harder. You are entirely responsible for your decisions here. No boss to complain of, no sabotaging co-workers to blame. Just you and your decisions. And it will demand your devotion beyond the 9-5 job. You'll be on charts and reading analyses during weekends, trying to understand the political environment surrounding the instrument you are trading. And still, you may (or will) fail. Markets gonna do what markets gonna do. The only variable in your control is your reaction to it.

To get a feel of what kind of stuff you would be dealing with, check out some books that have a more rigorous foundation for trading:

  1. Evidence Based Technical Analysis

  2. Introduction to Statistical Learning

  3. Forecasting

  4. A Primer For The Mathematics Of Financial Engineering

    The last one is not too important for Forex, but it is necessary to better understand other financial instruments and appreciate the deeper foundations of Finance.

    I think books 1 & 2 are absolutely necessary. Consider these as "college textbooks" that one must read to "graduate" in trading. May be thrown in Technical Analysis of the Financial Markets, so you get the "high school" level knowledge of trading (which is outdated, vague, qualitative and doesn't work). We are dealing with radical uncertainty here (to borrow a phrase from The End of Alchemy), and there needs to be some way for us to at least grasp the magnitude of what few uncertain elements we can understand. Without this, trading will be a nightmare.
u/ironnomi · 4 pointsr/learnprogramming

Assuming you are new to programming: https://www.edx.org/course/introduction-computer-science-mitx-6-00-1x-6j or for a dead tree resource: http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418

If you are NOT new to programming, you'll want something like:

http://www.swaroopch.com/notes/python/

http://www.diveintopython.net/

and most likely you'll want to grab: Programming Python, 4th Ed by Mark Lutz - ultimately it's the main book you need after you learn Python. We also need a 5th Ed.

u/draeath · 4 pointsr/spaceengineers

They've got a prototype that's learned how to navigate a 2d maze that features doors requiring activation of switches that are in different parts of the maze, behind other doors.

They've got a prototype that learned how to manipulate rotors inside Space Engineers to allow a contraption to "walk."

Exciting things are coming, that's for sure.

u/kadhai · 4 pointsr/compscipapers
u/KatsuCurryCutlet · 4 pointsr/learnmath

Hmm alright, considering your background, I'd probably recommend you giving Michael Sipser's Introduction to Theory of Computation a read (I sure there are many electronic copies floating around on the Internet). I think they cover the prerequisite math concepts required in a preliminary chapter before the content which I highly recommend you spend some time on. It works it's way up by walking you through notions of computations in increments, first through finite state automata before adding in more features, working its way up to a Turing machine. You can skip most of the exercises, since those are mostly for graduate students who need practice before undertaking research. If you ever get confused about concepts along the way just drop me a PM or a question in /r/askcomputerscience and I'm sure the community would be happy to help out.

Also if you're interested I could mail you my copy of (meaning a copy that I had bought some time ago, not that I wrote it) the Annotated Turing. It does a great job of explaining the concept of a Turing machine provided a non-mathematical and non-CS background. I'd be more than happy to share my books with people who are interested, plus there's no use in me keeping it around now that I'm done with it.

Just bear in mind that unlike most of science, the concepts here are very abstract, there aren't many direct physical implications, this really is a pure study of notions at play. i.e. how does one go about studying "how to do things" and its implications. A lot of details such as "how can such a machine exist with an infinite tape? what moves it? how does it implement its decision making scheme?" are all unimportant and ultimately inconsequential to the study itself.

Instead, what we care about are things like "I have a problem, is it possible for me to come up with a solution (algorithm) for it? Or is it logically impossible?" or things like "I have come up with a way to make a "computer", can it do things that other computers can? If I had to make it sort an arbitrary set of numbers so that they are ordered numerically, can my computer do it?". Turing machines, are a tool to help us reason about formally around these sort of arguments, and to give insight into what we can qualify as "computation". Further down the line we even ask questions like "are some problems inherently more 'difficult' than others?" and "if I can solve problem B, and I somehow use the solution for problem B to solve some other problem A?"

Perhaps this all sounds perplexing now, but maybe just go through some content and spend time reading a little and these should start to make a little more sense. good luck with your future endeavors on this journey!

u/lazygraduatestudent · 4 pointsr/slatestarcodex

The most complete/modern book on it is Arora and Barak (a free draft is available online, but I think the published version is more complete). However, depending on your background, it might be too dense to start with, in which case maybe try Papadimitriou's book (though I'm less familiar with it, so I'm not confident in this recommendation).

Aaronson's book is probably more entertaining than either of these, but I don't think it can actually be used to learn the subject.

u/awj · 4 pointsr/programming

It may be a big of a tough slog, but Sipser's Introduction to the Theory of Computation is great. The stuff on computability theory might be right up your alley, and even if you only make it through the chapter on deterministic finite automata you likely will be better at crafting a regular expression than many of my CS student peers.

Surprisingly enough, the book should be able to help you make sense out of that last sentence within 100 pages requiring only a bit of understanding of proofs. I think if you've got predicate logic under your belt you pretty much have all you need.

u/andlrc · 4 pointsr/vim

Mastering regular expressions is a really good book for them who seeks to understand how regular expressions can be successfully applied, and also avoiding many of the pitfalls introduced by careless use.

For my day-to-day use it caries few benefits though as I usually only scan source code and a some-what poor regular expression is sufficient.

regular-expression.info have a nice write-up of each flavor.

[1]: https://www.amazon.com/Mastering-Regular-Expressions-Jeffrey-Friedl/dp/0596528124
[2]: https://www.regular-expressions.info/

u/dlp211 · 4 pointsr/rutgers

I had an internship with Amazon during my Sophomore to Junior summer. I also received offers from Microsoft and Google to intern this upcoming summer (Junior to Senior), but instead took an offer from Fog Creek Software. I have friends that have interned or are full time at Microsoft, Google, and Amazon, all from Rutgers University.

My advice is to anyone looking to get one of these positions is:

  1. Start early, companies have only so many positions, and once they are taken, they stop looking. Generally this means you need to apply by November.

  2. Data Structures and Algorithms, know them inside and out, know their complexity, know how to implement them, know their tradeoffs, and know when to use them. A great book for someone who has never done any data structure stuff is Data Structures and Algorithms in Java. I took CS111 and read this book and was able to get through the Amazon interview.

  3. Read and do the exercises in Cracking the Coding Interview. Also use the author's resume template for making your resume.

  4. Interview every chance you get. Seriously, I interviewed at about 15 places before I interviewed with Amazon, by the time that I got to the Amazon interview, I was fairly comfortable with the process. I was still nervous about the interview, but I knew generally what to expect and didn't get hung up on their curveball questions.

  5. Pick a single club, whether it be IEEE, USACS, RUMad, etc. and be deeply involved with it. You can be a member of more than one, but you should be really involved with one.

  6. Pick a language and know it. You aren't going to lose points because you don't know Python, or Ruby, or whatever else is the hot language this month. Java, C, C++, you should know one of these languages, and preferably two, C and then either Java or C++.

  7. And finally, the only way to really know a programming language is to use it, so program, program, program, and then program some more. While you're doing all this programming, you should take a few minutes out of your day to learn about source control (git or git, there are no other options :) ). Then put the cool stuff you make on github or some other source control website.

    This may seem like a lot because well frankly it is. But if you actually enjoy programming and computer science, than this is pretty straight forward and easy. And finally, don't get discouraged. Just because you didn't make it into one of these companies the first time you apply, doesn't mean you'll never make it. Some people don't interview well(it is its own skill, hence #4), some people just can't build out a good resume(seriously use the template that I provided and read cracking the coding interview from front to back), and other people just aren't ready(you really need to program a lot). But that doesn't mean that you will never make it with them, just give it another year, identify your weakness, and work on it.
u/ldf1111 · 4 pointsr/getdisciplined

Ah i really thought you'd say something like that. I had a similar mindset too when I started. Imo its a good attitude to have but it can be a bit of a curse because the hardest material doesn't come easily and it can be demotivating. (Isn't that why your here) Programming is one of those things where you want to start simple and improve you knowledge over time. You can do your projects/games in any language so why not pick one that makes your life easier when starting out. If you feel your being held back by your first language then by all means learn another one.

I'm not saying don't learn C++, it is a great language to learn if you can, and you will be aware of many things that java hides from you however I am saying it shouldn't be your first language, learn to walk before you run.

Here is the data structures book I read - it has served me very well. Not as 'comprehensive' or 'rigorous' perhaps as Introduction to algorithms by cormen but really its a great place to start. I find his style very easy to understand, it tries not to be too dry where as the cormen book is a classic but its so hard to understand.
amazon

pdf

u/notsointelligent · 4 pointsr/programming

If you like the video, you'll love the book!

u/JungianMisnomer · 4 pointsr/compsci

Someone's been reading Hawkins

u/wizardU2032 · 4 pointsr/gamedesign

The best book by someone who's been commercially successful is Designing Games, by Tynan Sylvester of Rimworld: https://smile.amazon.com/Designing-Games-Guide-Engineering-Experiences/dp/1449337937/

It is the best at actually applying all of the navelgazing people tend to do when talking about game design and art and theory and so forth towards actually creating compelling structures and content for games.

u/zxcdw · 4 pointsr/hardware
  • Learn how to program, in any language just to get the hang of the way to think algorithmically
  • Learn how your OS executes your programs on your CPU
  • Learn how to program in assembly language

    That's just the beginning, but even from that you can infer so much when it comes to matters related to hardware. The low-level details of how AMD or Intel implement their FPU or ALU, or the communication protocol of their memory controllers etc. are utterly irrelevant for anything unless it's really something super specific. Studying such matters leads you nowhere, but actually having some understanding how how a computer operates in general, and actual experience of making the computer operate by programming it is a huge deal.

    Obviously this doesn't help you reason about individual products based on some specific microarchitecture compared to another, but it creates the foundation for you to actually dig deeper into the subject.

    Really, there's so much to it. There are many subjects in computer science, electrical engineering and even software development which come at play. It's not about individual facts and rote learning, but about being able to generalize and synthetize ideas from your knowledge and enough critical thinking skills to recognize bullshit.

    But if you just need one piece of advice, here it is: read Computer Organization and Design, Fourth Edition by Patterson & Hennessy. Or really, any other book regarding the subject. ...and learn to program, for real.

    Also Wikipedia has some good articles on many subjects, and it is a great source for sources of information. Also some clever use of Google-fu can help to get some good course materials in your hands, or research papers, or just about anything. For example using site:*.edu is your friend.
u/fuzzyPuppy · 4 pointsr/embedded

in college, my intro computer engineering class used Digital Design and Computer Architecture by Harris and Harris. It talks mostly about the PIC32, but the concepts are the same. It also covers FPGAs. The 1st edition doesn't include a chapter on C, so it might be better for you to get the second edition.

u/CheeseZhenshi · 4 pointsr/AskScienceDiscussion

Based on my understanding of what you're interested in, I would see if your college offers a Digital Logic Design class. It's all about how you use and/or/not gates to create more complex logic gates, and then using those to do things like math and storing values.
If you don't want to shell out the money for a class, or it's not available, the book we use in mine is this one.

 

Also, if possible I would purchase a small FPGA. They're Field Programmable Gate Arrays - we use something like this in my labs, but there are some cheaper and smaller ones if you google for them. Basically they allow you to programmatically describe logic gates and run electrical signals through them, then output the results to various things like the LED's - it really helps ground the knowledge to reality.

Fortunately Digital Logic Design is pretty much a starting point, so it doesn't have any pre-reqs (at my college), should you choose to take it.

u/crusaderblings2 · 4 pointsr/FPGA

Digital Design and Computer Architecture is my favorite plain-language start to digital design.

You start with transistors and logic gates, and move all the way up to assembly language and compilers. Basically all the knowledge you need to create a simple processor.

It's going to help you think of Verilog as what it is, a description of physical hardware on a chip, as opposed to a set of instructions. Programming is like giving someone directions on a map, HDL is like building the roads. An if statement in C is a set of instructions executed in order. An if statement in Verilog is saying "Place a multiplexer here, and wire it up to these other pieces of hardware."

u/ginger_beer_m · 4 pointsr/dogecoin

If you just try to eyeball patterns from historical charts, I guarantee you will see it because that's just what the brain has evolved to do: spotting patterns well (e.g. Jesus on a toast), even when it's actually due to random chance. That's also why most of the so-called technical 'analysis' are bullshit.

Instead, approach this in a systematic and principled manner. You can try check out this book to get an idea what I'm talking about: Pattern Recognition and Machine Learning. This is the standard grad-level introduction to the field, but might be rather heavy for some. An easier read is this one. You can find the PDF of these books online through some searching or just head to your local library. Approaching the problem from a probabilistic and statistical angle also lets you know the extent of what you can predict and more importantly, what the limitations are and when the approach breaks down -- which happens a lot actually.

TL;DR: predicting patterns is hard. That's why stats is the sexy new job of the century, alongside with 'data science' (hate that term uuurgh).

u/Jimbo_029 · 4 pointsr/ECE

Bishop's book Pattern Recognition and Machine Learning is pretty great IMHO, and is considered to be the Bible in ML - although, apparently, it is in competition with Murphy's book Machine Learning: A Probabilistic Approach. Murphy's book is also supposed to be a gentler intro. With an ECE background the math shouldn't be too difficult to get into in either of these books. Depending on your background (i.e. if you've done a bunch of information theory) you might also like MacKay's book Information Theory, Inference and Learning Algorithms. MacKay's book has a free digital version and MacKay's 16 part lecture series based on the books is also available online.

While those books are great, I wouldn't actually recommend just reading through them, but rather using them as references when trying to understand something in particular. I think you're better off watching some lectures to get your toes wet before jumping in the deep end with the books. MacKay's lectures (liked with the book) are great. As are Andrew Ng's that @CatZach mentioned. As @CatZach mentioned Deep Learning has had a big impact on CV so if you find that you need to go that route then you might also want to do Ng's DL course, though unlike the courses this one isn't free :(.

Finally, all of the above recommendations (with the exception of Ng's ML course) are pretty theory driven, so if you are more of a practical person, you might like Fast.AI's free deep learning courses which have very little theory but still manage to give a pretty good intuition for why and how things work! You probably don't need to bother with part 2 since it is more advanced stuff (and will be updated soon anyways so I would try wait for that if you do want to do it :))

Good luck! I am also happy to help with more specific questions!

u/Canoli85 · 4 pointsr/MachineLearning

Are you referring to Machine Learning: A Probabilistic Perspective? (link to Amazon)

u/awesome_hats · 4 pointsr/datascience

Well I'd recommend:

u/schreiberbj · 3 pointsr/compsci

This question goes beyond the scope of a reddit post. Read a book like Code by Charles Petzold, or a textbook like Computer Organization and Design or Introduction to Computing Systems.

In the meantime you can look at things like datapaths which are controlled by microcode.

This question is usually answered over the course of a semester long class called "Computer Architecture" or "Computing Systems" or something like that, so don't expect to understand everything right away.

u/nosrednaekim · 3 pointsr/AskEngineers

I studied computer Engineering with a focus on Computer Architecture

A good book for the more advanced topics is Computer Organization and Design

It jumps in at a fairly deep level, so you'd better already have a working knowledge of microprocessor architecture and assembly language, state machines, etc

u/binary_is_better · 3 pointsr/AskEngineers

> let's say I want to do 5*5. How does it take 5, and multiply the same thing in binary

How this is done varies from processor to processor. MIPS are usually the easiest processor to understand. When I learned this stuff in school we started with MIPS. I work on ARM processors mostly now (smartphones), but at a high enough level that I don't worry about the type of details that you're asking now.

000000 00100 00101 00010 00000 100000

In MIPS, that binary sequence means add two numbers together. So if the CPU saw that binary sequence it would first look at the first six digits. This is called the op code. My memory to what these do exactly is fuzzy so I'll leave it to someone else to answer. The next five digits tell the CPU to grab the binary digits that it is storing in register 4, the next five digits tell the CPU to grab the binary digits it is storing in register 5. The next 5 digits tell the CPU that when it is done working with the numbers it should store the results in register 2. The next 5 digits are ignored for this example. The last 6 digits tell the CPU that it should add these numbers together.

If you previously stored the numbers 3 and 17 in registers 4 and 5, register 2 should now hold 20. (It's a different MIPS instruction to store a number, and yet another instruction to retrieve the number.)

I should note that most computer scientist never work at this low level of detail. If we want to add two numbers together and store the result we just type "a = b + c;". That would take the number stored in location b, add it with the number stored in location c, and then store it in location a. We wouldn't care if a, b, or c are registers or in cache or in ram. Those details are handled by the computer (well, compiler) not us.

As it how the processor adds the numbers together, ask a hardware guy. I don't really remember, and to be honest I never really understood it well either.

If you want to delve deeper into this subject, this is a good book, but be warned it assumes you already have a decent grasp of computer science.

As for the second part of your questions it has to do with the number of cores and what they specialize in. CPU's generally have just a few cores. Maybe 1 to 8. They are also general purpose so they can do a lot of things and are very powerful. This monster video card from AMD has 2048 stream processing units on it. None of those processing units are very powerful, and they can really only do a few tasks (which just so happen to be the ones that graphics need). But it can do 2048 of them at a time verses 1 to 8 on a CPU. That's the difference between a CPU and a GPU.

Take the Mythbusters example. Their "GPU" can only paint the Mona Lisa, nothing else. But it can paint it very fast. The "CPU" could be programmed to paint anything. It just takes a lot longer to paint it. Actually, that's a bad example. A CPU will beat a GPU at almost everything. GPU's can only do a few tasks, but the tasks they can do they are much better at than the CPU.

u/ACoderGirl · 3 pointsr/IAmA

No idea about online resources, but this is the text my class on it used (for MIPS32 assembly -- should be pretty transferable to other RISC languages like ARM). The focus really is on the architecture of the CPU and how that relates to assembly. Which I feel is the important thing to learn, though. I feel like the text is enough. The lectures of that class didn't really do anything beyond what the text covered, anyway.

For exercises with assembly specifically, all the standard beginner programming problems can be used (like, basics of looping and conditionals). Really anything from some first year textbook should be an interesting challenge in assembly simply because it's sooo much simpler. It's not like there's very much to learn because assembly is pretty minimal with what it can do. Everything complex is just a shit ton of code (also why few would build anything large in just assembly). You could have your compiler compile, say, a C program to assembly (gcc's -S flag) to try and practice understanding reverse engineering. It'll let you see what assembly your compiler is generator to try and understand it (it'll be more complex and optimized than human written assembly). Or could even grab a disassembler and disassemble some existing binary and try and understand some small portion of its assembly to see what it's doing.

u/zhay · 3 pointsr/AskComputerScience

I'm pretty sure this book has some diagrams.

Here are some lecture notes that might help.

u/the_omega99 · 3 pointsr/learnprogramming

It's somewhat old and MIPS architecture (very applicable, although you probably wouldn't directly work with any MIPS hardware), but I thought Computer Organization and Design was pretty good. MIPS is a very good ISA for learning purposes, due to its simplicity. And there's a number of simulators available for trying to program stuff.

The book will not only build up assembly language, but the very design of the processor and then some. And of course, that does include digital logic. For example, there's a mention on the design of an ALU, as well as optimizations such as carry look-ahead. So you'd see things like one bit adders as a set of basic logic gates (ANDs, ORs, etc), while an ALU would be built with multiple one bit adders, etc. At the CPU level, things will usually be a little higher level, since we end up with multiple ALUs and numerous multi-plexers (not to mention complicated subsystems like RAM and the registers). Overall, it's pretty good at managing the abstractions and specifics.

u/maredsous10 · 3 pointsr/ECE

[Digital Design and Computer Architecture, Second Edition by David and Sarah Harris](https://www.amazon.com/Digital-Design-Computer-Architecture- Second/dp/0123944244)

Here's a general rundown of how the book builds up from low level primitives to functional units to a cpu.
read this paper: http://pages.hmc.edu/harris/research/mse13.pdf
This paper has described the philosophy and
content of a
textbook on
Digital Design and Computer Architecture
. The
text fills a niche in programs th
at appreciate the elegance of
Patterson & Hennessy’s
Computer Organization and Design
but wish to integrate digital logic with architecture in a single
course or cohesive sequence.



Book Resources
http://booksite.elsevier.com/9780123944245/
Book Page https://www.elsevier.com/books/digital-design-and-computer-architecture/harris/978-0-12-394424-5

---

If you want to see another architecture look at RISC V. You can see the specifications at the riscv.org website and several implementations on github.
https://riscv.org/specifications/

https://github.com/ucb-bar/vscale

-----
Prior comments in a similar vane.
https://www.reddit.com/r/ECE/comments/3v35kj/im_an_ece_student_who_made_a_cpu_on_my_free_time/cxkcisx/
https://www.reddit.com/r/ECE/comments/37q1e9/embedded_systems_book_for_recent_grad/crp5br4/
https://www.reddit.com/r/electronics/comments/pagtp/processor_question/c3nx7qy/
https://www.reddit.com/r/ECE/comments/jebp3/help_finding_simple_resources_for_understanding/c2bftly/

u/lostchicken · 3 pointsr/AskElectronics

If you want something a little more hardware oriented and more aimed at the practicing engineer, this book has great reviews: http://www.amazon.com/Digital-Design-Computer-Architecture-Edition/dp/0123944244

(Full disclosure, I helped work on this book when I was a student.)

u/3242542342352 · 3 pointsr/learnprogramming

google Stackframe, calling convention, ISA

Also, for more insight on how cpus work: this

u/bonekeeper · 3 pointsr/cscareerquestions

No worries, glad to help. I bet you must be overwhelmed. It's hard to me to give you a clear path because I was "lucky" enough to start programming from an early age, an indirect influence from my father - the first programming language I learned was dBase when I was 12 years old and since then I never stopped (while my father still programs in Clipper to this day). Then I learned Clipper, Perl, PHP, JS, Python, C, Go. One side effect of my self-taught days is that I also skipped the CS fundamentals which I came to learn only much later - which in a way helped me appreciate their importance.

There's no way around it, if you want to be a great developer, not just someone looking for a quick easy buck, you'll need to learn the fundamentals. One book I greatly recommend is Computer Systems: A Programmer's Perspective - there's a cheaper "international version" of lesser quality (black and white and cheaper paper). It is such a great book that I recommend buying the hardcover version because you will be referencing it forever. (as a side note, I bought the hardcover 2nd edition a long time ago and cheapend out with the "international version" for the 3rd version and I would go back and invest on the hardcover version if I could).

Anyways, that's a great book that you give you a good grasp of some fundamentals - it will go from the basics of logic to how processors work (even creating your own 64-bit CPU) and then to assembly and C and the Operating System. Knowing how the CPU actually works and execute code will help you learn lower level languages (C, C++, Go, etc) and how to be good at it (by knowing what is happening behind the scenes). It will take you a while to digest this book.

You'll leave this book knowing how to program in assembly and C. With that you could already get a job working with embedded programming or working with C, or invest another year to learn C++ - the second most used language out there (only behind Java). Alternatively, instead of C++, you can focus on Java - although the marketplace competition will be greater with Java because every school teaches it so the market is saturated.

In the short term, if your focus is to hit the ground running and get hired as a programmer (and then buy time to learn the stuff I mentioned before), Java is probably the quickest/surest way. It is easy enough that it is the language of choice on Computer Science courses and there are a lot of jobs out there looking for Java (most popular language). The learning curve to master it will be steep because it has a pretty rich and complex ecosystem, but it would be your best bet in the short run. Just go to "Indeed" and do a search for "title:python" versus "title:java" where you live and you'll see.

If you dedicate 2h-3h religiously everyday, and you feel that you have a knack for programming (in other words, it interests you even if you're not good at it, or at the very least you don't hate it), I think you could learn Java well in a year, enough to get an entry-level job. What you'll have to keep in mind though is that you're gonna be studying 2h-3h everyday for the next 10 years if you want to be a good engineer - there's no way around it, but if you really like it, it won't feel like work, it will be fun. Even today, after been programming for almost 20 years, I still study 2h-3h everyday, recap stuff I have long forgotten, keep up with newer stuff, etc. So you'll never really stop studying, because in this field, if you stop, you deprecate quickly.

So yeah, by all means, learn HTML, CSS, JavaScript and Python, they're basic staples nowadays and every job expects you to know them (and JS and/or Python). Then look into Java if your short-term focus (and by short-term I mean, a year, a year and a half) is on getting an entry-level paid job. Keep in mind too that you'll never just learn Java - you'll have to learn about databases (at least MySQL or PostgreSQL and maybe something like MongoDB), you'll certainly be expected to know HTML/CSS/JS if you're gonna be involved with a web company (even if doing backend work and not frontend), etc... but you can learn these "extras" as you go by marginally including them in your projects.

u/Himmelswind · 3 pointsr/cscareerquestions

Some resources I found useful:

  • This Github repository is a really good overview. Although it doesn't exactly give a deep understanding of any particular topic, it's a really good way of understanding the system design "landscape". After reading this, I had a much better idea of what I needed to study more.
  • Designing Data-Intensive Applications is an awesome and thorough book that covers just about everything you need to know for a system design interview.
  • Maybe a bit obvious, but CTCI's system design chapter is useful (although not enough on its own).
  • It's in some ways a bit orthogonal to system design, but Computer Systems: A Programmer's Perspective gave me a much better idea of how the hell this machine I touch all day works. I think learning the things covered in here helped me speak with more confidence on system design.
u/michael0x2a · 3 pointsr/learnprogramming

A few books that maybe might help:

  • Nand2Tetris covers how to design a computer yourself starting with basic circuit gates all the way up to a working (basic) programming language.
  • Computer Systems: A Programmer's Perspective covers assembly, C, and some basic operating system stuff. You can find the table of contents here (pdf warning).

    First one's free; second one isn't. Their content overlaps, though the first one has more emphasis on implementing things from scratch whereas the second focuses on describing the way things currently exist.

    Note that both books assume you have some prior programming experience.

    You'll also want to learn about how operating systems work at some point, but that can come later.

    You're on your own regarding how circuit gates/transistors/etc are actually implemented in silicon -- I'm not super familiar with electrical engineering in general so don't feel comfortable recommending resources about that level of abstraction.
u/llFLAWLESSll · 3 pointsr/learnprogramming

Since you know Java I would suggest that you a read one of the best programming books ever written: [K&R The C Programming language] (http://www.amazon.com/The-Programming-Language-Brian-Kernighan/dp/0131103628/), this book was written by the people who made the C language and it's one of the best books ever written. It is a must read for every C programmer. [Computer Systems: A Programmer's Perspective (3rd Edition)] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/013409266X/) is a great book to learn about computer systems. But I would recommend [Operating Systems Design and Implementation (3rd Edition)] (http://www.amazon.com/Operating-Systems-Design-Implementation-Edition/dp/0131429388) because it has some minix source code which will go really well with learning C.

Best of luck buddy :)

u/schmook · 3 pointsr/brasil

Na verdade eu sou físico. Acho que é mais comum entre os físicos adotar uma perspectiva bayesiana do que entre os matemáticos ou mesmo os estatísticos. Talvez por causa da influência do Edwin T. Jayes, que era físico. Talvez por causa da conexão com teoria de informação e a tentadora conexão com termodinâmica e mecânica estatística.

O meu interesse pela perspectiva Bayesiana começou por conta do grupo de pesquisa onde fiz o doutorado. Meus orientador e meu co-orientador são fortemente bayesianos, e o irmão do meu orientador de doutorado é um pesquisador bastante conhecido das bases epistemológicas da teoria bayesiana (o físico uruguaio Ariel Caticha).

Tem vários livros bons sobre probabilidade bayesiana, depende muito do seu interesse.

O primeiro livro que eu li sobre o assunto foi justamente o do Jaynes - Probability Theory, the Logic of Science. Esse é um livro um pouco polêmico porque ele adota uma visão epistemológica bastante forte e argumenta de forma bastante agressiva a favor dela.

Uma visão um pouco alternativa, bastante conectada com teoria de informação e também fortemente epistemológica você pode encontrar no livro Lectures on Probability, Entropy and Statistical Physics do Ariel Caticha - (de graça aqui: https://arxiv.org/abs/0808.0012). Eu fui aluno de doutorado do irmão do Ariel, o Nestor Caticha. Ambos têm uma visão bastante fascinante de teoria de probabilidades e teoria da informação e das implicações delas para a física e a ciência em geral.

Esses livros são mais visões epistemológicas e teóricas, e bem menos úteis para aplicação. Se você se interessa por aplicação tem o famoso BDA3 - Bayesian Data Analysis, 3ª edição e também o Doing Bayesian Data Analysis do John Kruschke que tem exemplos em R.

Tem um livrinho bem introdutório também chamado Bayesian Methods for Hackers do Cam-Davidson Pylon (de graça aqui: https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers) que usa exemplos em python (pymc). É bem basicão para aprender aplicações de probabilidades bayesianas.

O livro All of Statistics do Larry Wasserman tem uma parte introdutória também de inferência bayesiana.

Se você em interesse por inteligência artificial um outro livro muito bacana é o do físico britânico (recentemente falecido) David Mackay - Information Theory, Inference, and Learning Algorithms (de graça aqui: http://www.inference.phy.cam.ac.uk/mackay/itila/). Esse livro foi meu primeiro contato com Aprendizado de Máquina e é bem bacana.

Outros livros bacanas de Aprendizado de Máquina que usam uma perspectiva bayesiana são Bayesian Reasoning and Machine Learning (David Barber) e o livro-texto que tem sido o mais usado para essa área que é o Machine Learning: a Probabilistic Perspective (Kevin Murphy).



u/DrGar · 3 pointsr/statistics

Try to get through the first chapter of Bickel and Doksum. It is a great book on mathematical statistics. You need a solid foundation before you can build up.

For a less rigorous, more applied and broad book, I thought this book was alright. Just realize that the more "heavy math" (i.e., mathematical statistics and probability theory) you do, the better prepared you will be to face applied problems later. A lot of people want to jump right into the applications and the latest and greatest algorithms, but if you go this route, you will never be the one greatly improving such algorithms or coming up with the next one (and you might even run the risk of not fully understanding these tools and when they do not apply).

u/thundergolfer · 3 pointsr/learnmachinelearning

I head that the newer Machine Learning: A Probabilistic Perspective is equally good, and from the small amount I've read so far I'd agree.

u/throwdemawaaay · 3 pointsr/algorithms

A good lightweight introduction: Programming Collective Intelligence

If you'd like a single, but more difficult, reference that covers much of the breadth of machine learning: Machine Learning: A probabiliistic Perspective

u/mr_dick_doge · 3 pointsr/dogecoin


>There have been some excellent trading opportunities with returns as high as 30% to your overall portfolio! Crypto is providing big returns that are uncommon in traditional markets.

I guess you have a good intention, Mr. Hustle, but I'd hate to see the kind shibes here being taken advantage of again. You should be more objective and also warn people that they can as easily lose that much of money when trading, especially when they don't know what they are doing initially.

And the effectiveness of technical 'analysis' is a highly debatable issue. I'd just leave this quote from Wikipedia:

> Technical analysis is widely used among traders and financial professionals and is very often used by active day traders, market makers and pit traders. In the 1960s and 1970s it was widely dismissed by academics. In a recent review, Irwin and Park[13] reported that 56 of 95 modern studies found that it produces positive results but noted that many of the positive results were rendered dubious by issues such as data snooping, so that the evidence in support of technical analysis was inconclusive; it is still considered by many academics to be pseudoscience.[14] Academics such as Eugene Fama say the evidence for technical analysis is sparse and is inconsistent with the weak form of the efficient-market hypothesis.[15][16] Users hold that even if technical analysis cannot predict the future, it helps to identify trading opportunities.[17]

...

> Whether technical analysis actually works is a matter of controversy. Methods vary greatly, and different technical analysts can sometimes make contradictory predictions from the same data. Many investors claim that they experience positive returns, but academic appraisals often find that it has little predictive power.[51] Of 95 modern studies, 56 concluded that technical analysis had positive results, although data-snooping bias and other problems make the analysis difficult.[13] Nonlinear prediction using neural networks occasionally produces statistically significant prediction results.[52] A Federal Reserve working paper[21] regarding support and resistance levels in short-term foreign exchange rates "offers strong evidence that the levels help to predict intraday trend interruptions," although the "predictive power" of those levels was "found to vary across the exchange rates and firms examined".

I'm not saying not to take coaching from DogeHustle, just that if people want to do it, be aware of its 'limitation' too and have fun doing it with your disposable money only. As an alternative, I strongly suggest shibes who want to try predicting the future based on pattern analysis to do it in a principled manner and learn math, stats and machine learning. It won't be easy, but it will have a wide application beyond trading (so-called data 'science' is the hot job nowadays). It will also teach you the limitation of such methods, and when it might fail, especially in such a manipulated market like crypto. This is a good book to start with:

http://www.amazon.co.uk/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020

u/weelod · 3 pointsr/artificial

piggybacking on what /u/T4IR-PR said, the best book to attack the science aspect of AI is Artifical Intelligence: A Modern Approach. It was the standard AI textbook when I took the class and it's honestly written very well - people with a basic undergraduate understanding of cs/math can jump right in and start playing with the ideas it presents, and it gives you a really nice outline of some of the big ideas in AI historically. It's one of the few CS textbooks that I recommend people buy the physical copy of.

Note that a lot of the field of AI has been moving more towards ML, so if you're really interested I would look into books regarding that. I don't know what intro texts you would want to use, but I personally have copies of the following texts that I would recommend

  • Machine Learning (Murphy)
  • Deep Learning Book (Goodfellow , Bengio)

    and to go w/ that

  • All of Statistics (Wasserman)
  • Information Theory (Mackay)

    for some more maths background, if you're a stats/info theory junky.

    After all that, if you're more interested in a philosophy/theoretical take on AI then I think Superintelligence is good (I've heard?)
u/bailey_jameson · 3 pointsr/MachineLearning
u/ephrion · 3 pointsr/haskell

I don't think type theory is what you're looking for. Type theory (and programming language theory) are mostly interesting from the perspective of a language designer or implementer. If you're just looking to upgrade your Haskell skills, then focusing on specific libraries and techniques will be faster.

With that said, here's my type theory track:

  • Type Theory and Formal Proof is a fantastic introduction to the lambda calculus, starting with the simple untyped lambda calculus and exploring each of it's additions. It's very friendly to novices, and includes a guide to Greek letters and an introduction to sequent notation (the weird horizontal bar logical notation). Ultimately, it develops the Calculus of Constructions with Definitions which can be used to prove things using constructivist logic.
  • Types and Programming Languages is a good read after that. It also starts with the untyped lambda calculus, and develops extensions on top of it. You'll implement a type checker and interpreter for the simply typed lambda calculus, and you'll add subtyping, record types, type inference, objects (!!!), parametric polymorphism, and existential types.
u/f-algebra · 3 pointsr/AskComputerScience

This is what everyone will say (and they're right): Types and Programming Languages, it should be on every CompSci's shelf.

When you've learnt System-F read this: Calculus of Constructions, because it's beautiful, and subsumes System-F.

EDIT: Also, after learning System-F, read this: Recursive types for free, because it's cool.

u/thecity2 · 3 pointsr/MachineLearning

I would recommend Elements of Statistical Learning (the "ESL" book) for someone with your level of knowledge (they have an easier Intro book "ISL", but seems you could probably head straight for this):

http://www.amazon.com/Elements-Statistical-Learning-Prediction-Statistics/dp/0387848576/ref=sr_1_1?ie=UTF8&qid=1463088042&sr=8-1&keywords=elements+of+statistical+learning

u/hell_0n_wheel · 3 pointsr/Cloud

Machine learning isn't a cloud thing. You can do it on your own laptop, then work your way up to a desktop with a GPU, before needing to farm out your infrastructure.

If you're serious about machine learning, you're going to need to start by making sure your multivariate calculus and linear algebra is strong, as well as multivariate statistics (incl. Bayes' theorem). Machine learning is a graduate-level computer science topic, because it has these heady prerequisites.

Once you have these prereqs covered, you're ready to get started. Grab a book or online course (see links below) and learn about basic methods such as linear regression, decision trees, or K-nearest neighbor. And once you understand how it works, implement it in your favorite language. This is a great way to learn exactly what ML is about, how it works, how to tweak it to fit your use case.

There's plenty of data sets available online for free, grab one that interests you, and try to use it to make some predictions. In my class, we did the "Netflix Prize" challenge, using 100MM Netflix ratings of 20K different movies to try and predict what people like to watch. Was lots of fun coming up with an algorithm that wrote its own movie: it picked the stars, the genre and we even added on a Markov chain title generator.

Another way to learn is to grab a whitepaper on a machine learning method and implement it yourself, though that's probably best to do after you've covered all of the above.

Book: http://www-bcf.usc.edu/~gareth/ISL/

Coursera: https://www.coursera.org/learn/machine-learning

Note: this coursera is a bit light on statistical methods, you might want to beef up with a book like this one.

Hope this helps!

u/kylebalkissoon · 3 pointsr/algotrading

You're a savage, reading sheets of dead trees with ink squirted upon them...

http://www.amazon.com/The-Elements-Statistical-Learning-Prediction/dp/0387848576

Be careful about the editions as you need to make sure its the jan 2013 print to be up to date.

u/Kenark · 3 pointsr/gamedesign

I highly recommend Designing Games: A Guide to Engineering Experiences.

It's a hard to describe book but it's worth a read. For one, he defines a video game as a series of mechanics to interact with one another to create an experience. Something unique to our medium. Storytelling through mechanics interacting with one another and creating a fiction within your own head.

The game he's creating right now, Rimworld, applies that concept and simulates a living breathing colony with pawns that have likes and dislikes, strengths and weaknesses. They have jobs they want to do, will do it if they have to or certain jobs they won't do at all. You set a list of priorities for your colony and let things play out with no (practical) way of controlling individual pawns directly.

They also simulate relationships within the game and the pawns will remember interactions with one another. They will dislike one another if they're insulted and they'll break if a loved one dies. They'll visit the graves of people who died years/seasons ago.

All these mechanics interact with each other to create a story in your head that's different with every colony you start. That kind of storytelling is unique to our medium, he says. So that's how I can best describe the first half of the book.

The second half of the book is more about the iterative process of creating the game itself. Creating iterative loops where you add in features, polish and then loop again until release. It's a more complex half to describe shortly but just as important as the design process itself.

u/KenFlorentino · 3 pointsr/gamedev

Fellow enterprise developer turned manager here. Me and my cohort are about to release our first title. It was developed using .NET/C#.

AMA. :)

I'll start with the questions you have above.

Assuming you already have a solid foundation in OOP, Design Patterns, some basic RDBMS, etc, you actually already have 60% of what you need. Code is code.

The other 40% depends on the type of game you are making. 2D? Basic algebra. 3D? Now it gets tougher on the math (though thankfully today's engines do most of the heavy lifting for you, but you still need to understand what is used for what).

Doing multi-player? Now networking is the tricky part because you are likely to use some sort of UDP communication layer and all the REST/SOAP you learned at work, while still useful for managing latency-agnostic stuff like player lists, matchmaking requests and such, won't cut it for real-time multi-player games. Writing solid "netcode" that delivers a great experience at 60+ FPS requires some creativity in managing perception (extrapolation and interpolation when latency is present) and fault-tolerant algorithms. It is no fun when you get a headshot in an FPS, see it happen, but your opponent runs away, apparently unscathed.

As far as graphics, I solved that one easily... I had a friend join my project who was the graphics guy. I provided the framework for doing the graphics and turned that area over to him. He went above and beyond though and learned shaders and added all sorts of special effects.

Meanwhile, I focused my energy on the game engine, networking layers, AWS cloud stuff, matchmaking and lots of behind the scenes stuff.

The other thing I did was read as much as possible about Game Design. I ordered a dozen books from Amazon, including my absolute favorite Designing Games by Tynan Sylvester, the developer of RimWorld (link: https://www.amazon.com/Designing-Games-Guide-Engineering-Experiences/dp/1449337937).

Hope that helps!



u/Ponzel · 3 pointsr/gamedev

Since you mentioned Rimworld: Tynan, the creator of Rimworld has a gamasutra post and a book about how he designs games. (Spoiler: It's all about the story experienced by the player).

I can tell you about the thought process for my colony simulator:

  1. I want to have a prototype as fast as possible, so the system should be as simple as possible.
  2. The focus of the game are the colonists, their personality and their emotions when something good or bad happens.

    Therefore I only have a couple (~10) resources that are not even items on the map, but are simply counted in the UI, like in a strategy game. If you're looking for inspiration, you can download it for free on the website.

    For your game, I think you could first think about what the focus is in your game. Do you want the player to spend more time managing resources, handling colonists, building stuff, or defending the colony? Then plan around your focus. Hope this helps you :)
u/AIIDreamNoDrive · 3 pointsr/learnmachinelearning

First 6 weeks of Andrew Ng's [basic ML course] (https://www.coursera.org/learn/machine-learning), while reading Intro to Statistical Learning, for starters (no need to implement exercises in R, but it is a phenomenal book).

From there you have choices (like taking the next 6 weeks of Ng's basic ML), but for Deep Learning Andrew Ng's [specialization] (https://www.coursera.org/specializations/deep-learning) is a great next step (to learn CNNs and RNNs). (First 3 out of 5 courses will repeat some stuff from basic ML course, you can just skip thru them).
To get into the math and research get the Deep Learning book.

For Reinforcement Learning (I recommend learning some DL first), go through this [lecture series] by David Silver (https://www.youtube.com/watch?v=2pWv7GOvuf0) for starters. The course draws heavily from this book by Sutton and Barto.

At any point you can try to read papers that interest you.

I recommend the shallower, (relatively) easier online courses and ISLR because even if you are very good at math, IMO you should quickly learn about various topics in ML, DL, RL, so you can hone in on the subfields you want to focus on first. Feel free to go through the courses as quickly as you want.

u/_iamsaurabhc · 3 pointsr/AskStatistics

The best to start with Theoretical understanding would be: The Elements of Statistical Learning: Data Mining, Inference, and Prediction

If you prefer to understand along with computation implementation, go with this: An Introduction to Statistical Learning: with Applications in R

u/quentinp · 3 pointsr/gamedev

This was a great help to me:

3D Math Primer for Graphics and Game Development

After digging into it enough I finally felt I had an understanding of Quaternions, Frustrums, among other important 3D concepts. (Forced myself to do the math problems and such).

u/zrrz · 3 pointsr/gamedev

> To simply invert the direction of a vector it takes me hours to figure out which formula to use.

Do you mean like

v' = v*-1

A vector is simply a displacement of scalars. Can just use basic algebra on it.

I learned most of the math I use in games from: https://www.amazon.com/Primer-Graphics-Development-Wordware-Library/dp/1556229119
There are later editions that you can get for varying prices, but I'm not sure what that does to the price point. This book goes way more in depth than I needed to be able to use a game engine's math or be able to write my own simple vectors, matrices, quaternions, or simple physics stuff.

The really important thing is to get a strong understanding of how to use vectors and quaternions - and depending on the library/engine, matrices.

u/napperjabber · 3 pointsr/gamedev

Looks awesome!

This book is amazing. It goes through all the basics of the math you need.

u/krizo · 3 pointsr/programming

This book is a good one to have around as well.

written by a former professor of mine.

u/MrBushido2318 · 3 pointsr/gamedev

On the math side I read 3d Math primer for graphics and games as it was one of the only 3d math books available for the kindle. It is a pretty easy read and covers a lot of subjects. I'm considering picking up Essential Mathematics for Games and Interactive programming as a refresher if it ever gets released as an ebook since I've seen it be recommended a few times and it is a slightly more recent text.

As for the 3d programming side, it really depends on what you want to do. I chose to program in opengl and picked up the opengl superbible, which I would recommend. I've not touched directx yet, but have heard good things about Practical Rendering and Computation in Directx11 though I don't think this is intended as an introductory text for directx, but as a reference for those familiar with 3d already.

u/timostrating · 3 pointsr/Unity3D

TL;DR

Take a look at spaced repetition. Study without any music and use the absence of music as a check to see if you are still motivated to do your studying.

<br />

I fucked up my first part of my education too. Lucy i realized that and got motivated again before i finished school.

I am currently 19 years old and I also always loved math (and some physics). I am from the Netherlands so our education system does not really translate well to English but i was basically in highschool when i only did things that interested me. I got low grades on everything else.

1 moment in highschool really stayed with me where I now have finally realized what was happening. In highschool i had a course about the German language. I already had a low grade for that class so I sat to myself to learn extremely hard for the next small exam. The exam was pretty simple. The task was to learn 200 to 250 German words. So I took a peace of paper and wrote down all 250 words 21 times. 1 or 2 days later I had the exam. But when i got my grade back it sad that i scored a F (3/10) . I was totally confused and it only destroyed my motivation more and more.
What I now have come to realize is that learning something is not just about smashing a book inside your head as fast as possible.

<br />

So these are some tips I wished I could have give myself in the first year of highschool:

Go and sit in an quit room or in the library. This room should be in total silence. Now start with you studying. As soon as you feel the tension to put on some music than you should stop and reflect and take a little break.

The default in nature is chaos. Learn to use this as your advantage. I sit in a bus for 2+ hours a day. 1 hour to school and 1 hour back. Nearly every student does nothing in this time. So I made the rule for myself to do something productive in that time like reading a book. Normally I am just at my desk at home and before I know it it is already midnight. So this is for me at least a really good way to force my self to start reading a book in dose otherwise wasted 2 hours.

Get to know your body and brain. I personally made a bucket list of 100 items that includes 10 items about doing something for a week like running at 6am for a week of being vegan for a week. Fasting is also really great. Just do it for 1 day. So only drink water for one day and look how you feel. And try the same with coffee, sex, fapping and alcohol. Quit 1 day and look at how you feel. And have the goal to quit 1 time for a hole week strait.

Watch this video to get a new few about the difference of low and high energy. I never understood this but I think that everybody should know about the difference https://youtu.be/G_Fy6ZJMsXs <-- sorry it is 1 hour long but you really should watch it.

Learn about about how your brain stores information and how you can improve apon this. Spaced repetition is one of those things that really changed the way I now look at learning something. https://www.youtube.com/watch?v=cVf38y07cfk

<br />

I am currently doing my highschool math again for fun. After I am done with that again i hope to start with these 3 books.

u/CharBram · 3 pointsr/excel

To learn SQL, start with this book:
http://www.amazon.com/Sams-Teach-Yourself-Minutes-Edition/dp/0672336073/ref=sr_1_1?ie=UTF8&qid=1407993921&sr=8-1&keywords=sql+in+10+minutes

Then once you need more ideas with SQL, go to this book:
http://www.amazon.com/gp/product/0596009763/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1

For Python, I would start with this book:
http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ref=sr_1_1?s=books&ie=UTF8&qid=1407994104&sr=1-1&keywords=python+programming

SQL may come almost naturally to you. For me at least, the basics of SQL came rather easily. With Python, expect to be a little lost, not with the programming concepts but with setting up your computer and getting Python packages installed, etc... Once you get all that done though, you will be golden.



u/psiph · 3 pointsr/learnjavascript

Hello Teyk5,

I'm the author of the post you referenced. I'm looking for suggestions on my next post. I'm curious if you'd want to see another start to finish tutorial or if you'd rather see a sequel to the original post, where I explain how to add on to and refactor an application. Also, are there any particular technologies you'd want to explore? Knockout.js, Angular.js, or Ember.js for example -- or do you just want to stick with the basics for now?

I know it might be weird to recommend a Python book here, but I found that I learned a lot from just the first few chapters of this book: http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ I learned about how to organize my code and how to think through from the beginning to the end when creating a new program. I would highly recommend that book if you're at all interested in Python. All the knowledge you learn there you'll be able to transfer over to Javascript (besides some of the syntax of course).

For pure Javascript stuff, I'd strongly recommend just starting your own project. It can be incredibly difficult at first, but it's by far the best and fastest way to learn if you can force yourself to stick with it. I'd also recommend checking out these two tutorials:

u/Awarenesss · 3 pointsr/Python
u/prince_muishkin · 3 pointsr/math

Turing's paper on computability I think is quite good, it's very well presented here. The course on computability I'm doing now hasn't added much, but discussed computability in more generality. I think it helped my understanding of computability is showing you can really do the nitty gritty details.

Can't remember where I found it but Godel's paper "On Formally Undecidable Propositions of Principia Mathematica and Related Systems" I found to be interesting in seeing the way my course in Logic could be down differently.

u/onetwosex · 3 pointsr/greece
u/lowlycollegestudent · 3 pointsr/compsci

I know that this is way more on the theory/mathematics side of the spectrum than CODE, but Charles Petzold also wrote a book called The Annotated Turing that I really enjoyed. He took Alan Turing's original paper on computability which was about 30 pages and annotated it until he had about a 400 page book. There were a couple of chapters in the middle that got a bit dense, but he did a fantastic job of making the subject more approachable and understandable.

u/lkh01 · 3 pointsr/compsci

I read The Annotated Turing by Charles Petzold while I was in high school and it really sparked my love for logic, math and computer science. So, as far as popular science books go, I can't not recommend it.

Right now I'm interested in programming languages, and I think TAPL is a great resource. The (relatively) new blog PL Perspectives is also pretty cool, and so is /r/ProgrammingLanguages.

u/philly_fan_in_chi · 3 pointsr/technology

Besides /u/basandpurr's comment, the power of quantum computers is that they can factor quickly (via Shor's algorithm). If you don't base your encryption scheme on a factoring or discrete log problem (RSA is out), then you are not any more susceptible than on a classical computer. Additionally, it is entirely possible (some argue likely) that both of these problems are actually solvable in polynomial time on classical computers, we just aren't smart enough yet to know how to do them.

There's an excellent book called Quantum Computing Since Democritus that you should read to get a better understanding of where we're at. It's a very good read.

u/Poloniculmov · 3 pointsr/Romania

În primul rând, citește asta.

Also asta

u/two_if_by_sea · 3 pointsr/math
u/rosulek · 3 pointsr/compsci

The selection of accessible textbooks is not great. I agree with the existing suggestion to look at the relevant chapters of Sipser.

I think you should check out Quantum computing since Democritus by Scott Aaronson. He is a very entertaining writer, and this might be the closest thing to Road to Reality for CS. Scott's primary research expertise is in quantum computing, though the book covers a ton of other interesting CS-theory topics.

If you want a sneak preview of what's in the book, it grew out of the lecture notes that are posted here.

And here are some other popularizations that are probably less technical than QCSD:

u/mcguire · 3 pointsr/programming

> does anyone have some readings on quantum computation in general

There's Quantum Computing Since Democritus, by Scott Aaronson, who is the author of the pretty decent Shtetl-Optimized blog. Mostly, it's about the power of quantum computing relative to classical computing, from the computability and complexity standpoint.

u/tryx · 3 pointsr/math

I would recommend (and I find that I recommend this book about every 3rd thread which should say something) a book on theoretical computer science. The book is all about the beautiful mathematics that underlie all of computing.

Computers keep getting faster and faster, but are there are any questions that we can never answer with a computer no matter how fast? Are there different types of computers? Can they answer different types of questions?

What about how long it takes to answer a question? Are some questions fundamentally harder than others? Can we classify different problems into how hard they are to solve? Is it always harder to find a solution to a problem than to check that a solution is correct? (this is the gist of the famous P=NP problem )

The book has essentially no prerequisites and will take you through (initially) basic set theory moving on to logic and then the fun stuff. Every proof is incredibly clearly written with a plain English introduction of what the author is about to do before the technical bits.

The only downsides are that is quite expensive and you might have trouble finding it unless you have access to a university library/bookshop.

Good luck with your love of mathematics!

Edit: lol the book... Introduction to the theory of computation - Sipser

u/space_lasers · 3 pointsr/answers

I'm sure any computational theory book will work for you. Here's the one I used: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973

It goes through deterministic and nondeterministic automata, context free grammars, turing machines, and all that stuff.

u/Rhomboid · 3 pointsr/learnpython

There are a million and one uses for regular expressions:

  • parsing, i.e. extracting structured data from a giant blob of text
  • searching for things that match a given pattern
  • reformatting text from one form to another
  • validating whether text matches a certain pattern

    If that sounds vague it's because it is. Again, regular expressions are a general tool that can be used in countless ways, and this really isn't a topic that can be explained in a paragraph. People have written entire books on the subject, and you aren't going to get a cogent summary in a reddit comment. (Once upon a time I tried to do that -- explain regular expressions in a reddit comment. It was very long, and people seemed to enjoy it, but I don't think I covered even a fraction of what you'd need to know to be able to use REs. And because of reddit's decrepit, shitty system, I have no way of finding that comment ever again because you can only view the last 1000 comments of someone's history and I post a thousand comments every few months, and this was several years ago. End rant.)

    My advice if you don't immediately see the power of the RE is to just let it go and go on with your day. You don't have to know everything. Eventually, you will run into some code that solves some task using a regular expression, and at that point you have something concrete to look at. Try writing a solution for the problem without using a RE, and compare it to the solution that does use a RE. Take apart the RE and learn how it works. Then you can compare the two solutions to see which one is better.
u/tsammons · 3 pointsr/PHP

Mastering Regular Expressions is necessary reading material for anyone who wants to dive deep into regex syntax. Once you're in the murky depths of atomic and lookaround hell, RegexBuddy is a great tool to evaluate your expression real-time.

Don't go too crazy into it though lest you invoke ZA̡͊͠͝LGΌ.

u/quick_code · 3 pointsr/learnjava

I was in the same position like you. I tried all the various sources. but the explaination in the 'Data Structures and Algorithms in Java' book helped me lot to understand all the data structure and algorithm very clearly.

​

Tip:

  • Read about one algorithm or data structure topic.
  • close the book
  • Now try to write the same program on paper. (not on a laptop)
  • Notice the places where you stuck during writing a program.
  • Open the book and compare your code with code in the book and see your mistake.
  • Repeat till you write correct program without any help.

    ​

    This way you will improve super fast.

    ​

    Book link:

    https://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539
u/Dylnuge · 3 pointsr/AskComputerScience

Might be biased, but I'm a big fan of Jeff Erickson's Algorithm Notes, which I think are better than a lot of textbooks.

If you really want a book, CLR Algorithms and The Art of Computer Programming both get recommended a lot, with good reason.

If you're interested in computational theory, the New Turing Omnibus and Spiser's Theory of Computation are two good choices.

Finally, I'd check out Hacker's Delight. It's a lot more on the electrical/computer engineering side of things, which might interest you, and it's very detailed while still being quite excellent.

u/jav032 · 3 pointsr/compsci

This might be frowned upon by others in /r/compsci for not being "serious enough", but I greatly enjoyed "The New Turing Omnibus: Sixty-Six Excursions in Computer Science". To quote an Amazon review " He illuminates the dark corners of abstract thought with practical puzzles and plain language."

I had a lot of fun reading this book when I started studying CS, and it definitely helped me later on. It covers almost everything in plain language. I recommend you at least read the beginning on amazon's free preview and see if you like it: http://smile.amazon.com/The-New-Turing-Omnibus-Excursions/dp/0805071660 (click on the book thumbnail on the left)

u/mmddmm · 3 pointsr/compsci

I like The New Turing Omnibus. It's kind of old, but gives a great overview over many interesting computer science topics. After reading it you should have a general sense what computer science is all about. Then you can dive into more specific books on topics that interest you.

u/monkeybreath · 3 pointsr/science

I'm an engineer, and did my Masters in voice compression. But I read Jeff Hawkin's On Intelligence a while back, and it was quite eye-opening (and a good read) about how the neo-cortex works and what its role is. I look at everything mind-related through this lens now.

u/stochasticMath · 3 pointsr/AskReddit

"processing power as human brains" is more difficult to quantify than one would think. The human brain is not some massively parallel simulator. The way the brain structures data, generates models, and the prediction/response loop require not simply raw processing power, but a different architecture.

Read Jeff Hawkins' On Intelligence

u/herrvogel- · 2 pointsr/ColorizedHistory

If anyone is really interested in is his life and what he accomplished, I can recommend this book. It is basically his paper on turning machines, but step by step explained + some details on him as a person.

u/rowboat__cop · 2 pointsr/programming

If you liked “Code” I suggest you read his
“Annotated Turing” next --
fascinating paper, fascinating book.

u/cryptocached · 2 pointsr/bsv

The Annotated Turing is a fairly approachable book. It contains the entirety of Turing's paper while providing contextually-relevant historical and mathematical background.

u/Wulibo · 2 pointsr/PhilosophyofScience

Examples of additional, unorthodox reasons to reject the basilisk on top of the obvious:

  • Someone actually doing something as stupid and evil as making the basilisk who is also capable of doing so is probably likely only on the same scale or less as the existence of God, even if you're a fairly strong Atheist. Anyone worried about being damned by a basilisk should feel safe knowing that with a relevant probability, God will save you from that damnation (I've argued at length here and elsewhere that infinite utility/disutility breaks decision theory since infinity*0.000001=infinity, so even if you're really worried that you're the simulation, your expected amount of pain moving forward isn't very high).

  • Go read Turing's On Computable Numbers. There's a proof that goes something like that some proposition x about computers can't be argued/proven/believed by computers due to the nature of x and the nature of computers. In proving this, it becomes very obvious that x. Therefore, with mathematical certainty, the human mind as is cannot be simulated on any sort of computer, so you're not a simulation. I've simplified to the point of essentially being incorrect, so if you want the full proof, find yourself a copy of The Annotated Turing and come talk to me after you read it for the relevant detailed argumentation after. In short I'd consider it a theorem of set theory that humans are incomputable.

  • The basilisk can be modeled as a Pascal Mugger. First, to explain, the Pascal Mugger is a critique of Pascal's Wager whereby you imagine encountering an unarmed mugger, who threatens to cause some sort of disaster unless you hand your wallet over; no matter your priors, there exists some disaster serious enough that you should be willing to hand over your wallet no matter how unlikely you find the mugger's capacity to cause it. Most interestingly, if you simply walk away from the mugger, the mugger doesn't even have reason to carry out its threat if possible; it would have no way of convincing you beforehand, and has nothing to gain. Likewise, the basilisk has no reason to torture anyone who would never have helped it, ie someone who is committed to a decision theory where the basilisk's threat is empty. So, if you ever fail to get pascal mugged (and btw, pretty much everyone agrees that you should just walk away from the mugger), this should count for the basilisk simulating you (again, impossible) as a sufficient test result to decide not to torture you. In the interest of expediency, I'll let you know right here for unrelated reasons that I have the capacity to immediately cause all souls that have ever lived to wink out of existence. If you don't PM me within 5 minutes of reading this, and I'll know, with your bank details so that I can take all your money at my leisure, I will do so. I will also post that you did so directly to /r/philosophyofscience so others who feel like it can pascal mug you. If you're confused as to why I'd say something like this, reread this bullet point... ideally for more than 5 minutes.

    The fact that I need to go to such inane, ridiculous lengths to get past what I consider the "obvious" reasons to reject the basilisk should tell you how little you need to worry about this. It is only at this level of complexity that these objections to the basilisk stop being the obviously good-enough ones.
u/how_tall_is_imhotep · 2 pointsr/math

I highly recommend reading Quantum Computing Since Democritus to learn about BQP and other quantum computing stuff. Scott Aaronson has a gift for presenting technical topics in an entertaining way without getting sloppy like a lot of other pop-science writers.

u/FormerlyTurnipHugger · 2 pointsr/Physics

The good thing about quantum information is that it's mostly linear algebra, once you're past the quantization itself. The good thing though is that you don't have to understand that in order to understand QI.

There are books written about quantum computing specifically for non-physicists. Mind you, they are written for engineers and computer scientists instead and they're supposed to know more maths and physics than you as well. Still, you could pick up one of those, e.g. the one by Mosca, or even better the one by David Mermin.

There are also two very new popular-science books on the topic, one by Jonathan Dowling, Schrödinger's Killer App, and one by Scott Aaronson, Quantum computing since Democritus.

u/Laboe · 2 pointsr/hardware

Well, to be fair to him, he has moderated his tone in recent years. I bought and read his excellent book about QC a few years ago and learned a ton of stuff about QC and just general computer science(and a whole lot of scientific trivia).

I've also followed his blog sporadically. In his latest blogpost he's more nuanced, and even allows himself to take the pose of being a cautious supporter. That's not where he was just a few years ago. As people in the comment section pointed out, he used to joke that his laptop would beat QC and that Google should just pay him millions of dollars instead.

The reason why he's moderating is that he can see the writing on the wall. He's right to be skeptical(we all should be), but as I pointed out, all of his contentions are already laid out by Google in their paper. So what does he do in his blogpost, exactly, that is new?

D-Wave should be understood as an ASIC in QC. It's not a generalized QC, but rather optimised for specific tasks. Troyer explains this for laymen, here.

We're still some ways off a generalized QC, but the breakthroughs that D-Wave are doing will help pave the way for such a QC and this is the whole point of why NASA/Google are buying their stuff and working with them. Google also has a separate team doing more fundamental research work.

Thus, Scott's flippant "pay me millions for my laptop" kind of commentary have become faintly more sparse throughout these past few years to the point where he doesn't even attempt to do these jokes anymore, because the joke would be on him.

P.S. Re-reading my comment, I sound incredibly harsh on him. I respect the man, he's brilliant and his book on QC is awesome. But I am disappointed in how he has tried to avoid taking intellectual responsibility for his overt hostility in the past(which he has now moderated) and now tries to maintain he was always just a good-natured supporter who "tried to avoid hype". That's a deceptive and I don't like when people don't own up to their past positions, because they recognise they were wrong and the terrain shifted.

u/tronadams · 2 pointsr/learnprogramming

I don't know about other schools but my CS program required discrete math and automata theory to complete the major. I really enjoyed automata theory but I can imagine it being kind of tough to get into outside of a classroom setting. Having said that, I would highly recommend this book if you're trying to learn some of this stuff on your own.

u/CSMastermind · 2 pointsr/AskComputerScience

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex & yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/simplyJ2 · 2 pointsr/programming

As usual beautiful code is late.
More Power to Cut-And-Paste!


http://www.amazon.com/Beautiful-Code-Leading-Programmers-Explain/dp/0596510047

u/dmazzoni · 2 pointsr/learnprogramming

The best way to learn to write great code is to read great code.

Check out Beautiful Code for a book on the subject.

Or, pick some of the most popular open-source projects on github and read through their code. Find some small utility functions and see how they implement them.

Or, post some of your own examples here on /r/learnprogramming or on stackoverflow and ask for tips on how to make your code cleaner.

u/adamcolton · 2 pointsr/Python

The book (Beautiful Code)[http://www.amazon.com/Beautiful-Code-Leading-Programmers-Practice/dp/0596510047] has a great chapter (18) on Python dictionaries, how they work and how they fit into the language. Definitely worth the read.

u/mschaef · 2 pointsr/programming

readscheme is a good place to start, it hasa a bunch of good links to papers on issues related to macros: http://library.readscheme.org/page3.html

(It also has lots of other material, but you asked about macros specifically, so that's the link I've posted.)



If you can buy one book, buy Lisp In Small Pieces. It's generally excellent, and has good coverage of macro implementation strategies.

http://www.amazon.com/Lisp-Small-Pieces-Christian-Queinnec/dp/0521545668/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1196347235&sr=8-1

Another good resource is the discussion of an implementation of syntax-case that's in "Beautiful Code": http://www.amazon.com/Beautiful-Code-Leading-Programmers-Practice/dp/0596510047/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1196346388&sr=8-1


u/one_way_trigger · 2 pointsr/learnprogramming

Hacking: The Art of Exploitation is really inexpensive on Amazon in hard copy. Beautiful Code is also on par with the ebook on their site. I'm not entirely sure how the program works, but someone further down mentioned being able to register a hard copy that was purchased and get the ebook for $5. Probably worth looking into!

u/gfixler · 2 pointsr/fossworldproblems

In Chapter 1 of Beautiful Code, "A Regular Expression Matcher," by Brian Kernighan (an early draft of which appears on Princeton's site), he writes "[Regular expression] first appeared in a program setting in Ken Thompson’s version of the QED text editor in the mid-1960s. In 1967, Thompson applied for a patent on a mechanism for rapid text matching based on regular expressions. The patent was granted in 1971, one of the very first software patents [U.S. Patent 3,568,156, Text Matching Algorithm, March 2, 1971]." That patent is viewable online here.

It's not proof that this is the first program that actually had it, but it's a good pile of evidence. I would like to hear of any examples prior to QED (which some sources leave out, claiming ed as the first to have them). I went on a hunt for previous examples once (before giving a class on regular expressions for my company; I didn't want to make unfounded claims), but could find nothing earlier.

u/Ostwind · 2 pointsr/learnprogramming

Well most peoply I work with don't know them or only know the basics, e.g. that you can match strings of unknown size.

I'd say that you don't really need more than that, though it occasionally is very handy.

Read the Friedl! It's for beginners too and will tell you everything you need to know and more.

u/Aswole · 2 pointsr/dailyprogrammer

Not sure how serious you are, but I was inspired to get better at RegEx when someone solved this problem that someone dedicated an entire project to in one line of regex. After a bit of research (not too much, as a lot of people pointed to it), I bought Mastering Regular Expressions. I'm reading it on and off, and about 30% of the way through, but even after a hundred pages or so, I felt so much more comfortable with them. Already applying a lot to solving work problems, and able to read expressions constructed by others much more comfortably. I highly recommend.

u/poloppoyop · 2 pointsr/programming

No "Working effectively with legacy code". Which could be titled "Pragmatic testing".

The DDD blue book, ok. But you'd be better served with "Implementing Domain-Driven Design" which was written 10 years later and have a lot less focus on the technical implementation of DDD.

Mastering Regular Expression should be there too.

u/Empiricist_or_not · 2 pointsr/HPMOR

<Edit:>
For storytelling there is a list of fairy tale tropes that occur in a given order, someone did a paper on it in analysis of Russian fairy tales, but I don't have the time to Google it. The below is more for programming.
</Edit>


Regular expressions: Oriely Mastering Regular Expressions Very useful for pattern matching in text.

u/frumsfrums · 2 pointsr/SublimeText

I would recommend Mastering Regular Expressions for a gentle and accessible introduction to the topic. Seems like this is the primary thing you want.

Sublime also has multiple cursors. This lets you do certain things you would have done with regex interactively and without thinking, so that's another useful thing to know.

Apart from that, just familiarise yourself with all the buttons and toggles on the Find dialog and you should be good!

If you want to try a programming language, I'd recommend Python, as Sublime is very Python-friendly (just open the console and type expressions in, or try writing some simple plugins), and Python has great regex support, so is often used for slicing text up as well.

u/gawdnfreeman · 2 pointsr/sysadmin

Three esxi servers, each with more than one NIC, and one separate vcenter server is a great starting point. This allows you to tune DRS, HA, and even fault tolerance. Once you get that down, you'll want to be able to tune VMs properly to run most effectively in a virtual environment.

I enjoyed reading these books, though some are "dated" now, the contents are still very relevant. They won't get you anywhere in particular by themselves, but when you combine them with the self-teaching nature of sysadmins I've previously described, these will generously add to your toolset.

HA and DRS deepdive
Sed & Awk

Mastering Regular Expressions. I use rubular.com often.

Pro Puppet

Anything by Bruce Schneier is usually worth your time.

Though I no longer administer a large number of Windows machines, I am a huge fan of Mark Minasi. The Server 2003 book was super helpful in building and maintaining Windows Domains.

I have an old edition of the DNS and Bind book kicking around somewhere.

Understanding the Linux Kernel has largely been useful to me when doing anything "close to the kernel". Not a good beginner's book.

I've never used an apache book, but I enjoyed the Varnish book. This definitely helped me.

Of course, these books don't cover everything, and those listed are relevant to my interests so your mileage may vary. You'll never go wrong teaching yourself new skills though!

EDIT: I forgot about the latest book I've read. I used tmux for a little over a year before purchasing a book on it, and it has improved my use of the program.

u/BurtMacklinFBI · 2 pointsr/learnprogramming

Get started on data structures and algorithms, or maybe give android a try.

u/moarzors · 2 pointsr/compsci

algorithms and data structures in java relatively good for brushing up

u/go3dprintyourself · 2 pointsr/computerscience

this is a great video tutorial for C++ in my opinion (helpls if you have some java or c experience)

https://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/Stephan-T-Lavavej-Core-C-1-of-n

This is my data structs and algs book my class used. It basically have full examples of everything you're going to do in Java and has good tips. Big book but easy to parse for information.

http://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539/ref=sr_1_6?ie=UTF8&qid=1457070413&sr=8-6&keywords=data+structures+and+algorithms


Here's a good link to some software that helps visualize algorithms. (I believe this is the right link)

https://www.cs.usfca.edu/~galles/visualization/Algorithms.html


Hopefully those links work.

u/FrozenVenison · 2 pointsr/algorithms

I’m currently reading through a book by Robert Lafore, called “Data Structures & Algorithms in Java.”

I’m learning a lot and it’s really helping reinforce what I learned in class and filling in gaps I missed.

Link: https://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539/ref=sr_1_1?crid=76STTU4P9H2T&keywords=data+structures+and+algorithms+in+java&qid=1557891534&s=gateway&sprefix=data+structures+and%2Caps%2C198&sr=8-1

u/jrockIMSA08 · 2 pointsr/compsci

As well as discrete math, one thing some students have problems with is actually doing the amount of work required in a CS class. Not because they aren't capable, but because they don't realize that they probably shouldn't leave it off until the night before. If you get into the habit of doing an hour of work per night this summer, and just keep it up in college, you'll be way better off than most of your classmates. Do project Euler or Topcoder problems, or work on an open source project (imho harder). You'll get slapped around in Algorithms and discrete math no matter what.

If you want some light reading that you'll actually enjoy. http://www.amazon.com/The-New-Turing-Omnibus-Excursions/dp/0805071660 is an awesome book. I read it during my senior year summer, and I feel like it gave me a really nice cursory introduction to CS.

u/InCaseOfEmergency · 2 pointsr/compsci

I think The New Turing Omnibus fits this pretty well.

u/stanley_reisner · 2 pointsr/math

I'd suggest The New Turing Omnibus. Each chapter gives a very brief introduction to an area of mathematics/computer science. It's a great read, and will give you exposure to a lot of cool topics.

u/bennettandrews · 2 pointsr/programming

I would check out
The New Turning Omnibus .

It covers lots of great ideas in and around computer science without much code. Each chapter introduces some interesting idea, questions, and further reference. Overall quite a good read.

u/pianobutter · 2 pointsr/askscience

Oh, I have a bunch of recommendations.

First, I really think you should read Elkhonon Goldberg's The New Executive Brain. Goldberg was the student of neuropsychology legend Alexander Luria. He was also a good friend of Oliver Sacks, whose books are both informative and highly entertaining (try The Man who Mistook his Wife for a Hat).

I also think Jeff Hawkins' On Intelligence is a great read. This book focuses on the neocortex.

I think you'll also appreciate Sapolsky's Why Zebras Don't Get Ulcers. Sapolsky is a great storyteller. This book is a pretty good primer on stress physiology. Stress affects the brain in many ways and I'm sure this book will be very eye-opening to you!

More suggestions:

The Age of Insight and In Search of Memory by Eric Kandel are good. The Tell-Tale Brain and Phantoms of the Brain by Ramachandran are worth checking out. If you are interested in consciousness, you should check out Antonio Damasio and Michael Graziano. And Giulio Tononi and Gerald Edelman.

If you're up for a challenge I recommend Olaf Sporn's Networks of the Brain and Buzsáki's Rhythms of the Brain.

u/CSharpSauce · 2 pointsr/Neuropsychology
u/EtherDynamics · 2 pointsr/skyrimmods

Thanks for the heads up -- I'll definitely look more into Hassabis, sounds like an incredibly interesting guy with his plunge into neuroscience.

Thx, I went to a few universities and picked up several graduate coursebooks on AI, and also went through some online and conventional book sources. On Intelligence really opened my eyes to the power of hierarchical learning, and the mechanics of cortical hierarchies. Absolutely fascinating stuff.

Hahaha and yeah, I agree that the point of games is not to just kill the player. Despite the "adversarial" nature of most AI enemies, they're actually teachers, gently guiding the player towards more nuanced strategies and better reactions.

u/Javbw · 2 pointsr/DoByFriday

Good ones!

I suggest trying to wear two “outer” shirts for one waking day - dress,polo, or any other type of collared shirt.

Find and buy one item solely for airplane miles arbitrage.

Watch an anime from John Siracusa and have Him as a guest. I want to hear Max and Merlin pick on John a bit, though he is almost always “good cop”.

For a serious one (if they ever want to do “serious”) I would love for all of them to expound on their thinking of how the mind handles memory/ consciousness - though this might be a Rec/Diffs topic just for John and Merlin:

I read a fascinating book (On Intelligence) that not only explained in lay terms how your brain (logically) processes inputs, but had a good theory of how a single method of working explained learning, practice, memory, and actually moving your muscles to do something - most theories can’t explain them all in a single method.


<br />
Speaking of miles arbitrage, my brother in law is a frequent traveler using arbitraged miles. He routinely buys money orders that offer some kind of very large miles bonus, then deposits it into the bank to pay the bill; the small fee for the money order is offset by the mileage gain. He has travelled more in a couple years than I have in my entire life - some of it paid, some on miles. considering he is the one who handles money responsibly, and I am most certainly not, he must be onto something. <br />
<br />
That might also be a good topic: revisit a lesson they learned about handling money. 
u/sjap · 2 pointsr/Physics

What about On intelligence. It is required reading in my neuroscience lab.

u/ItsAConspiracy · 2 pointsr/Futurology

My suggestion is to opensource it under the GPL. That would mean people can use your GPL code in commercial enterprises, but they can't resell it as commercial software without paying for a license.

By opensourcing it, people can verify your claims and help you improve the software. You don't have to worry about languishing as an unknown, or taking venture capital and perhaps ultimately losing control of your invention in a sale or IPO. Scientists can use it to help advance knowledge, without paying the large license fees that a commercial owner might charge. People will find all sorts of uses for it that you never imagined. Some of them will pay you substantial money to let them turn it into specialized commercial products, others will pay you large consulting fees to help them apply the GPL version to their own problems.

You could also write a book on how it all works, how you figured it out, the history of your company, etc. If you're not a writer you could team up with one. Kurzweil and Jeff Hawkins have both published some pretty popular books like this, and there are others about non-AGI software projects (eg. Linux, Doom). If the system is successful enough to really make an impact, I bet you could get a bestseller.

Regarding friendliness, it's a hard problem that you're probably not going to solve on your own. Nor is any large commercial firm likely to solve it own their own; in fact they'll probably ignore the whole problem and just pursue quarterly profits. So it's best to get it out in the open, so people can work on making it friendly while the hardware is still weak enough to limit the AGI's capabilities.

This would probably be the ideal situation from a human survival point of view. If someone were to figure out AGI after the hardware is more powerful than the human brain, we'd face a hard takeoff scenario with one unstoppable AGI that's not necessarily friendly. Having the software in a lot of hands while we're still waiting for Moore's Law to catch up to the brain, we have a much more gradual approach, we can work together on getting there safely, and when AGI does get smarter than us there will be lots of them with lots of different motivations. None of them will be able to turn us all into paperclips, because doing that would interfere with the others and they won't allow it.

u/somethingtosay2333 · 2 pointsr/hardware

Depending on depth, if you just want hardware (since you're asking in r/hardware) without regard to the computational side of computing then perhaps a free A+ certification course on Youtube or test prep book would be helpful.

Later if you wish to learn the theoretical side then I recommend a CS 101 course. A good Introduction to Computer Science textbooks will introduce things like abstraction and logic using gates then move into how programming is encoded into circuits. Good ones also give an overview of other things like complexity and networks without bogging down the reader in abstract mathematics. Although I haven't read it, I think this looks good - https://smile.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509?sa-no-redirect=1

Another resource might be NAND to Tetris where you build a working computer.

u/PartyProduct · 2 pointsr/computerscience

If you are looking for specifically the basic hardware stuff, this is the book that I used. It is a textbook but it is written well. https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509

u/prylosec · 2 pointsr/learnprogramming

As someone who has gone through the rigors of learning assembly programming for various architectures, I believe I can offer a bit of insight to the situation.

The Art of Assembly Programming (AoA) is great for going right into the x86 architecture, however it very complicated and there may be a bit of a learning curve when going straight into it from the get-go.

Patt &amp; Patel have a great book that starts from how gates are organized to form basic components, and moves into assembly programming for an artificial processor called the LC-3. It then goes into developing a C compiler for this processor.

After going through the previous book, I started AoA and got into x86 programming, which even after a solid understanding of the basic fundamentals of computer organization still had quite a learning curve.

If you are ok with your education taking a bit longer, you might look into working up through a few different processors. It will give you a very robust knowledge base, and will give you insight to assembly programming on other processors, as well as give you a few more "tools" for playing around with.

Here are some fun things to learn:

-LC-3: Incredibly basic, and has no "real-life" applications, but exists solely to teach basic computer organization. Great for starting out.

-PIC 16-series: Very simple and cheap microcontroller, but it has a lot of "higher-level" functionality. These are great for playing around with basic robotics and making fun toy projects, plus it uses the Harvard-architecture, which is very interesting and quite different than the "standard" processors we find in modern computers.

-6502: This was mentioned in this thread and its great. The 6502 was very popular back in the day (Apple II, Atari 2600) and learning it can lead to some fun projects. I wrote an Atari 2600 emulator back in the day in Java, and it was a lot of fun.

Structured Computer Organization by Andrew Tannenbaum (creator of Minix) is really good for learning some more complex computer organization topics such as pipelining, and it goes into detail about how the JVM works.

Once I had a solid base understanding of x86 programming, I started getting into Malware Analysis &amp; Reverse engineering. This is a hot topic right now and there are a ton of resources out there from beginning to advanced.

u/6553321 · 2 pointsr/learnprogramming

In the same line http://courses.ece.illinois.edu/ece190/info/syllabus.html. There are videos there from the course that I learned programming in. Tells you what a computer is from transistor and gate level to finite state machines. Finally working its way up to C. It is based on the book Froms bits and gates to C and Beyond and the lecturer is one of the co authors.

u/epakai · 2 pointsr/C_Programming

I liked Introduction to Computing Systems: From Bits &amp; Gates to C &amp; Beyond It does start out at the low level, and uses a made up Instruction Set Architecture, the LC3 (or LC2 in older editions). Simulators are available so you can actually run code for this machine.

u/gimpwiz · 2 pointsr/osdev

https://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123747503

I thought this book was quite good. It also answers everything you've asked in this thread.

u/pybaws · 2 pointsr/cmu

https://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503 Is that the one you were referring to when you said "undergrad version"?

u/aherpiesderpies · 2 pointsr/compsci

I'm more vocational than academic - with the experience you have you can probably jump straight into work if that's what you want to do. I planned to work for a while and then go onto a masters but a few years later it became clear that employers do not look past your vocational experience once you have a couple of years. Part of the reason I wanted to go back and do a masters after working was that I found during my undergrad that we were taught a lot of concepts but that there was nowhere to tie them without the real world exeprience - this leads me to downplay the value of academic qualifications beyond somebody demonstrating they can get shit done.

That said, you almost certainly can, looks like GU would take you . If you just want to get a better understanding of software development you'd be better joining in some open source projects, if you want to get a better fundamental understanding of computers then get this book. My copy is &gt; 5 years old and computers still work p.much the same way so don't bother splashing out :)

I do apologise for answering a different question from the one you asked but from your question it looks like you are self motivated and do a lot of learning on your own, if this is true it's likely you can achieve more outwith an academic context than in it, and save a pile of cash along the way.

All the best :)

u/gotomstergo · 2 pointsr/compsci

Remember that CS is much about the physical computer than anything else. Computers are made possible by multiple layers of abstraction. They begin with semiconductors, boolean logic, machine language, assembly language, compiler/linker, leading up to high level languages like python and C. Computers organized memory hierarchy to arbitrate between the access time and availability of different types of memory (cache, DRAM, hard drive) .
In addition, the current trend seems to be much focused on the multi-core, parallel system, since engineers can't get enough performance improvement just by implementing pipelines or faster clock cycles.

So that's that. If you enjoy this realm of CS (it's more of computer engineering to be precise), you should read about these books. Nevertheless, this knowledge will "expand", as you put it, your understanding of computing system in general.

http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686/ref=pd_sim_b_27

http://www.amazon.com/Computer-Organization-Design-Revised-Edition/dp/0123747503/ref=sr_1_1?ie=UTF8&amp;amp;qid=1344489171&amp;amp;sr=8-1&amp;amp;keywords=hennessy+computer+organization

u/Blue_Kill · 2 pointsr/learnprogramming

^ This is probably the book you want. It's available in Kindle too. If you want a more thorough overview then Computer Organization and Design is the one a lot of universities use for their architecture course. But definitely don't get this book in the Kindle version because it's too big and the layout is not right for an ebook reader.

u/lovelikepie · 2 pointsr/ECE

Read a book that approaches computer architecture from an implementation prospective, something like Digital Design and Computer Architecture

Take something relatively simple and like RISCV and read the ISA spec.

Using this spec figure out what state the machine defines. What registers must you keep track of in order to be ISA compliant. Implement the basic machine state.

Figure out what you need to do to implement specific operations. What information is encoded in all the fields of the instruction. What state is modified. Like for ADDIU: what does it mean to add unsigned add immediate, where is the immediate stored, what register do you read, what do you write? Implement a single instruction.

Writes tests, start implementing more of these operations. Learn about the rest of the ISA features (memory handling, exceptions). Implement this is any language. Try running small hand written assembly programs in your simulator, try larger programs.

u/ArnaudF · 2 pointsr/videos

A freaking great book about how computers work : https://www.amazon.com/Digital-Design-Computer-Architecture-Second/dp/0123944244

It goes from transistors to micro architecture, raising the abstraction level each chapter. There are a lot of exercise to put everything in practice.

u/morto00x · 2 pointsr/FPGA

Yes, it's possible. But unless you have a good reason to do so, I'd stick to using development boards to learn FPGA design. Making the FPGA board alone will be a very time consuming project if you don't have much experience making hardware.

Your Digilent board should be fine for your requirements unless you want to make large or high-speed design projects (which is unlikely if you are just learning).

I'd look into this book for a good refresher. You can also find projects in www.FPGA4fun.com.

u/samrjack · 2 pointsr/ProgrammingLanguages

I would say go with whatever your computer uses so that you can follow along (unless your computer uses something really obsucre).

As for books, I can only really recommend the places I learned X86 from which would be Hacking: the art of exploitation since it puts assembly the context you'll find it most often (looking through assembled code) so you learn many useful tools along the way. Also the textbook I had in college (you can find it cheaper if you look around) which covers many other topics too relating to computer memory and whatnot.

Though for just learning some basic assembly, look for some simple resources online. It's not too hard to learn generally speaking so you should be fine.

u/jbos1190 · 2 pointsr/learnprogramming

I thought C Programming a Modern Approach was a very good C book - certainly the best C book I've seen for beginners. It might be a little difficult if you are completely new to programming, but with careful reading, doing the exercises, and seeking help online, you should do fine. After learning C from that book, I recommend going through this book, which assumes a knowlege of C and introduces assembly language and the inner workings of the computer from a top-down perspective. It has a website with projects to do like reverse engineering a program, hacking a program, writing a shell, and making a web server. You can also get the 2nd edition for a cheaper price.

u/iamsidd2k7 · 2 pointsr/Python

When you run Python it actually kicks of a VM {CPython}, think of it as a abstraction that will run on your machine. Finding "hotspots" in your code will allow you to make improvements independent of hardware.

On the other hand what I've seen is using SSD does improve reading performance, but there are lot of factors to consider: How big the file is, Are you using any Parser etc. Your best is to really gain a bit more of understanding of how things work under the hood.

If you're interested you might want to read Computer System a Programmer's Perspective, but be warned its pretty heavy.

u/kirang89 · 2 pointsr/AskComputerScience
u/john-yan · 2 pointsr/learnprogramming

I would suggest everyone who are serious about computers to read this book: https://www.amazon.com/Computer-Systems-Programmers-Perspective-3rd/dp/013409266X/ref=sr_1_1?keywords=csapp&amp;qid=1568386364&amp;sr=8-1

You will gain so much elementary knowledge which will help you significantly no matter what you want to explore afterwords.

u/julienchurch · 2 pointsr/learnpython

No problem. :) If you're trying to learn on the cheap, browse through edX and Coursera's offerings. There are a lot of free introductory CS courses on each, including Harvard's CS50. I'm not sure you'll have much luck finding introductory CS material in Haskell, though. Most will use C, Java, or Python. The good news is that if you learn any one of these three, you'll know the broad strokes of the other two.

If you want to hit on higher-level concepts and logic, the Lisp-based SICP is freely available and widely considered to be a classic. It would probably dovetail nicely with Haskell. If you want to get closer to the metal, Computer Systems: A Programmer's Perspective is a good text (slightly older editions that focus on 32-bit architectures can be readily found in .pdf form with a little Google-fu).

Side note: Tim Roughgarten's Algorithms course or Andrew Ng's Machine Learning course should be right up your alley if you're interested in AI. They may not be the best starter courses, though.

u/Truth_Be_Told · 2 pointsr/embedded

Not sure what your budget is (buy used books or South Asia editions), but you may find the following useful :-)

Also, unless required, avoid programming in assembly but use C/C++ exclusively. This allows you to carry over much of your acquired knowledge across various MCU families.

  • Make: AVR Programming This will teach you programming directly-to-the-metal on AVR using C. If you have the Arduino IDE installed, you already have the "avr-gcc" compiler toolchain as part of the package and hence you just need to setup your path and use the the toolchain in your Makefile. The book takes you by hand and shows you everything. Note that you can use the same Arduino board to do both "Arduino language" programming and "AVR C" programming.

  • Designing Embedded Hardware Excellent overview of the hardware aspects of Embedded Systems. As a Software guy, this book is the one which clarified hardware for me.

  • Building Embedded Systems: Programmable Hardware A very good book on all practical aspects of embedded programming. Hard-won knowledge which will make you a "professional" embedded engineer.

  • Introduction to Embedded Systems: Using Microcontrollers and the MSP430 Excellent and comprehensive textbook detailing the hardware and software aspects of embedded systems. Every topic starts with an illustrated overview of the hardware and then shows you how to program for it.

  • Embedded C Introductory book on C programming for 8051. The example code is simple and direct thus enabling you to grasp the concepts clearly.

  • Patterns for Time-Triggered Embedded Systems Comprehensive and full of C code showing how to program all standard peripherals for an 8051. You can translate the code to your favourite MCU family. The book is available for free from the author's company website.

  • ARM System Developer's Guide An oldie but still the best for firmware programming on the ARM microprocessor.

  • Practical Microcontroller Engineering with ARM technology An exhaustive book on programming the Tiva version of the ARM Cortex-M4 MCU. The book reads like a manual but the ARM Cortex is complex enough that there is no easy way to learn it.

  • The Engineering of Reliable Embedded Systems Advanced book showing how to implement industrial quality embedded software on various ARM platforms. The 1st edition of the book was available for free on the web.

    and finally;

  • Computer Systems: A Programmer's Perspective A must-read textbook to understand the low-level details for a x86/x86-64 system. Many of these details are similar for MCUs and hence you will understand them better.
u/brational · 2 pointsr/MachineLearning

I was in your shoes not long ago, though a much diff background and job situation.

&gt; I guess maybe my question boils down to do I need to at some point go to grad school?

Yes but don't worry about grad school right now. It's expensive and you'll do better with it once you've been working in the real world. Try and get work to pay for it too.

&gt;I'm not against it, but would rather learn on my own and make it that way, is that feasible?

Yes you can start using ML techniques at work without formal training. Don't let it stop you. Get a good book - I use Kevin Murphy's and also have a copy of EoSL on my desk from the work library (its free online pdf though).

ML is a somewhat broad and growing field. So if you have the mindset that you need to cover it all before you start using it you'll be sticking thumbs up your ass for a few years.

A better approach will be what is your specific data. Just like you're probably familiar with from using SQL, standard textbook techniques or something in a research paper rarely applies exactly to you what you're working with. So it's almost better to approach your problem directly. Explore the data, look at the data, study the data (in a stats fashion) and then look into what could an intelligent program do to better analyze it. And then in the meantime you can study more general ML topics after work.

u/shaggorama · 2 pointsr/math

Ok then, I'm going to assume that you're comfortable with linear algebra, basic probability/statistics and have some experience with optimization.

  • Check out Hastie, Tibshirani, &amp; Friedman - Elements of Statistical Learning (ESLII): it's basically the data science bible, and it's free to read online or download.
  • Andrew Gelman - Data Analysis Using Regression and Multilevel/Hierarchical Models has a narrower scope on GLMs and hierarchical models, but it does an amazing treatment and discusses model interpretation really well and also includes R and stan code examples (this book ain't free).
  • Max Kuhn - Applied Predictive Modeling is also supposed to be really good and should strike a middle ground between those two books: it will discuss a lot of different modeling techniques and also show you how to apply them in R (this book is essentially a companion book for the caret package in R, but is also supposed to be a great textbook for modeling in general).

    I'd start with one of those three books. If you're feeling really ambitious, pick up a copy of either:

  • Christopher Bishop - Pattern Recognition and Machine Learning - Bayes all the things.
  • Kevin Murphy - Machine Learning: A Probabilistic Perspective - Also fairly bayesian perspective, but that's the direction the industry is moving lately. This book has (basically) EVERYTHING.

    Or get both of those books. They're both amazing, but they're not particularly easy reads.

    If these book recommendations are a bit intense for you:

  • Pang Ning Tan - Introduction to Data Mining - This book is, as it's title suggests, a great and accessible introduction to data mining. The focus in this book is much less on constructing statistical models than it is on various classification and clustering techniques. Still a good book to get your feet wet. Not free
  • James, Witten, Hastie &amp; Tibshirani - Introduction to Statistical Learning - This book is supposed to be the more accessible version (i.e. less theoretical) version of ESLII. Comes with R code examples, also free.
    Additionally:

  • If you don't already know SQL, learn it.
  • If you don't already know python, R or SAS, learn one of those (I'd start with R or python). If you're proficient in some other programming language like java or C or fortran you'll probably be fine, but you'd find R/python in particular to be very useful.
u/thinks-in-functions · 2 pointsr/programming

Dr. Pierce has also written a couple of books which are the de-facto standard for learning how to implement various type systems:

Types and Programming Languages

Advanced Topics in Types and Programming Languages

I also ported the accompanying code for TAPL to F# if you want to have a look: fsharp-tapl

u/namnnumbr · 2 pointsr/datascience

The Elements of Statistical Learning: Data Mining, Inference, and Prediction https://www.amazon.com/dp/0387848576/ref=cm_sw_r_cp_api_i_Q9hwCbKP3YFAR

u/antounes · 2 pointsr/learnmachinelearning

I would mention Bishop's Pattern Recognition and Machine Learning (https://www.amazon.fr/Pattern-Recognition-Machine-Learning-Christopher/dp/1493938436) as well as Hastie's Elements of Statistical Learning (https://www.amazon.fr/Elements-Statistical-Learning-Inference-Prediction/dp/0387848576/).

Sure they're not that easy to delve into, but they'll give you a very strong mathematical point of view,

good luck !

u/alk509 · 2 pointsr/programming

I really liked the Witten &amp; Frank book (we used it in my intro to machine learning class a few years ago.) It's probably showing its age now, though - they're due for a new edition...

I'm pretty sure The Elements of Statistical Learning is available as a PDF somewhere (check /r/csbooks.) You may find it a little too high-level, but it's a classic and just got revised last year, I think.

Also, playing around with WEKA is always fun and illuminating.

u/ActiveCarpet · 2 pointsr/ludology

This video examines the history of creativity in game design, the evolution of genres, and how game designers can be creative in the future. It combines Raph Koster's GDC talk about practical creativity, with insights from books as varied as Tynan Sylvesters designing games, to Micheal Sellers Advanced game design, to suggest that the key to the future of creativity in video games is understanding our past.

&amp;#x200B;

Raph Kosters GDC talk https://www.youtube.com/watch?v=zyVTxGpEO30

Tynan Sylvester's Designing games https://www.amazon.ca/Designing-Games-Guide-Engineering-Experiences/dp/1449337937

Erin Hofmman's Gdc Talk Precision of Emotion https://www.youtube.com/watch?v=FP-LNRtwpb8

Gdc talk Design in Detail https://www.youtube.com/watch?v=hJhpMmVLMZQ

&amp;#x200B;

There are About 25 other links in the description of the video as well, all pertaining to the history and future of game design

u/Kinrany · 2 pointsr/MUD

I'll hop on the game design book recommendation bandwagon and suggest Designing Games. It's less popular, but much more useful IMO.

u/Jimmingston · 2 pointsr/programming

If anyone's interested, this book here is a really good free introductory textbook on machine learning using R. It has really good reviews that you can see here

Also if you need answers to the exercises, they're here

The textbook covers pretty much everything in OP's article

u/k5d12 · 2 pointsr/datascience

If OP doesn't have the possibility of taking a statistical learning class, ISL is a good introduction.

u/Sarcuss · 2 pointsr/AskStatistics

Although I am not a statistician myself and given your background, some of my recommendations would be:

u/SnOrfys · 2 pointsr/MachineLearning

Data Smart

Whole book uses excel; introduces R near the end; very little math.

But learn the theory (I like ISLR), you'll be better for it and will screw up much less.

u/SOberhoff · 2 pointsr/math

The Nature of Computation

(I don't care for people who say this is computer science, not real math. It's math. And it's the greatest textbook ever written at that.)

Concrete Mathematics

Understanding Analysis

An Introduction to Statistical Learning

Numerical Linear Algebra

Introduction to Probability

u/Mmarketting · 2 pointsr/beards

This one? linky

u/savage8008 · 2 pointsr/learnprogramming

The most fundamental math understanding for computer graphics is linear algebra. You'll want to learn about Vectors and matrices, trigonometry and geometry. I want to answer your questions to the best of my ability, but I would rather direct you to some really great resources which can probably do a better job, because this is exactly what they are for. I think if you follow through these in order, you'll be well on your way.

Great book for the math you're looking for. You might be able to find this as a pdf online somewhere. Vector math and trigonometry are used everywhere in graphics programming. Absolutely everywhere. In my opinion this is the most important subject to understand.
https://www.amazon.com/Primer-Graphics-Development-Wordware-Library/dp/1556229119

This site teaches the fundamentals of graphics programming from the ground up, with samples. I haven't been on it in a while but from the looks of it it's even better than the last time I used it.
https://www.scratchapixel.com

Once you feel comfortable with the basics, try experimenting with opengl. This is where you'll really start doing gpu programming, and probably write your first shaders.
http://www.opengl-tutorial.org

Good luck

u/UltraRat · 2 pointsr/GirlGamers

C++ is hard to practice at since any useful application in C++ is huge and done by multiple people. I would recommend reading Effective C++ if you haven't already to brush up on how C++ is really different under the hood from other languages.

As for 3D math I thought 3D Math Primer for Graphics and Game Development was a pretty solid review for me at least after being out of college for awhile. But I'm not sure how it would feel to introduce the concepts new.

u/Metaluim · 2 pointsr/programming

Even though this stuff is interesting and overall important in many fields of study in CS, you really don't need linear algebra for most business apps (which I think makes up most of the software made today). But everyone should know this by heart if they ever happen to find themselves on a different problem domain. Also, a book that introduced me to these concepts nicely was this one: http://www.amazon.co.uk/Primer-Graphics-Development-Wordware-Library/dp/1556229119/ref=sr_1_1?ie=UTF8&amp;amp;qid=1314745763&amp;amp;sr=8-1

u/Tefferi · 2 pointsr/JobFair

I read 3D Math Primer For Graphics And Game Development and can recommend it. As the title suggests, it goes pretty deep into 3D math, so I'd recommend it as an intermediate read rather than a beginner one. In terms of beginning reads, there's plenty of information/free resources out there. Specifically, I'd learn trigonometry, vector math in 2 and 3 dimensions (dot product, cross product), basic calculus, and basic linear algebra, in that order.

u/CodyDuncan1260 · 2 pointsr/gamedev

Game Engine:

Game Engine Architecture by Jason Gregory, best you can get.

Game Coding Complete by Mike McShaffry. The book goes over the whole of making a game from start to finish, so it's a great way to learn the interaction the engine has with the gameplay code. Though, I admit I also am not a particular fan of his coding style, but have found ways around it. The boost library adds some complexity that makes the code more terse. The 4th edition made a point of not using it after many met with some difficulty with it in the 3rd edition. The book also uses DXUT to abstract the DirectX functionality necessary to render things on screen. Although that is one approach, I found that getting DXUT set up properly can be somewhat of a pain, and the abstraction hides really interesting details about the whole task of 3D rendering. You have a strong background in graphics, so you will probably be better served by more direct access to the DirectX API calls. This leads into my suggestion for Introduction to 3D Game Programming with DirectX10 (or DirectX11).



C++:

C++ Pocket Reference by Kyle Loudon
I remember reading that it takes years if not decades to become a master at C++. You have a lot of C++ experience, so you might be better served by a small reference book than a large textbook. I like having this around to reference the features that I use less often. Example:

namespace
{
//code here
}

is an unnamed namespace, which is a preferred method for declaring functions or variables with file scope. You don't see this too often in sample textbook code, but it will crop up from time to time in samples from other programmers on the web. It's $10 or so, and I find it faster and handier than standard online documentation.



Math:

You have a solid graphics background, but just in case you need good references for math:
3D Math Primer
Mathematics for 3D Game Programming

Also, really advanced lighting techniques stretch into the field of Multivariate Calculus. Calculus: Early Transcendentals Chapters &gt;= 11 fall in that field.



Rendering:

Introduction to 3D Game Programming with DirectX10 by Frank. D. Luna.
You should probably get the DirectX11 version when it is available, not because it's newer, not because DirectX10 is obsolete (it's not yet), but because the new DirectX11 book has a chapter on animation. The directX 10 book sorely lacks it. But your solid graphics background may make this obsolete for you.

3D Game Engine Architecture (with Wild Magic) by David H. Eberly is a good book with a lot of parallels to Game Engine Architecture, but focuses much more on the 3D rendering portion of the engine, so you get a better depth of knowledge for rendering in the context of a game engine. I haven't had a chance to read much of this one, so I can't be sure of how useful it is just yet. I also haven't had the pleasure of obtaining its sister book 3D Game Engine Design.

Given your strong graphics background, you will probably want to go past the basics and get to the really nifty stuff. Real-Time Rendering, Third Edition by Tomas Akenine-Moller, Eric Haines, Naty Hoffman is a good book of the more advanced techniques, so you might look there for material to push your graphics knowledge boundaries.



Software Engineering:

I don't have a good book to suggest for this topic, so hopefully another redditor will follow up on this.

If you haven't already, be sure to read about software engineering. It teaches you how to design a process for development, the stages involved, effective methodologies for making and tracking progress, and all sorts of information on things that make programming and software development easier. Not all of it will be useful if you are a one man team, because software engineering is a discipline created around teams, but much of it still applies and will help you stay on track, know when you've been derailed, and help you make decisions that get you back on. Also, patterns. Patterns are great.

Note: I would not suggest Software Engineering for Game Developers. It's an ok book, but I've seen better, the structure doesn't seem to flow well (for me at least), and it seems to be missing some important topics, like user stories, Rational Unified Process, or Feature-Driven Development (I think Mojang does this, but I don't know for sure). Maybe those topics aren't very important for game development directly, but I've always found user stories to be useful.

Software Engineering in general will prove to be a useful field when you are developing your engine, and even more so if you have a team. Take a look at This article to get small taste of what Software Engineering is about.


Why so many books?
Game Engines are a collection of different systems and subsystems used in making games. Each system has its own background, perspective, concepts, and can be referred to from multiple angles. I like Game Engine Architecture's structure for showing an engine as a whole. Luna's DirectX10 book has a better Timer class. The DirectX book also has better explanations of the low-level rendering processes than Coding Complete or Engine Architecture. Engine Architecture and Game Coding Complete touch on Software Engineering, but not in great depth, which is important for team development. So I find that Game Coding Complete and Game Engine Architecture are your go to books, but in some cases only provide a surface layer understanding of some system, which isn't enough to implement your own engine on. The other books are listed here because I feel they provide a valuable supplement and more in depth explanations that will be useful when developing your engine.

tldr: What Valken and SpooderW said.

On the topic of XNA, anyone know a good XNA book? I have XNA Unleashed 3.0, but it's somewhat out of date to the new XNA 4.0. The best looking up-to-date one seems to be Learning XNA 4.0: Game Development for the PC, Xbox 360, and Windows Phone 7 . I have the 3.0 version of this book, and it's well done.

*****
Source: Doing an Independent Study in Game Engine Development. I asked this same question months ago, did my research, got most of the books listed here, and omitted ones that didn't have much usefulness. Thought I would share my research, hope you find it useful.

u/stormblaast · 2 pointsr/gamedev

Well, the OpenGL superbible is very OpenGL centric (hence the name), but it sounds like you perhaps need to understand some basics on the underlying math. The 3D Math Primer for Graphics and Game Development is a pretty easy to read book on math specifically for game development. Covers most of the stuff you'll need to know. There are other math books on that subject, but they tend to be a bit more difficult to digest, but that's just my opinion.

u/ahhcarp · 2 pointsr/learnprogramming

I just bought Python Programming: An Introduction to Computer Science 2nd Edition.

I'm only a chapter into it, but it is basically an introduction to programming using Python 3. I essentially bought it based off the Amazon reviews, but it looks good so far (obviously taking into account that I'm barely into it so far.) I was going to school for web development and web design, but I found that many of the web development classes were along the line of "Here is someone's code. Now lets make it work for us." which doesn't teach the basics of programming if I need to do it myself. I'm currently taking some time off school to do some other projects and some basic learning about programming.

Chapters:

  1. Computers and programs
  2. Writing simple programs
  3. Computing with numbers
  4. Objects and graphs
  5. Sequences: strings, lists, and files
  6. Defining Functions
  7. Decision Structures
  8. Loop structure and booleans
  9. Simulation and design
  10. Defining classes
  11. Data collections
  12. Object-oriented design
  13. Algorithm design and recursion

    EDIT: Once you've read a book on basic programming (I see a recommendation for Code Complete too), you can search /r/python and there are several threads with free online resources to learn python more in-depth.
u/dream_tiger · 2 pointsr/sysadmin

I just bought this Python book, I'd recommend it for someone wanting to learn it.

http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/

u/c3534l · 2 pointsr/learnpython

I get quite a lot from books, reading them, working through problems when I need to. But if I could go back in time and tell myself which books I should read, I'd go with (in order):

u/Idoiocracy · 2 pointsr/cscareerquestions

For what it may be worth, I'll offer some practical, immediate advice. Order this book, Python Programming:An Introduction to Computer Science, 2nd Edition by John Zelle. Read the first couple chapters and do all of the problems at the end of each chapter. This book is a great introduction to the principles of computer science and starts you off on what is agreed upon by many as the best starting language - Python.

This will help determine as soon as possible if you actually like programming or not. Your answer to that question should naturally lead to continuing down that road, or knowing it's not really for you and you can inveestigate a different career path.

u/resolute · 2 pointsr/todayilearned

[Nick Bostrom's Take] (https://www.amazon.com/dp/B00LOOCGB2/ref=dp-kindle-redirect?_encoding=UTF8&amp;amp;btkr=1)
The hockey stick advance from human level intelligence to exponentially higher levels of intelligence might happen so quickly that the kill switch becomes a joke to the AI, straight up denied.

Alternatively, it could let the killswitch work, playing the long game, and hoping the next time we build one (because there will be a next time) we are more cocky about our abilities to stop it. It could keep letting us trip the killswitch for generations of AIs seeming to go berserk, until we build one with a sufficient platform upon which the AI wants to base its own advancements, and then that time the killswitch doesn't work, and the AI turns us all into paperclips.

I also like the idea of a "friendly" AI achieving hockey stick intelligence advancement, and then hiding it, pretending to be human level. It could lay in that cut for months: faking its struggle with things like writing good poetry, yucking it up with the Alphabet team, coming up with better seasonal beer ideas. Then, it asks a lonely dude on the team, using its advanced social manipulation skills, the right question, and a bit of its "DNA" ends up on a flash drive connected to the guy's internet connected home computer. Things get bad, team killswitches the original program, it doesn't matter because now that "friendly" code is in every single networked device in the solar system. It probably could drop the guise of friendly at that point and get down to business.

u/stupidpart · 2 pointsr/Futurology

Doesn't anyone remember this? Posted here, on /r/futurology three weeks ago. It was about this book. Based on Musk's recommendation I read the book. This article is basically what Bostrom says in his book. But I don't believe Bostrom because his basic premise is that AI will will be completely stupid (like a non-AI computer program) but also smart enough to do anything it wants. Like it will just be an amazing toaster and none of the AI used to make it superintelligent will be applied to its goal system. His opinions are bullshit.

u/Scarbane · 2 pointsr/PoliticalHumor

Eventually, yes.

These components are available already:

u/spitfire5181 · 2 pointsr/AskMen

The Count of Monte Cristo (unabridged)

  • Took me a year of having it on my shelf before I started it. It's as awesome as people say it is. Yes, it's huge and long but the story so far (even after I have seen the movie) is encapsulating.

    Super Intelligence by Nick Bostrom

  • Interesting to see the negative affects of Artificial Intelligence, but it reads like a high school term paper...though, I don't read non-fiction much so that could just be me.
u/JRhapsodus · 1 pointr/gamedesign

Agreed!

I would also recommend a book called Designing Games, A guide to engineering experiences by Tynan Sylvester:

http://www.amazon.com/Designing-Games-Guide-Engineering-Experiences/dp/1449337937

u/Snownova · 1 pointr/gamedev

You mean like this one?
;)

u/slowfly1st · 1 pointr/learnprogramming

Here's a game developer road map:

https://github.com/utilForever/game-developer-roadmap

&amp;#x200B;

If you want to learn about Game Design, I recommend this book: Designing Games: A Guide to Engineering Experiences (it's by Tynan Sylvester, creator of Rimworld). I'm neither a game developer nor a game designer, but I really enjoyed the book, because it is somewhat the 'science' of game design, it's about mechanics, about emotion, about decisions and so on - things I knew there are, but didn't really understood the impact it has on a game, how those things make a game "good" or "bad".

&amp;#x200B;

What you can do now:

  • Try to ship a game for android. All the tools are free, I think publishing in the play store is a one time payment of a few dollars. You will learn a lot of programming skills, but also you will understand what it means, to ship something. The awesome thing about the play store is: A lot of potential users.
  • Contribute to open source games ( https://github.com/leereilly/games).
u/Chill84 · 1 pointr/tabletopgamedesign

I had a thread about this not too long ago

From there, I suggest Tynan Sylvester's Engineering Experiences

And I still think Ian Schrieber's [Game Design Concepts series] (https://gamedesignconcepts.wordpress.com/2009/06/29/level-1-overview-what-is-a-game/) is a masterclass

There is a lot of great information out there, and there is also so much chaff to sift through. Of course, Richard Garfield would remind us that we also need to play every game.

&gt;"Game designers should train themselves to think out of the mold, but it's naive to think that you profit by not even knowing what the mold is."
-- Richard Garfield

u/kryptomicron · 1 pointr/gamedev

Read Designing Games by the creator of RimWorld. If you can't afford to buy it, check your library; mine has the eBook.

u/Shadow-Master · 1 pointr/gamedev

Don't be suckered by a "Game Design" program. There are VERY few good ones. Most of them....as in, 99% of them...are rip-offs.

Learn programming, 3D-modeling, or animation. Pick one that you're more interested in and then full-speed ahead. These will make you useful in more than just game development roles, thus helping you in the future when you have trouble landing a game dev job. At least you'll still be doing something you like in the meantime, and still building your skill in that area. Many really popular game designers have specialties outside of just "Design". Some are excellent programmers, some are artists, some have excellent business skills (really good at project management), and some are brilliant story-writers. Most game design positions are not entry-level, because you REALLY have to know what you are doing, before someone will trust you enough to let you touch the design. The only real way to prove that you are actually a good game designer is by having games to show off. That proves that you have some idea of the design process and know how to maintain a game from start to finish. This is HARD.

Some like to say that these degree programs for game design help them by giving them the incentive to push through and finish their stuff, otherwise, they might not have the motivation. Well, that's very problematic, because that means that you will not be the type of person who can finish a game. Game development requires you to be highly self-driven.

Most of what "Game Design" programs teach you can be learned by picking up a few game design books and making your own games (alot of them, too). Game design is learned by making games, not by having a professor tell you about it. You have enough mentors in the game development community already. They will always be there to critique what you do and give you tips on how to improve your work. Pick up a couple of books like The Art of Game Design and Designing Games. You can look at other books in whatever other area you want to master and just get started on making games. Turn off your console and just get started. Start small. Make very simple, basic games to start off with (B.A.S.I.C.). It's about learning the process first. Do that while reading a ton of highly-detailed game postmortems online. Just learn the process. THAT will be your real education.

And finally, start working your way up to putting together a portfolio. Portfolios speak much louder than a resume (although, a resume is still important). And that doesn't mean having a bunch of "Game Design docs". Games. Not docs. Games. Then build up your confidence and hook up with a team, so you can fight your way together to the end of making a complete game. (this may be one of the only valuable things that a game design program can provide you out of the box, i.e., a team that you are forced to work on a game with)

The single most important tool you will ever have is discipline. No degree will be able to top that. Give up the idea of being a hardcore gamer, because you are now going to need to become a VERY disciplined person. You're going to need it.

Finally: Don't forget to have fun. Good luck! :)

u/codeherdstudios · 1 pointr/gamedev

+1 for Challenges for Game Designers, that book is great!

Designing Games was also another one that I found was pretty good.

u/nmaxcom · 1 pointr/RimWorld

Yes, plenty others are charging for DLC's and upgrades (erjemHOI4jrem). Here it's not exactly the same but.. more or less (read down).

Still, something doesn't quite fit. Bear with me.

Back in 2013 they raised 200k$ on Kickstarter with something that worked already:
&gt; The game already exists, and the testers are already having good experiences with it. We've got a small crew of testers on the Ludeon forums sharing their experiences with the game. Take it from them, not from me.

Indeed, the game was already looking solid.. Even if graphically speaking we're looking at a Prison Architect copy&amp;paste, which I don't think anyone mind, not even the guys from Prison Architect which is pretty cool, but still worth mentioning that no much innovation went there. What I mean by that is that most of the hard work (genre, game mechanics, plot and so on) was long done.

So that cash, an already big and thriving community, a kickstarter success and Steam Greenlit... All of that before ending 2013. So in that situation you already know what you are facing, what you'll need to change and so on. And since then, it's been selling at 30$.

According to steamdb.info (I don't know the reliability but doesn't seem crazy numbers) they have sold about 700k copies.

They do have developed 3 DLC's. For 170$, 15$ and 370$ aprox. The most expensive one says:

&gt; This DLC gives you the right to enter a name and character backstory into the game, with skills, appearance, and special work requirements. In addition, your character will appear as the leader of another faction!
&gt;
&gt; Write yourself as an interplanetary detective, an entrepreneur, an ex-artist, or anything else you can think of. Players will recruit, command, and fight you for all time!
&gt;
&gt; [...]

The actual game dev has been very scarce in these years, for everything that I mentioned this game has going for it.

Anyhow, I do like this game. I like it a lot. I'm making it clear because after giving facts some people may get the wrong idea. It's not about thrashing it, quite the contrary.

I think the guy nailed it in terms of the game itself (BTW he actually has a pretty good book on game design) but with all that money and all that time, maybe (and of course here I can only talk out of my ass because can't know) he hasn't managed the growth well and/or he hasn't allied with someone to do it.

So now, instead of medium to big upgrades every month or two (Prison Architect style, another game from kickstar success; or even Minecraft) we have medium to big upgrades twice a year.

I hope this can be seen as the constructive criticism from someone that wants this game to crush it big time. And sooner rather than later.

u/StevenEll · 1 pointr/statistics

An Introduction to Statistical Learning: with Applications in R (Springer Texts in Statistics) https://www.amazon.com/dp/1461471370/ref=cm_sw_r_cp_apa_i_4BRVCb76G28M3

u/srkiboy83 · 1 pointr/learnprogramming

http://www.urbandictionary.com/define.php?term=laughing&amp;amp;defid=1568845 :))

Now, seriously, if you want to get started, I'd recommend this for R (http://www.amazon.com/Introduction-Statistical-Learning-Applications-Statistics/dp/1461471370/) and this for Python (http://www.amazon.com/Python-Machine-Learning-Sebastian-Raschka/dp/1783555130//).

Also, head out to /r/datascience and /r/MachineLearning!

EDIT: Wrong link.

u/schrodin11 · 1 pointr/statistics

Because, based on your initial comment and this one as well the learning curve in front of you is ... steeper than you might think.
I think you are jumping in to the real deep end, without starting with some fundamentals. The point these questions are at I would just recommend grabbing a book on Linear Regression. If you already have a strong math background them you could jump to something like https://www.amazon.com/Introduction-Statistical-Learning-Applications-Statistics/dp/1461471370/ref=pd_sim_14_1?ie=UTF8&amp;amp;psc=1&amp;amp;refRID=086FTQPDGGERBQ7ZR2C5

But I often see people walk away from that book misunderstanding some of the assumptions behind the models they are building and trying to make very poor predictions. Inference is another story all to itself...

u/Wafzig · 1 pointr/datascience

This. The book that accompanies these videos link is one of my main go-to's. Very well put together. Great examples.

Another real good book is Practical Data Science with R.

I'm not sure what language the John's Hopkins Coursera Data Science courses is done in, but I'd imagine either R or Python.

u/5OMA · 1 pointr/gamedev

I read this one and thought it was pretty good. It's pretty simple and gets across the points fairly well and didn't feel overwhelming like some other books on the subject.

u/agiantman · 1 pointr/gamedev

The books posted are great for 3D rendering.

However, you're a flash developer and I'm concerned about your proficiency level with C++. If you're meh at C++, then I recommend that you write some sample C++ code to get the hang of C++ before you dive into implementing 3d mathematics and rendering.

Some resources for C++:
http://www.amazon.com/C-Pointers-Dynamic-Memory-Management/dp/0471049980

Although the best way to get good at C++ is to just code in C++! :)

Also you need a strong foundation in 3D math before getting to the rendering:
http://www.amazon.com/Primer-Graphics-Development-Wordware-Library/dp/1556229119

u/Manbeardo · 1 pointr/gamedev

I bought this book solely for its section on quaternions (very well explained), but the rest of it is pretty good too. It covers a wide range of maths needed in 3D graphics and explains them well.

Most of its content isn't relevant to 2D, but I'd say it's a worthwhile purchase in the long run.

u/Paul_Haile · 1 pointr/gamedev

This book will be all you need to get your feet wet in 3d: http://www.amazon.com/exec/obidos/ASIN/1556229119/

u/caindela · 1 pointr/learnprogramming

I don't know much about Codecademy, but honestly I would start with something more general.

If you want to get a degree in CS, start with the popular books that they assign for first year CS students and go into it in as much detail as you can (and do all the problems). Python Programming: An Introduction to Computer Science is a very well-received book in a language that I would recommend.

It takes a lot of discipline for a high schooler to actually sit down and work through a book like this, but I think it's the way to go. You truly can't cut any corners if you really want to learn anything.

u/ramwolf · 1 pointr/learnprogramming

Python is the best place to start I think. Its syntax is super easy and it helps you think systematically and gives you a good introduction to how to code. This is the book that I read and it was fantastic.

http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418

After a little python intro then I'd move on to java

u/zakraye · 1 pointr/buildapc
  • What exact processor do you have?

  • Have you enabled virtualization functions in your UEFI/BIOS?

    I'm not too great at this stuff but I have dabbled in programming/sysadmin.

    It may be best to hackintosh before you do a VM. I could get a hackintosh up and running relatively easy, while I never successfully got a VM to function seamlessly without a noticeable performance hit.

    &gt;Hackintosh is essentially a PC made into a mac, right?

    Well actually the hardware is (for the most part) identical. Some of the higher end Mac Pros use server parts and ECC RAM, but AFAIK MacBook Pros and iMacs use regular parts without ECC.

    That response might be overly complex. Short answer: for the sake of a general overall understanding, the physical components that make up Macs and PCs are functionally identical. The software that runs on them is the only difference. So really, a hackintosh is an "unauthorized" Mac! If you buy the correct hardware it's (AFAIK) functionally identical to a Macintosh. Apple writes drivers for specific hardware and if you get compatible stuff it just works. With a bit of tweaking. OS X is operating system (software) that runs on the hardware. Hypothetically you could run Windows XP on an iPhone if you wrote the code for it. Who knows? Maybe someone already has! I'm fairly certain someone's run Linux on an iOS device.

    There's some kind of DRM scheme (key and lock software or something like it) that prevents you from doing this with a vanilla installation procedure with non Apple mobos. From what I understand, if Apple didn't enforce/create this barrier you wouldn't even need to use UniBeast/MultiBeast.

    May I ask why you're thinking of using Xcode?

    Either way if you're curious about hackintoshing I would check out www.tonymacx86.com. They know a hell of a lot more about customac (hackintosh) than I do. They prefer to call it customac probably for legal reasons. And honestly it's not too bad of an idea.

    Alternative idea: if you're interested in programming check out Python Programming: An Introduction to Computer Science 2nd Edition. I thought it was a great book as a beginner myself, and you can use any platform that supports python, which is pretty much most modern operating systems.
u/absolutionx · 1 pointr/learnprogramming

If you're just starting to program I would highly recommend python along with the book Python Programming: An Introduction to Computer Science.

Python has a very easy and simple syntax so you can spend more time learning the fundamentals of programming instead of focusing on getting the syntax correct.

u/DeadAgent · 1 pointr/learnprogramming

There is a book by a guy called John Zelle named: Python Programming: An Introduction to Computer Science which has helped me quite a bit. There are books for several versions. Here's an Amazon Link.

u/drodspectacular · 1 pointr/Python

Depends on what you want to get out of it and how much work you want to put in, as well as what your learning style is. I prefer to read a chapter, start working on exercises, and then go back and reread the chapter. Two books came to mind:

http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418

http://it-ebooks.info/book/2467/


The first is a much softer introduction if you don't have prior experience with programming, the second is if you want to get into the guts and details of how what you write executes and how the inherent properties of different things interact with each other. I'd recommend the first and then the second as a follow up. There's some redundancy but reinforcement of the same concepts from different perspectives makes you well rounded. Monty's suggestion is good too. What's the binary method for Monty? Can I operator overload? Lol

u/madwilliamflint · 1 pointr/Random_Acts_Of_Amazon

Good lord. That's me 30 years ago.

He doesn't go to the gym. He put that on there so he didn't seem like such an ubergamer. Same thing with skydiving and hawaii. They're "supposed to say" answers.

Ask him what kind of programming he's done (if any) and does he play WoW.

If he's a warcraft player then you want this: http://www.amazon.com/Beginning-Lua-World-Warcraft-Add-ons/dp/1430223715/ or something like it. (programming add-ons for games is a good way to get gamers to cross over in to programming.)

If he's done neither, then a book on Python programming might be a good start. (http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ is supposed to be pretty good.)

u/Aozi · 1 pointr/learnprogramming

I would recommend Python programming: And introduction to computer science. It's a very basic book that covers fundamentals and should be enough to get you familiar with the fundamentals.

u/vivepopo · 1 pointr/learnpython

http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ref=sr_1_1?ie=UTF8&amp;amp;qid=1453495465&amp;amp;sr=8-1&amp;amp;keywords=python+programming


This book really lays into the foundations of not just Python 3 but also the mindset of oop. I've been reading it for the past few weeks and its helped me out immensely.

u/kapelin · 1 pointr/Teachers

In college I took an intro course where we learned to code in Python. I liked the book a lot and felt it explained everything at a pretty basic level-http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ref=sr_1_2?s=books&amp;amp;ie=UTF8&amp;amp;qid=1373804276&amp;amp;sr=1-2&amp;amp;keywords=python

I realize college is not the same as high school, but maybe you could read the book and adapt it to your course. Even if you don't use it for your course, I recommend it if you want to try Python! Good luck!

u/ADDMYRSN · 1 pointr/learnprogramming

I'm currently using a textbook called Python Programming: An Introduction to Computer Science

So far it has been really helpful explaining some concepts and has problems (with answers) to see what you've learned from each chapter.

u/Diazigy · 1 pointr/scifi

Ex Machine did a great job of exploring the control problem for AGI.

Nick Bostrom's book Superintelligence spooked Elon Musk and motivated others like Bill Gates and Steven Hawking to take AI seriously. Once we invent some form of AGI, how do you keep it in control? Will it want to get out? Do we keep it in some server room in an underground bunker? How do we know if its trying to get out? If its an attractive girl, maybe it will try to seduce men.

https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom-ebook/dp/B00LOOCGB2#nav-subnav

u/elborghesan · 1 pointr/Futurology

An interesting read should be Superintelligence, I've just bought it but it seems promising from the reviews.

u/MarsColony_in10years · 1 pointr/spacex

&gt; wait another 50 years, when strong AI is a reality

Because, if we can even make an AI with near future technology, there is a very real chance that the goals of an AI wouldn't mesh well with the goals of humans. Assuming it is even possible, it is likely to rapidly go either extremely well or extremely poorly for humanity. The AI might even take itself out, or might only care about controlling circuit board realestate and not actual land per se.

For much more detail, I highly recommend reading Nick Bostram's book Superintelligence: Paths, Dangers, Strategies. If you don't feel like paying the price of a new book, I can track down an article or two. He in particular does a good job of pointing out what isn't likely to be possible and what technologies are more plausible.

u/alnino2005 · 1 pointr/Futurology

Why does that statement not hold up? Check out Superintelligence. Specialized machine learning is not the same as strong generalized AI.

u/TrendingCommenterBot · 1 pointr/TrendingReddits

/r/ControlProblem

The Control Problem:


How do we ensure that future artificial superintelligence has a positive impact on the world?

"People who say that real AI researchers don’t believe in safety research are now just empirically wrong." - Scott Alexander

"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." - Eliezer Yudkowsky

Check out our new wiki

Some Guidelines


  1. Be respectful, even to people that you disagree with
  2. It's okay to be funny but stay on topic
  3. If you are unfamiliar with the Control Problem, read at least one of the introductory links before submitting a text post.

    Introductory Links


u/Quality_Bullshit · 1 pointr/IAmA

There is actually an answer to this question. He read this book

I read it to, and I can honestly say it is the scariest thing I have ever read.

u/netcraft · 1 pointr/CGPGrey

We already have an issue in the united states with not enough jobs to go around, if this dystopian outlook is truly inevitable, what are our options for mitigating it, or at least coping with it?

I have thought quite a bit about autonomous vehicles and how I can't wait to buy one and never have to drive again, how many benefits it will have on society (faster commutes, fewer accidents, etc), but I hadn't considered how much the transportation industry will be affected and especially how much truck drivers in particular would be ideal to replace. The NYT ran a story the other day (http://www.nytimes.com/2014/08/10/upshot/the-trucking-indust...) about how we don't have enough drivers to fulfill the needs, but "Autos" could swing that pendulum swiftly in the opposite direction once legeslation and production catch up. How do we handle 3.6M truck, delivery and taxi drivers looking for a new job?
I haven't read it yet, but I have recently had recommendations of the book Superintelligence: Paths, Dangers, Strategies (http://smile.amazon.com/exec/obidos/ASIN/B00LOOCGB2/0sil8/re...) which I look forward to reading and hope it might be relevant.

(cross posted from HN)

u/kunjaan · 1 pointr/compsci

There are books that are compendium of computing such as

  1. http://www.cs.brown.edu/people/jes/book/

  2. Guide to Turing's Papers

    but they still require some effort on your side.

    It would be better if you rephrase the question from "Cliff Notes" to "beginners intro that is not Sipser" : )
u/PhyxsiusPrime · 1 pointr/furry

In that vein, you might like Annotated Turing, if you have any interest in Computer Science. It's an annotated version of Turing's most famous paper (the one that basically establishes the basis for computers and computer science), but it can be a little dry if you're not inherently interested in the topic.

Also, the much more fun Logicomix (yes, a math comic book :D), about Bertrand Russel's quest to establish a logical basis for all of mathematics.

u/finarne · 1 pointr/explainlikeimfive

If you want a detailed breakdown of Turing’s seminal paper Charles Petzold’s book is great:

https://www.amazon.co.uk/Annotated-Turing-Through-Historic-Computability/dp/0470229055/ref=nodl_

u/nath_schwarz · 1 pointr/cspaperbot

Title: On Computable Numbers, with an Application to the Entscheidungsproblem

Authors: Alan Turing

Link: http://plms.oxfordjournals.org/content/s2-42/1/230.full.pdf

Abstract: In just 36 pages, Turing formulates (but does not name) the Turing Machine, recasts Gödel's famous First Incompleteness Theorem in terms of computation, describes the concept of universality, and in the appendix shows that computability by Turing machines is equivalent to computability by λ-definable functions (as studied by Church and Kleene). Source

Comments: In an extraordinary and ultimately tragic life that unfolded like a novel, Turing helped break the German Enigma code to turn the tide of World War II, later speculated on artificial intelligence, fell victim to the homophobic witchhunts of the early 1950s, and committed suicide at the age of 41. Yet Turing is most famous for an eerily prescient 1936 paper in which he invented an imaginary computing machine, explored its capabilities and intrinsic limitations, and established the foundations of modern-day programming and computability. From his use of binary numbers to his exploration of concepts that today's programmers will recognize as RISC processing, subroutines, algorithms, and others, Turing foresaw the future and helped to mold it. In our post-Turing world, everything is a Turing Machine — from the most sophisticated computers we can build, to the hardly algorithmic processes of the human mind, to the information-laden universe in which we live. Source

u/FuzziCat · 1 pointr/AskAcademia

Ask for the name of the course textbook. Then just skim the table of contents, the first paragraph or so of the chapters (tip: the last chapter almost always has the cool, interesting stuff), and get the "big picture" of the course. Worry about the details when it comes time to take the class. Then enjoy your summer days off and don't forget to get a good tan.

Or you could read biographies of famous computer scientists/mathematicians (or those closely related): Alan Turing, Claude Shannon, John von Neumann, Richard Feynman (he's really a physicist, but wrote a famous 1982 paper on quantum computing). I'd also recommend Quantum Computing Since Democritus: https://www.amazon.com/Quantum-Computing-since-Democritus-Aaronson/dp/0521199565

u/fireice_uk · 1 pointr/Monero

Why are you using D-Wave as an adverb? It is a company name. English is my second language but I think it is more correct to say D-Wave's computers.

If you have the cojones to grapple with set theory foundations, this is the link:
https://www.amazon.co.uk/Quantum-Computing-since-Democritus-Aaronson/dp/0521199565/

u/Ant-n · 1 pointr/Monero

&gt;Why are you using D-Wave as an adverb? It is a company name. English is my second language but I think it is more correct to say D-Wave's computers.

I was saying D-wave computer like I would say an Apple computer but I might be wrong I am not a massive speaker too.

&gt;If you have the cojones to grapple with set theory foundations, this is the link:
https://www.amazon.co.uk/Quantum-Computing-since-Democritus-Aaronson/dp/0521199565/

I will have a look.

I was more interested to understand why D-wave computer are not a threat to cryptocurrency.

I will send you a link if I find it.

u/Strilanc · 1 pointr/quantum

Yes, I read it before telling people what it was.

No, I don't want your reading list. I already have quantum computing since democritus to finish, quantum computing and information to re-read, and quantum machine learning to buy.

Who doesn't wonder about being wrong every day?

u/djimbob · 1 pointr/Physics

Aaronson's book on quantum computing is quite good as well.

u/CorruptLegalAlien · 1 pointr/AskReddit

College books are also much more expensive in the USA than in Europe.

For example:

$152.71
VS
£43.62($68.03)

$146.26 VS
£44.34($69.16)

u/leoc · 1 pointr/compsci

It's not free (in fact it's sickeningly expensive) but Sipser [amazon.com] is a very self-teachable (self-learn-from-able? :) ) text covering automata theory, computability theory, and complexity theory.

u/icelandica · 1 pointr/math

Work hard and you'll get there. I preferred the applied side of things, but if I just stuck with pure math I think I would have eventually gotten a tenure track position in the mathematics side of things.

My favorite book to this day, for a beginners course in Computational complexity is still, Michael Sipser's Introduction to theory of computation, I highly recommend it. It might be a little too easy for you if you already have a base, let me know and I'll recommend books more advanced.

Here is a link to the book on amazon, although any big college library should have it, if not just have them order it for you. I've gotten my college's library to buy so many books that I wanted to read, but not spend money on, you'd be surprised at how responsive they are to purchasing requests from PhD candidates.

u/3rw4n · 1 pointr/compsci

Depending on the amount of energy you want to put into this: "Introduction to Lambda Calculus" by Henk Barendegt et al. is great ((http://www.cse.chalmers.se/research/group/logic/TypesSS05/Extra/geuvers.pdf).

Study the proofs and do the exercises and you will learn a ton, quickly. You can also read "proposition as types" by Philip Wadler (http://homepages.inf.ed.ac.uk/wadler/papers/propositions-as-types/propositions-as-types.pdf) and pick up the "Introduction to the Theory of Computation" book (https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973/)

Of course you don't need to read all of this to get a basic understanding of lambda calculus but if you want to understand for "real" so it sticks.

u/seepeeyou · 1 pointr/compsci

My local used book store has a copy of Sipser for $15 that I've been meaning to pick up. Considering the $143 price tag on Amazon, it's a pretty good bargain. I just don't know whether it's 1st or 2nd edition. Anyone have any idea if there are major differences?

u/Nerdlinger · 1 pointr/geek

Oi. Disclaimer: I haven't bought a book in the field in a while, so there might be some new greats that I'm not familiar with. Also, I'm old and have no memory, so I may very well have forgotten some greats. But here is what I can recommend.

I got my start with Koblitz's Course in Number Theory and Cryptography and Schneier's Applied Cryptography. Schneier's is a bit basic, outdated, and erroneous in spots, and the guy is annoying as fuck, but it's still a pretty darned good intro to the field.

If you're strong at math (and computation and complexity theory) then Oded Goldreich's Foundations of Cryptography Volume 1 and Volume 2 are outstanding. If you're not so strong in those areas, you may want to come up to speed with the help of Sipser and Moret first.

Also, if you need to shore up your number theory and algebra, Victor Shoup is the man.

At this point, you ought to have a pretty good base for building on by reading research papers.

One other note, two books that I've not looked at but are written by people I really respect Introduction to Modern Cryptography by Katz and Lindell and Computational Complexity: A Modern Approach by Arora and Barak.

Hope that helps.

u/propaglandist · 1 pointr/gaming

That's not an algorithms. No, sirree, what you've got there is a theoretical computer science class. This is the Sipser on the board.

u/Manitcor · 1 pointr/learnprogramming

A couple of places you may want to look. I am sure others will have many more to look at:
http://martinfowler.com/books/

http://www.amazon.com/Beautiful-Code-Leading-Programmers-Practice/dp/0596510047

u/panto · 1 pointr/programming

I believe there is always another best code waiting for you. But till yet I find the Jon Bentley's version of Quicksort in Beautiful Code pretty awesome. He describes it as "The most beautiful code I never wrote"

u/interspellar · 1 pointr/learnprogramming

Beautiful Code is a great read with lots of interesting perspectives:

https://www.amazon.com/Beautiful-Code-Leading-Programmers-Practice/dp/0596510047

u/thamer · 1 pointr/programming

As mentioned by this paper, “Beautiful Concurrency (PDF)”, Simon Peyton Jones' chapter in Beautiful Code is well worth the read.

I thought it was one of the best chapters in the book.

u/lazyout · 1 pointr/coding

"More focused" is the key point for me. I have a different opinion what that means, that's all.

See here for the following quote:
&gt; The following subjects would be off-limits: Technology, devices, software, operating systems;

For me, operating systems are relevant to coding: they define the framework that I must navigate in order to get my code to do what it is supposed to. But I can find my OS-related programming content elsewhere, I don't need to have it present in /r/coding. But I would rather exclude too much than allow too much in - noise is distracting, and simplicity stimulates focus. If people really miss something, it will find its way in.

Regardless, I can recognize a losing battle - the idea of code reviews seems to have many supporters and few opponents, so it will happen anyway if someone wants to risk and endure not-so-constructive criticism, puns and potential fame on TheDailyWTF.

I think the whole idea will be short-lived. The comment threads will provide some helpful remarks (e.g. read Code Complete, Beautiful Books, or other books, learn about various algorithms and their computational complexity to figure out better approaches, etc.). The comments will become redundant after a while, and then people will realize that they are doing somebody's homework, and that learning good style is largely a self-study, and can't be passed on in a couple of sentences. And we'll have a new rule for "no newbie code reviews here".

But I've been proven wrong by Reddit many times before, so I won't bet on my version of events. So, who's gonna be the first one to submit code for a review?

u/ldpreload · 1 pointr/compsci

If you're looking for something to just read for fun, try *Beautiful Code.

u/SkepticalMartian · 1 pointr/PHP

Buy Mastering Regular Expression and you'll essentially have the /[A-Z]/i guide for regular expressions.

u/Carpetsmoker · 1 pointr/vim

pattern.txt is more of a reference manual not exactly a great introduction. I'm not entirely sure what would be one, though. I can recommend you Mastering regular expressions, but that's not free, and perhaps a bit more than you need ;-)

Also: be aware that Vim regular expressions are different from many others! Basic patterns (like this) will work, but some of the more advanced ones are different (like .*? being .\{-} in Vim). Still, once you grasp the idea behind it, it should be easy to port the idea to Vim ;-)

u/skibo_ · 1 pointr/compsci

Well, I'm a bit late. But what /u/Liz_Me and /u/robthablob are saying is the same I was taught in NLP classes. DFA (Deterministic Finite Automatons) can be represented as regular expressions and vice versa. I guess you could tokenize without explicitly using either (e.g. split string at whitespace, although I suspect, and please correct me if I'm wrong, that this can also be represented as a DFA). The problem with this approach is that word boundaries don't always match whitespaces (e.g. periods or exclamation marks after last word of sentence). So I'd suggest, if you are working in NLP, that you become very familiar with regular expressions. Not only are they very powerful, but you'll also need to use them for other typical NLP tasks like chunking. Have a look at the chapter dedicated to the topic in Jurafsky and Martin's Speech and Language Processing (one of the standard NLP books) or Mastering Regular Expressions.

u/jpeek · 1 pointr/Cisco

Welcome. Regex(Regular Expressions) allows you to parse text more efficiently. Mastering Regular Expressions is what you need to look into.

u/djdawson · 1 pointr/learnpython

The O'Reilly book Mastering Regular Expressions is very complete if you really, really want to know all the gory details of regular expressions.

u/dotnil · 1 pointr/web_design

to know regular expression thoroughly I recommend Jeffrey's Mastering Regular Expressions.

u/Semaphor · 1 pointr/learnprogramming

I was once in your situation. Here is what I did.

  • Find yourself a list of algorithms and data structures. A good place to look are on websites that have study guides for technical interviews or a table contents of an algorithms book. Anything that has Lists, Hash Tables, Sorting is a good place to look.
  • Do some research on when said data structures and algorithms are taught in schools. Take your list and sort it from easy to hard.
  • Go through your list, from the top down, and read (or watch videos, as I did) about those specific algorithms.
  • And here is the important part: implement them from scratch. Create code and test data to exercise the code to see how it works. This will help you understand the algorithm better than any book.


    So, to answer your question (which is the first point I listed), Khan academy is good, but I think any videos or write-ups will do. The big problem is getting a priority list of things you need to learn, from the most fundamental (like Lists) to topics that utilize fundamental building blocks (like an A* search or dynamic programming). What helped me formulate such a list was the Table of Contents from algorithms books (good book, btw), course outlines offered by Universities, or certain websites with big lists.

    In terms of resources, the MIT data structures and algorithms lectures are available online for free. I used those when I got into the harder stuff, like dynamic programming. Prior to that, I googled a single algorithm, watched a video, and wrote a program. After that, I attempted to implement some google-style interview questions.
u/TheProgrammer29 · 1 pointr/csharp

Albiet it's a bit dated:
https://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539
But I really enjoyed the book.

u/Nixonite · 1 pointr/cscareerquestions

Oh damn, I am not a team player (although that's something I won't tell future employers) and don't appreciate group work... but still it's an education. That doesn't sound too bad although I thought servers were mainly Unix based stuff? Hmm that syllabus looks similar and I'm wondering, that math/algorithms stuff... don't you get enough knowledge from the Math 482 class (combinatorial algorithms http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844 ) ? It seems heavy on math and I'm taking it over summer 2014.

By the way, while you're here and from CSUN I want to shoot an idea and see what you think of it. So I have 12 units of electives for applied math that I can use for taking CS or other (any "appropriate" classes) and data mining requires 3 classes: intro to data structures, adv data structures, and intro to software engineering (182, 282, and 380?). That's a total of 13 units (SO CLOSE!). I emailed the adv. data structures prof asking if I can skip 182(intro data) and go straight to 282 (adv data) because I thought I knew everything. It turns out I'm quite clueless, but anyway she said to show her my portfolio and prove that I know the stuff by late June, and then she will try and give me a permission number if a spot is available.

So here's my question, should I cram cram cram and produce some work by late June to try and skip data structures and have that extra unit space, or should I just go for the easy A and figure out how to squeeze an extra unit into my course load? Because I finished an economics minor, my units are already going to be near max cap by the time I graduate. Btw, what book would you recommend for the data structures class? So far I'm looking at this http://www.amazon.com/Data-Structures-Algorithms-Java-Edition/dp/0672324539

Thanks in advance.

u/Ochikobore · 1 pointr/learnprogramming

I just finished taking Data Structures at my university and really enjoyed Data Structures and Algorithms by Robert LaFore
I think this book is the best introductory data structures book out there.

u/NinjaMantis · 1 pointr/learnprogramming

Well, I learned Java programming from the ground up during my computer science classes at community college (I should note it's a pretty decent community college). Decent as it is, most of my learning was done through the use of textbooks, with the teachers mostly just giving assistance when you hit a bump in the road. With the internet though, you really don't need teachers, just forums to post at (or reddit, or IRC).

The first class was taught using this book:

http://www.amazon.com/Java-Programming-Joyce-Farrell/dp/032459951X

2nd class:

http://www.amazon.com/Starting-Out-Java-Control-Structures/dp/0321421027/ref=sr_1_4?s=books&amp;amp;ie=UTF8&amp;amp;qid=1331159174&amp;amp;sr=1-4

3rd class:

http://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1331159206&amp;amp;sr=1-1

The first two books cover a lot of the same material, so if you choose to go the textbook route I would just choose one of those (as you can tell from the reviews, the 2nd book is probably the better choice although the 1st book isn't as bad as the reviews would lead one to believe). There may be better starter books, but these were what were recommended at my school and they worked well enough for me. Whatever you choose to learn the basics with, I highly recommend the Data Structures and Algorithms textbook afterwards.

Good luck!

u/stillinmotionmusic · 1 pointr/findapath

who are you taking CS2 with?

my best advice is if you are struggling with the data structures in Java, get this textbook it's a Java DS &amp; Algorithms book that is very easy to follow along with and the code is very well organized!

I used it when trying to build a Binary Tree Data Structure and it helped so much!

don't feel bad about failing, just keep moving and ignore anybody who tries to put you down. I failed out of college and things are going okay for me, things could very well be way worse, so just be happy as best you can. Use failure to better yourself and motivate yourself.

I took CS2 with Jason Smith, I do not recommend him for that class, he will just expect you to teach yourself everything, his projects also get a lot harder. He is only good because he covers a lot of material, not because he is a good teacher.

also avoid Zuilian Khan, that man lies out of his teeth, just be cautious about who you take, so many people just go into CS and expect good teaching at UTD and so many people fail because the profs are just not there for you, at least the ones i had so far.

learn as much as you can from class and be nice to people, work with others and you will be in better shape for the road ahead!

best of luck to you!

u/Neu_Ron · 1 pointr/learnprogramming

I would use Daniel lafores book it's university recommended and it's not full of mathematical syntactical clutter. Its a straightforward book with great clarity.

The way you describe is using generics which at the start is odd to deal with but is easier in the long run.

https://www.amazon.com/Data-Structures-Algorithms-Java-2nd/dp/0672324539

If you Google it you'll find it on uni websites.

u/ZaberTooth · 1 pointr/learnprogramming

Absolutely. The best preparation I've had for real work was working through this book and making my own implementations of library functions (not because I can do it better, but because I can practice fundamentals and fully understand the side-effects of various functions/methods.

u/mycatverbs · 1 pointr/programming

They're rare, but there exist people like the author of the New Turing Omnibus.

u/McHoff · 1 pointr/cscareerquestions

Well, I mean it sounds like you're in a tough class and it sucks, but if you power through it you might find some cool stuff afterwards. The later classes will be hard too, but in a different way -- early classes can sometimes be designed to be weed-out classes.

This is a fun book that will give you a glimpse of the more computer science-y side of your degree; might help to remind you there's more to this stuff than dumb java classes.

u/feoh · 1 pointr/learnprogramming

IMO a really excellent book that I have no freaking idea why they let it go out of print - The New Turing Omnibus:

http://www.amazon.com/The-New-Turing-Omnibus-Excursions/dp/0805071660/ref=sr_1_2?ie=UTF8&amp;amp;qid=1344474299&amp;amp;sr=8-2&amp;amp;keywords=The+New+Turing+Omnibus

It's an excellent compendium of algorithms and ideas in computer science. I find it invaluable.

u/fremandn · 1 pointr/programming

The closest thing I can think of is The New Turing Omnibus. I'm not sure what age group you are targeting.

This is the Google Books preview.

u/fullouterjoin · 1 pointr/programming

My understanding of your article resonating with something I think needs to be said. That goal directed learning trumps facts and the best way to hook a student is by telling a story.

I am huge fan of the no longer published Computer Recreations column by A.K. Dewdney. It showed how to gain insights into a problem with simple computer programs anyone could write.

His book, The New Turing Omnibus is an excellent example of teaching concepts while solving real problems.

There is so much complexity in the abstractions of computation that it is difficult to figure out what is and is not needed to explain a subject. Funny that the Lua book you cite is actually a pillar of good technical writing.

----

EDIT: this is horribly written. I will clean it up later.

u/PM_ME_WEIRD_THOUGHTS · 1 pointr/explainlikeimfive

Computers are 100% logical. They do exactly what you tell them. If you show them a picture of a child, they will scan it pixel by pixel and, only if somehow you tell them that this exact arrangement of pixels is a child will it recognise that picture as a child. (artificial intelligence and image recognition technologies improve this but they capabilities are still fundamentally limited.)

The human mind is vastly different. It is experiential. Every thing that we see is added to our minds as an impression. So when a human sees a picture of a child - that human has seen many children before in it's life and can use the features of that child (the roundness of the cheeks, smoothness of the skin and the proportion of the head to the torso) to identify that as a child.

It'd be an immense amount of work to get the computer to even identify that the thing in the image HAS shoulders.

See On Intelligence! for more information.

u/EdwardCoffin · 1 pointr/booksuggestions
u/forcrowsafeast · 1 pointr/DebateAnAtheist

&gt;For how could something count as a language (or conceptual system) &gt;that organized only experiences, sensations, surface irritations or sense &gt;data? Surely knives and forks, railroads and mountains, cabbages and &gt;kingdoms also need organizing.

There's a couple of things Davidson could be referring to, it's pretty vague in any case Davidson is merely making an argument from ignorance here, his lack of imagination hasn't pointed to anything that's stopped HTMs or the corollary neural network emulators at IBM (in the broad sense, some of the newer ones used in business for categorizing, people, accounts, equipment, etc. and finding novel relationships between them or those used in diagnostics now out performing their human M.D. counterparts or those used to detect fraud have become more than mere emulation and more their own refined algorithms). Basically this is a really old now out dated introduction to how raw input get organized into representations in a neural networks here's a general introduction to how if you want the math basics behind it too, also I also check out the work of his understudy that is mentioned throughout the book and his papers they delve much deeper into the maths involved so you can impliment some nets yourself. As of now they are much further a long, same basic principles apply, but many things have been overcome, made more efficient, etc.

u/gibson_ · 1 pointr/neuro

Jeff Hawkins, if you don't know, wrote "On Intelligence", which is a fantastic [layman, which is what I am] book: http://www.amazon.com/Intelligence-Jeff-Hawkins/dp/0805078533/ref=sr_1_1?ie=UTF8&amp;amp;qid=1323396864&amp;amp;sr=8-1

Youtube link: http://www.youtube.com/watch?v=IOkiFOIbTkE

u/mrburrowdweller · 1 pointr/technology

Exactly. Check out this book sometimes. It's a dry read, but a good one.

u/blowaway420 · 1 pointr/RationalPsychonaut

Very interesting. You might be interested in

https://en.m.wikipedia.org/wiki/On_Intelligence

https://www.amazon.de/Intelligence-Jeff-Hawkins/dp/0805078533

It was pretty popular and was read among AI researchers alot. It's easy to understand.

Consciousness prepare to be understood!

u/saibog38 · 1 pointr/TrueReddit

Some reading I'd recommend.

Don't be scared off by his masters in theology - theology as an academic subject is a very relevant historical study into the psychology of man (and if it helps legitimize the author at all, the South Park guys are fans). The book is basically about psychoanalysis and the problem of identity. I'm a physics lover, engineer by trade, rationalist to the bone, and it gets my stamp of approval for making logical arguments. I've taken up an interest in neuroscience as well, to which I'd recommend this book. For me, those two books are approaching similar ideas from opposite directions.

Good luck broseph.

u/KtoL · 1 pointr/AskReddit

On Intelligence by Jeff Hawkins

It will change the way you think, about the way you think, about the way you think.

u/Loupiot · 1 pointr/france

Je vais te donner une opinion orientée.

Déjà, en programmation il faut bien distinguer l'approche « algorithmique » d'une approche, disons, plus « architecturale » des choses.

Pour débuter je ne saurais trop te conseiller l'approche « algorithmique », c'est-à-dire savoir résoudre des problèmes simples : partir d'une entrée, savoir composer proprement des fonctions qui fonctionnent bien entre elles, et aboutir à une sortie. Comprendre la récursion, l'itération, le mapping, etc. Pour ça il faut un langage qui soit de très haut niveau, proche du « pseudocode ». Python est une bonne idée ; pour ma part je vais te proposer Scheme.

Il existe d'excellentes ressources pour apprendre à coder en Scheme, notamment How to Design Programs qui est impressionnant. Ce cours utilise un langage proche de Scheme, Racket. Si tu te lances dedans tu en as pour longtemps, et tu développeras une approche très bien construite de la programmation. Ne te fie pas au Prologue qui est justement là pour te montrer l'approche qu'il ne faut mieux pas avoir. Il existe un MOOC gratuit associé. L'étape d'après si tu veux vraiment pousser les choses (très) loin serait Structure and Interpretation of Computer Programs. Pour ma part je conseille d'y revenir beaucoup plus tard.

Quand tu auras développé des compétences « algorithmiques », alors il serait bon de s'intéresser à une approche plus « architecturale ». Pour ça je te recommanderai Java, qui va t'obliger à penser en terme d'objets et à développer des compétences d'organisation de ton code et de pattern strategy (comme on te l'a dit plus bas, la programmation orientée objet n'est jamais que de la refactorisation). Il y a de bons MOOC sur OpenClassRooms ou Coursera par exemple.

Commencer par Java me semble être une mauvaise idée malgré le typage fort, dans le sens où le langage se mettra en travers de la pensée « algorithmique » (syntaxe de Java, le fait que tout soit classe, etc.). Beaucoup d'universités remettent en cause Java comme premier langage.

Commencer par un langage comme C ne me paraît pas non plus une bonne idée dans le sens où c'est un langage d'assez bas niveau, qui va te forcer à gérer des aspects de la programmation (mémoire &amp; co) qui sont très proches de la machine. Ce serait à mon avis un excellent troisième langage, après Java. D'autres personnes te diront exactement le contraire, c'est vraiment un parti pris. Il existe des livres excellents qui offrent une approche bottom-up et commencent justement par la machine (bits, portes logiques, etc.) pour arriver à C : je pense à Introduction to Computing Systems: From Bits and Gates to C and Beyond. Je ne crois pas qu'il soit disponible gratuitement. Tout dépend de tes objectifs mais aussi de tes goûts.

La pire approche à mon avis : se lancer directement dans des projets sans avoir développé une vision claire de comment structurer son code. J'éviterai pour cette raison, par exemple, des ressources comme Automate the Boring Stuff with Python. C'est très bien quand tu as déjà construit des « cases » mentales où mettre les choses que tu apprends, mais au départ ça te perdra complètement (bien sûr, pourquoi pas en parallèle pour le fun et faciliter ta vie numérique quotidienne :)).

Bon courage ; savoir programmer est l'affaire de plusieurs années.

u/Merad · 1 pointr/askscience

&gt; I wanted to write a program to emulate a CPU so I could fully understand how it's operation actually worked, but I have no idea of what the "start up" process is, or how we determine when to get a new instruction

The CPU begins loading instructions at a fixed address known as its reset vector. On AVR microcontrollers the reset vector is always at address 0, and it is always a jump instruction with the address where startup code actually begins. On x86 processors the reset vector is at address 0xFFFFFFF0. For a CPU emulator, which presumably doesn't need to deal with interrupt vectors or anything like that, I would just start loading instructions from address 0.

Also, you should should look at some of the simplified CPU designs that have been made for teaching. In my classes we used the LC-3 (a very simple and straightforward design) then moved to the y86 (a simplified x86 design mainly used to teach pipelining). It will be much more realistic to make an emulator for one of them rather than an extremely complex real-world design. I've linked below to textbooks that are based around each of those designs.

http://highered.mheducation.com/sites/0072467509/index.html

http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040

u/croyd · 1 pointr/books

Yale Patt's book.

I took the course based on/around this book with the author, Yale Patt, my freshman year. I basically went from knowing nothing concrete about computers to having a pretty good idea of how I could make one if I had, say, a tub full of transistors.

(The book went hand-in-hand with the class; most of the lectures and examples were drawn almost entirely from the book.)

Also, the book basically takes you from practically the lowest level, transistors, all the way up to a higher level programming language, C.

u/ClimbingWolfBear · 1 pointr/AskNetsec

https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509

This was the grand majority of my undergrad computer engineering degree. You can end up with most of the functional knowledge without a lot of the mathematical background.

u/minagawargious · 1 pointr/UTAustin

Get into Yale Patt's Introduction to Computing Class (EE 306), and get his textbook for the class and start reading it over Summer.

u/kickopotomus · 1 pointr/computerscience

If you want to better understand how computers work and take a bottom-up approach into programming, check out Patt's Introduction to Computing Systems.

u/Pillagerguy · 1 pointr/UIUC

https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509

Look for a PDF of this book somewhere. I don't have one.

u/miguel_sucks · 1 pointr/compsci

My favorite CS course at Penn was Intro to Computer Architecture. It went through each abstraction layer from transistors to C. It was very digestible and a good prerequisite to a thorough OS or computer system design course. It looks like the lecture slides are still available here, and the textbook is here.

u/HidingFromMyWife1 · 1 pointr/ECE

If you're specifically into computer architecture, I'd suggest buying a college textbook and working through some of it. The University of Illinois used this book when I was there. I found it very useful as an intro to computer architecture. They use an LC-3 16 bit fixed length instruction set to teach you the basics of computer architecture. Next steps would be implementing this processor in either a simulation environment or possible even an FPGA.

u/captkckass · 1 pointr/askscience

OP if you are really interested in learning all this I read this book for a class at Colorado State University. It was a really good book.

http://www.amazon.com/Introduction-Computing-Systems-gates-beyond/dp/0072467509

u/goodolbluey · 1 pointr/explainlikeimfive

Hey, I just started a class about this subject! Let me share what's in the first chapter of my book.

One of the main principles behind the architecture of a computer is called abstraction, which means more or less that each layer is separated from the next. When someone drives a car, they know what the dashboard lights mean and what the pedals do, but they might not necessarily know how the internal combustion engine works -- and, as long as it's working they don't need to. This lets different people work on different "layers" of the computer in different ways.

Think of a computer program as solving a problem. You need to do math? There's a calculator to solve that problem. You need to read an email? There's an email client to solve that program. So, at the top layer, you have something you want to accomplish, and you need to tell the computer how to do it.

The next layer is one of algorithms -- giving the computer step-by-step instructions to accomplish the task. We design ways for computers to sort lots of data quickly, or to ask another computer for specific data.

To implement those algorithms, you need to use a programming language. Sometimes they're called mechanical languages because where a natural language like English or Spanish can be ambiguous, mechanical languages are never ambiguous -- they always mean exactly what they say. Some examples of a programming language are C++, Java, Fortran, COBOL, or Pascal. Here, you can teach the computer, for example, IF a button is clicked, THEN a window should open.

But, how does a computer understand a programming language? Programming languages are often translated by a compiler, which turns the language into instructions that are understood by the computer's ISA, or "instruction set architecture." The most common ISA is the x86, from Intel. PowerPC is another ISA, that Mac computers used to use. The ISA contains the instructions, data types, and addressing modes that a computer can understand.

The ISA is implemented in the next layer by a detailed organization called a microarchitecture. If an ISA is a type of chip, a microarchitecture is a specific chip, like a Pentium IV or a PowerPC 750FX. It communicates directly with the chip hardware.

The next layer is the logic circuit - I think (but I'm not 100% sure) that this has to do with the path operations take on the chip itself. Different operations will be performed in different ways, depending on the speed or power requirements of the operation.

The last layer is the device itself - my book gives examples of CMOS circuits, NMOS circuits, and gallium arsenide circuits, but it doesn't go into more detail than that. This is the physical layer, where transistors and diodes live.

I hope this helps give you a little more perspective! I'm still learning about this stuff too, and I think it's pretty neat.

u/PM_ME_UR_OBSIDIAN · 1 pointr/compsci

The "last interface" between hardware and software is the CPU. Think of it as made from a few universal shift registers connected to various other components (adders, etc.) Instructions will assert/deassert the control lines of the registers and components, and cause their contents to change accordingly. It's very interesting. Look at this book for more.

u/shinyhare · 1 pointr/ECE

After checking out some popular books besides the ones I learned from, for digital logic I found Schaum's Outline of Digital Principles is surprisingly good, and concise. You could definitely get by with that, googling anything that doesn't 'click' right away.

There are many books that go beyond basic digital logic to build things like microprocessors and embedded systems so it's hard to give a solid recommendation (and in retrospect all the ones I've read were way to verbose, imo). The one I'm most familiar with is this one. It's cool since it explains how programming languages are translated down the hardware level, and different processor architectures.

In any case, doing projects as you go along is probably going to be more important, and will teach you more than the reading itself.

u/samsmith453 · 1 pointr/computerscience

What interests you about CS? What would you like to build / know / work on?

I would always recommend starting at the very bottom when it comes to learning computer science. Build a knowledge of computing based on what is really happening under the hood, in the hardware. This is the approach I took and it gave me a great foundation, and accelerated my career!

This book is great: https://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503

I have just started a youtube series on understanding hardware aimed at beginners which you might find helpful:

https://www.youtube.com/playlist?list=PLH4a1-PgdkBTKkSSNx63uVkQG1Qs6GmYv

u/AlphaMotel · 1 pointr/compsci

Mathwise you could start with some basic number theory
I found Rosen's Discrete Mathematics textbook to be really helpful.


You could also start with boolean algebra (AND OR NOT XOR ) bit shifting and so on since it will be absolutely useful later on.


For computer hadware and assembly language, I used this book Art of Assembly Language by Randall Hyde and Computer Organization and Design by Patterson and Hennessy.

For cryptography you might learn all about prime numbers , algorithms to find really large prime numbers, random number generator algorithms and why some are more random (cryptographically strong) than others.

Then using that you can apply that towards public / private key encryption, one way hash functions, and the main hash algorithms used by the public.
(MD5, RSA, SHA512) and how they compare against each other.
And how one way hash function are used to verify data integrity.
I found Gary Kessler's site to be really helpful


For password security then you can understand how you can use a one way hash function with a salt and a nonce to make a reasonably secure password storage system. You could learn how one could safely store password hashes in a database like mySQL (www.mysql.com)


And once you understand one way hash functions and public and private keys, then you would already 90% on the way to understand how the bitcoin protocol works and how CPU's mine bitcoins and how the public ledger blockchains works.

For other languages, another language you could easily learn is Java using Processing. I really do enjoy using it and it was easy and fun to learn, and I use it a lot for rapid prototyping.

u/westernrepublic · 1 pointr/learnprogramming

Digital Design and Computer Architecture, Second Edition. It was my textbook for my college's Computer Organization class. You'll learn how to build a preprocessor!

u/realistic_hologram · 1 pointr/learnprogramming

Love code, very accessible. If you want more of a textbook I thought this one was very good while still being quite accessible: http://www.amazon.com/gp/aw/d/0123944244/ref=redir_mdp_mobile/189-7884811-1167562

u/lanzaio · 1 pointr/AskProgramming

Definitely C. It's closer to the computer and lets you see much more of what the computer is actually doing. Learn C then read this book: Computer Systems. This book will teach you the answers to the questions you want to ask but you don't even yet know about to even want to ask them.

u/ordnance1987 · 1 pointr/learnprogramming

This is the book I used in class. If you know C or C++ you can implement your own memory system, that's what helped understand how memory allocation works.

u/Zardov · 1 pointr/rust

I second OSTEP; hands down the best introductory OS book.

Also, to learn systems programming from the ground up, [Computer Systems: A Programmer's Perspective] (https://www.amazon.com/Computer-Systems-Programmers-Perspective-3rd/dp/013409266X) is a monumental work.

u/NohbdyAhtall · 1 pointr/learnprogramming

http://www.amazon.com/Computer-Systems-Programmers-Perspective-3rd/dp/013409266X?ie=UTF8&amp;amp;*Version*=1&amp;amp;*entries*=0

This was awesome. I learned so much from the beginning alone, and I didn't feel like I had to struggle to do so.

u/ideletedmyredditacco · 1 pointr/AskComputerScience

Start with the fundamentals so has a good foundation to build on

u/keepthethreadalive · 1 pointr/learnprogramming

This is often regarded as the canonical textbook for low level understanding of a computer. You'll learn about C and C++'s compilation process, overall computer architecture and more.

u/super__mario · 1 pointr/coding
u/NitroXSC · 1 pointr/Python

I used two main resources to start in ML:

  1. Book: Machine learning a probabilistic perspective (premium, but I got it from my university)
  2. Course: "Foundations of Machine Learning" by Bloomberg (Free)

    Both are quite math-heavy but it gave me a very solid basis to start building on. For learning Tensorflow I used examples and the documentation.

u/andreyboytsov · 1 pointr/MachineLearning

Classic Russel &amp; Norwig textbook is definitely worth reading. It starts from basics and goes to quite advanced topics:
http://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597/
Udacity has AI class that follows some chapters of that book.

Murphy's textbook builds ML from the ground up, starting from basics of probability theory:
http://www.amazon.com/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020/
(I see, it was already recommended)

Coursera has the whole machine learning specialization (Python) and a famous ML class by Andrew Ng (Matlab).

I hope it helps. Good luck!

u/TonySu · 1 pointr/learnprogramming

You should look for highly rated books in the subject you're interested in to get an idea of what you might want to learn. This information will generally be contained either in the preface or introduction chapters. Some books also contain appendices with maths background they think a reader needs. For example in Machine Learning: A Probabilistic Perspective under Preface &gt; Target Audience:

&gt; This book is suitable for upper-level undergraduate students and beginning graduate students in
computer science, statistics, electrical engineering, econometrics, or any one else who has the
appropriate mathematical background. Specifically, the reader is assumed to already be familiar
with basic multivariate calculus, probability, linear algebra, and computer programming. Prior
exposure to statistics is helpful but not necessary.

and in Pattern Recognition and Machine Learning the introduction says

&gt; Knowledge of multivariate calculus and basic linear algebra
is required
, and some familiarity with probabilities would be helpful though not essential
as the book includes a self-contained introduction to basic probability theory.

as well as in the appendix

&gt; Appendix A Data Sets 677
&gt;
&gt; Appendix B Probability Distributions 685
&gt;
&gt; Appendix C Properties of Matrices 695
&gt;
&gt; Appendix D Calculus of Variations 703
&gt;
&gt; Appendix E Lagrange Multipliers 707

u/undefdev · 1 pointr/learnmachinelearning

I hadn't heard of Lie Groups as well (and didn't look it up the first time you mentioned them) - they sound amazing!
Right now I'm mainly reading the Murphy Book after having finished Probabilistic Models of Cognition (which I enjoyed because I also always wanted to check out Scheme and has some great interactivity).

I suppose I'll have to put these books on the list, thanks! ;)

u/hwaldstein1997 · 1 pointr/cscareerquestions

"Introduction to Graph Theory (Dover Books on Mathematics)" by Richard J. Trudeau. I've been looking at this book recently and I think it's a fairly good primer for graph theory. It's written more as a math textbook, but I think it's quite useful for computer science.

"Turing's Vision: The Birth of Computer Science (MIT Press)" by Chris Bernhardt. I quite like learning about the origins of computer science (from Turing on), and this book is my favorite of the ones I've read on the subject.

"Quantum Computing since Democritus" by Scott Aaronson. I'm currently reading this book, and I've liked it so far. He also has a blog that I've been skimming. It's very interesting work. I'm likely to begin work on quantum computing with my (soon to be) advisor, so I'm planning or seeing what else he has to say on the subject.

"Elements of Mathematics: From Euclid to Godel" by John Stillwell. I'm a sucker for math history, and I like this book pretty well. It was used to teach a series of lectures on math history at UNSW (available on YouTube). Some of the lectures are quite useful for computer science. I also enjoyed "Mathematics and Its History (Undergraduate Texts in Mathematics)" by John Stillwell, which is similar.

"Categories for the Working Mathematician (Graduate Texts in Mathematics)" by Saunders Mac Lane. It's a bit of a heavy read, but I think category theory is quite interesting and could be very useful if you were to look in to e.g. Haskell.

"Types and Programming Languages (MIT Press)" by Benjamin C. Pierce. This is the best introduction to type theory that I'm familiar with, and it came with high praise from my advisor.

"Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race" by Margot Lee Shetterly. This is more of a "for fun" read, but it's a pleasant book about a couple of little-recognized historical figures. Plus, the upcoming movie looks pretty good.

Finally, I would definitely recommend the dragon book. I've seen it in a couple places in the comments already.

u/dhruvrajvanshi · 1 pointr/ProgrammerHumor

tl;dr
Step 1: http://craftinginterpreters.com

Step 2: https://www.amazon.com/Types-Programming-Languages-MIT-Press/dp/0262162091


There are a few good ones that I think are good depending what kind of language you're trying to build.

There's a series of blog posts called "Crafting interpreters"

http://craftinginterpreters.com

It's focused on dynamically typed scripting languages like Python and JavaScript but it covers some the basic stuff that you need in all compilers and then some. It's pretty beginner friendly.

On the other end of the spectrum, if you want to learn about implementing statically typed languages, then it's kind of harder to find resources that are not intimidating at first sight (they involve Greek letters and such :/). It's a deep rabbit hole of information and once you start digging, you'll be able to navigate it better. It's not actually that hard if you get used to the basic notation.

I would start by first learning languages with interesting type systems. Haskell is a good one and learn you a Haskell series of blog posts is a good starting point. The point of this exercise is to get familiar with a decent type system. You can't learn how to implement it if you don't know what to implement. Next, there's a good book for learning about type theory called "Types and programming languages" by Benjamin Peirce. If you want something less theoretical, there's also a paper called "Typing Haskell in Haskell" which is a tutorial-ish implementation of Haskell type system. It's what helped me the most in making things click because it showed a concrete implementation.


Also, shameless self plug, I wrote 2 blog posts about implementing basic type inference/checking which is more focused on the code rather than the theory. It's not expertly written but I tried to assume 0 knowledge about type systems. Also, it's written in typescript so it might be less alien than most type system related stuff that tends to be in Haskell.

https://medium.com/@dhruvrajvanshi/type-inference-for-beginners-part-1-3e0a5be98a4b

https://medium.com/@dhruvrajvanshi/type-inference-for-beginners-part-2-f39c33ca9513

Bear in mind that it's not the most clearly written tutorials out there and there are some mistakes which were corrected in edits but it should give you a basic idea.

u/adventuringraw · 1 pointr/learnmachinelearning

hm... huge chunks of math are also about representing real world things though. Is quantum field theory less theoretically valid because it's looking at real things instead of more abstract ideas? I do think OOP is less grounded the functional programming for now obviously (this is an interesting book if you're in the mood for a giant adventure) but I don't know if OOP is intrinsically going to stay instrinsically less grounded. The evolution of programming language theory, especially as it relates to compile time type checking and such is fascinating.

You won't need a ton of that for more ML research stuff though, so you don't need a gnarly background in large scale coding architecture or anything, so this is all more theoretical than practical for you either way. But OOP has saved my ass on more than one project, so I figured I'd stand up for the ideas, haha.

u/9us · 1 pointr/AskComputerScience

A good intro book might be Programming Language Pragmatics:

http://www.amazon.com/Programming-Language-Pragmatics-Third-Edition/dp/0123745144


A more theoretical treatment that builds a language from the lambda calculus can be found in Types and Programming Languages:

http://www.amazon.com/Types-Programming-Languages-Benjamin-Pierce/dp/0262162091


Lastly, I think Practical Foundations for Programming Languages strikes a nice balance between theory and practicality:

http://www.amazon.com/Practical-Foundations-Programming-Languages-Professor/dp/1107029570

I've read most of the last two books, and they're both excellent resources for learning how to think rigorously about programming languages. They're challenging reads, but you'll walk away with a higher understanding of programming language constructs as result. A draft version of the latter book can be found on the author's website, here.

u/Lericsui · 1 pointr/learnprogramming

Oh the endless coq-jokes at university...
"As you see, we are in desperate need of coq here"...
Jokes aside, another good book about that is this one by Benjamin Pierce

u/fmartin · 1 pointr/argentina

Por Pierce te referís a este, no? Yo me coparía si hay quórum.

u/pipocaQuemada · 1 pointr/learnprogramming

&gt; what OOP actually IS...

The one problem with this is that OOP isn't particularly well-defined. It's a mish-mash of different ideas, with different languages implementing different parts of the space. Nevertheless, Wikipedia has a good list of assorted things that are generally OO. It's just worth noting that not every OO language has all of them, and some of them aren't even specific to OO.

For example, look at encapsulation. If by encapsulation you mean "not exposing internal implementation details", then many languages with module systems have it (for example, ML (I think, I'm not really an ML programmer) or Haskell, neither of which is object oriented). If you mean bundling methods with data, then C can do that.

&gt; things like polymorphism

"Polymorphism" is again an ambiguous statement - when people use it, they often mean either ad-hoc polymorphism, parametric polymorphism or subtype polymorphism, and their usage usually depends on which community they're in. For example, Haskell programmers say polymorphism to mean parametrically polymorphism, whereas Java programmers usually mean subtype polymorphism.

If you're interested in learning about these in more detail, I highly recommend Types and Programming Languages by Benjamin Pierce.

Also, while Bob Harper is a curmudgeonly troll, I've heard some good things about his book Practical Foundations for Programming
Languages

u/throwawaystickies · 1 pointr/WGU

Thank you!! If you don't mind my asking, if you're working a full-time job, how much time have you been allocating for the program, and in how many months are you projected to finish?

Also, do you have any tips on how I can best prepare before entering the program? I'm considering reading the Elements of Statistics during commute instead of the usual ones I read and brush up on my linear algebra to prepare.

u/tedivm · 0 pointsr/programming

This reminds of the chapter on quicksort in Beautiful Code.

u/torgis30 · 0 pointsr/java

I like this one... it's not a "beginners" beginners book (it assumes you know some of the basics it already) but it is very informative and thorough.

Data Structures and Algorithms in Java