Best ai & machine learning books according to redditors

We found 3,368 Reddit comments discussing the best ai & machine learning books. We ranked the 567 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Subcategories:

Artificial intelligence & science books
Computer neural networks books
Artificial intelligence books
Machine theory books
Computer vision & graphics books
Natural language processing books

u/ilknish · 482 pointsr/learnprogramming

Code: The Hidden Language of Computer Hardware and Software.
It may be a bit more lower level than you're looking for, but it'd be a great foundation to build off of.

u/_a9o_ · 188 pointsr/cscareerquestions

Refactoring: Improving the design of existing code

Design Patterns

Working Effectively with legacy code

Clean Code

How to be a programmer

Then there are language specific books which are really good. I think if you read the above, slowly over time, you'll be in a great place. Don't think you need to read them all before you start.

u/HeterosexualMail · 187 pointsr/programming

We did something similar as well. The labs were tons of fun. I remember having to run a couple dozen lines of code through the CPU cache on a test once, including some sneakery of using code as data at one point. I do appreciate having done it, but I'm not sure how much practical lasting value that really contributed.

That said, for those who are interested in this there is The Elements of Computing Systems: Building a Modern Computer from First Principles, more commonly known as "NAND to Tetris".

Petzold's Code is excellent as well.

Edit: Actually, while I've suggested those two let me throw Computer Systems: A Programmer's Perspective into the mix. It's a book we used across two courses and I really enjoyed it. We used the 2nd edition (and I have no issue recommending people get a cheaper, used copy of that), but there is a 3rd edition now. Being a proper text book it's stupidly priced (you can get Knuth's 4 book box set for 30 more), but it's a good book. Anyone have suggestions similar to that Computer Systems's text? I've always wanted to revisit/re-read it, but could always used a different perspective. u/falcojr · 103 pointsr/programming If you're really serious about learning, I HIGHLY recommend the book Code: The Hidden Language of Computer Hardware and Software. It's basically a book that 'builds' a computer from the ground up, starting with simple morse code type stuff through the wire, and each chapter just keeps building until you get to assembly and some higher level language stuff at the end. You do have to think through (or glaze over) some stuff in a few chapters, but it's a very eye opening book. Edit: Grammar u/BullockHouse · 95 pointsr/MachineLearning I mean, only if you think we're tens of thousands of years away from powerful, general-purpose software agents. If you survey actual experts, they're pretty uncertain (and vulnerable to framing effects) but in general they think less than a century is pretty plausible. So it's closer to somebody looking at the foundational research in nuclear physics and going "hey guys, this is going to be a real fucking problem at some point." Which is pretty much what Einstein did (and started the Manhattan project and a pretty significant intelligence operation against the development of a German nuclear weapon). EDIT: Also, if anyone's interested, the same blogger made a rundown of the opinions of lumaries in the field on AI risk in general. Opinions seem to be split, but there are plenty of bright people who know their shit who take the topic seriously. For those who aren't familiar with the topic and think everyone's just watched too much bad sci-fi, I recommend Bostrom. u/PM_ME_YOUR_MAKEFILE · 75 pointsr/learnprogramming CODE by Charles Petzold is the book to read to understand computers at a base level. It literally starts at a single bit and moves all the way up the stack. I cannot recommend this book enough for someone starting out. u/Scarbane · 70 pointsr/Futurology He should brush up on his knowledge about general AI. Nick Bostrom's Superintelligence is a good starting place, even though it's already a few years old. I recommend the rest of you /r/Futurology people read it, too. It'll challenge your preconceived notions of what to expect from AI. u/baileylo · 65 pointsr/PHP Here's a list I compiled, the list books kind of hit a wide range. I don't really have any given list of talks I like. I'll just randomly google for videos by a speaker i like or subject I'm interested in. Books: u/LateBroccoli · 64 pointsr/compsci Long answer is here Short answer is "stop whining and learn while you still can" Regards, Someone who didn't u/Lhopital_rules · 64 pointsr/AskScienceDiscussion Here's my rough list of textbook recommendations. There are a ton of Dover paperbacks that I didn't put on here, since they're not as widely used, but they are really great and really cheap. Amazon search for Dover Books on mathematics There's also this great list of undergraduate books in math that has become sort of famous: https://www.ocf.berkeley.edu/~abhishek/chicmath.htm Pre-Calculus / Problem-Solving u/Magical_Gravy · 61 pointsr/badcode My bad. In Object Oriented Programming (OOP), there are lots of design patterns that end up getting repeated all over the place. You might have run in to the factory pattern, or perhaps the builder pattern? If you can understand and notice these patterns, it means you can re-use old code more effectively, because code to handle a pattern in one place is probably very similar to code to handle a pattern in another. In addition, if you're discussing a problem with somebody, it means you can refer to the patterns by name as a sort of shorthand notation for "put it together like this". Saying "use a decorator" is a lot quicker and easier than describing what exactly "a decorator" is from scratch every time. The "Gang of Four" are four Computer Scientists who were among the first few to notice that these patterns kept popping up, and wrote a pretty well known book describing about 20 of the most common ones. In this specific instance, the builder pattern would probably have proved useful. Rather than having a single, monolithic constructor, you create a separate "builder" class. Character chararacter = new Character(xx, yy, life, kpph, kpyl, kvin, krak, kgr, kptgr, kbb, havepph, havepyl, havevin, haverak, haveya, isevented, x1, y1, x2, y2, x3, y3, x4, y4, x5, y5, IE) Can become Character character = Character.builder() .life(life) .initialCoordinates(x1, y1) ... .build() This is waaayyy more readable (especially if you're assigning values as arguments, rather than named values. If you ever called createFrom(...) with a string of numbers, it'd be very difficult to work out which number was what), and a lot easier to lay out properly. It also means you can gather arguments for creation gradually over time rather than all at once. Also looking more closely, and as /u/PM_ME_YOUR_HIGHFIVE pointed out, they're not actually using objects at all, which would be a good place to start. u/srnull · 54 pointsr/programming Sorry to see this getting downvoted. Read the about page to get an idea of why /u/r00nk made the page. I have to agree with one of the other comments that it is way too terse at the moment. I remember when we learnt about e.g. d-latches in school and it was a lot more magical and hard to wrap your head around at first then the page gives credit for. That and, or, and xor gates can be built up from just nand gates (the only logic gate properly explained) is also glossed over. Either go over it, or don't show the interiors of the other logic gates. The interactive stuff is really neat. Good work on that. Edit: If anyone reading wants to learn this stuff in more detail, two good books are u/reddilada · 51 pointsr/learnprogramming CODE: The Hidden Language of Computer Hardware and Software is a great book written for the non-tech crowd. It gives a good basis for what computers are all about. If he works in an office, I'd point him to Automate the Boring Stuff with Python as it will deal with things he is probably already familiar with. u/Nezteb · 43 pointsr/compsci Some book recommendations: u/kevroy314 · 43 pointsr/compsci I first heard about these when reading Godel Escher Bach back in later high school. That book was a long, difficult read, but man did it blow my brain wide open. Quines are definitely the thing that I remember most vividly (probably because it was the easiest to understand), but that book was full of awesome stuff like this. You should totally check it out! You can get it super cheap at used book stores since it was such a successful book. u/indrora · 42 pointsr/programming I think one of the most profound articles I read about just what makes a computer "work" is Brian Kernighan's article, "What an educated person should know about computers". It's a decade old now and was developed into a book of similar name. (Amazon link) Another was sitting down and reading Code: the hidden language of computing (Amazon link) and actually walking through it. The book is coming up on 20 years old, but Petzold (who has taught many a developer how to do fancy tricks with their silicon) really sat down and wrote a book that anyone could understand and come away from feeling better off and more knowledgeable about the way our world works. This is the book I refer new programmers and old knitting circle nannies to read when they ask how a computer works. u/greentide008 · 42 pointsr/compsci u/myfavoriteanimal · 40 pointsr/compsci Code, by Charles Petzold Here it is on Amazon. u/Shuank · 40 pointsr/argentina Creo que mucha gente se confunde ser autodidacta con hacer algun cursito de como hacer una web y darle con eso. Para llegar a cierto nivel, tenes que aprender computer science, teoria y trabajar en cosas que te permitan aplicar esa teoria. Tenes que saber ver un algoritmo y poder calcular la complejidad, tenes que entender que son las patrones de diseño y cuando conviene aplicar tal o cual. Tenes que entender como funciona OOP, pero tambien tenes que aprender algun lenguaje funcional, te va a hacer un programador más rico. Tenes que entender de Unit Testing, automated testing, Integration testing. Los dos libros que más me ayudaron cuando empecé en computer science son : https://www.amazon.es/Algorithm-Design-Manual-Steven-Skiena/dp/1848000693 y https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612 Y ir codeando mientras vas leyendo y aplicando las cosas es fundamental. Me parece que la diferencia entre ser autodidacta es que no tenés esa vara minima que te da la facultad, asi que depende de vos que tan crack queres ser y si estas dispuesto a poner el laburo y a aprender cosas constantemente. La información esta en internet o Amazon, no hay ningún secreto. u/zombox · 38 pointsr/gamedev The last couple of weeks in Zombox development: (tl;dr: brand new zombie AI with lots of bells and whistles, tweaked ammo display and you can no longer hit things through walls) • The zombie AI was completely re-written. After I was inspired by examples in the book Programming Game AI By Example, I decided to go with a behavioural hiearchy approach, rather than my old system which relied on a complex FSM in the form of a large if-then tree. • In the old zombie AI system, zombies knew the player's position at all times, and their only goal was to find the player and attack him. In the new system, the zombies' behaviour is more realistic -- they have limited knowledge of their surroundings, and react to sensory input to gain new information. Zombies generally idle or wander, and will not attack the player unless they see him. If they hear a noise (like the player firing a gun, or hitting something with a blunt weapon), they will seek the source of the sound, and either continue wandering if they find nothing, or attack the source if they can see it and it's human. This sight/sound based system is nice because it allows you to do things like distract zombies with a sound while you escape in the opposite direction, or sneak up behind zombies to attack. • Zombie flocking has been improved. Zombies will no longer randomly walk through walls, or walk through the player, or walk through each other. Also, when attacking they will aim to surround the target, rather than simply attack from one side. • If a zombie finds that a door is in the way of its path (ie, if it chases a target into a building and the target closes the door, or it hears a sound inside a building), it will attempt to break the door down to get inside. • In non-AI related news, the weapon ammo system has been improved to show the ammo counters for weapons in the active item slots up top, and when a weapon's ammo is depleted, the active item icon for that weapon will blink red. • Props and zombies can no longer be hit through walls or other objects. Here are some screens showing the debug info I use to help me visualize the new AI system. White lines show new A* paths that are calculated. Green lines point to the next node on a zombie's path when the zombies is following a sound. Magenta/Cyan lines point to a zombie's active target (cyan = close, magenta = far). Red lines show the next node on a zombie's path when the zombie is chasing a target (although zombies are allowed to veer off their path when the target is directly in range). Yellow lines point to a new sound that a zombie has heard. • One, Two, Three, Four, Five Animations: • here's a gif showing some zombies chasing the player into a building, and attempting to break down the door to get inside. • here's another gif showing that same thing happening, in a different scenario • here's a gif (sped up 200%) showing some of the improved swarming • here's a gif showing the active item icon ammo improvements As for New Year's resolutions...my short term goal is to implement Jump Point Search into my A* algorithm. My long term goals involve adding human NPCs, the underground subway system, the structure-building system, mini-quests, more zombie types including zombie animals, and releasing the game in May. More info: Devblog, Youtube, Twitter u/rusty_shaklefurd · 37 pointsr/Cyberpunk A central concept in cyberpunk and hacker culture is the idea of planned obsolescence: Corporations can make more money if they get you to buy their products multiple times instead of just once. This leads to a world where everything is discarded and the wealth gap is very clear between the people who have the new and the people who have the old. The fact of the matter is that DNA is not our friend. Humans were built to spread our seed and be destroyed. We are a tool that DNA uses to extend it's own life. The human body is amazing in many ways, but it's amazing like a disposable razor is amazing. There's no mechanism to prevent cancer, no mechanism to prevent the development of back problems, and no mechanism to prevent it from withering away like a rotten fruit when it's purpose of reproduction has been served. The implementation of transhumanism might be flawed, but so are all human endeavors. That's what cyberpunk is about: Figuring out how to deal with a world ruled by technology. Sometimes it doesn't go as smoothly as we imagine. The message of transhumanism is still clear, though: DNA doesn't own this planet any more, we do, and the name of the game is going to stop being reproduction and start being the enjoyment of existence. Since you seem to be basing your understanding almost entirely on fiction, let me recommend some reading u/5hot6un · 36 pointsr/videos Ray Kurzweil best describes how humankind will transcend biology. Our biology binds us with time. By transcending biology we will transcend time. With all the time we need, we needn't worry ourselves with trying to create wormholes when we can just send a copy of ourselves in all directions. u/veryreasonable · 35 pointsr/RationalPsychonaut As one of the people who commented on that thread, I feel the need to respond to this as rationally as humanly possible. For starters, let's clear up the difference between fractal mathematics, fractal woo, and what Douglas Hofstadter might call fractal analogy. 1. From the wiki - Fractal Mathematics would be the study of "natural phenomena or a mathematical sets that exhibits repeating patterns that display at every scale" as well as the study of self similarity and iterated functions. While it has grown complex and vast, the studies of fractals and their geometry started out as literally what you say it isn't: people asking questions about self-similarity in nature and asking how to describe it mathematically. 2. Fractal Woo would be, as OP said: &gt;“Everything big is just like everything small!” they exclaim, “the universe is self-similar!” ...and then using such logic to thereby justify whatever silly energy-Reiki-mystical-connectedness-telepathy-de-jour they want. 3. Fractal Analogy (my term, but run with it) would be seeing patterns in the world which are, indeed, self similar, as tons of stuff in nature is. This includes plant and animal system, as well as consciousness and human experience. The reason I mention Douglas Hofstadter is that he is a PhD physicist who literally used fractal mathematics to predict some pretty nifty real world stuff 35 years before it was confirmed - but Mr. Hofstadter is also an incredibly enjoyable author who muses at length about cognitive science and AI research, often using the analogy of self-similar shapes to help describe what we understand of consciousness in a way that most layman readers can understand. Even if you are not a very capable mathematician, I highly recommend his Godel Escher Bach, which uses fractals and loads of other creative stuff to help conceptualize how the "mind" arises from the brain. As well, Chaos Theory - the study of how immensely complex patterns emerge from seemingly simple preconditions - is full of fractal mathematics. Given that the universe is absolutely packed with iterated functions and self-similarity almost everywhere we look, I think you can absolutely take the point of view that the universe is fractal in nature, especially when you are in a self-induced state where your brain makes a lot of connections you might normally overlook or not even bother to think about. My point is that discussing things in the universe as self-similar is useful to mathematicians and non-mathematicians alike; using the word "fractal" to describe natural systems that exhibit those familiar patterns might not be perfectly correct, but it's not itself offensive or an affront to reasonable discourse. I manage a business; so what's your problem if I visualize the structure of my company as a fern leaf with departments and employees as branches off the main stem? What would be the issues of discussing how incredible human cellular morphology really is with my biologist roommate, and citing some cool research someone decided to do about fractal geometry in the way our bodies build themselves? EDIT: OP's edit makes it more clear his statements were more about irrational folk seeing the universe as a single continuous fractal (that would be the "fractal woo"), and that he is not denying the existence of fractal-like patterns in nature, or that using fractal models can be useful in understanding phenomena. Sorry for any confusion and thanks for the discussion! EDIT2: /u/ombortron commented pretty well in regards to the utility of the concept of fractals in scientific discourse and otherwise: &gt;The universe itself doesn't have to be a fractal for fractals to be important. &gt;Fractals are quite common in our reality, and as a result, that means they are an important facet of reality, and as such they are a legitimate and common topic of discussion amongst people, and this is particularly true of people who do psychedelics. &gt;Does this mean the universe is 100% fractal in nature? No. u/RhoTheory · 33 pointsr/MachineLearning Grad school for machine learning is pretty vague, so here's some general resources I think would be good for an incoming CS grad student or undergraduate CS researcher with a focus on deep learning. In my opinion, the courses you mentioned you've done should be a sufficient foundation to dive into deep learning, but these resources cover some foundational stuff as well. • Kaggle is for machine learning in general. It provides datasets and hardware. It has some nice tutorials and you can look at what other people did. • Google has an online crash course on Machine Learning. • Hands-On Machine Learning with Scikit-learn and Tensorflow is a great book for diving into machine learning with little background. The O'Reilly books tend to be pretty good. • MIT Intro to Deep Learning provides a good theoretical basis for deep learning specifically. • MIT Intro to AI. This is my favorite online lecture series of all time. It provides a solid foundation in all the common methods for AI, from neural nets to support vector machines and the like. • Tensorflow is a common framework for deep learning and provides good tutorials. • Scikit-learn is a framework for machine learning in python. It'd be a good idea to familiarize yourself with it and the algorithms it provides. The link is to a bunch of examples. • Stanford's deep learning tutorial provides a more mathematical approach to deep learning than the others I've mentioned--which basic vector calc, linear algebra, and stats should be able to handle. • 3Blue1Brown is a math youtuber that animates visual intuitions behind many rather high-level concepts. He has a short series on the math of neural networks. • If you are going to be dealing with hardware for machine learning at all, this paper is the gold standard for everything you'd need to know. Actually, even if you aren't dealing with the hardware, I'd recommend you look at the seconds on software. It is fairly high level, however, so don't be discouraged if you don't get some of it. • Chris Olah's Blog is amazing. His posts vary from explanations of complex topics very intuitively to actual research papers. I recommend "Neural Networks, Manifolds, and Topology". u/chuwiki · 33 pointsr/Python I'd recommend this book. It's really nice for beginners :D u/geek_on_two_wheels · 33 pointsr/csharp u/Pally321 · 33 pointsr/mildlyinteresting If you're serious about getting into software development, I'd recommend you start looking into data structures and algorithms as well. It's something I think a lot of people who were self-taught tend to miss because it's not required knowledge to program, but it will give you a huge competitive advantage. While I haven't read it, this book seems like a good introduction to the concept: https://smile.amazon.com/dp/1617292230/?coliid=I34MEOIX2VL8U8&amp;colid=MEZKMZI215ZL&amp;psc=0 From there I'd recommend looking at MIT's Intro to Algorithms, 3rd Edition. A bit more advanced, but the topics in there will play a huge role in getting a job in software. u/jhartikainen · 32 pointsr/cscareerquestions fwiw, bare minimum working code is often a good idea if we're talking about the amount of code to do some task :) Design patterns are most useful in that they help you start recognizing patterns in your own code, and they show you a number of common patterns which can be useful - but it's good to keep in mind that you shouldn't force a design pattern somewhere just because it's a design pattern. Anyway, the Design Patterns book is good, and so is Head First Design Patterns. u/zorfbee · 32 pointsr/artificial Reading some books would be a good idea. u/wall_time · 32 pointsr/programming Charles Petzold also wrote Code: The Hidden Language of Computer Hardware and Software. It's a great book. I'm sure most of the people browsing this subreddit will already understand most of what is in the book (or have read it already) but fantastic read nonetheless. u/jesseguarascia · 31 pointsr/gamedev I think your major problem here is that you want the "why not"s instead of the "why"s. A good programmer can look at a chunk of code and determine "why" the programmer is doing certain things. These pre-extising code blocks that people refer to are given because you should be able to read through it and interpret what's going on and why. The questions you most likely ask at the "interpreting" stage isn't "why" but instead "why that way and not this way?" Really, when it comes down to it, the answer as to that question for a lot of things in engine programming (or just programming in general) is that it's what the lead designer or lead programmer thought was the best idea. For instance: How do you want to store your array of tiles? As integers representing tile indexes in a tile set? As separate Tile class instances in a vector array containing vector arrays of Tile instances? As a hashmap indexed using characters to grab a tile? etc. There's a million ways to handle each and every part of an engine, it all comes down to what design patterns and what theories you think are the best for what you need your engine to do. I suggest reading up on some of the design patterns in here (actual link in the sidebar) and here. They're a great way to start understanding the multitudes of ways of handling different ideas in your engine! Reading up on pre-existing theory or seeing pre-existing pseudo-code is fine and dandy, but sometimes you have to reinvent the wheel. Sometimes, for the most part you can follow a lot of design patterns that already exist. P.S. For a great tutorial on loading tile maps and working with them in your game, lazyfoo's got you covered (it's in C++ but can easily be adapted for other languages) Here u/majordyson · 29 pointsr/MachineLearning Having done an MEng at Oxford where I dabbled in ML, the 3 key texts that came up as references in a lot of lectures were these: Pattern Recognition and Machine Learning (Information Science and Statistics) (Information Science and Statistics) https://www.amazon.co.uk/dp/0387310738/ref=cm_sw_r_cp_apa_i_TZGnDb24TFV9M Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning Series) https://www.amazon.co.uk/dp/0262018020/ref=cm_sw_r_cp_apa_i_g1GnDb5VTRRP9 (Pretty sure Murphy was one of our lecturers actually?) Bayesian Reasoning and Machine Learning https://www.amazon.co.uk/dp/0521518148/ref=cm_sw_r_cp_apa_i_81GnDbV7YQ2WJ There were ofc others, and plenty of other sources and references too, but you can't go buying dozens of text books, not least cuz they would repeat the same things. If you need some general maths reading too then pretty much all the useful (non specialist) maths we used for 4 years is all in this: Advanced Engineering Mathematics https://www.amazon.co.uk/dp/0470646136/ref=cm_sw_r_cp_apa_i_B5GnDbNST8HZR u/KobayashiDragonSlave · 28 pointsr/learnprogramming Not OP but I discovered this book 'Grokking Algorithms' from a fantastic youtube channel 'The Coding Train'. The book explains a lot of the algorithms and data structures that I am learning in my first sem of CS at school. This thing even has the stuff that I am going to learn in the next semster. I found this book much more fun than my monotonous textbooks. If anyone wants to get a good grasp of the fundamentals of A&amp;DS this is a great starting point and then move on to MOOCs by famous Universities. MIT's A&amp;DS was the one that I used. Dunno if it's still available on YouTube because I remember that OCW courses were removed or something? Link u/hooj · 28 pointsr/explainlikeimfive The whole subject is a bit too complicated and a bit too deep for a short ELI5, but I'll give a stab at the gist of it. The reason why computers work (at least in the vein of your question) is very similar to the reason why we have language -- written, spoken, etc. What you're reading right at this very moment is a complex system (language) simplified to symbols on the screen. The very fact that you can read these words and attain meaning from them means that each sentence, each word, and each letter represent a sort of code that you can understand. If we take an apple for example, there are many other ways to say that in different languages. Manzana. Pomme. Apfel. And so on. Codes -- some symbol maps to some concept. In the context of computers, well, they can only "understand" binary. Ones and zeros. On and off. Well, that's okay, because we can map those ones and zeros to codes that we (humans) care about. Like 101010111 could represent "apple" if we wanted it to. So we build these physical circuits that either have power or don't (on and off) and we can abstract that to 1's (power flowing through that circuit) and 0's (no power flowing through it). This way, we can build physical chips that give us basic building blocks (basic instructions it can do) that we can leverage in order to ultimately make programs, display stuff, play sounds, etc. And the way we communicate that to the computer is via the language it can understand, binary. In other words, in a basic sense, we can pass the processor binary, and it should be able to interpret that as a command. The length of the binary, and what it should contain can vary from chip to chip. But lets say our basic chip can do basic math. We might pass it a binary number: 0001001000110100 but it might be able to slice it up as 0001 | 0010 | 0011 | 0100 -- so the first four, 0001, might map to an "add" command. The next four, 0010, might map to a memory location that holds a number. The third group of four might be the number to add it to. The last group might be where to put it. Using variables, it might look like: c = a + b. Where "c" is 0100, "a" is 0010, "b" is 0011, and the "+" (addition operator) is 0001. From there, those basic instructions, we can layer abstractions. If I tell you to take out the trash, that's a pretty basic statement. If I were to detail all the steps needed to do that, it would get a lot longer -- take the lid off the can, pull the bag up, tie the bag, go to the big garbage can, open the lid, put the trash in. Right? Well, if I tell you to take out the trash, it rolls up all those sub actions needed to do the task into one simple command. In programming, it's not all that different. We layer abstractions to a point where we can call immense functionality with relatively little code. Some of that code might control the video signal being sent to the screen. Some of that code might control the logic behind an app or a game. All of the code though, is getting turned into 1's and 0's and processed by your cpu in order to make the computer do what is asked. If you want to learn more, I highly recommend Code by Charles Petzold for a much more in depth but still layman friendly explanation of all this. u/cholland89 · 27 pointsr/compsci I just finished reading Code: The Hidden Language of Computer Hardware and Software and will state unequivocally that this book is the most satisfying read I've experienced. It starts with flashlights blinking through windows, moves to Morse code, introduces electrical relays and demonstrates how they can be connected to form logic gates, then uses those gates to construct an ALU/counter/RAM and multiplexors. It goes on to describe the development of an assembly language and the utilization of input and output devices. This book can be described as knowledge hose flooding the gaps in my understanding of computer hardware/software at an extremely enjoyable pace. It may help satisfy your interest in the concepts and technology that led to modern computers. Check out the reviews for more info. If you haven't already studied logic gates in depth in your formal education, I would suggest using a logic simulator to actually build the combinational logic structures. I now feel very comfortable with logic gates and have a strong understanding of their application in computing from my time spent building the described logic. I went through the book very slowly, rereading chapters and sections until I felt confident that I understood the content. I can not recommend this book enough. After reading CODE, I have been working through The Elements of Computing Systems: Building a Modern Computer from First Principles. If you are looking to gain a better understanding of the functions of hardware components, this is the book to read. This book's companion site http://www.nand2tetris.org has the first chapters free along with the entire open source software suite that is used in the book's projects. You will build, in the hardware design language starting with Nand gates, each logic gate and every part of a computing system up to a modern high level language with which you can program custom software of your own design to compile in a compiler you designed into an assembly language you specified which is turned into binary that runs in a processor you built from Nand gates and flip flops. This book was very challenging before reading CODE, now I feel like I'm simply applying everything I learned in code with even more detail. For somebody that hasn't attended college for computing yet, this has been a life changing experience. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686 u/stevenxdavis · 27 pointsr/compsci I just started reading CODE by Charles Petzold and I've really enjoyed it so far. It's an accessible take on the basics of computer science that doesn't just focus on computers themselves. u/bpikmin · 26 pointsr/programming I highly recommend the book Code. I read it in middle school and it was absolutely fascinating. Pretty short too. u/Lericsui · 26 pointsr/learnprogramming "Introduction to Algorithms"by Cormen et.al. Is for me the most important one. The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good. Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics. Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it. SICP(although married to a language) teaches very very good fundamentals. Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material. u/cronin1024 · 25 pointsr/programming Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors. edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's. edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books. edit: Updated up to redline6561 u/technogeeky · 24 pointsr/AskReddit u/in0pinatus · 23 pointsr/programming I admire your dogged adherence to being wrong in every particular. It takes a special brand of stubborn contrarianism to quote someone's badly edited notes as a primary source and then followup by a claim that this is best possible research. However, outside in the real world, Alan Kay writes extensively and authoritatively here and in his numerous contributions on Hacker News quite aside from publications spanning decades. And an awful lot of people agree with his definition. The introduction of the classic Design Patterns defines objects as an encapsulated package that only responds to messages. People who teach OO programming readily quote Mr Kay's definition. The Ruby programming language is fundamentally based upon it, and before you shout "but Ruby has classes" note that Ruby classes are actually themselves objects, for which the new message happens to do something particular by convention. And so on; the point being that Alan Kay's definition is super influential, which is why the idea that Erlang is the most object-oriented language is not a new proposition. u/EricHerboso · 23 pointsr/westworld Asimov's books went even farther than that. Don't read if you don't want to be spoiled on his most famous scifi series. [Spoiler](#s "Because Law 1 had the robots take care of humans, the first AIs decided to go out and commit genocide on every alien species in the universe, just so they couldn't compete with humans in the far future.") AI safety is hard. Thankfully, if you care about actually doing good in real life, there are organizations out there working on this kind of thing. Machine Intelligence Research Institute does research on friendly AI problems; the Center for Applied Rationality promotes increasing the sanity waterline in order to increase awareness of the unfriendly AI problem; the Future for Humanity Institute works on several existential risks, including AI safety. If you want to learn more about this topic in real life, not just in fiction, then I highly recommend Nick Bostrom's Superintelligence, a book that goes into detail on these issues while still remaining readable by laymen. u/devilbunny · 23 pointsr/explainlikeimfive That's a pretty interesting course. I've read the book and done exercises up until you actually have to start building the CPU. However, I would strongly recommend reading Charles Petzold's CODE first. It's a little less technical, but explains the general concepts much better than nand2tetris. u/abstractifier · 22 pointsr/learnprogramming I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone. Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know. u/teknobo · 22 pointsr/programming Even though this seems to be just aggregating some Stack Overflow answers into blogspam, I'll bite. &gt; Two lines of code is too many If you're seriously going to complain about one extra line of code in a method, I don't see this ending well. &gt; If it's not native, it's not really programming Semantics. Even if you don't call it programming, you'd damn well better know those things if you want to use them. SQL, Java, and any other VM-based language may not qualify as "programming" by this definition, but they're still damn useful. &gt; The "while" construct should be removed from all programming languages. (In favor of "Repeat...Until") Semantics again. There is no functional difference between the two, and I would argue that while is actually preferable since it puts the looping condition right there on the same line, instead of having to skip to the end of the block to find out if you even entered the block in the first place. &gt; Copy/pasting is not an anti-pattern. No, it's not, and it's been proven. I'm having a hard time finding the peer-reviewed study on copy/paste programming right now, but basically, it's been shown to save a lot of time as long as you're using it properly. Where the hatred for it comes in is that, like GOTO, if you use it too often, you'll probably end up using it wrong. &gt; Developing on .NET is not programming, it's just stitching together other people's code A reiteration of his 2nd point, but honestly, a huge amount of working as a professional programmer -- hell, almost the definition of working in a team -- is stitching together other people's code. There's nothing wrong with that, and it's hardly controversial. &gt; The use of try/catch exception handling is worse than the use of simple return codes and associated common messaging structures to ferry useful error messages. This has been getting debated a lot in go-lang circles, but the general consensus seems to be that unless you're working in an embedded environment (or some other highly-constrained environment), you're probably better off with try/catch. &gt; Test constantly Test-Driven Development is something that I personally agree with, and truthfully has become a very popular practice among Rails people. I don't see how that would qualify it as being controversial. That said, certain studies have shown evidence that TDD is not as effective as many seem to believe. &gt; Object Oriented Programming is absolutely the worst thing that's ever happened to the field of software engineering. I've heard this claim semi-often. It seems to mostly come from people having worked with languages that claim to be OO but constantly make exceptions to the rules, like Java, C++, or Python. In fact, the author specifically calls out Java. Try Smalltalk or Ruby and you'll come to see that OOP done right is actually quite wonderful. &gt; C (or C++) should be the first programming language Debatable, but certainly not controversial by any stretch of the imagination. &gt; Classes should fit on the screen. How big is your screen? I can fit any class definition on a 64" monitor. Some classes simply must be large. It is an unavoidable fact that certain things are simply more complex to model than others. This point isn't controversial, it's just asinine. &gt; Making invisible characters syntactically significant in python was a bad idea This again? Is it really a controversial opinion if it's been something non-Python programmers have been whining about for decades? Because as far as I can tell, people whine about it for about the first five minutes of Python coding, and then give up because they would've been indenting anyway. It can cause bugs when transferring code between computers, I'll give them that. Otherwise, it's Python demanding good formatting, something that you should be demanding from everyone on your team anyways. My main regret with Python is that I haven't found a good tool that auto-formats everything (a la "gofmt"). But otherwise, Python's indentation requirements are so in line with common indentation in almost every programming language that proper indentation comes naturally to more or less everyone. In how many programming languages that you regularly use do you not format your conditional, looping, class/method, or exception blocks? &gt; Singletons are not evil It's not controversial to agree with Design Patterns. That book is more or less the undisputed truth on the subject, and it thinks the Singleton pattern is fine and dandy. u/darawk · 22 pointsr/compsci Godel Escher and Bach is precisely what you're looking for. u/MyrddinE · 21 pointsr/programming Gödel, Escher, Bach: An Eternal Golden Braid This book is not exactly a programming book... maybe... kinda. It teaches no practical programming language. It explains no useful design patterns. It does not deal with any practical computer applications. And yet had I never would have really gotten into programming had I not read it long ago. Written in the late 70's, it's still relevant today. u/willardthor · 21 pointsr/compsci During my studies, in my research (information-flow security), and as a working computer scientist, I practically never find use for analysis. Most of (theoretical) computer science is based on logic and algebra, and deals with discrete structures. That is not to say that real analysis has no home in computer science. Far from it. Any aspect of computer science that deals with randomness touches on analysis. These aspects resort to probability theory to reason about randomness. And probability theory applies analysis. Example aspects in computer science include cryptography (pseudo-random number generators, hash functions, etc., which rely on number theory), formal verification (model checking, which sometimes uses continuous-time Markov chains as models), information theory (channel coding and signal processing), and machine learning (Bayesian networks). And, as I am sure you know, optimization and analysis of algorithms (smoothed analysis). You might find the following books interesting: u/SUOfficial · 21 pointsr/Futurology This is SO important. We should be doing this faster than China. A branch of artificial intelligence is that of breeding and gene editing. Selectively selecting for genetic intelligence could lead to rapid advances in human intelligence. In 'Superintelligence: Paths, Dangers, Strategies', the most recent book by Oxford professor Nick Bostrum, as well as his paper 'Embryo Selection for Cognitive Enhancement', the case is made for very simple advances in IQ by selecting certain embryos for genetic attributes or even, in this case, breeding for them, and the payoff in terms of raw intelligence could be staggering. u/KernlPanik · 20 pointsr/learnprogramming I'm a ~10 year sysadmin that has decided to rebuild my software dev skills that I haven't used since college. Here's what I did to reawaken that part of my brain: 1. Harvard's CS50. I figured an entry level college course would be "beneath me" but it was a great experience and I learned a surprising amount. It's very entertaining as well so that made the "simple" parts fun to do as well. 2. Read CODE by Charles Petzold. Great insight into the nuts and bolts of how computers work. Read through it on my lunch breaks while taking CS50 in the evenings. 3. Read and do the problems in C Primer Plus. This is a great book for learning how to write in C, which is the basis for all modern languages and is still widely used today. Great starter book for anyone who wants to learn to program. 3.5) After going through the last chapters of C Primer Plus, I realized that some of my math skills were not up to par, so I took this MOOC from MIT to supplement that. No idea if that's something you need. 4. Here comes the fun one: The Structure and Interpretation of Computer Programs, aka The Wizard Book. This book is more about how to design software in general, and it is pretty difficult. That being said, if you can get through it then you have the chops to do this professionally. u/MrBushido2318 · 20 pointsr/gamedev You have a long journey ahead of you, but here goes :D Beginner C++ Primer: One of the better introductory books. The C++ Standard Template Library: A Tutorial and Reference: Goes over the standard template library in fantastic detail, a must if you're going to be spending a lot of time writing C++. The C++ Programming Language: Now that you have a good idea of how C++ is used, it's time to go over it again. TCPPL is written by the language's creator and is intended as an introductory book for experienced programmers. That said I think it's best read once you're already comfortable with the language so that you can full appreciate his nuggets of wisdom. Intermediate Modern C++ Design: Covers how to write reusable C++ code and common design patterns. You can definitely have started game programming by the time you read this book, however it's definitely something you should have on your reading list. C++ Templates: Touches on some similar material as Modern C++ Design, but will help you get to grips with C++ Template programming and how to write reusable code. Effective C++: Practical advise about C++ do's and dont's. Again, this isn't mandatory knowledge for gamedev, but it's advice is definitely invaluable. Design Patterns: Teaches you commonly used design patterns. Especially useful if you're working as part of a team as it gives you a common set of names for design patterns. Advanced C++ Concurrency in Action: Don't be put off by the fact I've put this as an "advanced" topic, it's more that you will get more benefit out of knowing the other subjects first. Concurrency in C++11 is pretty easy and this book is a fantastic guide for learning how its done. Graphics Programming OpenGL: A surprisingly well written specification in that it's pretty easy to understand! While it's probably not the best resource for learning OpenGL, it's definitely worth looking at. [edit: Mix it in with Open.gl and arcsynthesis's tutorials for practical examples and you're off to a good start!] OpenGL Superbible: The OpenGL superbible is one of the best ways to learn modern OpenGL. Sadly this isn't saying much, in fact the only other book appears to be the "Orange Book", however my sources indicate that is terrible. So you're just going to have suck it up and learn from the OGL Superbible![edit: in retrospect, just stick to free tutorials I've linked above. You'll learn more from them, and be less confused by what is 3rd party code supplied by the book. Substitute the "rendering" techniques you would learn from a 3d book with a good 3d math book and realtime rendering (links below)] Essential Mathematics for Game Programmers or 3D Math Primer for Graphics and Game Development: 3D programming involves a lot of math, these books cover topics that OpenGL/DirectX books tend to rush over. Realtime Rendering: A graphics library independent explanation of a number of modern graphical techniques, very useful with teaching you inventive ways to use your newly found 3d graphical talents! u/neutronfish · 20 pointsr/cscareerquestions One book that helped me a lot while starting out and which I highly recommend to any new student of computer science is Code: The Hidden Language of Computer Hardware by Charles Petzold, which starts out as a general interest book about the history of computing and then very quickly ratchets up into how modern computers, compilers, operating systems, and hardware drivers are built. You basically have to learn some discrete math and assembly language just to follow along, and by the end you have a really good idea of what happens under the hood when you run your programs and why. u/shivasprogeny · 20 pointsr/learnprogramming How deep do you want to go? Code: The Hidden Language of Computer Hardware and Software goes all the way from binary to computer code. If you don't really care about the hardware, you might start dabbling in assembly on a Raspberry PI. u/RagaTanha · 20 pointsr/singularity The singularity is near by ray kurzweil has all the science behind it. Accelerando and Singularity Sky by Charles Stross for Fiction. u/Philipp · 20 pointsr/Futurology Here's a fantastic book on the subject: Superintelligence. u/I_make_things · 20 pointsr/AskReddit Godel Escher Bach It's ultimately about the self-referential nature of consciousness, but it explores so many fascinating concepts that I couldn't even begin to do it justice u/jschm · 19 pointsr/compsci AIMA. A real treasure trove! u/cybrbeast · 19 pointsr/Futurology This was originally posted as an image but got deleted for IMO in this case, the irrelevant reason that picture posts are not allowed, though this was all about the text. We had an interesting discussion going: http://www.reddit.com/r/Futurology/comments/2mh0y1/elon_musks_deleted_edge_comment_from_yesterday_on/ I'll just post my relevant contributions to the original to maybe get things started. --------------------------- And it's not like he's saying this based on his opinion after a thorough study online like you or I could do. No, he has access to the real state of the art: &gt; Musk was an early investor in AI firm DeepMind, which was later acquired by Google, and in March made an investment San Francisco-based Vicarious, another company working to improve machine intelligence. &gt; Speaking to US news channel CNBC, Musk explained that his investments were, "not from the standpoint of actually trying to make any investment return… I like to just keep an eye on what's going on with artificial intelligence. I think there is potentially a dangerous outcome there." *Also I love it that Elon isn't afraid to speak his mind like this. I think it might well be PR or the boards of his companies that reigned him in here. Also in television interviews he is so open and honest, too bad he didn't speak those words there. ---------------------------- I'm currently reading Superintelligence which is mentioned in the article and by Musk. One of the ways he describes an unstoppable scenario is that the AI seems to function perfectly and is super friendly and helpful. However on the side it's developing micro-factories which can assemble from a specifically coded string of DNA (this is already possible to a limited extent). These factories then use their coded instructions to multiply and spread and then start building enormous amount of nanobots. Once critical mass and spread is reached they could instantly wipe out humanity through some kind of poison/infection. The AI isn't physical, but the only thing it needs in this case is to place an order to a DNA printing service (they exist) and then mail it to someone it has manipulated into adding water, nutrients, and releasing the DNA nanofactory. If the AI explodes in intelligence as predicted in some scenarios this could be set up within weeks/months of it becoming aware. We would have nearly no chance of catching this in time. Bostrom gives the caveat that this was only a viable scenario he could dream up, the super intelligence should by definition be able to make much more ingenious methods. u/chronographer · 19 pointsr/Foodforthought For background, I understand that Elon's views are informed by this book (among others, no doubt): Nick Bostrom: Superintelligence. It's a dense read, but talks about AI and how it might emerge and behave. (I haven't finished the book, so can't say more than that). Edit: fixed up punctuation from mobile posting. See below for more detail. u/ZeljkoS · 18 pointsr/philosophy Author here. Let me start: First software company I founded develops software components for other programmers: https://www.gemboxsoftware.com/ Our customers include NASA, MS, Intel, and US Navy: https://www.gemboxsoftware.com/company/customers Second company I co-founded screens programmers before interviews: https://www.testdome.com/ We are used by Paypal and Ebay, among others. I finished computer science at University of Zagreb. I high school, I won 1st place at national computer science competition in 1997. Because of that I attended Central European Olympiad in Informatics, where I got a bronze medal: https://svedic.org/zeljko/Competitions/ceoi_medalja.jpg I have also been part of Croatian team at IOI in Capetown: https://svedic.org/zeljko/Competitions/ioi_team.jpg Here is my Linkedin profile: https://www.linkedin.com/in/zeljkos/ I don't work in AI, I got the idea while reading Peter Norvig's book: https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597 Hope I changed your mind about how certain you can be about something just based on the first feeling. My about page was one click away. Although I really know programming and sell my software to thousands of companies, I have to admit I don't see how that makes my article more or less credible. It is a philosophical text, not text about software. I think you made "Appeal to Authority" logical fallacy: https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies/21/Appeal-to-Authority Every article should be judged by its arguments, not the credibility of the author. u/ytterberg_ · 18 pointsr/changemyview The problem is AI alignment: how do we make sure that the AI wants good stuff like "acting like a neutral arbiter" and not bad stuff like "world domination"? This turns out to be a very hard question, and a lot of very smart people believe that a superintelligence would destroy humanity unless we are very very careful. Bostroms Superintelligence is a good introduction to this topic. &gt; The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. If machine brains surpassed human brains in general intelligence, then this new superintelligence could become extremely powerful - possibly beyond our control. As the fate of the gorillas now depends more on humans than on the species itself, so would the fate of humankind depend on the actions of the machine superintelligence. If you don't have the time for the book, this FAQ is good: &gt; 4: Even if hostile superintelligences are dangerous, why would we expect a superintelligence to ever be hostile? &gt; The argument goes: computers only do what we command them; no more, no less. So it might be bad if terrorists or enemy countries develop superintelligence first. But if we develop superintelligence first there’s no problem. Just command it to do the things we want, right? &gt; Suppose we wanted a superintelligence to cure cancer. How might we specify the goal “cure cancer”? We couldn’t guide it through every individual step; if we knew every individual step, then we could cure cancer ourselves. Instead, we would have to give it a final goal of curing cancer, and trust the superintelligence to come up with intermediate actions that furthered that goal. For example, a superintelligence might decide that the first step to curing cancer was learning more about protein folding, and set up some experiments to investigate protein folding patterns. &gt; A superintelligence would also need some level of common sense to decide which of various strategies to pursue. Suppose that investigating protein folding was very likely to cure 50% of cancers, but investigating genetic engineering was moderately likely to cure 90% of cancers. Which should the AI pursue? Presumably it would need some way to balance considerations like curing as much cancer as possible, as quickly as possible, with as high a probability of success as possible. &gt; But a goal specified in this way would be very dangerous. Humans instinctively balance thousands of different considerations in everything they do; so far this hypothetical AI is only balancing three (least cancer, quickest results, highest probability). To a human, it would seem maniacally, even psychopathically, obsessed with cancer curing. If this were truly its goal structure, it would go wrong in almost comical ways. &gt; If your only goal is “curing cancer”, and you lack humans’ instinct for the thousands of other important considerations, a relatively easy solution might be to hack into a nuclear base, launch all of its missiles, and kill everyone in the world. This satisfies all the AI’s goals. It reduces cancer down to zero (which is better than medicines which work only some of the time). It’s very fast (which is better than medicines which might take a long time to invent and distribute). And it has a high probability of success (medicines might or might not work; nukes definitely do). &gt; So simple goal architectures are likely to go very wrong unless tempered by common sense and a broader understanding of what we do and do not value. u/christianitie · 18 pointsr/math Without knowing much about you, I can't tell how much you know about actual math, so apologies if it sounds like I'm talking down to you: When you get further into mathematics, you'll find it's less and less about doing calculations and more about proving things, and you'll find that the two are actually quite different. One may enjoy both, neither, or one, but not the other. I'd say if you want to find out what higher level math is like, try finding a very basic book that involves a lot of writing proofs. This one is aimed at high schoolers and I've heard good things about it, but never used it myself. This one I have read (well, an earlier edition anyway) and think is a phenomenal way to get acquainted with higher math. You may protest that this is a computer science book, but I assure you, it has much more to do with higher math than any calculus text. Pure computer science essentially is mathematics. Of course, you are free to dive into whatever subject interests you most. I picked these two because they're intended as introductions to higher math. Keep in mind though, most of us struggle at first with proofwriting, even with so-called "gentle" introductions. One last thing: Don't think of your ability in terms of your age, it's great to learn young, but there's nothing wrong with people learning later on. Thinking of it as a race could lead to arrogance or, on the other side of the spectrum, unwarranted disappointment in yourself when life gets in the way. We want to enjoy the journey, not worry about if we're going fast enough. Best of luck! u/bmathew5 · 18 pointsr/learnprogramming Design Patterns by the gang of four. It is the essence of designing software architecture. It describes very common designs that have been tested time and time again, however it is broad and you have to specify your requirements but it is an amazing starting point. (ps it actually is similar to design problems in civil architecture identified by Christopher Alexander. A Pattern Language (1977) &amp; A Timeless Way of Building(1979).) It really opens doors to how you should approach and choose the correct design. It's not language specific. The book however does have C++ examples u/sid78669 · 18 pointsr/compsci I would recommend reading Design Patterns: Elements of Reusable Object-Oriented Software. That book will give you the majority of design knowledge you would gain at this point in your career from college. u/JustBesideTheWindow · 18 pointsr/HowToHack u/totemcatcher · 18 pointsr/linux • CODE: The Hidden Language of Computer Hardware and Software by Charles Petzold A ground up approach to understanding digital processing and transmission in a broad sense. I only recommend this book if you are looking for an intrinsic understanding of computing rather than merely a handle on using a particular programming language or operating system. By the end of the book you should have a handle on actually building your own computer, however it's actually an excellent "first book" for anyone interested in computing. u/myrrlyn · 18 pointsr/learnprogramming https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 This book is an excellent primer for a bottom-up look into how computers as machines function. https://www.amazon.com/gp/aw/d/0123944244/ref=ya_aw_od_pi This is my textbook from the class where we built a CPU. I greatly enjoy it, and it also starts at the bottom and works up excellently. For OS development, I am following Philipp Opperman's excellent blog series on writing a simple OS in Rust, at http://os.phil-opp.com/ And as always Wikipedia walks and Reddit meanders fill in the gaps lol. u/Spasnof · 17 pointsr/learnprogramming Awesome book Code , really helps you understand from a bottom up perspective. Super approachable without a CS background and does not need a computer in front of you to appreciate. Highly recommended. u/Afro-Ninja · 17 pointsr/explainlikeimfive It doesn't "know." Any logical operation (especially basic math calculations) can be broken down into binary digits, and a single binary digit (bit) can be represented as the presence or absence of electricity. It's almost how if you were to build a sequence of pipes and valves, and pour water into the opening, the water would end up flowing through the same way each time. The pipes don't "know" where the water goes, it just happens. A computer does the same thing but on a tiny scale with tiny electric pulses travelling through sequences of thousands of gates all connected to each other. Imagine that the buttons you hit on a calculator slightly change how the valves open and close. (or which opening to dump the water into) You hit enter, the water is poured, and the result shows on screen. fair warning: I am not a hardware guy so this explanation is probably not 100% accurate. If you have more interest in the subject I HIGHLY recommend reading this book: http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 u/mohabaks · 17 pointsr/unixporn Thanks ;). Not so skilled on that and my advice might be misleading; though I got a background in cs:This would be my suggestion for someone beginning. u/flaz · 17 pointsr/philosophy You might be interested in a book called On Intelligence, by Jeff Hawkins. He describes something similar to your simulations idea, but he calls it a predictive hierarchical memory system (or something like that). It is a fascinating idea, actually, and makes a lot of sense. I too suspect that speech is a central unifying aspect to what we call consciousness. A lot of AI guys seem to agree. There is a theory by Noam Chomsky (I think), called Universal Grammar. As I recall, he suspects that may be key to modern intelligence, and he suspects the genetic mutation for it happened about 70,000 years ago, which gave us the ability to communicate, and allowed Homo Sapiens to successfully move out of Africa. I've also read that mutation 70k years ago referred to as the cognitive revolution. But it seems everyone agrees that's when the move out of Africa began, and communication started; it's not just a Chomsky thing. u/Cohesionless · 17 pointsr/cscareerquestions The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics. If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following: • Introduction to Algorithms, 3rd Edition: https://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844. A lot of people do not like this classic book because it is very theoretical, very mathematical, and very abstract, but I think that is its greatest strength. I find a lot of algorithms books either focus too much about how to implement an algorithm in a certain language or it underplays the theoretical foundation of the algorithm such that their readers can only recite the algorithms to their interviewers. This book forced me to think algorithmically to be able to design my own algorithms from all the techniques and concepts learned to solve very diverse problems. • Design Patterns: Elements of Reusable Object-Oriented Software, 1st Edition: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/. This is the original book on object-oriented design patterns. There are other more accessible books to read for this topic, but this is a classic. I don't mind if you replace this book with another. • Clean Code: A Handbook of Agile Software Craftsmanship, 1st Edition: https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882. This book is the classic book that teaches software engineer how to write clean code. A lot of best practices in software engineering is derived from this book. • Java Concurrency in Practice, 1st Edition: https://www.amazon.com/Java-Concurrency-Practice-Brian-Goetz/dp/0321349601. As a software engineer, you need to understand concurrent programming. These days there are various great concurrency abstractions, but I believe everyone should know how to use low-level threads and locks. • The Architecture of Open Source Applications: http://aosabook.org/en/index.html. This website features 4 volumes of books available to purchase or to read online for free. It's content focuses on over 75 case studies of widely used open-source projects often written by the creators of said project about the design decisions and the like that went into creating their popular projects. It is inspired by this statement: "Architects look at thousands of buildings during their training, and study critiques of those buildings written by masters." • Patterns of Enterprise Application Architecture, 1st Edition: https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/. This is a good read to start learning how to architect large applications. The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry. P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database. u/Ken_Obiwan · 17 pointsr/MachineLearning &gt;The swipe at Andrew Ng is off the mark and tasteless Meh, it's on about the same level he brought the conversation to. (Oxford professor writes a carefully-argued 350-page book; Ng apparently doesn't see the need to read it and dismisses news coverage of the book with a vague analogy.) &gt;Yudkowsky and the LessWrong cult have contributed nothing tangible to the fields of AI and machine learning Well, at least it's consistent with their position that making public contributions to the field of AI may not actually be a good idea :) It's not like Yudkowsky is somehow unaware that not having an active AI project makes him uncool, here's him writing about the point at which he realized his approach to AI was wrong and he needed to focus on safety: &gt;And I knew I had to finally update. To actually change what I planned to do, to change what I was doing now, to do something different instead. &gt;I knew I had to stop. &gt;Halt, melt, and catch fire. &gt;Say, "I'm not ready." Say, "I don't know how to do this yet." &gt;These are terribly difficult words to say, in the field of AGI. Both the lay audience and your fellow AGI researchers are interested in code, projects with programmers in play. Failing that, they may give you some credit for saying, "I'm ready to write code, just give me the funding." &gt;Say, "I'm not ready to write code," and your status drops like a depleted uranium balloon. And if you wanna go the ad hominem route (referring to Less Wrong as a "cult" despite the fact that virtually no one who's interacted with the community in real life seems to think it's a cult), I'll leave you with this ad hominem attack on mainstream AI researchers from Upton Sinclair: "It is difficult to get a man to understand something, when his salary depends on his not understanding it." u/The_Dirty_Carl · 17 pointsr/gamedev u/Cryocore · 17 pointsr/gamedev you could use space partitioning. Split the world into grids of say 10 x 10. let each agent update which grid cell it is on on each frame. Then its just a matter of testing only the agents in the surrounding grid cells from the current agent (which will most of the time be a small number) This book explains it really well: Programming Game AI by Example. Also has code samples online u/MattDPS · 17 pointsr/gamedev The one I always suggest is Programming Game AI By Example. The amount of valuable info you can pull out of this one book is incredible. It's still something I reference years later. u/chrndr · 17 pointsr/HPMOR I wrote a quick script to search the full text of HPMOR and return everything italicized and in title case, which I think got most of the books mentioned in the text: Book title|Author|Mentioned in chapter(s)|Links|Notes :---|:---|:---|:---|:--- Encyclopaedia Britannica| |7|Wikipedia|Encyclopaedia Financial Times| |7|Wikipedia|Newspaper The Feynman Lectures on Physics|Richard P. Feynman|8|Wikipedia|Full text is available online here Judgment Under Uncertainty: Heuristics and Biases|Amos Tversky|8|Amazon| Language in Thought and Action|S.I. Hayakawa|8|Amazon Wikipedia | Influence: Science and Practice|Robert B. Cialdini|8|Wikipedia|Textbook. See also Influence: The Psychology of Persuasion Rational Choice in an Uncertain World: The Psychology of Judgment and Decision Making|Reid Hastie and Robyn Dawes|8|Amazon |Textbook Godel, Escher, Bach|Douglas Hofstadter|8, 22|Amazon Wikipedia| A Step Farther Out|Jerry Pournelle|8|Amazon| The Lord of the Rings|J.R.R. Tolkien|17|Wikipedia| Atlas Shrugged|Ayn Rand|20, 98|Wikipedia| Chimpanzee Politics|Frans de Waal|24|Amazon| Thinking Physics: Understandable Practical Reality|Lewis Carroll Epstein|35, 102|Amazon| Second Foundation|Isaac Asimov|86|Wikipedia|Third novel in the Foundation Series Childcraft: A Guide For Parents| |91|Amazon|Not useful if your child has a mysterious dark side Also, this probably isn't technically what the OP was asking, but since the script returned fictional titles along with real ones, I went ahead and included them too: Book title|Mentioned in chapter(s) :---|:--- The Quibbler|6, 27, 38, 63, 72, 86 Hogwarts: A History|8, 73, 79 Modern Magical History|8 Magical Theory|16 Intermediate Potion Making|17 Occlumency: The Hidden Arte|21 Daily Prophet|22, 25, 26, 27, 35, 38, 53, 69, 77, 84, 86, 108 Magical Mnemonics|29 The Skeptical Wizard|29 Vegetable Cunning|48 Beauxbatons: A History|63 Moste Potente Potions|78 Toronto Magical Tribune|86 New Zealand Spellcrafter's Diurnal Notice|86 American Mage|86 As others mentioned, TVTropes has a virtually-exhaustive list of allusions to other works, which includes books that aren't explicitly named in the text, like Ender's Game u/[deleted] · 16 pointsr/books GEB Seriously, this is a brainfuck of a book. u/sandsmark · 16 pointsr/artificial http://www.amazon.com/Artificial-Intelligence-Modern-Approach-Edition/dp/0136042597 is what I (and probably most others) would recommend as an introductory book. u/hwillis · 16 pointsr/Physics This is some kind of weird gatekeeping where AI keeps being redefined until it just means adult human intelligence. I have a textbook that literally has artificial intelligence in the title. u/Rinnve · 16 pointsr/learnpython You should read this book. The best explanation of how computers work I know of. u/My_6th_Throwaway · 16 pointsr/INTP American amazon link u/UnlikelyToBeEaten · 15 pointsr/math Disclaimer: I only have a masters in maths, and I've just started working as a programmer. Here are topics I enjoyed and would recommend • At least basic programming. I've heard that if you are interested in mathematics,you may be especially interested in the Haskel language. • Mathematical Logic and Meta-mathematics, some (very basic) model-theory. Personally, I also found set theory and the proof of the independence of the Axiom of Choice from the Zermelo-Fraenkel axioms incredibly interesting. The stuff from Gödel, Escher, Bach. • Probability theory and Bayesian statistics, along with some basic information theory and theory of complexity / entropy. • Basic category theory (though it's a hard subject to learn if you don't have some advanced algebra to motivate why things are done the way they are. It also took me about three or four tries before I finally started understanding it, and after that I found it quite beautiful. Your mileage may vary). • EDIT: Also, Knuth's Concrete Mathematics. The book is a very good source on the topic. u/latetodata · 15 pointsr/learnmachinelearning I personally really benefitted from Jose Portilla's udemy class on python for Data Science: https://www.udemy.com/python-for-data-science-and-machine-learning-bootcamp. It deals with the machine learning algorithms at a pretty basic level but he does a good job overviewing things and this course personally gave me more confidence. He also wrote a helpful overview for how to become a data scientist: https://medium.com/@josemarcialportilla/how-to-become-a-data-scientist-2d829fa33aba Additionally, I found this podcast episode from Chris Albon helpful: http://partiallyderivative.com/podcast/2017/03/28/learning-machine-learning Finally, I have just started going through Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems and I love it. It's very easy to read and applicable: https://www.amazon.com/dp/1491962291/_encoding=UTF8?coliid=I1VIM81L3W5JUY&amp;amp;colid=2MMQRCAEOFBAX Hope this helps. u/JoshuaSmyth · 15 pointsr/gamedev This book Programming Game AI By Example comes highly recommended by me. It contains all of the above along with an example 2D topdown style deathmatch game with bots to give you a clear understanding of the most common topics in Game AI. It's also one of the more practically focused books, rather than theory focused. u/chub79 · 15 pointsr/algorithms The Algotitms Design Manual by Skienna helped me a lot. I was also curious about this one. Also, this site may help :) u/osirisx11 · 14 pointsr/math If you like stuff like this you may be interested in my favorite book: Godel, Echer, Bach: The Eternal Golden Braid: http://amzn.com/dp/0465026567 Edit: Also see the great MIT course with video lectures: http://ocw.mit.edu/OcwWeb/hs/geb/geb/ u/goodbyegalaxy · 14 pointsr/hardware Code: The Hidden Language of Computer Hardware and Software As the title implies, it's not just about hardware, it goes into how software is written for hardware as well. But it's a really cool book, takes you from the very basics of circuitry (a battery, a light bulb, and wire) in the first chapter, and building only on things taught in the book gets you to a fully working computer. u/jhanschoo · 14 pointsr/compsci Google hasn't been helpful because no such algorithm exists. Check out Rice's Theorem for the impossibility. edit: Let S be the set of languages that you can reduce SAT to in polynomial time. SAT is clearly in S, and we know some machine recognizes it. The empty language is not in S (even if P=NP, so that SAT is P-complete), and we know some machine recognizes it. By Rice's Theorem, no machine decides, when given a machine as input, whether that machine recognizes a language in S. (we assume that the "any custom problem" input is as a machine encoding) edit2: I see that you make a lot of questions about computational complexity, but do not have a good foundation. Many things you propose or ask already have known impossibility results. May I suggest you have a look at Sipser (https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X)? That will give you a better understanding of computability and complexity to understand the feedback you're getting. u/jacobolus · 14 pointsr/math I take it you want something small enough to fit inside a hollowed-out bible or romance novel, so you can hide your secrets from nosy neighbors? u/LongUsername · 13 pointsr/compsci I'd second unplugging completely: no computer, TV, electronics. If you insist on doing something CS related, Godel, Escher, Bach comes highly recommended. u/drzowie · 13 pointsr/AskPhysics Reductionism is important, but pure reductionism denies the existence of emergent phenomena (phenomena that depend on collective behavior of many simpler things). A very enjoyable book that covers this and many other topics at a popularly-accessible level is Gödel, Escher, Bach: an Eternal Golden Braid. First published in the late 1970s, GEB is still delightfully fresh and exciting although a few minor elements are dated (e.g. computers now can beat humans at chess). u/hell_onn_wheel · 13 pointsr/Python Good on you for looking to grow yourself as a professional! The best folks I've worked with are still working on professional development, even 10-20 years in to their profession. Programming languages can be thought of as tools. Python, say, is a screwdriver. You can learn everything there is about screwdrivers, but this only gets you so far. To build something you need a good blueprint. For this you can study objected oriented design (OOD) and programming (OOP). Once you have the basics, take a look at design patterns like the Gang of Four. This book is a good resource to learn about much of the above What parts do you specify for your blueprint? How do they go together? Study up on abstract data types (ADTs) and algorithms that manipulate those data types. This is the definitive book on algorithms, it does take some work to get through it, but it is worth the work. (Side note, this is the book Google expects you to master before interviewing) How do you run your code? You may want to study general operating system concepts if you want to know how your code interacts with the system on which it is running. Want to go even deeper with code performance? Take a look at computer architecture Another topic that should be covered is computer networking, as many applications these days don't work without a network. What are some good practices to follow while writing your code? Two books that are widely recommended are Code Complete and Pragmatic Programmer. Though they cover a very wide range (everything from organizational hacks to unit testing to user design) of topics, it wouldn't hurt to check out Code Complete at the least, as it gives great tips on organizing functions and classes, modules and programs. All these techniques and technologies are just bits and pieces you put together with your programming language. You'll likely need to learn about other tools, other languages, debuggers and linters and optimizers, the list is endless. What helps light the path ahead is finding a mentor, someone that is well steeped in the craft, and is willing to show you how they work. This is best done in person, watching someone design and code. Also spend some time reading the code of others (GitHub is a great place for this) and interacting with them on public mailing lists and IRC channels. I hang out on Hacker News to hear about the latest tools and technologies (many posts to /r/programming come from Hacker News). See if there are any local programming clubs or talks that you can join, it'd be a great forum to find yourself a mentor. Lots of stuff here, happy to answer questions, but hope it's enough to get you started. Oh, yeah, the books, they're expensive but hopefully you can get your boss to buy them for you. It's in his/her best interest, as well as yours! u/gorgoroth666 · 13 pointsr/gamedev u/expedient · 13 pointsr/programming Not a video, but Code: The Hidden Language of Computers by Charles Petzold is really great. u/steamywords · 13 pointsr/Futurology This does nothing to address the difficulty of the control issue. He's basically just saying we'll figure it out before we get AI, don't worry about it. SuperIntelligence actually spells out why control is so hard. None of those points are touched even generally. He's Director of Engineer at Google, which actually created an AI ethics board because an AI company they bought was afraid that the tech could lead to the end of the human species, yet none of that is even briefly mentioned. There is very good reason to be cautious around developing an intellect that can match ours, never mind rapidly exceed it. I don't see the necessity for repeated calls to let our guard down. u/Jetbooster · 12 pointsr/Futurology Why would it care if the goal we gave it didn't actually align with what we wanted? It has no reasons to care, unless these things were explicitly coded in, and as I said, morality is super hard to code into a machine. To address your second point, I understand my example wasn't perfect, but say it understands that the more physical material a company controls, the more assets it has. So it lays claim to the entire universe, and sets out to control it. eventually, it is the company, and growing the company's assets just requires it to have more processing power. Again, it is an illustrative point, loosely derived from my reading of Superintelligence by Nick Bostrom. I would highly recommend it. u/unovasa · 12 pointsr/programming I really enjoyed Concrete Mathematics by Graham, Knuth, &amp; Patashnik u/frenchst · 12 pointsr/cscareerquestions Three CS fundamental books in the order I'd suggest someone read them if they don't have a background in CS. u/sbsmith · 12 pointsr/gamedev Hi PizzaPartify, I believe that different companies/teams will place emphasis on different skills. When I was helping to hire software engineers for EA's motion capture studio, I liked to see candidates who showed a strong aptitude for engineering code to be maintainable. For me, this meant a familiarity with design patterns and software development processes (like Test Driven Development or Extreme Programming). In my department, much of our code was in C++ and Python. However, other departments would use languages like Java, C# or ActionScript - depending on the project. It would be helpful to know what role you are applying to. To answer your specific questions: 1. If you're already familiar with C++, I would highly recommend reading Effective C++ by Scott Meyers (http://www.amazon.ca/Effective-Specific-Improve-Programs-Designs/dp/0321334876). Every C++ developer should read this. Regardless of the language you're working in, I would also recommend Design Patterns by the gang of four (http://www.amazon.ca/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612). A game-specific recommendation is Game Engine Architecture by Jason Gregory (http://www.amazon.ca/Game-Engine-Architecture-Jason-Gregory/dp/1568814135). It doesn't matter if you intend to write an engine or not, it is immensely helpful to understand how they work. I own all of the Game Programming Gems books but use them more as a reference library. The books above will be more helpful right now. 2. I worked with Unity only briefly to prototype a game, so I can't really comment here. 3. This is tricky. I think you will need to find a passion project in C++ so that you will just naturally learn more about the language. And speaking of passion: you need to really want the job you are applying for. I have seen qualified developers miss out on jobs because you could tell they were just looking for anything (rather than really being enthusiastic about the position). I hope that helps. u/42e1 · 12 pointsr/compsci If you're interested in learning more about Turing's paper that introduced the Turing Machine, I highly recommend the book The Annotated Turing. It's by the same person who wrote Code, which is an oft-recommended book on this sub-reddit. u/MirrorLake · 12 pointsr/learnprogramming "Code" by Charles Petzold, if anyone wants the link. u/babyfacebrain666 · 12 pointsr/learnpython On the flip side I kind of envy you for your confidence in the underlying math... that shit is melting my brain currently. Check out https://automatetheboringstuff.com/ great starter book for basic python programming with more of an emphasis on just making a basic program vs the underlying data structures or algorithms. Anyone who says they don't still use these programs or an improved version of one is lying lol For Machine Learning stuffs: https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291 the current cause of my brain melting. If you don't like the idea of a textbook: http://interactivepython.org/runestone/static/pythonds/index.html http://www.fast.ai/ (this is EXTENSIVE I've been working on it on-off for like a year) u/1_________________11 · 12 pointsr/Futurology Just gonna drop this gem here. http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742 Doesn't have to be skynet level smart to fuck shit up. Also once its self modifying it's a whole other ballgame. u/punctured-torus · 11 pointsr/compsci u/finitedimensions · 11 pointsr/datascience I glanced at "Hands-On Machine Learning with Scikit-Learn and TensorFlow" by Aurelien Geron and thought it is quite good. But I have not had a chance to read it deeply yet. &amp;#x200B; https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291 u/FeepingCreature · 11 pointsr/slatestarcodex Somebody proposed a T-Shirt design saying "I broke my back lifting Moloch to Heaven, and all I got was this lousy Disneyland with no Children." Combines Meditations and Bostrom. u/SouthernArrowwood · 11 pointsr/learnprogramming From what I understand they're a way to structure your code to solve specific problems. An example would be a combination of the Factory pattern and the Component pattern as a way to use data driven design to create "things" in your world (I have enemy Bob, Bob.txt/Bob.xml/Bob.whatever has all the information to create Bob. The "factory reads in this info, and then handles creating the entity and components.) If you'd like to learn more there's the gang of 4 book Design Patterns: Elements of Reusable Object-Oriented Software, and for a focus on design patterns in games I liked gameprogrammingpatterns.com u/UpAndDownArrows · 11 pointsr/cscareerquestions u/ajh2148 · 11 pointsr/computerscience I’d personally recommend Andrew Ng’s deeplearning.ai course if you’re just starting. This will give you practical and guided experience to tensorflow using jupyter notebooks. If it’s books you really want I found the following of great use in my studies but they are quite theoretical and framework agnostic publications. Will help explain the theory though: Deep Learning (Adaptive Computation and Machine Learning Series) https://www.amazon.co.uk/dp/0262035618/ref=cm_sw_r_cp_api_i_Hu41Db30AP4D7 Reinforcement Learning: An Introduction (Adaptive Computation and Machine Learning series) https://www.amazon.co.uk/dp/0262039249/ref=cm_sw_r_cp_api_i_-y41DbTJEBAHX Pattern Recognition and Machine Learning (Information Science and Statistics) (Information Science and Statistics) https://www.amazon.co.uk/dp/0387310738/ref=cm_sw_r_cp_api_i_dv41DbTXKKSV0 Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series) https://www.amazon.co.uk/dp/B00AF1AYTQ/ref=cm_sw_r_cp_api_i_vx41DbHVQEAW1 u/lowlandslinda · 11 pointsr/Futurology Musk keeps up with what philosopher Nick Bostrom writes. Same reason why he knows about the simulation theory which is also popularised by this philosopher Nick Bostrom. And lo and behold, Bostrom has also a paper and a book on AI. u/NondeterministSystem · 11 pointsr/worldnews A scenario where such an AI becomes arbitrarily intelligent and capable of interacting with the outside world isn't beyond the realm of consideration. If it's smart enough to outplan us, a superintelligent Go engine of the future whose primary function is "become better at Go" might cover the world in computer processors. Needless to say, that would be a hostile environment for us...though I imagine such a machine would be frightfully good at Go. If you're interested in (much) more along these lines, I'd recommend Superintelligence: Paths, Dangers, Strategies by Nick Bostrom. I got it as an audio book, and it's thought provoking. u/RepliesWhenAngry · 11 pointsr/worldnews Very good point- I'm currently reading (or trying to read...) this book: http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111 I think you'd like it also. u/SkepticalMartian · 11 pointsr/PHP Beginner and Novice are the same thing. It sounds like you're trying to transition of Beginner to Intermediate. You really should stop trying to write your own framework for the moment, and start using a mature framework. Good frameworks aren't trivial to write, and generally require an expert level of knowledge to write well. The thing with a framework is that it helps remove you from a lot of boilerplate code - that is, common code everyone would normally need for any give web project. The easiest way for you to bridge the gap is to begin using and understanding code that is better than yours. Don't reinvent the wheel until you understand how to make a better wheel. Design patterns are everywhere in good code. The trick is to recognize when a design pattern is being used, and to understand why it's being used. In order to help with this, it's commonly recommended to read Design Patterns: Elements of Reusable Object-Oriented Software. This is a book every programmer should own regardless of the language they use. u/faintdeception · 11 pointsr/learnprogramming The amount of planning you have to do scales with the complexity of the project. Professors drill the importance of planning, documentation and unit testing into students because it is extremely important and once you start your career if you're a poor planner it's going to come back to haunt you. However, when you're working on a simple project that's not intended for public release you don't have to go overboard with docs unless you just want to practice. My own process usually starts with me jotting down an idea; I find that writing it out helps me to get a better grasp on the overall feasibility. Once I'm satisfied that I actually have something I can implement I'll diagram the flow of the application, and maybe do some wire-frames. I usually find that this is enough of a launching pad for a simple personal project. Professional projects are a different ballgame, because as I said, the amount of planning you have to do scales with the complexity and size of the project. It's in the professional environment that all of the things your professors are teaching you will become really important. So, to answer what I think was your question, &gt;So how does one end up with 20 classes connected with each other perfectly and a build file that set everything up working flawlessly with unit test methods that check every aspect of the application? This comes about more in the implementation phase than the planning phase. I've heard it said that in war "no plan survives contact with the enemy" and you'll find this to be true in software development as well. Even when you plan really well you'll sometimes have to go back to the drawing board and come up with a new plan, but that's just part of the process. Some books that I recommend on the topic are Hackers and Painters - Paul Grahm and I think every software dev should have a copy of Design Patterns The former is a collection of essays that might give you some useful perspective on the process of writing software. The latter is more of a reference book, but it's helpful to become familiar with the patterns covered in the book so that you don't find yourself re-inventing the wheel every time you begin a new project. As for the other part of your question (apologies for addressing them out of order) &gt;My new "bottleneck" writing code is the structure. I end up having huge classes with way to many public methods. I might as well just write a script with everything in one file. Almost anyway.. I try to write OO, but I often get lazy and just end up with not very elegant systems I would say. Don't be lazy, because as you're already seeing, it comes back to bite you in the ass. As you're writing your code you have to be mindful of the complexity of the project as it grows around you, and you have to periodically take a step back and look at what you've created, and re-organize it. This kind of goes back to what I was saying earlier about no plan surviving enemy contact. So when you find yourself creating a new class that you hadn't thought about, be mindful of where you put it. Should you create a new file (yes, of course you should), new folder? Do you have a bunch of similar classes doing the same thing? Should they inherit from one another? Be especially mindful of copy and pasting from one are of your code to another, generally speaking if you're doing this you should probably be writing a function, or using inheritance. It's up to you as the developer to make sure your project is organized, and now-a-days it's really easy to learn how to best organize code by looking through other peoples projects on github, so there's really no excuse for it. Hope that helps, good luck. u/mooshoes · 11 pointsr/IWantToLearn I'd recommend you start with the book "Code", which handles just this progression: http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 From there, investigate operating system development. The Minix OS is very well documented. u/entropicone · 11 pointsr/compsci Riding on your top post coattails... The Elements of Computing Systems and Code by Charles Petzold are exactly what you want. Code goes through number systems, basic information theory, circuits (from gates on up), memory, machine code and programming languages, all with accessible diagrams and explanations. TECS has you build an actual working computer from the ground up. u/JJinVenice · 11 pointsr/askscience Your brain uses memory as a way to anticipate the effort required in these situations. There is a book called On Intelligence by Jeff Hawkins that discusses this. You've opened thousands of doors, lifted thousands of objects. Your brain remembers how it felt to engage in that activity. So when you approach a door, your brain sees what type of door it is and anticipates how much effort will be required to open it. Sometimes your brain gets it wrong. edit: a word u/enteleform · 11 pointsr/compsci Check out: Grokking Algorithms: An illustrated guide for programmers and other curious people &amp;nbsp; I'm also pretty rusty at math right now, and have been getting by with a try-different-things-until-it-works approach.&amp;nbsp; Even for the types of problems I've become efficient at solving, in many cases I don't know the actual terminology, so it makes it difficult to expand upon concepts or communicate them with others.&amp;nbsp; I'd like to get to a point where I can mentally reason about processes &amp; formulas without having to execute them in order to see the results, and I feel like the first step to get there is to get reacquainted with terminology &amp; foundational concepts.&amp;nbsp; Here are Some Resources I've queued up to work through for that purpose. u/asdff01 · 11 pointsr/AskComputerScience The book that allowed me to do this is the legendary "Gang of Four" Design Patterns book. Code examples are in C++ and it was written a while ago, but is still recommended as a fantastic resource for learning how to design software well. There is also the SOLID principles, for object oriented design. u/Pandasmical · 11 pointsr/computerscience I enjoyed this one! Code: The Hidden Language of Computer Hardware and Software Here is someone else's detailed review on it "Charles Petzold a does an outstanding job of explaining the basic workings of a computer. His story begins with a description of various ways of coding information including Braille, Morse code, and binary code. He then describes the development of hardware beginning with a description of the development of telegraph and relays. This leads into the development of transistors and logic gates and switches. Boolean logic is described and numerous electrical circuits are diagramed showing the electrical implementation of Boolean logic. The book describes circuits to add and subtract binary numbers. The development of hexadecimal code is described. Memory circuits are assembled by stringing logic gates together. Two basic microprocessors are described - the Intel 8080 and the Motorola 6800. Machine language, assembly language, and some higher level software languages are covered. There is a chapter on operating systems. This book provides a very nice historical perspective on the development of computers. It is entertaining and only rarely bogs down in technical detail." u/fj333 · 11 pointsr/compsci u/gunder_bc · 11 pointsr/learnprogramming Learn some math, yes. Algebra, Discrete Math, Inductive Logic, Set Theory. Calc and Matrix Algebra are good for specific things, and just in general to beef up your math skills. But don't get hung up on it too much. It's a good thing to always have going in the background. Start to think about how all computation is math - check out The Annotated Turing and really wrap your head around both what Petzold is talking about and what Turing is talking about. That may require you to take a step back and study Formal Languages, Finite State Machines, and other related concepts (all the stuff that lets you build up to Regular Expressions, etc). Turnings thesis really gets to the heart of a computation, and those concepts build on that. Then go learn LISP just to bend your brain some more. Comp Sci is a fascinating subject, and you're off to a good start by thinking about what that Stack Overflow commenter meant - how are all languages similar? How do they differ? What's the underlying problem you're solving, and what are different ways of solving it? How do the tools you choose alter your solution? Try writing something relatively simple (say, a program that plays Checkers or Tic-Tac-Toe and always wins) in a few different languages (start with ones you know, then learn some new ones to see how - if you've worked with Procedural or OO languages, try Functional ones). u/toastisme · 11 pointsr/IWantToLearn A similar question was posted on Quora not long ago, and the main recommendation was Code by Charles Petzold: http://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;qid=1395088237&amp;amp;sr=8-1&amp;amp;keywords=code+charles+petzold Having subsequently read the book I think it's a fantastic introduction, and goes through everything from the importance of binary code and applying Boolean logic to circuits, to the details of the inner workings of the first microprocessors, and all in an interesting and engaging way. u/maybefbi · 10 pointsr/compsci Title: On Computable Numbers, with an Application to the Entscheidungsproblem Authors: Alan Turing Link: http://plms.oxfordjournals.org/content/s2-42/1/230.full.pdf Abstract: In just 36 pages, Turing formulates (but does not name) the Turing Machine, recasts Gödel's famous First Incompleteness Theorem in terms of computation, describes the concept of universality, and in the appendix shows that computability by Turing machines is equivalent to computability by λ-definable functions (as studied by Church and Kleene). Source Comments: In an extraordinary and ultimately tragic life that unfolded like a novel, Turing helped break the German Enigma code to turn the tide of World War II, later speculated on artificial intelligence, fell victim to the homophobic witchhunts of the early 1950s, and committed suicide at the age of 41. Yet Turing is most famous for an eerily prescient 1936 paper in which he invented an imaginary computing machine, explored its capabilities and intrinsic limitations, and established the foundations of modern-day programming and computability. From his use of binary numbers to his exploration of concepts that today's programmers will recognize as RISC processing, subroutines, algorithms, and others, Turing foresaw the future and helped to mold it. In our post-Turing world, everything is a Turing Machine — from the most sophisticated computers we can build, to the hardly algorithmic processes of the human mind, to the information-laden universe in which we live. Source u/Monguce · 10 pointsr/askscience This is a really great book about the topic. It's much simpler than you might think but kind of tricky to explain unless you know a bit of back ground. The book costs less than a tenner and will give you a while different appreciation of how computers work. Well worth a read even if it starts out seeing rather simple. https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319 u/lukeprog · 10 pointsr/Futurology Our co-founder Eliezer Yudkowsky invented the entire approach called "Friendly AI," and you can read our original research on our research page. It's interesting to note that in the leading textbook on AI (Russell &amp; Norvig), a discussion of our work on Friendly AI and intelligence explosion scenarios dominates the section on AI safety (in ch. 26), while the entire "mainstream" field of "machine ethics" isn't mentioned at all. u/thetafferboy · 10 pointsr/artificial From the comments below from /u/Buck-Nasty /u/Jadeyard /u/CyberByte /u/Ken_Obiwan For those that haven't read it, I can't recommend Superintelligence: Paths, Dangers, Strategies highly enough. It talks about various estimates from experts and really draws the conclusion that, even at the most conservative estimates, it's something we really need to start planning for as it's very likely we'll only get one shot at it. The time between human-level intelligence and super-intelligence is likely to be very short, if systems can self-improve. The book brings up some fascinating possible scenarios based around our own crippling flaws, such as we can't even accurately describe our own values to an AI. Anyway, highly recommended :) u/chromaticgliss · 10 pointsr/AskComputerScience Caclulus track is typically required. Linear algebra is pretty useful (and often required). If you really wanna go into CS specific maths.... pick up a book on Discrete Math and give it a go. If you want to be really hardcore, pickup Concrete Mathematics ... good luck, hard book. Honestly, you're probably better off spending that time learning a programming language. Specifically, whatever language your school teaches mostly. Math in a CS bachelor isn't very intense until you get into senior/graduate level courses. u/cabbagerat · 10 pointsr/compsci Start with a good algorithms book like Introduction to algorithms. You'll also want a good discrete math text. Concrete Mathematics is one that I like, but there are several great alternatives. If you are learning new math, pick up The Princeton Companion To Mathematics, which is a great reference to have around if you find yourself with a gap in your knowledge. Not a seminal text in theoretical CS, but certain to expand your mind, is Purely functional data structures. On the practice side, pick up a copy of The C programming language. Not only is K&amp;R a classic text, and a great read, it really set the tone for the way that programming has been taught and learned ever since. I also highly recommend Elements of Programming. Also, since you mention Papadimitriou, take a look at Logicomix. u/jonhanson · 10 pointsr/computergraphics u/madebyollin · 10 pointsr/MachineLearning The Bostrom book is the go-to reference for the sort of ai risk arguments that Musk and others endorse. Elon has previously linked to this WaitBuyWhy post summarizing the argument from the book, so I would read that if you're curious. (Not that I agree with any of it, but linking since you asked) u/ThomasMarkov · 10 pointsr/math Gödel, Escher, Bach: An Eternal Golden Braid by Douglas R. Hofstadter is perhaps the most thought provoking book I have ever read. It unifies music, art, and mathematics and will simply blow your mind. u/distantocean · 10 pointsr/exchristian That's one of my favorite popular science books, so it's wonderful to hear you're getting so much out of it. It really is a fascinating topic, and it's sad that so many Christians close themselves off to it solely to protect their religious beliefs (though as you discovered, it's good for those religious beliefs that they do). As a companion to the book you might enjoy the Stated Clearly series of videos, which break down evolution very simply (and they're made by an ex-Christian whose education about evolution was part of his reason for leaving the religion). You might also like Coyne's blog, though these days it's more about his personal views than it is about evolution (but some searching on the site will bring up interesting things he's written on a whole host of religious topics from Adam and Eve to "ground of being" theology). He does also have another book you might like (Faith Versus Fact: Why Science and Religion are Incompatible), though I only read part of it since I was familiar with much of it from his blog. &gt; If you guys have any other book recommendations along these lines, I'm all ears! You should definitely read The Selfish Gene by Richard Dawkins, if only because it's a classic (and widely misrepresented/misunderstood). A little farther afield, one of my favorite popular science books of all time is The Language Instinct by Steven Pinker, which looks at human language as an evolved ability. Pinker's primary area of academic expertise is child language acquisition, so he's the most in his element in that book. If you're interested in neuroscience and the brain you could read How the Mind Works (also by Pinker) or The Tell-Tale Brain by V. S. Ramachandran, both of which are wide-ranging and accessibly written. I'd also recommend Thinking, Fast and Slow by psychologist Daniel Kahneman. Evolution gets a lot of attention in ex-Christian circles, but books like these are highly underrated as antidotes to Christian indoctrination -- nothing cures magical thinking about the "soul", consciousness and so on as much as learning how the brain and the mind actually work. If you're interested in more general/philosophical works that touch on similar themes, Douglas R. Hofstadter's Gödel, Escher, Bach made a huge impression on me (years ago). You might also like The Mind's I by Hofstadter and Daniel Dennett, which is a collection of philosophical essays along with commentaries. Books like these will get you thinking about the true mysteries of life, the universe and everything -- the kind of mysteries that have such sterile and unsatisfying "answers" within Christianity and other mythologies. Don't worry about the past -- just be happy you're learning about all of this now. You've got plenty of life ahead of you to make up for any lost time. Have fun! u/c_d_u_b · 10 pointsr/AskHistorians Computer scientist here... I'm not a "real" mathematician but I do have a good bit of education and practical experience with some specific fields of like probability, information theory, statistics, logic, combinatorics, and set theory. The vast majority of mathematics, though, I'm only interested in as a hobby. I've never gone much beyond calculus in the standard track of math education, so I to enjoy reading "layman's terms" material about math. Here's some stuff I've enjoyed. Fermat's Enigma This book covers the history of a famous problem that looks very simple, yet it took several hundred years to resolve. In so doing it gives layman's terms overviews of many mathematical concepts in a manner very similar to jfredett here. It's very readable, and for me at least, it also made the study of mathematics feel even more like an exciting search for beautiful, profound truth. Logicomix: An Epic Search for Truth I've been told this book contains some inaccuracies, but I'm including it because I think it's such a cool idea. It's a graphic novelization (seriously, a graphic novel about a logician) of the life of Bertrand Russell, who was deeply involved in some of the last great ideas before Godel's Incompleteness Theorem came along and changed everything. This isn't as much about the math as it is about the people, but I still found it enjoyable when I read it a few years ago, and it helped spark my own interest in mathematics. Lots of people also love Godel Escher Bach. I haven't read it yet so I can't really comment on it, but it seems to be a common element of everybody's favorite books about math. u/DoorsofPerceptron · 10 pointsr/MachineLearning For a maths heavy book, I'd go with Bishop's Pattern recognition and Machine Learning. Check out the reviews here: http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738 u/Jumballaya · 10 pointsr/FreeCodeCamp &gt; Felt pretty good about myself.. until I got to the algorithm section. This is VERY normal. These are hard math concepts that take everyone a little bit to get used to. The only way you will learn these concepts is by implementing them, over and over and over and over and over. &gt; I would say I was getting stuck probably about half the time and would turn to read-search-ask method. If this were not the case, then there would be no need to learn. I am a web developer and I look up the most inane shit on a daily basis because it is something that I either have never used/implemented or something I rarely use/implement (my big one here is PHP's array functions, I can never remember if the array comes before the callback or not with array_map() but I remember that it is the exact opposite of array_filter() and array_reduce()). Embrace this, build your Google-fu because you will always need it. &gt; A lot of times I was missing some small operator (code error) or somewhat minor step in the thought process, other times I would be lost entirely. Basically I wasn't thinking about how to implement my code well enough, imo. This is 100% normal. Have you ever heard of a code review? This is where other developers review your code before it goes live. The point of this is that you cannot be 100% perfect with your code, maybe you forgot a semicolon or maybe your code is tough to read, that is what the code review process is like. I write code in iterations to make sure that I never 'get in too deep' and the fear of removing code sets in, each of these phases I go through a mini code review to see what is working at what isn't. I ALWAYS find some half-baked logic in my first few iterations before I really get into it and over the last couple years I find that I need fewer and fewer iterations and that I am able to get a better 'big picture.' Don't be afraid to scrap some code and go back at it, this is your education and only you know when you understand the material. I have a bajillion abandoned side projects and so does every developer that I know. Advice 1. Keep coding, it is the only way you are going to get better, for real. 2. Read other people's code. This is where I learn all my cool tricks; Anytime I find a cool project/library/framework I hit up github and read through as much of the source as I can stomach. You will be surprised at how much you learn about the parts of a project that AREN'T code like documentation, contributing, comment styles, things that seem secondary when first learning to program. 3. Design patterns, paradigms, data structures, algorithms. I wouldn't suggest going head-on with this stuff yet, but don't be afraid of it. Design Patterns Book (Gang of Four) - This is a very highly suggested book, though the examples are in C++ 4. Learn another language. The largest increases in my JS knowledge have come from learning another language and bringing back that language's way of thinking into JS. I would suggest Python because it is stupid-easy to jump in and start making cool stuff, but any language will do. Python, Java, C#, PHP, Elixir, Ruby, C++ and Go are a handful I can think of that will aid in employment as well as teach you new ways of thinking about JS. 5. Talk to developers, go to meetups, scour github for misspellings in documentation and contribute. Anything that you can do to get a free mentor will be an AMAZING boon for you Links 6. Project Euler - Looking for something to code? Here is around 600 different problems to solve. I am pretty sure some of the FCC algorithms were taken from Project Euler as it is a very good resource. 7. Eloquent JavaScript - A great resource, though, a little dated 8. You Don't Know JS - Another great resource on the JavaScript language. I reread through these books every once in a while. 9. Professor Frisby's Mostly Adequate Guide to Functional Programming - Great primer on functional programming with JavaScript 10. Rosetta Code - Algorithm reference across many languages, though, the coding style for each tends to be a mess it is a good getting-started reference. 11. 2017 Frontend Handbook - This is great for figuring out what to learn next when you get to that point where you don't know what to learn next. I did FCC up through the frontend section, I started my web dev career path in 2014 and picked up FCC in mid 2015 right before getting a job in web development. The most important part of FCC is that you are coding, getting practice and making mistakes, TONS of mistakes. Just keep it up, don't get burned out and remember that it is about your education, not how many challenges you complete. Code and read and read code. u/Grazfather · 10 pointsr/engineering Anyone who likes this stuff should really read code. The author goes from tin-can phones to building a computer, in language anyone could follow. u/remembertosmilebot · 10 pointsr/learnprogramming Did you know Amazon will donate a portion of every purchase if you shop by going to smile.amazon.com instead? Over50,000,000 has been raised for charity - all you need to do is change the URL!

https://smile.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

---

^^i'm ^^a ^^friendly&amp;nbsp;bot

u/NotAGeologist · 9 pointsr/computerscience

This is kind of a weird one but I’d suggest Code. Very non-technical, no programming, but cool history and fundamentals.
Code: The Hidden Language of Computer Hardware and Software - Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_i_IiH2DbWSNWHMW

This is the book you want to read. It walks you through every bit of how a CPU works, in an incredibly approachable way. (If you can understand a light switch, you can understand this book.)

u/akmark · 9 pointsr/programming

I'll recommend Code, even though it isn't specifically theoretical. However, it does go over how code (semaphore, morse code) evolved over time. From someone who does program this is about a 'human' a book as they come which could fit exactly what you are looking for.

Read this book: Gödel, Escher, Bach: An Eternal Golden Braid at least 3 times, and take your time.

u/SharmaK · 9 pointsr/books

For some physics :

Gleick - Chaos

Some math/philosophy :
Hofstadter - Godel, Escher, Bach: An Eternal Golden Braid

Anything early by Dawkins if you want to avoid the atheist stuff though his latest is good too.

Anything by Robert Wright for the evolution of human morality.

Pinker for language and the Mind.

Matt Ridley for more biology.

I'm just beginning this journey myself, so judge what I say accordingly.

Artificial Intelligence: A Modern Approach seems to be the most popular textbook.

This article has some seemingly good advice, though it seems to be geared more toward Machine Learning (ML) than AI in general.

I think you'll want to learn a programming language. The above article recommends Python as it is well suited to ML.

There is (was?) a free online course on ML from Stanford by Andrew Ng. I started to take it a couple years ago but never finished. It is very accessible. The lectures appear to be on YouTube.

Grokking Algorithms is a highly regarded book on algorithms.

Make a free Amazon Web Services account and start playing with Sagemaker.

There really is no well defined path to learning AI, in my opinion. It is a highly interdisciplinary endeavor that will require you to be a self-starting autodidact. It's very exciting though. There is still plenty of new ground to be broken. Some might argue it is difficult for the little guy to compete with big labs at the big tech companies with their ungodly amounts of data to feed their AI, but I am optimistic.

u/SomeIrishGuy · 9 pointsr/IWantToLearn

Artificial Intelligence: A Modern Approach is a commonly used introductory textbook.

u/sleepingsquirrel · 9 pointsr/ECE
u/IamAlbertHofmann · 9 pointsr/learnprogramming

here you go

It's the 'hidden language', not 'secret'. Sorry about that.

This book is a very good explanation of how computers work. It starts with explaining electromechanical switches, and how you can turn a couple switches into a logic gate. Then, it shows how you can put logic gates together to do arithmetic. It goes on like that until you reach programmable computers.

Hooray, a question I can answer!

One of the problems here is that the question is worded backwards. Binary doesn't combine to give us programming languages. So the answer to your question is somewhat to the contrary: programming languages were invented to ease the tedium of interfacing using binary codes. (Though it was still arguably tedious to work on e.g. punched cards.) Early interfaces to programming machines in binary took the form of "front panels" with switches, where a user would program one or several instructions at a time (depending on the complexity of the machine and the front panel interface), using the switches to signify the actual binary representation for the processor functions they desired to write.

Understanding how this works requires a deeper understanding of processors and computer design. I will only give a very high level overview of this (and others have discussed it briefly), but you can find a much more layperson accessible explanation in the wonderful book Code: The Hidden Language of Hardware and Software. This book explains Boolean logic, logic gates, arithmetic logic units (ALUs) and more, in a very accessible way.

Basically, logic gates can be combined in a number of ways to create different "components" of a computer, but in the field of programming languages, we're really talking about the CPU, which allows us to run code to interface with the other components in the system. Each implementation of a processor has a different set of instructions, known as its machine code. This code, at its most basic level, is a series of "on" or "off" electrical events (in reality, it is not "on" and "off" but high and low voltages). Thus, different combinations of voltages instruct a CPU to do different things, depending on its implementation. This is why some of the earliest computers had switch-interfaces on the front panel: you were directly controlling the flow of electricity into memory, and then telling the processor to start executing those codes by "reading" from the memory.

It's not hard to see how programming like this would be tedious. One could easily write a book to configure a machine to solve a simple problem, and someone reading that book could easily input the code improperly.

So eventually as interfacing with the machine became easier, we got other ways of programming them. What is commonly referred to as "assembly language" or "assembler" is a processor-specific language that contains mnemonics for every binary sequence the processor can execute. In an assembly language, there is a 1:1 correlation between what is coded, and what the processor actually executes. This was far easier than programming with flip-switches (or even by writing the binary code by hand), because it is much easier for a human to remember mnemonics and word-like constructs than it is to associate numbers with these concepts.

Still, programming in assembly languages can be difficult. You have to know a lot about the processor. You need to know what side-effects a particular instruction has. You don't have easy access to constructs like loops. You can't easily work with complex datatypes that are simply explained in other languages -- you are working directly with the processor and the attached memory. So other languages have been invented to make this easier. One of the most famous of these languages, a language called "C," presents a very small core language -- so it is relatively easy to learn -- but allows you to express concepts that are quite tedious to express in assembler. As time has gone on, computers have obviously become much faster, and we've created and embraced many languages that further and further abstract any knowledge about the hardware they are running on. Indeed, many modern languages are not compiled to machine code, but instead are interpreted by a compiled binary.

The trend here tends to be making it easier for people to come into the field and get things done fast. Early programming was hard, tedious. Programming today can be very simple, fun and rewarding. But these languages didn't spring out of binary code: they were developed specifically to avoid it.

TL;DR: People keep inventing programming languages because they think programming certain things in other ones is too hard.

u/mcscottmc · 9 pointsr/compsci

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=mp_s_a_1_1

This book explains how computers work from first principles (electricity and switches on up). Very easy to read. I am surprised it hasn’t been mentioned yet.

u/weasler · 9 pointsr/compsci

Code is an absolute classic.

u/shwestrick · 9 pointsr/compsci

Sipser's Introduction to the Theory of Computation is an excellent book with three major parts:

1. Automata and Languages
2. Computability Theory
3. Complexity Theory

Each part builds on the previous. I highly recommend working through it.
u/pm_me-your_tits-plz · 9 pointsr/learnpython

I haven't read it myself, but it has been recommended to me. https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291

Edit: PM me if you want a free copy(have it in epub, mobi and pdf)
EDIT: I stand corrected, I was thinking of another book that was azw3 format.

u/starkprod · 9 pointsr/worldnews

The whole terminator / skynet scenario isnt what they are afraid of either. If you would like to know more on the subject matter, I would suggest reading "Superintelligence" by Nick Bostrom. This paints a pretty good picture of the problem with AI. https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742

TL/DR of parts of the book

1: It is stupidly difficult to design an AI that has the same frame of reference as humans. There are many reasons for this, well described in the book. In short, we humans have values such as good and bad, a machine is unlikely to share ours and might aim for success over failiure, without regard to what we would call bad side-effects.

2: This leads to many scenarios where you tell an AI to do a thing, and it does just that, but in a way that will harm humans, not as a means or an end, just as a biproduct. Harm is a broad term. Not needing to kill us, but re-routing all available power to a specific calculation would have serious ramafications for anything using electricity. Using all available resources to create more computers or electricity will also be a big problem for human existance or society as we currently know it. Suggest to read a summary here: https://en.wikipedia.org/wiki/AI_control_problem#The_problem_of_perverse_instantiation:_.22be_careful_what_you_wish_for.22

3 Since 1 and 2 are difficult, its difficult to create reliable safeguards. There is a lot of theory on how to build them, but all in all, they are not easy to do, and whats worse, you often have no idea of knowing that they work until they fail.

4 Since 3 is difficult, corporations or governments might not investigate fully if they have managed to take nececary precautions since they will then maybe fall back in the race of developing said AI. Increasing the risk of a catastropic failure.

5 A self improving general AI will be able to do so in a extremely rapid pace.

6 Combine all of the above and we get the likelyhood of a non zero chance that we develop an AI that we cannot understand (or understands us, or might not care about us for that matter) that we have no way of stopping. Said AI may be doing all in its power to help us with what we are asking of it, and as a byproduct of doing just that, might turn the planet into a giant solarpanel. It is not saying that this is the default outcome, however it is likely. The thing is, if it does, its non-reversible. And currently, we are not sure how to prevent such a scenario.

TLDR/TLDR
Terminator scenario extremely unlikely. What ppl are afraid of are that we might just fuck up because "we are not building skynet, we are making an intelligent paperclip counter!" without realizing that there are big dangers even in this extremely simple scenario.

u/VelveteenAmbush · 9 pointsr/MachineLearning

&gt; I can't help but cringe every time he assumes that self-improvement is so easy for machines so that once it becomes possible at all, AI skyrockets into superintelligence in a matter of weeks.

He doesn't assume it, he concludes it after discussing the topic in depth.

Pages 75-94 of his book. Preview available via Amazon.

u/Newt_Hoenikker · 9 pointsr/C_Programming

C and C++ are pretty different nowadays depending on your standard. "Game engine" is a pretty generic descriptor, because you can build game engines in a lot of different ways depending on your needs for the genre and how all-encompassing your engine needs to be, so I'm going to ask you a few questions about specifics in regards to your experience which might help to flesh out where you can start your search.

• How are you with Data Structures? Algorithms? Which are you familiar with? How extensively have you used them? CLRS is a decent starting point with pretty broad coverage and good descriptions.

• What about Design Patterns? Which ones? How much? They're not so much applicable in C, but C++ and other OOP languages are lousy with them. My university was way too into this book, but it wasn't bad; bonus all the examples are in C++.

• How portable is your code, generally? What's your programming environment like? Windows? Mac? Linux? *BSD? Games are usually Windows oriented, but there's a lot that C/++ can do aside from that, and IMHO the best way to learn systems programming is with C and a Unix-like OS.

• What is it exactly that you want to accomplish with your code? A broader engine? A more portable engine? Something not game related? In my experience learning for the sake of learning is great and all, but I lack drive without a concrete goal I'm working toward.

Hope this helps.
u/ActionHotdog · 9 pointsr/cscareerquestions

Knowing a giant list of programming languages is really overrated. Instead, focus on learning new programming concepts.

Sometimes that can mean learning a new language, but not always. Some examples:

• Design patterns! There's so many and while a lot of them are pretty niche, it's always helpful to be able to identify what someone else's code is doing, or identify when a give problem can be partially solved using one. The Gang of Four is the go-to book for this area.

• On the other end, anti-patterns. Being able to identify (and fix) poor design choices that can cause maintainability problems later is really valuable.

• Memory management. Java/C# handle most of it for you, but understanding what's happening under the covers is crucial for making design decisions (i.e., how much garbage needs to be collected as a result of using this API versus a different one?). C/C++ is the king of this area, so you could use this as a reason to learn a new language.

• Functional programming (Scheme, Lisp). It's really different from imperative (Java, C#, C++) programming.

• Advanced features of languages that you already know. Generics, operator overloading, etc. You might know many of these, but I doubt you know all of them.

And regarding your concern of it being harder to learn new languages later, you'll only really have that problem when learning a vastly different language (such as Scheme when compared to your C#). Once you know one language in the same "family", a lot of knowledge carries over.
u/siddboots · 9 pointsr/statistics

It is hard to provide a "comprehensive" view, because there's so much disperate material in so many different fields that draw upon probability theory.

Feller is an approachable classic that covers all of the main results in traditional probability theory. It certainly feels a little dated, but it is full of the deep central limit insights that are rarely explained in full in other texts. Feller is rigorous, but keeps applications at the center of the discussion, and doesn't dwell too much on the measure-theoretical / axiomatic side of things. If you are more interested in the modern mathematical theory of probability, try Probability with Martingales.

On the other hand, if you don't care at all about abstract mathematical insights, and just want to be able to use probabilty theory directly for every-day applications, then I would skip both of the above, and look into Bayesian probabilistic modelling. Try Gelman, et. al..

Of course, there's also machine learning. It draws on a lot of probability theory, but often teaches it in a very different way to a traditional probability class. For a start, there is much more emphasis on multivariate models, so linear algebra is much more central. (Bishop is a good text).

u/mzieg · 8 pointsr/learnprogramming
u/munificent · 8 pointsr/ruby

One of the things we can do to make our lives harder in software engineering is muddle terminology. It adds needless friction to our job if when you say "flernb" you're referring to a widget but when I use say it, I'm talking about a whozit.

In low level code, we're pretty good about sticking to the first term that was coined for something. If you have a series of objects, each with a reference to the next, you'll probably call it a "linked list", just like I would.

But when it comes to higher-level architecture and design, names get a lot messier. Fortunately, some dudes wrote a book that tries to be the beginning of a taxonomy for architecture. Ultimately, the names they pick are arbitrary, but if we all agree to stick to them, then they become concretely useful.

What the author describes here is not a proxy. A proxy is a local object that represents a distant one, usually on another machine or process. It's not a very common part outside of distributed systems.

u/rafuzo2 · 8 pointsr/androiddev

Here's a rough outline, from high level to low(ish):

• Fundamentals of Object Oriented Programming. Understand basic concepts like inheritance and encapsulation and why you'd use them. Any book with these words in the title/subtitle should be able to get you the basics, pull a highly-rated book off Amazon for recommendations.

• (This is optional but highly recommended) Learn basic design patterns such as those presented in the Gang of Four book. These aren't required for writing Android apps but the more you understand about patterns the more it'll help you later on. You don't need to master this stuff at the outset so just read at your leisure.

• Learn Eclipse. It's a big subject and for seasoned veterans the various components can be confusing, but you should know how to use an Integrated Development Environment regardless.

• Follow the tutorials that VersalEszett mentioned.

Extra Credit
• Get an account on Github and understand how Git works. It's free unless you want private repos. Google around, find android development projects that are public, clone the repos and walk through the code - retype it line by line if you need to. Try to figure out why things are broken out the way they are.

That will get you started but there's tons more. The best thing you can do is write code and read code and be patient.
u/bluecoffee · 8 pointsr/MachineLearning

If you're having to ask this, it means you haven't read enough textbooks for reading papers to make sense.

What I mean is that to make sense of most research papers you need to have a certain level of familiarity with the field, and the best way to achieve that familiarity is by reading textbooks. Thing is, if you read those textbooks you'll acquire a familiarity with the field that'll let you identify which papers you should focus on studying.

Now go read MLAPP cover to cover.

u/Zaemz · 8 pointsr/programming

This is awesome! I've been slowly getting more and more interested in hardware, and this is something I would absolutely love to do. I just don't know where to start.

I've been reading a couple of books about learning lower level stuff, and planned on working my way up.

I'd really like to get out of webdev and into low-level programming, or even hardware design and implementation. There's sooooo goddamn much to learn, that I doubt I'll be ready without getting a BS in Comp. Engineering, and maybe a master's as well.

(I'm absolutely a beginner, and if anyone is interested in the books I've been reading, these are they:

u/azimuth · 8 pointsr/compsci

Also Code: The Hidden Language of Computer Hardware and Software. It literally starts out with telegraphs, and shows how, if you are sufficiently crazy, they can be assembled into a working computer. Then it shows how you can write software for your telegraph-relay-cpu. A great read.

u/Daganar · 8 pointsr/programming

For anyone interested in this kinda stuff I would really recommend "Code: The hidden language of computer hardware and software"
https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319

u/help_me_will · 8 pointsr/actuary

Against The God: the remarkable story of Risk- Outlines the history of probability theory and risk assessment through the centuries

https://www.amazon.com/Against-Gods-Remarkable-Story-Risk/dp/0471295639/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1475105434&amp;amp;sr=1-1&amp;amp;keywords=against+the+gods

When Genius Failed - A narrative of the spectacular fall of Long Term Capital Management, a hedge fund which had on its board both Myron Scholes AND Robert Merton (you will recall them from MFE)
https://www.amazon.com/When-Genius-Failed-Long-Term-Management/dp/0375758259/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1475105453&amp;amp;sr=1-1&amp;amp;keywords=when+genius+failed

Black Swan/ Antifragility- A former quant discusses the nature of risk in these controversial and philosophical books. Some parts of this book are actually called out and shamed in McDonald's Derivative Markets, one or the both of them are worth reading

https://www.amazon.com/Black-Swan-Improbable-Robustness-Fragility/dp/081297381X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1475105478&amp;amp;sr=1-1&amp;amp;keywords=black+swan

Godel, Escher, Bach- Very dense look into recursive patterns in mathematics and the arts. While not actuarial, it's obviously very mathematical, a must read.

https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1475105497&amp;amp;sr=1-1&amp;amp;keywords=geb

Endurance- This was recommended to me by a pure mathematics professor. Again, not actuarial, but more about the nature of perseverance though problem solving(sound familiar). It's about Shakleton's famous voyage to the south pole.

https://www.amazon.com/Endurance-Shackletons-Incredible-Alfred-Lansing/dp/0465062881/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1475105520&amp;amp;sr=1-1&amp;amp;keywords=endurance+shackleton%27s+incredible+voyage

u/chindogubot · 8 pointsr/compsci

I get teased by people that I am one of only 3 people in the world to have actually finished this book, one of those being the author and the other being the person who recommended to me, but Godel, Escher, Bach: An Eternal Golden Braid was pretty interesting. It covers the profoundness of the topic and is interspersed with Alice in wonderland style dialog that comes at the topic from another angle. Deep but captivating overall.

On a tangent, Goedel's theorem and Turing's incompleteness theorem, along with some other mathematicians who have gazed out over the edge of logic and gone somewhat mad are covered in the BBC documentary Dangerous Knowledge.

u/MrPhantomZz · 8 pointsr/6thForm

Could read a book based on your interests in computer science, e.g. AI, Machine Learning, data science etc.

A good book that I recently picked up was
[Code: The Hidden Language of Computer Hardware and Software by Charles Petzold] (https://www.amazon.co.uk/dp/0735611319/ref=cm_sw_r_cp_api_qB4TBbG90ANHN)

u/prego_no_pao · 8 pointsr/portugal

acabei de ler o Code (1999). É uma boa introdução ao funcionamento de um computador baseada na sua evolução na história.

u/fiskfisk · 8 pointsr/compsci

Code: The Hidden Language of Computer Hardware and Software from Charles Petzold does just that, starting from the simplest form and going through all the different steps we took to get to where we are today. Well worth a read!

u/codeificus · 8 pointsr/programming

The 86 stands for the instruction set for the cpu. Basically, every chip designed in the world accepts input and output, but in different ways (different numbers of connections, ordering). All of those chips have more or less backwards compatibility with regard to that, so it makes it easier for others to develop around that.

So there is a meaning conveyed, though it probably isn't important to you if you aren't developing hardware or writing assembly.

I strongly recommend Code by Charles Petzold which explains the origins of these chipsets. Basically Intel put out the 8080 in 1974 which was an 8-bit processor, then the 8086 in 1978 was a 16-bit processor, so they just ran with the number scheme (6 for 16 bit). The "80" from 8080 probably came from IBM punchcards which were used for the US census (since the 1920s!), which is actually how IBM started, basically as the child of Herman Hollerith who built automated tabulating machines in the late 19th century. Also this is to blame for the 80-character terminal convention. Blame IBM.

u/lightforce3 · 8 pointsr/tech

What you seem to be asking is "how do computers work?" At any rate, the interaction of hardware and software is fundamental to any computer system, whether it's your fitness band or your cell phone or a supercomputer or the computer in your car engine or The Next Big Thing.

How that works is a really, really big question. Rather than attempt to answer it, I'll suggest you check out the book Code by Charles Petzold. It explains how computer hardware and software work, starting with basic electrical circuits and building up layer by layer until you're running an operating system and application software. That might seem like a lot to cover, but Code does it simply and cleanly, in a way that just about anybody can digest and understand.

u/shaggorama · 8 pointsr/learnpython

Starts in January: https://www.coursera.org/course/aiplan

EDIT: In case you can't wait a month (which according to /u/sovietmudkipz is apparently a completely unreasonable amount of time to wait for a free college course), check out this textbook: it's my understanding that it's basically the gold-standard for intro-AI education.

u/FranciscoSilva · 8 pointsr/computerscience

Well, for AI, you should prepare for a world of math, math, math, along with computer science and programming (obviously). Understanding an historic vision of A.I. is also important, so I would consider starting to read something like this particular book: Artificial Intelligence: A Modern Approach! This a college-level A.I. book, so be patient if there are things you don't fully understand at first. Work hard and you can do anything you set your mind to!

u/911bodysnatchers322 · 8 pointsr/conspiracy

Gnostic Globalists / Fascists

u/Sk8nkill · 8 pointsr/IAmA

Hijacking to plug a couple of other books really worth reading if you're into this sort of thing:

Radical Abundance: How a Revolution in Nanotechnology Will Change Civilization by K. Eric Drexler

The Singularity is Near by the aforementioned Ray Kurzweil

u/ringl-bells · 8 pointsr/technology

Everyone should read SuperIntelligence by Nick Bostrom.

Non-affiliate Amazon link: Superintelligence: Paths, Dangers, Strategies

u/mastercraftsportstar · 8 pointsr/ShitPoliticsSays

I don't even think we'll get that far. I honestly believe that once we create proper A.I. it will snowball out of control in a matter of months and it will turn against us. Their communist plans are mere fever dream when it comes to A.I. "Well, if the robots are nice to us, don't destroy the human species, and actually are subservient to us, then our Communist fever dream could work"

Yeah, okay, it's like trying to decide whether you want chicken or fish for the in-flight meal while the plane is going down.

I recommend reading Superintelligence if you want to get more theroies about it.

u/VorpalAuroch · 8 pointsr/artificial

Sotala and Yampolskiy, Bostrom's book, Infinitely descending sequence... by Fallenstein is a really interesting, clever solution to a piece of the puzzle. I'm not sure what you're looking for, particularly; everyone currently working on the question is pretty invested in it, because it's still coming in from the fringe, so it's all going to be people you'll denounce as "not credible".

u/darkroastbeans · 8 pointsr/compsci

Props to you for working through a textbook with python! Someone recommended Concrete Mathematics to me a while back as a good resource for really sharpening my math skills. Knuth is one of the authors, and while the book does not discuss programming at all, I'm interested to see how, as a programmer himself, he explains mathematics.

A Khan video series would be awesome. I think there would be a lot of demand for it.

I think the thing I have the most trouble with is understanding math notation. For some reason all the symbols really confuse me. I can learn a new programming language syntax with relative ease, but when I try to learn math syntax, my mind just goes blank. Not sure why this is an issue.

u/RedSpikeyThing · 8 pointsr/programming

Concrete Mathematics by Graham, Knuth and Patashnik is a great start to almost all of those topics.

u/JohnKog · 8 pointsr/compsci

You probably already have, but if not, definitely read Design Patterns, which is old but a classic. I'd also highly recommend the Pragmatic Programmer.

EDIT: I just want to say, that I also fully support alienangel2's answer. I wanted to recommend a couple good books to get you on "the path", but ultimately, the best thing by far is to find a job that grows you. For some people, the best way to do that is to work at a super small startup, where everything you're building is from scratch. For others (like me), the best way is to work at a company with tons of really smart people who have already built great software, and learning from them and the choices they've made (and why). And if you still feel like you're regressing since school, maybe that's the answer: go back to school (i.e. get a Master's or PhD)!

u/herpington · 8 pointsr/learnprogramming

These are all good points.

With respect to Design Patterns, I feel that the holy grail is Design Patterns - Elements of Reusable Object-Oriented Software by Gamma et al.

u/fisat · 8 pointsr/MachineLearning

Read Hands on Machine Learning with Scikit-learn and Tensorflow. This book is awesome.

https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291

u/grumpy_youngMan · 8 pointsr/movies

I like the premise that man creates something that chooses to destroy it in the end. A lot of AI experts raise this as one of the biggest concerns of artificial intelligence. A book called Super Intelligence [0] goes into this. Even Elon Musk, everyone's go to innovative tech thinker, recommends this book as caution against over-doing it with AI.

That being said, everything else was really a let down to me. They just brushed over the fact that David killed all the engineers? Why was the crew so damn stupid and careless? They went to a new planet, breathed the air, interacted with the vegetation, didn't think about quarantining the sick people...I refuse to believe that the 2nd in command would allow the crew to be this bad in ways that are obvious.

The material seems to be too stretched out when we don't need it to be (e.g. existential debate between 2 robots), and then its just thrown at us when I would prefer more detail (david killing all the engineers, understanding the engineers).

0: https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742

u/original_evanator · 8 pointsr/cscareerquestions
u/dogewatch · 7 pointsr/learnprogramming

The Grokking Algorithms Book is good for beginners or for those who want a fun refresher. Obviously not too much depth but teaches it in a nice illustrated way. Code is also available on github.

u/dear_glob_why · 7 pointsr/javascript

There's more than a single design pattern for applications. Recommended reading: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented-ebook/dp/B000SEIBB8

u/cemremengu · 7 pointsr/csharp

Those who want more detailed info on these should check the Design Patterns book by GOF

u/SidewaysGate · 7 pointsr/compsci

When I was a young teenager I read (most of) a book called Code.

This was absolutely fantastic. It didn't just talk about programming or about software, it explained the concept of a code and the work that we do from the ground up. It literally started from light bulbs and switches and went to microprocessors and programming languages. This is the book that helped me bridge the software-hardware cognitive gap. Eventually it got to be too much for me, but in my defense I was 12-13 at the time. Even so, the parts that I did get through stuck with me.

I'm going to go back and reread it.

The book isn't going to cover design patterns or microservices, but IMO it's best thing to give computer scientists context on what we're doing here from an engineering perspective (with sipser as the book from the mathematical perspective)

&gt;I want to be able to understand how computers work

Code: The Hidden Language of Computer Hardware and Software

I was on the search for the same as you a couple of weeks ago and people recommended the book above. I just recently started reading it but hopefully someone who has read it can chime in with their opinion.

u/deiphiz · 7 pointsr/learnprogramming

Okay, I'm gonna plug this in here. I hope this doesn't get buried because when I saw someone asking about low level stuff here, I couldn't help but make this recommendation.

For anyone that wants to learn more about low level computer stuff such as assembly code should read the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. I've been reading it recently and I'm really glad I picked it up. It really delves into how a computer really works, like how languages relate to the circuits on a board.

So yeah, /u/DEVi4TION, I recommend picking this up if you wanna know about more stuff like this. Then maybe try your hand at actual 6502 programming later :P

u/InvalidGuest · 7 pointsr/computerscience

I'd recommend this one: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's very enjoyable to read and definitely increases one's understanding of how computer conceptually function. Petzold also makes it very easy to understand what he is saying in his explanations.

u/ArmenShimoon · 7 pointsr/csharp

They seem a like reasonable starting point I think. Repetition is the mother of mastery, the more books the better (in addition to applying what is learned).

Since Mosh is calling out learning fundamentals as important to becoming a good C# developers, I would personally also recommend some general (non C# specific books) too for who are starting out in software development:

1. Design Patterns (Amazon) - also known as the "Gang of Four" Design Patterns, it was originally published in 1994 and is still relevant today. When people talk about design patterns, they're referring to the book more often then not.

2. Soft Skills (Amazon) - Not a book on programming actually... it's a software developers life manual. The reason I like this book is it covers the other parts of the life of a developer that I haven't seen covered anywhere else. Everything from learning strategies, time management, career advice, and even some health and fitness. It was an enjoyable read and I think other developers would enjoy it too.

3. The Passionate Programmer (Amazon) It's been a while since I've read this one, but I remember it giving decent advice for building a career in software development. Not to be confused with The Pragmatic Programmer (Amazon) which should be read at some point too.

There's a ton more, but those are a few that stood out to me. Essentially the more the merrier in my opinion - books, courses, videos, tutorials, and so on. The books I'm recommending here focus on adopting the developer mindset and being successful at it. That's part of the puzzle.

The other part is understanding the technical details including the programming language and frameworks you intend to use.

And finally, for learning about C#, I do highly recommend Mosh's videos/courses (some are free on YouTube, others available on Udemy). He's got a unique ability to explain things clearly and simply in a way that beginners can pick up quickly.

What I'd do is check out his free content first, and if you agree his style is ideal for learning, an investment in one of his courses is well worth it since he'll cover a lot more breadth and depth on each of the topics and they're organized into a super consumable package rather than scouring the internet for various topics.
u/mrcleaver · 7 pointsr/java

Learn by experience and by reading is probably the way to go. The gang of four's design patterns is still the de-facto standard:
http://www.amazon.ca/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612

I really love this book for Java design patterns though, fun to read and really informative:

Then it's a matter of knowing when and where to apply them, which is a harder problem and just an experience thing I'd say.

u/yoho139 · 7 pointsr/ProgrammerHumor

From wikipedia, and I highly recommend this book as reference material.

u/gerundpu · 7 pointsr/philosophy

Yes, and if you want to follow this deeper into the context of consciousness, check out this book: GEB

There's a series of chapters discussing the localization of brain functions. The author discusses a study on rat brains, in which maze-running rats had significant portions of their brains removed, and were allowed to heal. Most rats were still able to re-learn the maze.

u/justanothercactus · 7 pointsr/DesignPorn

You might like this [book] (https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567)...Gödel, Escher, Bach: An Eternal Golden Braid, the cover has an even better whatever you call that effect/illusion.

u/shobble · 7 pointsr/books

In Search Of Schrodinger's Cat by John Gribbin is a very readable physics and quantum physics history sketch. Might be slightly dated now, although I can't think of anything directly contradicted by recent work. Then again, I'm not actually a physicist :)

The Quark and the Jaguar is quite a bit more complicated, but still quite accessible to the layperson and has a lot of interesting stuff.

Slightly less sciency, more maths/logic/computation is Gödel, Escher, Bach: An Eternal Golden Braid

A Guinea Pig's History of Biology is pretty much what the title says, although there's an awful lot about fruit-flies too. Quite a good review of the history of biological experimentation, especially genetics.

H2O: A Biography of Water from a previous editor of Nature, covers water across a variety of fields. The second half of the book is mostly a rant about cold fusion and homoeopathy though, from what I recall, but the first half makes up for it.

Most general-audience things by Richard Feynman are well worth the read. He's got some great physics lectures, and his autobiography (Surely You're Joking, Mr Feynman?) is fun, but more for the anecdotes than the science.

Those are off the top of my head. If its something in a particular field, I might have some other ideas I'm currently forgetting.

u/HarlequinNight · 7 pointsr/math

You would love Godel Escher Bach by Douglas R Hofstadter. It won the pullitzer prize and is basically just a really good popular math/computer science/art book. But a really excellent jumping off point. Yes it lacks mathematical rigor (of course) but if you are a bright clever person who likes these things, its a must read just for exposure to the inter-connectivity of all of these topics in a very artistic and philosophical way. But be prepared for computer code, musical staff notation, DNA sequences, paintings, and poetry (all themed around Godel, Escher and Bach).

u/Mythiees · 7 pointsr/todayilearned

I don't follow?

At some point we started asking questions about the world. There came a time where 'something' emerged in us and we started questioning the world around us.

Questions are investigations about how the world (and here 'world' is everything in the immediate environment) works. This leads to 'what if' scenarios, equivalencies 'is this thing like the other?' and sets 'I belong to the group called 'men', she belongs to the group called 'women'. In the group called 'women' there is the subset of 'women' that are my offspring. Godel, Escher, Bach yourself on sets and other concepts.

So, we learned how to ask questions and the answers to those questions lead to more questions. All this leads to the internet and us meeting. Our interaction is the result of an unbroken chain of questions that has brought us from the savanna all the way to here. Think about that.

u/PsychedelicFrontier · 7 pointsr/RationalPsychonaut

What a great question, and an interesting example. For those confused by OP's example, check out Gödel's Incompleteness Theorem on Wiki. Better yet, read the insightful and very trippy Pulitzer Prize winning book, Gödel, Escher, Bach. Gödel's theorem is a bit abstract but it was both a monumental and surprising discovery. It's not just mathematical -- it's meta-mathematical, in that it reveals the limitations inherent to any mathematical framework or system. From wiki:

&gt;The first incompleteness theorem states that no consistent system of axioms...is capable of proving all truths about the relations of the natural numbers (arithmetic). For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that such a system cannot demonstrate its own consistency.

I'll point out an obvious one, though it's more to do with the aesthetics of the psychedelic experience rather than insights or ideas. Psychedelic hallucinations tend to be geometric, with lattices, grids, spirals, and perhaps most intriguing of all, fractals. All these are geometric forms that can be rigorously defined and analyzed by math. Fractals are especially fascinating because they exhibit self-similarity at every scale, appear sometimes in nature (for example, coastlines), and look extremely trippy. (Seriously, just look at these zoom-ins of the Mandelbrot set, discovered in 1978.)

u/egben · 7 pointsr/science
u/Robin_Banx · 7 pointsr/datascience

Almost the exact same trajectory as you - graduated with a psych degree, learned a lot of stats and experiment design, then did the Coursera ML course.

Reading this book is probably the biggest thing that took me from knowing there to doing well in interviews (before that it was just scattered projects): https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291 A second edition is coming out pretty soon, so watch out for that.

If I were doing it today, this is probably the best material out there: https://www.dunderdata.com/ It starts from scratch and gives you an amazing tour of Pandas. Author's also working on a practical Machine Learning book.

u/darkardengeno · 7 pointsr/singularity

&gt;Like Elon Musk on AI. There's zero difference between them, they are both ignoramuses spewing bullshit on a subject they know nothing about.

There's at least one difference: Carrey is wrong about vaccines, Musk is right about AI. As it happens, that's the only difference I care about.

&gt; there have been two deaths already

Are you joking? There were almost 30 thousand Model S's on the road in 2017. During that same year 40 thousand people in the US died in car crashes. The Model S is probably the safest car ever made but the only perfectly safe car is one that no one ever drives. Two deaths out of that sample is pretty good, though perhaps not excellent.

Out of curiosity, what are your qualifications to be speaking so strongly on AI? What experts do you read in the field that offer dissenting opinions from Musk, Bostrum, Hinton, or Tegmark? Or, for that matter, everyone that signed this letter?

u/EricTboneJackson · 7 pointsr/videos

&gt; The protagonist ship crew aren't real people, they're ones and zeros that Daily created. He is torturing them, no doubt, but it's indistinguishable in many ways from what we do to NPCs in computer games now, just more advanced.

You miss the point of the episode, and presumably of several Black Mirror episodes, if you don't grant than "ones and zeros" can be genuinely conscious.

Roger Ebert famously made the same mistake when reviewing A.I. The entire premise of the movie is that its protagonist, an android named David, is not only conscious and self aware, but experiences emotions just as humans do. Failing to accept that premise, which is careful established in the first scene of the movie, Ebert proceeds to simply not get the movie. He sees an "advanced NPC" where he should be seeing a lonely little boy who happens to be implemented in silicon: "A thinking machine cannot think. All it can do is run programs that may be sophisticated enough for it to fool us by seeming to think. [..] the movie intends his wait to be poignant but for me, it was a case of a looping computer program -- not a cause for tears, but a case for rebooting."

The fact is, you -- your thoughts and emotions -- are produced by perfectly ordinary physics in your brain. We have no reason to believe that we won't be able to someday build machines that do exactly as the brain does. From both a neuroscience, computer science, and physics perspective, we know of nothing that would prevent this, and we're getting close enough now that the potential existential crisis has been talked about lately by a lot of really smart people.

But that's moot, because even if you don't accept that this is possible, it's a fundamental premise of that episode. One of my favorite Ray Bradbury stories involves humans who have crash landed on Mercury. In that story, this causes the human life cycle to accelerate such that we are born, grow to maturity, get old and die, in 8 days. This is obviously not a scientifically plausible premise, but that doesn't matter. It's the setup for the story. It's how that world works, and a logically coherent story, consistent with that world, emerges from that premise.

In this episode, Daily has created AI that can think and feel, just as we do. That's the premise. But he has them captive. He can create and destroy them at will, torture them in unimaginable ways, and that's the major point of the episode. We're on the cusp as a species of actually being able to do this. Not in the glamorized way shown in the episode (at least not at first), where digital minds also have digital bodies and perfect digital worlds where they can basically behave just like humans, but in ways that are potentially much more horrifying.

Imagine that we create the first digital mind by accident, and because of computer speeds, it lives out a subjective 10,000 years in total isolation, with no sensory input, going completely mad before we even figure out what we've done. Imagine that we perfect making digital minds and conscript them to do all our thinking labor for us, as slaves that we boot in a fresh state every morning and reset every night. Imagine that we can have pet minds, as in this episode, and you start to see the dark potential that it speaks to so entertainingly.

Further reading: Superintelligence, but Nick Bostrom (Oxford professor).

&gt; we turn against Daily even though in the end he really is just a creep doing creepy (but legal) things

We turn against him because he's doing flat out evil things. It's completely irrelevant that it's legal. If we see a film of someone whipping their slaves in the 1700s, we turn against them, too, despite the fact that what they're doing is legal. "Legal" does not equal "moral", not in the past, not today, and not in the future.

u/argvnaut · 7 pointsr/gamedev

Check out Programming Game AI by Example and Artificial Intelligence for Games. They are both decent game AI books, the former actually develops an AI for a soccer simulation in chapter four. There's also [Behavioral Mathematics for Game AI] (http://www.amazon.com/Behavioral-Mathematics-Game-Dave-Mark/dp/1584506849/), it expands on the concepts of utility and decision making which are only touched upon in the first two books. It's purely theoretical but very interesting.

u/Xiroth · 7 pointsr/compsci

Operating Systems Concepts (AKA The Dinosaur Book) is generally quite well regarded.

Artificial Intelligence: A Modern Approach tends to be the text of choice for teaching AI to undergraduates - it doesn't deal with many of the most modern techniques, but it establishes the common functionalities.

u/IlluminateTruth · 7 pointsr/technology

The Swedish philosopher Nick Bostrum wrote a book called Superintelligence that covers much of this topic. I'd recommend it to anyone as it's not technical at all.

He maintains a strong position that the dangers of AI are many and serious, possibly existential. Finding solutions to these problems is an extremely arduous task.

u/tylerjames · 7 pointsr/movies

It's even more interesting if you don't just think him as the standard insane genius trope, but realize that he is probably genuinely disturbed and conflicted about what he's created and what to do with it.

Trying not to be spoiler-y here for people who haven't seen the movie but there are probably a lot of practical and metaphysical questions weighing on him. Is an AI truly a conscious creature? Does it have wants? If so, what would an AI want? Given that its social manipulation, long-game planning, and deception abilities are off the charts how could we ever be sure that what it told us was the truth? Does it have any moral considerations toward humans? How would we ever be able to contain it if we needed to? And if it is a conscious creature worthy of moral consideration then what are the moral ramifications of everything he's done with it so far?

Really interesting stuff. For those inclined I recommend checking out the book Superintelligence by Nick Bostrom as it explores these themes in depth.

u/stillnotking · 7 pointsr/atheism

This illustrates why we need to be careful with AI. A superintelligent AI given the directive to maximize human happiness might just stick electrodes in everyone's pleasure centers, or start an intensive, mandatory breeding program, because more humans = more happiness. It might be fully aware that that's not what we meant, but it's what we said...

(Yeah, I'm reading Nick Bostrom's book.)

u/nexxai · 7 pointsr/OpenAI

I couldn't have said it better myself. I read Superintelligence by Nick Bostrom (which is an insanely good read by the way) earlier this year and was becoming more and more worried that there was no one stepping up to the plate to spearhead a movement like this, at least nothing of this magnitude. To know that people like Elon Musk, Reid Hoffman, and Ilya Sutskever are behind this gives me hope that maybe we can emerge on the other side of the intelligence explosion relatively unscathed.

u/GenesisTK · 7 pointsr/uwaterloo

http://www-math.mit.edu/~rstan/ec/
I'll give you a brief about the book: It's really dense and probably will take you a while to get through just a couple of pages, however, the book introduces a lot of interesting and difficult concepts that you'd definitely see if you pursue the field.

https://math.dartmouth.edu/news-resources/electronic/kpbogart/ComboNoteswHints11-06-04.pdf
Is a Free book available online and is for a real beginner, basically, if you have little to no mathematical background. I will however say something, in Chapter 6, when he talks about group theory, he doesn't really explain it at all (at that point, it would be wise to branch into some good pure math text on group and ring theory).

https://www.amazon.ca/Combinatorics-Techniques-Algorithms-Peter-Cameron/dp/0521457610
This is a fantastic book when it comes to self studying, afaik, the first 12 chapters are a good base for combinatorics and counting in general.

https://www.amazon.ca/Concrete-Mathematics-Foundation-Computer-Science/dp/0201558025
I've heard fantastic reviews about the book and how the topics relate to Math 2 3/4 9. Although I've never actually used the book myself, from the Table of Contents, it appears like it's a basic introduction to counting (a lot lighter than the other books).

Regarding whether or not you can find them online, you certainly can for all of them, the question is whether legally or not. These are all fairly famous books and you shouldn't have trouble getting any one of them. I'm certain you can study Combinatorics without statistics (at least, at a basic level), however, I'm not sure if you can study it without at least a little probability knowledge. I'd recommend going through at least the first couple of chapters of Feller's introduction to Probability Theory and it's Applications. He writes really well and it's fun to read his books.

u/bixmix · 7 pointsr/VoxelGameDev

Steps to build your own engine from scratch with no knowledge:

1. Math: http://amzn.com/0201558025
2. Programming: http://www.amzn.com/0321751043
3. Intro Language: http://www.amzn.com/125785321X
4. C++ Language (Reference Books):
5. OpenGL Intro: http://opengl-tutorial.org/
6. OpenGL Reference: http://www.opengl.org/sdk/docs/
7. Scour the internet for voxel info

Note: Most people who decide to put together a voxel engine take about 2 years from inception. At the end of the two years, they will have a library they could use to create a game. They've also already made it through the first 4 steps when they start.

Without a degree program to solidify the concepts, I suspect that the first 4 steps will take at least 2-3 years: about 10-20 hours per week each week.
u/Tefferi · 7 pointsr/JobFair

Two things: The coursework from my CS degree, and reading books about software engineering.

I've spoken in other places about the former, and for the latter, I recommend The Pragmatic Programmer, Code Complete, and Design Patterns: Elements of Reusable Object-Oriented Software

u/ZukoBestGirl · 7 pointsr/ProgrammerHumor

https://www.amazon.com/dp/0201633612/?tag=stackoverflow17-20

https://www.amazon.com/Code-Complete-Practical-Handbook-Construction/dp/0735619670

And you could check stack overflow for question on general programming books. I would always go for a general concept functional programming over how to functional programming in haskell.

But I'll be perfectly honest, I'm a victim of the curse of knowledge so I honestly don't know how one should start. What I do remember is that I had huge problems with syntax (how do I tell language X to do a for (Employee e in employeeList), how do you write a switch and stuff, why would I ever need a ternary operator, and like that.

But once you master the basics of how to write x, y and z in your preferred language, you absolutely need to understand design patterns, you absolutely need to understand how code is supposed to be written, you absolutely need to understand data structures, you absolutely need to understand inheritance and polymorphism, you absolutely need to understand lambdas and functional programming.

Then you can get to the more "neat" stuff like the benefits of having immutables, and "job specific stuff" like how to solve race conditions in threading; sockets and web communication, etc.

u/welshfargo · 6 pointsr/compsci

Discrete math and this textbook.

MVC is just a paradigm. Implementations differ from language to language, and some languages have better support than others, but yeah, just a paradigm.

I will say this -- most experience I have with C# backends leads to great usage of databinding between model and view regardless of application platform (WPF, Windows Forms, even some ASP.NET). And I'm pretty impressed with the support of other design structures that C# and Visual Studio offer with the help of NuGet (looking at you, Angular).

u/obeleh · 6 pointsr/compsci

Perhaps offtopic but... Some books you should read regardless of CompSci branch

u/JAPH · 6 pointsr/learnprogramming

Introduction to Algorithms by CLRS. Used in almost every algorithms class I've heard of, and a great reference book. It's about 1300 pages, so don't try to read straight through it.

C Programming Language by K&amp;R. This is a C programmer's Bible.

Design Patterns by the Gang of Four

This is a little more of a topic book, but The Art of UNIX Programming by Raymond.

These are all either pretty common, or almost essential. You'll probably see these in your curriculum anyway, and you should. These are a good cornerstone.

Small employment gaps are no big deal. Over six months people may ask, but it's all in how you answer. I'm not sure why you feel like you're unmarketable having worked in the industry for two years, but do know a lot of the postings - especially junior postings - are inflated. I've seen one that asked for three years of experience with Visual Studio 2019. If you're halfway there, shoot your shot.

As a junior dev, the expectations are low. All I'd expect you to know is how to get code up and running that I don't have to tear down for the good of the company. Be able to read your language and solve simple problems. The biggest thing I look for in a junior dev is if I can give them some piece of of the software to write while I'm not looking and feel that you're mostly there when I come back to check. Apply for appropriate positions and don't fudge your experience. Enthusiasm and eagerness to learn go a long way. Don't be a know-it-all from your position.

Decide what kind of role you'd prefer, and start the process of brushing up on that. Use the job postings that represent the jobs you want as direction on what you need to learn. If the role you really want is too far, get a job doing what you know to pay for your education in the role you want.

As a front-end developer, you're going to want to learn a Javascript toolchain and one modern framework to start. Npm and Node.js are the backbone of what you do. If you want to switch, learn what juniors do in that paradigm. Do know that the Javascript world is fast-paced and fad-based, so if you miss a wave, wait two years and the next one will be coming around for you to hop on.

Personal projects are a good idea, just make them meaningful by using the proper setup (not just some bullshit hack job) or address an interesting problem. You're going to want to get it up on a personal repository that you can put a link to right on your resume and job site (Indeed, Dice, Glassdoor, Linkedin) posting. Be able to speak to every decision you made, even if it was a bad one. Your personal project doesn't have to be spotless or even completely done, it just has to be yours, it should be able to execute, and you should show some decent decision making. A mod for a game, a contribution to open source, a personal thing that has some use-case or whatever.

Get experience with related technologies. Start to learn one step before and beyond the one you're a specialist in. For example, you're a junior front-end dev. Learn a little about backend work, and learn about deployments. Learn about the experience of your fellow team members as they try to integrate your work with Git, build with Jenkins or AWS Code Build, and containerize with Docker. Think about the pain points you face in architecture, code, building, and deploying; think about how you'd solve them or if you can't, keep an eye on solutions as you go. Know the differences between elements of your chosen sphere.

Higher level concepts like SOLID principles,Design Patterns, and Refactoring Patterns are more goals than expectations, but you should take a look at them now and at least be able to speak to at least one of them somewhat. With limited time, prefer Design Patterns. You don't want to walk into an interview where someone asks you about how you use design patterns and you've never heard of them. Even if they'll accept that, you still won't feel good about it.

Look up some materials on coding challenges, as some companies give coding quizzes. I just had an interview with a guy that touted 10+ years of experience but couldn't read from a file given an hour.

If you feel like you're going to be let go due to performance, get ahead of that and ask your supervisor how you're doing or what you need to do to grow. If you feel like you're going to be let go due to a restructuring you can't affect, you have two options: get to know other teams so you can maybe hop on their project later, or just save your money and get to work on some of the stuff above each weekend until the axe falls. You're a junior dev. You're not expected to be perfect, but you should come in a teachable state - some foundation with programming, a willingness to learn, a willingness to figure things out, and the ability to take direction.

u/whereisspacebar · 6 pointsr/cscareerquestions
u/JavaJosh94 · 6 pointsr/learnprogramming

While data structures and algorithms are important, don't forget design patterns!

Design Patterns: Elements of Reusable Object-Oriented Software https://www.amazon.com/dp/0201633612/ref=cm_sw_r_other_awd_FW0ywbGSR9PB5

u/EughEugh · 6 pointsr/programming

There are several good books on designing good software:

Code Complete

Design Patterns

Refactoring

u/FattyBurgerBoy · 6 pointsr/webdev

The book, Head First Design Patterns, is actually pretty good.

You could also read the book that started it all, Design Patterns: Elements of Reusable Object-Oriented Software. Although good, it is a dull read - I had to force myself to get through it.

Martin Fowler is also really good, in particular, I thoroughly enjoyed his book Patterns of Enterprise Architecture.

If you want more of an MS/.NET slant of things, you should also check out Dino Esposito. I really enjoyed his book Microsoft .NET: Architecting Applications for the Enterprise.

My recommendation would be to start with the Head First book first, as this will give you a good overview of the major design patterns.

u/gtani · 6 pointsr/MachineLearning

Don't worry, you've demonstrated the ability to figure out whatever you need to get hired, you need to worry more about getting a place to live. probably you shd buy one of those shirts that says "Keep calm and carry on". You could cram on java performance tuning or kernel methods or hadoop or whatever and be handed a project that doesn't use it. Here's some "curricula", free books etc

http://metaoptimize.com/qa/questions/186/good-freely-available-textbooks-on-machine-learning

http://www.amazon.com/Machine-Learning-Probabilistic-Perspective-Computation/product-reviews/0262018020/ (first review)

--------

http://people.seas.harvard.edu/~mgelbart/glossary.html

http://www.quora.com/Machine-Learning

http://www.quora.com/Machine-Learning-Applications

u/tech-ninja · 6 pointsr/ProgrammerHumor

Depends what you want to learn. Some of my favorites are

• Code by Charles Petzold if you want to know how your computer works under the hood.

• Peopleware if you want to learn how to manage knowledge workers.

• Clean Code by Uncle Bob if you want to learn about good practices and program structure. Impressive content, covers much more than I expected.

• Don't Make Me Think if you want to learn about usability.

• Algorithms by Robert Sedgewick if you want to learn about DS &amp; algorithms.

• The Art of UNIX Programming by Eric S. Raymond if you want to learn about the unix philosophy. Lots of hidden gems in there. Have you ever heard: write programs that do one thing and do it well; don't tune for speed until you've measured; imagine all this knowledge distilled to you in one book.

This a good list to get you started :) most of my favorite books are not language specific.
u/Omnipotent0 · 6 pointsr/educationalgifs

This is the best video on the subject I've ever seen. http://youtu.be/VBDoT8o4q00
Of you want to learn more I very very very strongly recommend this book. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/novembersierra · 6 pointsr/cscareerquestions

Code: The Hidden Language of Computer Hardware and Software

This book starts with flashlights and Morse code and Braille, goes to telegraphs and electricity, works it way up to Boolean logic gate circuits (still using the components from telegraphs!) and then goes all the way to programming languages and computer graphics.

u/yoodenvranx · 6 pointsr/de

Falls du wissen willst wie eine CPU funktioniert und was Assembler ist und vor allem wo das alles her kommt und warum es funktioniert, dann kann ichdir Charles Petzold - Code: The Hidden Language empfehlen. Er fängt mit einfachen Morsesignalen an, leitet dann Logikgatter her und am Ende des Buches hast eine eine funktionerende CPU.

u/armchair_viking · 6 pointsr/smashbros

"Code" by Charles Petzold is a great book that does that. Starts with a simple flashlight switch, and builds on that example until you have a working processor.

Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_awd_FeDzwbRQ2ZDDP

Check out The Annotated Turing by Charles Petzold. It's Turing's paper on the Entscheidungsproblem which introduces Turing Machines, annotated with a lot of background information and some stuff about Turing's career. Very interesting stuff.

I can also recommend Code, by the same author which describes how a computer works from basic principles. It's doesn't have a lot of material on Turing, but it's certainly an interesting read for anyone interested in Comp Sci.

u/boojit · 6 pointsr/explainlikeimfive

Related, this is a very good book about computing that happens to cover the history of the braille system in some detail. If you click on the "look inside" preview bit, and go to Chapter 3, most of the information relevant to your question is covered in these preview pages.

u/com2kid · 6 pointsr/learnprogramming

You'll have a far greater understanding of how things work at a basic level than everyone else.

u/gineton2 · 6 pointsr/ComputerEngineering

For a gentle introduction, CODE: The Hidden Language of Computer Hardware and Software is a really pleasant read. It works its way up gradually, so maybe not the best fit for a physics student or someone who already understands the fundamentals. For someone new to the subject it's a great fit, however. Otherwise I see Patterson and Hennessy recommended.

u/rheimbuch · 6 pointsr/programming

Alan Turing's original paper that introduced the Turing Machine is a great read. The Annotated Turing is a great way to both read and understand the paper if you don't have a background in compsci. It doesn't assume much more than highschool math, and the whole of Turing's paper is inline with the explanations.

There are plenty of tutorials on the net - http://computer.howstuffworks.com/pc.htm I would also recommend this book - https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

A famous artefact of early computing is the boot-strapping process where the goal is a self-hosting compiler - which lets you write the compiler for a new language in the new langauge. However to get to that point a lot of earlier innovations were needed.

Take all of this with a pinch of salt - the order and the details may be wildly inaccurate, but the overall ideas viewed from afar give an idea of how we got to the point that we can choose our own language to write a compiler for another language..

To start with, raw binary values had to be set in order to define and run a program. Those raw binary values represent instructions that tell the hardwaer what to do and data that the program needed to operate. This is now usually referred to as machine code.

At first you would enter values into computer storage using switches.

Since that's so tedious and error prone, puched cards were developed along with the necessary hardware to read them so you could represent lots of values that could be read toagether. They had their own problems but it was a step forward from switches.

After some time symbolic instructions were defined as a shortcut for several machine code instructions - now usually called assembly language. For example put the value 8 and store it into a memory location 58 could be written as ST 8, [58]. This might take 3 machine code instructions, one represents the store instruction, one the value 8 and one the location 58. Since now assembly language could be written down it was easier to understand what the computer is being instructed to do. Naturally someone had the bright idea to make that automatic so that for example you could write down the instructions by hand, then create punched cards representing those instructions, convert them to machines code and then run the program. The conversion from the symbolic instructions to machines code was handled by a program called an assembler - people still write programs in assembly code and use assemblers today.

The next logical step is to make the symbolic instructions more useful and less aimed at the mundane, physical processes that tells the computer exactly how to operate and more friendly for people to represent ideas. This is really the birth of programming languages. Since programming languages allowed you to do more abstract things symbolically - like saving the current instructions location, branching off to another part of the same program to return later, the conversion to machine code became more complex.Those programs are called compilers.

Compilers allow you to write more useful programs - for example the first program that allowed you to connected a keyboard that lets you enter numbers and characters, one connected to a device to print numbers and characters, then later to display them on another device like a screen. From there you are quite free to write other programs. More languages and their compilers developed that were more suitable to represent more abstract ideas like variables, procedure and functions.

During the whole process both hardware - the physical elctronic machines and devices and software, the instructions to get the machines to do useful work - were both developed and that process still continues.

There's a wonderful book called Code by Charles Petzold that details all of these developments, but actually researched and accurate.

u/gingenhagen · 6 pointsr/programming

Try this book: Code, which is a bottom-up approach. Depending on how rigorous your college CS curriculum was, it'll be either a good review of your college classes or mind-blowing, but I think that the approach that the book takes is really great.

I'd also recommend Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. I recently finished reading this book after having it recommended by a post on Reddit a year or two ago. It starts off with a lot of basic information, covering Morse code and braille, and moves along in the development of code and hardware up until you actually create a functioning computer in the book. The later chapters were harder to get interested in, but the first 3/4 was very excellent and actually covered more than my computer architecture class in undergrad.

u/developero · 6 pointsr/learnprogramming

Code is a great book that helped me understand programming on an abstract level

u/TranshumanWarrior · 6 pointsr/slatestarcodex

&gt; I think that more people will be deterred by a focus on AI safety. It's worse for EA if people think "These people are weird nuts" than "These people are somewhat liberal."

But raw amount of support is not the objective that EA is supposed to be trying to maximize. If that support comes at the cost of making EA into a subset of left-wing political activism, and if an ever increasing proportion of EA stuff gets funneled into social justice and all the standard left-wing culture war causes, then we will be left with a movement that is EA in name only.

AI safety is not as far-out as it was 10 years ago. If someone looks at AI safety and people who support it - such as Stephen Hawking, Elon Musk, Bill Gates, Nick Bostrom and the guy who co-wrote the textbook on AI - and are turned off by it because they think it is crazy, well maybe they have been successfully filtered out as people not possessing the required level of rationality to be beneficial to the movement? I wouldn't have made this argument even 5 years ago actually, because AI risk looked so superficially dodgy even though the arguments are sound.

u/hungry4pie · 6 pointsr/programming

No no, he's Anti-AI: A Modern Approach

u/kcin · 6 pointsr/programming

Actually, it's a pretty comprehensive take on the subject: http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/

u/PM_ME_UR_OBSIDIAN · 6 pointsr/compsci

The first step to doing research is ingesting whatever knowledge already exists. With that in mind:

u/shred45 · 6 pointsr/gatech

So, when I was younger, I did attend one computer science related camp,

https://www.idtech.com

They have a location at Emory (which I believe I did one year) that was ok (not nearly as "nerdy"), and one at Boston which I really enjoyed (perhaps because I had to sleep on site). That being said, the stuff I learned there was more in the areas of graphic design and/or system administration, and not computer science. They are also quite expensive for only 1-2 weeks of exposure.

I felt it was a good opportunity to meet some very smart kids though, and it definitely lead me to push myself. Knowing and talking to people that are purely interested in CS, and are your age, is quite rare in high school. I think that kind of perspective can make your interests and hobbies seem more normal and set a much higher bar for what you expect for yourself.

On the other side of things, I believe that one of the biggest skills in any college program is an openness to just figure something out yourself if it interests you, without someone sitting there with you. This can be very helpful in life in general, and I think was one of the biggest skills I was missing in high school. I remember tackling some tricky stuff when I was younger, but I definitely passed over stuff I was interested in just because I figured "thats for someone with a college degree". The fact is that experience will make certain tasks easier but you CAN learn anything you want. You just may have to learn more of the fundamentals behind it than someone with more experience.

With that in mind, I would personally suggest a couple of things which I think would be really useful to someone his age, give him a massive leg up over the average freshman when he does get to college, and be a lot more productive than a summer camp.

One would be to pick a code-golf site (I like http://www.codewars.com) and simply try to work through the challenges. Another, much more math heavy, option is https://projecteuler.net. This, IMO is one of the best ways to learn a language, and I will often go there to get familiar with the syntax of a new language. I think he should pick Python and Clojure (or Haskell) and do challenges in both. Python is Object Oriented, whilst Clojure (or Haskell) is Functional. These are two very fundamental and interesting "schools of thought" and if he can wrap his head around both at this age, that would be very valuable.

A second option, and how I really got into programming, is to do some sort of web application development. This is pretty light on the CS side of things, but it allows you to be creative and manage more complex projects. He could pick a web framework in Python (flask), Ruby (rails), or NodeJS. There are numerous tutorials on getting started with this stuff. For Flask: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world. For Rails: https://www.railstutorial.org. This type of project could take a while, there are a lot of technologies which interact to make a web application, but the ability to be creative when designing the web pages can be a lot of fun.

A third, more systems level, option (which is probably a bit more opinionated on my part) is that he learn to use Linux. I would suggest that he install VirtualBox on his computer, https://www.virtualbox.org/wiki/Downloads. He can then install Linux in a virtual machine without messing up the existing OS (also works with Mac). He COULD install Ubuntu, but this is extremely easy and doesn't really teach much about the inner workings. I think he could install Arch. https://wiki.archlinux.org. This is a much more involved distribution to install, but their documentation is notoriously good, and it exposes you to a lot of command line (Ubuntu attempts to be almost exclusively graphical). From here, he should just try to use it as much as possible for his daily computing. He can learn general system management and Bash scripting. There should be tutorials for how to do just about anything he may want. Some more advanced stuff would be to configure a desktop environment, he could install Gnome by default, it is pretty easy, but a lot of people really get into this with more configurable ones ( https://www.reddit.com/r/unixporn ). He could also learn to code and compile in C.

Fourth, if he likes C, he may like seeing some of the ways in which programs which are poorly written can be broken. A really fun "game" is https://io.smashthestack.org. He can log into a server and basically "hack" his way to different levels. This can also really expose you to how Linux maintains security (user permissions, etc. ). I think this would be much more involved approach, but if he is really curious about this stuff, I think this could be the way to go. In this similar vein, he could watch talks from Defcon and Chaos Computer Club. They both have a lot of interesting stuff on youtube (it can get a little racy though).

Finally, there are textbooks. These can be really long, and kinda boring. But I think they are much more approachable than one might think. These will expose you much more to the "Science" part of computer science. A large portions of the classes he will take in college look into this sort of stuff. Additionally, if he covers some of this stuff, he could look into messing around with AI (Neural Networks, etc.) and Machine Learning (I would check out Scikit-learn for Python). Here I will list different broad topics, and some of the really good books in each. (Almost all can be found for free.......)

General CS:
Algorithms and Data Structures: https://mitpress.mit.edu/books/introduction-algorithms
Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X
Operating Systems: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/0470128720

Some Math:
Linear Algebra: http://math.mit.edu/~gs/linearalgebra/

I hope that stuff helps, I know you were asking about camps, and I think the one I suggested would be good, but this is stuff that he can do year round. Also, he should keep his GPA up and destroy the ACT.

u/rojobuffalo · 6 pointsr/Futurology

He is amazingly articulate on this subject, probably more so than anyone. I really enjoyed his book Superintelligence.

u/ArthurAutomaton · 5 pointsr/math

It's Problem 7.29 in the third edition of Michael Sipser's Introduction to the Theory of Computation (you can find it by searching for "coloring" when using the preview function on Amazon). It's also in The design and analysis of computer algorithms by Aho, Hopcroft and Ullman as Theorem 10.12, though the proof there seems a little different from what you've sketched. (I hope that the Google Books link works. Sometimes it won't show the right preview.)

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

&amp;#x200B;

But you might also consider Basics of Compiler Design which is a good and freely available resource.

&amp;#x200B;

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/tbid18 · 5 pointsr/math

What do you mean by "I want to be computer scientist?" Do you want to do research for a living, e.g, work in academia or for a lab? Or is your goal more along the lines of, "I want to learn more about computer science?" If the former, you're not going to get far without a degree, usually a Ph.D is necessary. If the latter is your goal then the 'traditional' math subjects would be 'discrete' subjects like probability and combinatorics. Linear algebra is heavily used in machine learning, and I believe PDEs are used as well.

On the computer science side, computability theory and computational complexity are necessary. Sipser is the standard for computability theory, and I like Arora/Barak for complexity (I don't necessarily recommend buying on amazon; that price for Sipser is outrageous).

u/mysticreddit · 5 pointsr/gamedev

Every game programmer should have at least one of these books:

• Mathematics for 3D Game Programming &amp; Computer Graphics by Eric Lengyel
• Game Physics by David Eberly
• Real Time Collision Detection by Christer Ericson

I own all 3. What I love about them is that they are some of the best ones around written by programmers to understand math in a clear and concise fashion; they are not written by some mathematician who loves theory and likes to hand-wave the worries about "implementation details."

To help provide direction I would recommend these exercises to start; Work on (re) deriving the formulas (from easiest to hardest):

• Work out how to reflect a vector
• Derive the formula for how to calculate a 2D perpendicular vector
• Work out the formula for how to project a vector A onto B.
• Study how the dot product is used in lighting.
• Derive the translation, scaling, and rotation 3x3 and 4x4 matrices.

Another great book for me was Hands-on Machine Learning with scikit learn and TensorFlow. This one I think is where I started to get some intuition on what some of the ML algo's were actually doing, and he provides lots of material on algorithm tuning.

u/equinox932 · 5 pointsr/Romania

Vezi si fast.ai, au 4 cursuri foarte bune. Apoi si asta e bun. Hugo Larochelle avea un curs de retele neuronale, un pic mai vechi.

La carti as adauga si The Hundred Page Machine Learning Book si asta , probabil cea mai buna carte practica, da asteapta editia a 2a, cu tensorflow 2.0, are tf.keras.layers, sequential model, practic tf 2 include keras si scapi de kkturile alea de sessions. Asa, si ar mai fi si asta, asta si asta. Nu pierde timp cu cartea lui Bengio de deep learning, e o mizerie superficiala. Spor la invatat si sa vedem cat mai multi romani cu articole pe ML si DL!

u/Neophyte- · 5 pointsr/CryptoTechnology

nope, humans can barely do it. you need general artificial intelligence first, then if it progressed to super artificial intelligence then yes.

if youre interested in what im talking about read these two articles. second article is where it gets good. but you need to read both.

https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html

https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-2.html

https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742

u/zrbecker · 5 pointsr/learnprogramming

Depends on what you are interested in.

If you are interested in games, pick a game and do it. Most board games are not that hard to do a command line version. A game with graphics, input, and sound isn't too bad either if you use something like Allegro or SDL. Also XNA if you are on windows. A lot of neat tutorials have been posted about that recently.

If you are more interested in little utilities that do things, you'll want to look at a GUI library, like wxWidgets, Qt and the sort. Both Windows and Mac have their own GUI libraries not sure what Windows' is called, but I think you have to write it with C++/CLI or C#, and Mac is Cocoa which uses Objective-C. So if you want to stick to basic C++ you'll want to stick to the first two.

Sometimes I just pick up a book and start reading to get ideas.

This is a really simple Game AI book that is pretty geared towards beginners. http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782/

I enjoyed this book on AI, but it is much more advanced and might be kind of hard for a beginner. Although, when I was first starting, I liked getting in over my head once in a while. http://www.amazon.com/Artificial-Intelligence-Modern-Approach-2nd/dp/0137903952/

Interesting topics to look up.

Data Structures

Algorithms

Artificial Intelligence

Computer Vision

Computer Graphics

If you look at even simple books in these subjects, you will usually find tons of small manageable programs that are fun to write.

EDIT: Almost forgot, I think a lot of these are Java based, but you can usually find a way to do it in C++. http://nifty.stanford.edu/ I think I write Breakout whenever I am playing with a new language. heh

u/joeswindell · 5 pointsr/gamedev

I'll start off with some titles that might not be so apparent:

Unexpected Fundamentals

These 2 books provide much needed information about making reusable patterns and objects. These are life saving things! They are not language dependent. You need to know how to do these patterns, and it shouldn't be too hard to figure out how to implement them in your chosen language.

u/akame_21 · 5 pointsr/learnprogramming

Despite their age, the MIT lectures were great. If you're good at math and enjoy proofs this is the class for you. Same thing with the CLRS book. One of the best books on DS &amp; Algos out there, but it's so dense it'll make your eyes glaze over, unless you love proofs and highly technical reading.

To get your feet wet, Grokking Algorithms is a good book.

A lot of people recommend Princeton's Algorithm Course. I took Algorithms in school already, but I'm probably going to take this course to round out my knowledge.

EDIT: special shout out to geeks for geeks. Great Website

u/-jp- · 5 pointsr/java

Learning JavaScript is pretty good advice since it's useful in its own right, but honestly any company that expects a Jr. Java developer to have any kinda deep insight is being just unreasonable, and probably isn't a very good place to work. Everybody starts somewhere, Java developers included.

That said, for anyone wanting to learn design patterns, I suggest the classic GoF book: Design Patterns: Elements of Reusable Object-Oriented Software.

Just.

Promise me if you read it you won't use Visitor for anything ever.

u/invictus08 · 5 pointsr/cscareerquestions

I will suggest start with Head first design pattern. That will gradually build your intuition about good software design. And it’s more fun to read compared to GoF imo. On top of that, there are Youtube videos by a guy named Christopher Okhravi on design patterns. Since I understand much better with videos (along with text book) because of my attention deficiency, they really helped me.

Apart from that, follow tech blogs of Netflix, Google, AWS, Uber etc. They are treasure troves.

Also, as /u/ibsulon mentined, Clean Code for writing good quality code, Programming Pearls and Pragmatic Programmer etc. Effective java and Doug Lea's book on concurrent programming - really helpful.

u/Obie-two · 5 pointsr/learnprogramming

the book is just called "Design Patterns?"
This?

u/joshrulzz · 5 pointsr/programming

&gt; Computer science is about building things like engineering but without the luxury of a toolbox and components taken from the physical world. No one has worked out reliable and effective procedures for building large pieces of software as the engineers have done for physical project.

At first, I started to take issue with this statement because of software patterns. But I he means more than this. The author's points were thus:

• Engineering components exist in the real world - No one expects a widget to break the laws of physics.
• Engineering components interact in more predictable ways - In the real world, separate subsystems are really separate.
• Engineering has fewer fundamental midcourse design changes - Waterfall process works because people don't demand mid-course changes as much.

IMHO, the author's points are a human problem, not an engineering problem. In point 1, a project manager didn't set expectations for a client. In point 2, developers did not use tools that exist. Buffer-overrun protection DOES exist (his example), and other technologies help modularize software properly. Hell, cohesion and coupling are among the core software design principles. In point 3, again, a project manager did not properly set expectations for the client. PM technologies like agile methods have been developed to fight this. Further, because software is much newer to the human knowledge collective, it's less well understood. When it has been around for as long as architecture and machines, clients will have better expectations of the process.

In all, it sounds like his experience with software has not been true engineering rather than modern software development techniques.
u/caryy · 5 pointsr/learnprogramming

In addition to Code Complete 2, which, while very dense, is a compendium of wonderful coding knowledge... I recommend Clean Code by Robert C. Martin.

One of the best books on concurrency that I've ever read is definitely Java Concurrency In Practice it's (obviously) written with Java in mind, but most of the concepts map rather easily to constructs in other languages as well.

The standard for design patterns is still probably Design Patterns (colloquially "Gang of Four")... but I've heard good things about Head First Design Patterns as well, despite the really stupid cover.

u/Nition · 5 pointsr/gamedev

Not specifically game-related, but the great classic Design Patterns: Elements of Reusable Object-Oriented Software is really well written, and it's patterns are as applicable to game design as they are for anything else.

You do say you're an experienced programmer though, so you may already know many of the basic general design patterns in there (or you may have read the book even).

u/Tip_of_the_hat · 5 pointsr/learnprogramming

It really depends on your use case. A good starting point (I'm assuming architecture in software) would be to read Design Patterns by Group of Four. It uses C++ for it's examples and is quite a dense read, but is worth the read.

u/Kiuhnm · 5 pointsr/MachineLearning

Take the online course by Andrew Ng and then read Python Machine Learning.

If you then become really serious about Machine Learning, read, in this order,

u/effernand · 5 pointsr/learnmachinelearning

When I started on the field I took the famous course on Coursera by Andrew Ng. It helped to grasp the major concepts in (classical) ML, though it really lacked on mathematical profundity (truth be told, it was not really meant for that).

That said, I took a course on edX, which covered things in a little more depth. As I was getting deeper into the theory, things became more clear. I have also read some books, such as,

• Neural Networks, by Simon Haikin,
• Elements of Statistical Learning, by Hastie, Tibshirani and Friedman
• Pattern Recognition and Machine Learning, by Bishop

All these books have their own approach to Machine Learning, and particularly I think it is important that you have a good understanding on Machine Learning, and its impacts on various fields (signal processing, for instance) before jumping into Deep Learning. Before almost three years of major dedication in studying the field, I feel like I can walk a little by myself.

Now, as a begginer in Deep Learning, things are a little bit different. I would like to make a few points:

• If you have a good base on maths and Machine Learning, the algorithms used in Deep Learning will be more straightforward, as some of them are simply an extension of previous attempts.
• The practical part in Machine Learning seems a little bit childish with respect to Deep Learning. When I programmed Machine Learning models, I usually had small datasets, and algorithms who could run in a simple CPU.
• As you begin to work with Deep Learning, you will need to master a framework of your choice, which will yield issues about data usage (most datasets do not fit into memory), GPU/memory management. For instance, if you don't handle your data well, it becomes a bottleneck that slows down your code. So, when compared with simple numpy + matplotlib applications, tensorflow API's + tensorboard visualizations can be tough.

So, to summarize, you need to start with simple, boring things until you can be an independent user of ML methods. THEN you can think about state-of-the-art problems to solve with cutting-edge frameworks and APIs.
u/NicolasGuacamole · 5 pointsr/MLQuestions

A good textbook will do you wonders. Get one that is fairly general and includes exercises. Do the exercises. This will be hard, but it'll make you learn an enormous amount faster.

My personal favourite book is Christopher Bishop's Pattern Recognition and Machine Learning. It's very comprehensive, has a decent amount of maths as well as good examples and illustrations. The exercises are difficult and numerous.

That being said, it is entirely Machine Learning. You mention wanting to learn about 'AI' so potentially you may want to look at a different book for some grounding in the wider more classical field of AI than just Machine Learning. For this I'd recommend Russel and Norvig's [AI: A Modern Approach](https://smile.amazon.co.uk/Artificial- Intelligence-Modern-Approach-Global/dp/1292153962). It has a good intro which you can use to understand the structure and history of the field more generally, and following on from that has a load of content in various areas such as search, logic, planning, probabilistic reasoning, Machine Learning, natural language processing, etc. It also has exercises, but I've never done them so I can't comment much on them.

These two books, if you were to study them deeply would give you at least close to a graduate level of understanding. You may have to step back and drill down into mathematical foundations if you're serious about doing exercises in Bishop's book.

On top of this, there are many really good video series on youtube for times when you want to do more passive learning. I must say though, that this should not be where most of your attention rests.

Here are some of my favourite relevant playlists on YouTube, ordered in roughly difficulty / relevance. Loosely start at the top, but don't be afraid to jump around. Some are only very tenuously related, but in my opinion they all have some value.

Gilbert Strang - Linear Algebra

Gilbert Strang - Calculus Overview

Andrew Ng - Machine Learning (Gentle coursera version)

Mathematical Monk - Machine Learning

Mathematical Monk - Probability

Mathematical Monk - Information Theory

Andrew Ng - Machine Learning (Full Stanford Course)

Ali Ghodsi - Data Visualisation (Unsupervised Learning)

Nando de Freitas - Deep Learning

The late great David MacKay - Information Theory

Berkeley Deep Unsupervised Learning

Geoff Hinton - Neural Networks for ML

Stephen Boyd - Convex Optimisation

Frederic Schuller - Winter School on Gravity and Light

Frederic Schuller - Geometrical Anatomy of Theoretical Physics

Yaser Abu-Mostafa - Machine Learning (statistical learning)

Daniel Cremers - Multiple View Geometry

u/slashcom · 5 pointsr/compsci

In Natural Language Processing, it's Jurafsky and Martin. In Machine Learning, it's debatably the Bishop book.

u/Thought_Ninja · 5 pointsr/learnprogramming

If you want to dig deep into the theoretical of programming, and help build a good foundation for OOP, patterns, and algorithm design, check out Concrete Mathematics: A Foundation for Computer Science. It is honestly the best textbook I have ever come across.

From there, if you're feeling really ambitious in studying algorithms, check out The Art of Computer Programming, but I should warn you, it is very dense and can be hard to understand even for accomplished developers.

Beyond that, I suggest checking out The Odin Project. It covers a variety of languages and frameworks including Ruby On Rails, which is pretty standard in app development these days. They have a lot of great references and side material. It's basically a "go at your own pace" open source coding boot-camp.

&gt; Like I said, this is for me. I hate just being told "do this" and having no concept of why. I want to understand why I'm doing it, the implications for doing it "this way".

This... This is the mindset that will carry you and eventually make you stand out as an exceptional programmer. Learning how to do something might land you a job, but knowing how it works makes you an invaluable asset to any employer.

As long as you are passionate about learning the material, you will pick it up over time.

&gt;This is where I realized that I was doing this wrong, at least for me. I'd be on codeabbey and know what I wanted to do, but not how. I realized that I needed to be building larger things to be working with oop concepts. I really felt I was missing a lot of "base" information.

Awesome observation. Doing studying and doing drills both have an important role in the learning process, but there are other forms of practice to include in order to reinforce the material in a meaningful way. Ruby Rogues Podcast has a great group discussion about how to learn that I highly suggest you give a listen.

Personally, I learn best by throwing myself into a project where I am in wayyy over my head. By struggling through problems, scrupulously tearing through documentation and examples, I learn a lot more of the why than the how at the end of the day.

I learned Javascript, jQuery, and AJAX by building a templating &amp; ecommerce framework. I started out with little to no knowledge or understanding of how JS worked, and was forced to restart a number of times as I began to see what was good and what was not, but now I feel very comfortable working with it.

Find a problem, and solve it, because Computer Science is, really, just the art of problem solving.

Best of luck, and most importantly, have fun :D

u/silverforest · 5 pointsr/IWantToLearn

Engineer here. The Navier–Stokes equations are expressed using vector calculus.

I'd just point you to books. I highly recommend actually doing the exercises at the end of each chapter.

John Bird - Engineering Mathematics

This is a good book that doesn't assume that you've completed GCSE mathematics. It brings you up to scratch to a level of mathematics I'd expect a first year undergraduate to have. Yes, it starts with fractions. A large chunk of it is geometry and calculus, though it does touch on some statistics and matrices and complex algebra on the way there.
It ends with you being able to solve basic separable 1ˢᵗ order ODEs.

Higher-order ODEs, Linear Algebra, Vector Calculus, PDEs, Complex Analysis, Fourier. Oh, and linear programming and optimization.

There's a recommended study guide and order in the introduction.

Knuth - Concrete Mathematics

Just thought I'd include this here, since you're a programmer. Touches on some continuous and some discrete mathematics essential for computer science.

u/realfizzbuzzed · 5 pointsr/learnprogramming

For language specific stuff like c++ I'd suggest:

https://www.toptal.com/c-plus-plus/interview-questions as a pretty good quick review. Going through c++ primer (or whatever book you learned C++ from) and just touching on all the topics that you don't remember can help a lot.

What helped me with design patterns is head first design patterns. Even though that book seems very 'intro' it has helped me pass all the design pattern question I've ever gotten. I think just visitor, singleton and MVC are the most popular interview design patterns. The more canonical 'gang of four' book would be: this.

For just practice tech interview problems that you'll see on silicon valley, big4 tech interviews, the most popular site is probably leetcode (hackerrank, or top coder or even codewars are pretty good sources too). The most popular book (as mentioned many times) is Cracking the coding interview.

Just grinding out problems on leetcode, you'll probably learn enough to pass most tech interviews after about 100-200 questions (if you've never done this before, less if you have).

Before/during your leetcode practice I've written a series of tech interview tips to help you check on your fundamentals (you can check it out at fizzbuzzed.com). I try to go over the questions I see people failing most often. I'm also trying to help readers so would love to help you if you run into problems.

I also think you should ask the recruiter the style of interview you'll be having. If you are on the whiteboard then you should practice on it before hand (whiteboard interview tips here.

u/eric_weinstein · 5 pointsr/learnprogramming

Seconding The Pragmatic Programmer and Cracking the Coding Interview. I'd also recommend:

• Code Complete: verbose and somewhat self-congratulatory, but extremely good.
• The Mythical Man-Month: a little dated and weirdly religious at times, but has great insights into how software problems are actually people problems and how large projects are (mis)managed.
• Design Patterns: a.k.a. the Gang of Four book. This one's a classic.
• Pro Git: you mentioned version control systems. IMHO, you should learn Git if you don't know it, and this book is a great resource.

If you let us know which languages you primarily write, I can probably recommend some good language-specific titles, too.
u/thatsnotgravity · 5 pointsr/learnprogramming

Pragmatic Programmer, Clean Code, and Head First Design Patterns come to mind right away. They're 3 of my favorites.

There's also Design Patterns by the Gang of Four. That's a lot more dense IMO than Head First, but it's fantastic material.

Since you're looking to jump ship and start interviewing, take a look at Cracking the Coding Interview. That will prepare you for any questions you run into during the process.

It's probably also worth brushing up on Algorithms and Data structures.

u/robertcrowther · 5 pointsr/programming
1. He's not the author
2. The book was published in 1994
3. Java was first released in 1995
u/sh0rug0ru · 5 pointsr/java

Read lots of code and read books to get multiple viewpoints. This is a deep topic which will require more than superficial online reading.

Check this out.

Books I have found useful:

u/ttutisani · 5 pointsr/softwarearchitecture

My blog about software architecture: http://www.tutisani.com/software-architecture/ (may not be for very beginners but I hope that it's able to communicate important topics).

I'd also suggest reading the classic book about design patterns (a.k.a. Gang of Four): https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_3?crid=1XRJO0L09LHLY&amp;keywords=design+patterns+elements+of+reusable+object+oriented+software&amp;qid=1557502967&amp;s=gateway&amp;sprefix=design+patterns%2Caps%2C162&amp;sr=8-3

There are several good thought leaders in this direction, specifically Martin Fowler (https://martinfowler.com/) and Eric Evans (he does not write much online, but his book is great - all about modeling properly): https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

&amp;#x200B;

I'm big on modeling, objects, etc. so reply back if any questions.

u/xbrandnew99 · 5 pointsr/Frontend

Design Patterns: Elements of Reusable Object-Oriented Software doesn't use JS for it's examples, but is highly regarded in learning design patterns.

Also, Mastering JavaScript Design Patterns is pretty good, and if I recall correctly, is modeled after the first book I mentioned. Heads up, there is a more up to date 2nd edition of this book available (linked version is 1st edition)

u/pogotc · 5 pointsr/PHP

A good place to start might be learning some of the common design patterns, these are common solutions that have been found to work over and over. The standard book for learning them is this one: http://www.amazon.com/Design-patterns-elements-reusable-object-oriented/dp/0201633612

It's not a PHP book but the patterns it covers can be used in any language, there's also loads of stuff on Wikipedia around them: http://en.wikipedia.org/wiki/Category:Software_design_patterns

Learning how to approach a programming problem is at the heart of being a good programmer and it's something you'll always be able to improve on (I'm still learning after 15 years of programming) so I would recommend a combination of trying to read as many programming books as you can, asking for help on StackOverflow when you need it, looking through well written open source code to see how those guys approach problems and above all else, practice, practice, practice.

u/intertroll · 5 pointsr/compsci

If you don’t want to real an actual textbook, this one will do the job (without skimping on details) and is more laypeople friendly:
https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

Just as an aside, I had a non techy friend who had a similar sense of mystification as OP’s but really wanted to understand them better, so I pointed him at this book. The next time I saw him we had a great conversation about logic gates and data representation. So it works! It was actually almost a cathartic experience, taking a person who doesn’t get to someone who does, since as a developer you often have to deal with users who don’t know and don’t care.

u/serimachi · 5 pointsr/computerscience

It's so great you're being so proactive with your learning! It will definitely pay off for you.

I like other's suggestion of Clean Code, but I fear as a first year that it may have mostly flew over my head--not that it would at all hurt to read. For a first year student specifically, I'd recommend either of two books.

Structure &amp; Interpretation of Computer Programs, also known as The Wizard Book and free on the link I just sent you, is a famous textbook formerly used in MIT's Intro to Computer Science course. However, it's conceptually useful to programmers on any level. If you really, seriously read it and do the exercises, it's gonna give you a rock-solid foundation and shoot you ahead of your peers.

It uses Scheme, a quote-on-quote "useless" programming language for any real-world purpose. That's arguable, but the important thing about the book is that it's really edifying for a programmer. The skill it helps you develop is not the kind that will directly show on your resume, it's nothing you can point to, but it's the kind of skill that will show in your code and how you think and approach problems in general. That said, the book has exercises and the MIT site I linked you to has labs that you could potentially show off on your github.

Code: The Hidden Language of Hardware and Software is much more approachable, is not marketed specifically for programmers, and does not contain any exercises. Read it, though, and you'll find you have a huge boost in understanding the low-level computing classes that your classmates will struggle with. What is basically does is show the reader how one can build a computer, step by step, from the very basics of logic and switches. It's readable and written for a casual audience, so you may find it easier to motivate yourself to finish it.

SICP and Code, despite both being extremely popular, can be a bit difficult conceptually. If you don't fully understand something, try reading it again, and if you still don't understand it, it's fine. Everyone experiences that sometimes. It's okay to move forward as long as you feel like you mostly get the topic. Don't let the perfect be the enemy of the good.

Best of luck to you, and be excited! It's thrilling stuff.

u/cyberbemon · 5 pointsr/hardware

This is a great start, as it explains and goes into great detail regarding cpu/gpu architectures: Computer Architecture, Fifth Edition: A Quantitative Approach

Another one that goes to low level is: Code: The Hidden Language of Computer Hardware and Software

&gt;"He starts with basic principles of language and logic and then demonstrates how they can be embodied by electrical circuits, and these principles give him an opening to describe in principle how computers work mechanically without requiring very much technical knowledge"

-wiki

u/Not0K · 5 pointsr/learnpython

If you would like a really in-depth explanation, check out Code.

u/ewiethoff · 5 pointsr/books

Petzold's Code: The Hidden Language of Computer Hardware and Software. Fascinating book about logic gates, character encoding, and so on.

u/el3r9 · 5 pointsr/explainlikeimfive

I would this in a top level comment but it’s against the rules of the sub to do so, but OP can check out this book, called “Code” is a great, truly ELI5 intro to computers. If someone is interested they can check it out.

u/AlSweigart · 5 pointsr/learnprogramming

Patternson's Computer Architecture: A Quantitative Approach was a pretty good book. I remember mostly teaching myself from that textbook since the prof I had wasn't a great lecturer.

You can probably find a PDF of it online easily enough.

EDIT: If you want a reasonable sized book instead of a big textbook, I'd recommend reading Petzold's Code, it's a fun read.

u/tolos · 5 pointsr/IWantToLearn

First, there are two requests: one from your title, and one from your description. The request from your title is a bit easier, though I'm afraid I won't be able to answer it satisfactorily. As far a real world example, that may be a bit harder because modern CPUs are pretty complicated. (When I was learning computer architecture at university, we never really discussed how an intel or AMD cpu worked -- just learned about the MSP430 and a couple of hypothetical CPUs. If you really want to see how a real CPU works, I'd suggest looking into a microprocessor to get started.)

A quick and dirty summary for MIPS CPU datapath Computer Organization and Design 4th ed page 315:

1. The Program Counter (PC) loads the next instruction
2. PC is incremented
3. Instruction is parsed and the correct registers are loaded
4. Registers are fed into ALU if necessary or
5. Registers are passed into data memory (for read/write)
6. Results from ALU/data memory are loaded back into registers
7. next instruction

Note that the MIPS example doesn't use pipe-lining, which generally makes things a bit faster. And there's only one code path executing. And there's no look ahead. Which isn't the case for modern CPUs.

For further reading I highly recommend Code by Charles Petzold. I think it helped prepare me before going to college (for computer engineering).

For video learning, a quick google search shows some videos that would probably be helpful (I haven't watched any of these).

Sorry for the rushed response, I can expand on this more later if there's interest.
u/charlesbukowksi · 5 pointsr/learnprogramming

I liked it. I would also recommend reading CODE: http://www.amazon.com/exec/obidos/ASIN/0735611319

Between MIT's Python course, CS50 and that you'll have an excellent grounding in CS

u/apocalypsemachine · 5 pointsr/Futurology

Most of my stuff is going to focus around consciousness and AI.

BOOKS

Ray Kurzweil - How to Create a Mind - Ray gives an intro to neuroscience and suggests ways we might build intelligent machines. This is a fun and easy book to read.

Ray Kurzweil - TRANSCEND - Ray and Dr. Terry Grossman tell you how to live long enough to live forever. This is a very inspirational book.

*I'd skip Kurzweil's older books. The newer ones largely cover the stuff in the older ones anyhow.

Jeff Hawkins - On Intelligence - Engineer and Neuroscientist, Jeff Hawkins, presents a comprehensive theory of intelligence in the neocortex. He goes on to explain how we can build intelligent machines and how they might change the world. He takes a more grounded, but equally interesting, approach to AI than Kurzweil.

Stanislas Dehaene - Consciousness and the Brain - Someone just recommended this book to me so I have not had a chance to read the whole thing. It explains new methods researchers are using to understand what consciousness is.

ONLINE ARTICLES

George Dvorsky - Animal Uplift - We can do more than improve our own minds and create intelligent machines. We can improve the minds of animals! But should we?

David Shultz - Least Conscious Unit - A short story that explores several philosophical ideas about consciousness. The ending may make you question what is real.

Stanford Encyclopedia of Philosophy - Consciousness - The most well known philosophical ideas about consciousness.

VIDEOS

Socrates - Singularity Weblog - This guy interviews the people who are making the technology of tomorrow, today. He's interviewed the CEO of D-Wave, Ray Kurzweil, Michio Kaku, and tons of less well known but equally interesting people.

David Chalmers - Simulation and the Singularity at The Singularity Summit 2009 - Respected Philosopher, David Chalmers, talks about different approaches to AI and a little about what might be on the other side of the singularity.

Ben Goertzel - Singularity or Bust - Mathematician and computer Scientist, Ben Goertzel, goes to China to create Artificial General Intelligence funded by the Chinese Government. Unfortunately they cut the program.

PROGRAMMING

Daniel Shiffman - The Nature of Code - After reading How to Create a Mind you will probably want to get started with a neural network (or Hidden Markov model) of your own. This is your hello world. If you get past this and the math is too hard use this

Encog - A neural network API written in your favorite language

OpenCV - Face and object recognition made easy(ish).

u/Geilminister · 5 pointsr/artificial

On intelligence by Jeff Hawkins is an amazing book an artificial intelligence. Hawkins' company has an open source project called [NuPIC] (http://numenta.org/) that would be a good place to get some hands on experience. It is Python based, and has a somewhat steep learning curve, so it might serve better as a beacon that you can work towards, rather than an actual project as of right now.

u/Capissen38 · 5 pointsr/singularity

You bring up an excellent point (and make a great case for land ownership!), and that is that actual physical space can't really be created, and will remain scarce, insofar as Earth has a fixed surface area. If the scenario I described above came to pass, though, would any landlords come looking for rent? Would any governments levy taxes? If no one needs cash and everyone has pretty much everything provided for them, all but the most stubborn landlords won't have any reason to give a hoot. I suspect government would take longer to die out, since it may still be needed to enforce laws, judge disputes, provide safety, etc. It's not hard to imagine a world even further down the line, however, when technology has advanced to the point where humans can't realistically do much damage to one another.

Edit: If you're really into this, I'd suggest reading some singularity-esque literature such as Down and Out in the Magic Kingdom (novella), Rainbows End (novel), and The Singularity is Near (speculative nonfiction to be taken with a grain of salt).

u/ataraxic89 · 5 pointsr/IAmA

You REALLY need to read this book. Infact, if you think you would actually read it I'll buy it for you.

u/coHomerLogist · 5 pointsr/math

&gt;I didn't say it was correct but it makes it more likely that people will dismiss it out of hand.

That's fair, I agree. It's just frustrating: there are so many strawmen arguments related to AI that a huge number of intelligent people dismiss it outright. But if you actually look into it, it's a deeply worrying issue-- and the vast majority of people who actually engage with the good arguments are pretty damn concerned.

I would be very interested if anyone can produce a compelling rebuttal to the main points in Superintelligence, for instance. I recommend this book very highly to anyone, but especially people who wonder "is AI safety just bullshit?"

&gt;Especially when those people get significant amounts of funding

u/toptrool · 5 pointsr/math
u/resisttheurge · 5 pointsr/reddit.com

It becomes useful to replace concepts such as equivalence relations (and other relations) with symbols in order to facilitate understanding, actually. I'm sure you've used the =, &lt;, &gt;, the greater-than-or-equal-to, or the less-than-or-equal-to symbols before. These symbols allow those that read equations, definitions, or proofs to quickly and unambiguously understand what is being discussed. If you end up studying higher math for a while, you become familiar and comfortable with this style of notation.

Interestingly, notation like this and the thought process it represents is important in understanding the structure of mathematical logic, forms a large part of the basis of automata theory (aka why you're able to enjoy complex technology, like computers), and may hold key insights into the nature of consciousness and sentience itself.

If you've got the stomach for the notation, wide worlds of fascinating information await!

u/1337_Mrs_Roberts · 5 pointsr/suggestmeabook

If you like things a bit more prose-y, try Pirsig's Zen and the Art of Motorcycle Maintenance

u/MmmCurry · 5 pointsr/compsci

Not specific to algorithms or even to CS, but Douglas Hofstadter (Gödel, Escher, Bach, I Am a Strange Loop) touches on many of the logical fundamentals in a relatively layman-digestable manner.

I wouldn't call him easy reading compared to Sagan or Kaku, and don't know a "pop computer science" equivalent to those two, but you definitely don't need a CS or math degree to get through GEB. Whether it's on-topic enough here is definitely questionable.

---

Edit: I haven't read it, but from the description this one by Thomas Cormen looks like it might be close to what you're looking for: Algorithms Unlocked.

"This is a unique book in its attempt to open the field of algorithms to a wider audience. It provides an easy-to-read introduction to an abstract topic, without sacrificing depth."

From the TOC, it looks like it's probably fairly light on math but gets into code or pseudocode relatively quickly. I still wouldn't call it pop-CS, but if that sounds like a fit, maybe give it a shot!

u/SuperConductiveRabbi · 5 pointsr/INTP

Here's the inevitable recommendation for Gödel, Escher, Bach (Amazon page, so you can see the reviews).

Synopsis:

&gt;Twenty years after it topped the bestseller charts, Douglas R. Hofstadter's Gödel, Escher, Bach: An Eternal Golden Braid is still something of a marvel. Besides being a profound and entertaining meditation on human thought and creativity, this book looks at the surprising points of contact between the music of Bach, the artwork of Escher, and the mathematics of Gödel. It also looks at the prospects for computers and artificial intelligence (AI) for mimicking human thought. For the general reader and the computer techie alike, this book still sets a standard for thinking about the future of computers and their relation to the way we think.

&gt;Hofstadter's great achievement in Gödel, Escher, Bach was making abstruse mathematical topics (like undecidability, recursion, and 'strange loops') accessible and remarkably entertaining. Borrowing a page from Lewis Carroll (who might well have been a fan of this book), each chapter presents dialogue between the Tortoise and Achilles, as well as other characters who dramatize concepts discussed later in more detail. Allusions to Bach's music (centering on his Musical Offering) and Escher's continually paradoxical artwork are plentiful here.

It may be strange, but during the biology and nature-of-thought-related sections of GEB I decided to read the neurology chapters of Gray's Anatomy (no, not Grey's Anatomy). It's pretty heady and slows you down quite a bit, but it results in a really interesting mix of deep biological knowledge about the structure of neurons and functioning of the nervous system with GEB's higher-level, cognition-focused discussion.

Note that that's the 40th, British edition of Gray's Anatomy. There are cheaper ones if you don't need the most up-to-date version, including leather-bound reprints of the classic 1901 American reprint. I doubt the old versions have much accurate information about neurology, however.

u/amair · 5 pointsr/math

Some good readings from the University of Cambridge Mathematical reading list and p11 from the Studying Mathematics at Oxford Booklet both aimed at undergraduate admissions.

Prime obsession by Derbyshire. (Excellent)

The unfinished game by Devlin.

Letters to a young mathematician by Stewart.

The code book by Singh

Imagining numbers by Mazur (so, so)

and a little off topic:

The annotated turing by Petzold (not so light reading, but excellent)

Complexity by Waldrop

If you're interested to learn the basic physicality of a computer, I'd recommend checking out a book by Charles Petzold: "Code: The Hidden Language of Computer Hardware and Software."

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's easy to read and provides a lot of insight into how circuitry embodies and propagates information!

u/Mattakinz · 5 pointsr/compsci
u/atommclain · 5 pointsr/apple

For the 'computers in general' side of things: Code: The Hidden Language of Computer Hardware and Software

u/UncleMeat · 5 pointsr/compsci

I cannot recommend the book Code by Charles Petzold highly enough. This is the book that solidified my love of computer science and hits most of the major topics in CS in an easy to understand and thoroughly entertaining way. By the end of the book you have walked through the fundamentals of how to build and program a rudimentary computer and had fun why doing it!

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/DrAmbulanceDriver · 5 pointsr/learnprogramming

I'm assuming you just want to learn the basic information about how computers work and the principles behind programming them, right?

In that case, I'd recommend Code by Charles Petzold

Are you looking to actually learn how to program and write code in a specific language? If so, then I'd recommend Automate the Boring Stuff with Python by Al Sweigart. It covers the basic principles of writing functions and how computer logic works, and you'll actually be able to apply it to some practical uses. And since its Python, it'll run on a lot of different platforms. If you like it, you may want to get into working with the Raspberry Pi. Javascript is another good language to start with, but as a book, I really like this one.

If you already know a bit about programming, and just want a general reference book, then Computer Science Illuminated by Dale and Lewis is pretty good.

You should give this book a read Code: The hidden Language Of Computer Hardware and Software By Charles Petzold

It does a great job of explaining how it all works. Loved it and I don't know how to program (Yet).

You should probably start with this book called Code and work your way up from there. It's actually pretty hard to find a single book that describes the history and the concepts, and even if you did find one, most of the topics would be hard to grasp on a first read. Code is usually a great starter book and might give you a few pieces of what your looking for. After you finish it, maybe check out a software book and dive into some of the concepts.

All of these answers answer your question on a general level, but I would really recommend reading Code: The Hidden Language of Computer Hardware and Software by Charles Petzold for a deeper understanding. He talks about how the first computers were built and how they were programmed, and he does it in a way that's understandable even to a person that doesn't know a thing about computers.

u/KatsuCurryCutlet · 4 pointsr/learnmath

Hmm alright, considering your background, I'd probably recommend you giving Michael Sipser's Introduction to Theory of Computation a read (I sure there are many electronic copies floating around on the Internet). I think they cover the prerequisite math concepts required in a preliminary chapter before the content which I highly recommend you spend some time on. It works it's way up by walking you through notions of computations in increments, first through finite state automata before adding in more features, working its way up to a Turing machine. You can skip most of the exercises, since those are mostly for graduate students who need practice before undertaking research. If you ever get confused about concepts along the way just drop me a PM or a question in /r/askcomputerscience and I'm sure the community would be happy to help out.

Also if you're interested I could mail you my copy of (meaning a copy that I had bought some time ago, not that I wrote it) the Annotated Turing. It does a great job of explaining the concept of a Turing machine provided a non-mathematical and non-CS background. I'd be more than happy to share my books with people who are interested, plus there's no use in me keeping it around now that I'm done with it.

Just bear in mind that unlike most of science, the concepts here are very abstract, there aren't many direct physical implications, this really is a pure study of notions at play. i.e. how does one go about studying "how to do things" and its implications. A lot of details such as "how can such a machine exist with an infinite tape? what moves it? how does it implement its decision making scheme?" are all unimportant and ultimately inconsequential to the study itself.

Instead, what we care about are things like "I have a problem, is it possible for me to come up with a solution (algorithm) for it? Or is it logically impossible?" or things like "I have come up with a way to make a "computer", can it do things that other computers can? If I had to make it sort an arbitrary set of numbers so that they are ordered numerically, can my computer do it?". Turing machines, are a tool to help us reason about formally around these sort of arguments, and to give insight into what we can qualify as "computation". Further down the line we even ask questions like "are some problems inherently more 'difficult' than others?" and "if I can solve problem B, and I somehow use the solution for problem B to solve some other problem A?"

Perhaps this all sounds perplexing now, but maybe just go through some content and spend time reading a little and these should start to make a little more sense. good luck with your future endeavors on this journey!

u/rohit275 · 4 pointsr/hardware

I haven't read it, but it looks pretty good. I can personally vouch for this book however:

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=pd_sim_14_2?_encoding=UTF8&amp;amp;psc=1&amp;amp;refRID=C8KMCKES83EHGXS3VWSQ

It's truly amazing. I'm currently an EE PhD student but I had a pretty limited background in digital hardware and computer architecture, and I read most of this book just out of interest a little while ago and frankly learned quite a bit. It's written at a very readable level for anyone with almost no prior knowledge, yet gets technical when it needs to. It's very thorough, but approaches the topics at a wonderful and easy pace with very clear explanations. The author even says you can skip some of the more technical details if they're not of interest to you, and you'll still end up learning quite a lot. The book you posted looks pretty similar, so I'd say it's worth a shot.

u/mwassler · 4 pointsr/webdev

Everyone seems to have good things to say about khan academy's comp sci courses.

A few good lower level books in my opinion are this one which is maybe less technical but a good lower foundation and then From Mathematics to Generic Programming by Alexander Stepanov.

I think your probably just experiencing outliers in your job search. If you keep at it your luck will probably turn around.

u/dr_dalek · 4 pointsr/explainlikeimfive

Take a look at this book: Code The book starts off with a switch and builds a whole computer from there.

u/nattoninja · 4 pointsr/learnprogramming

Code is a really good book that goes into how it all works, from the basics of binary and electrical signals and builds from there. The text is very straightforward and there are lots of good illustrations.

u/p7r · 4 pointsr/NoStupidQuestions

I've taught a lot of people how computers work, or more precisely how to program them. I am sure you can learn too.

First, let's make it fun.

There is a lot of material for people who like the Raspberry Pi out there that is fun and simple. You don't even need to own a Raspberry Pi to understand what they're talking about.

It's fun and simple because it's designed for youngsters who find long/complex books a bit too boring. I think you might enjoy it, because you've said you've found the books you've tried too boring.

Here is a load of magazines about the Pi - on each issue you can click on "Get Issue" and then under the cover "download the PDF" and read it and see if you enjoy that.

Next, have a play with Scratch. It's designed for kids but the exact same concepts are in professional programming languages.

The reason I recommend it is not because I think you are a child, but because it's a lot of fun and makes a lot of the dull and boring bits of programming go away so you can focus on the fun bits.

You have to remember all the things going on inside a computer are basically the things going on in there - just a lot more complex.

If you ever want to learn a programming language that professional developers use, I think you'll like Ruby.

It's very forgiving for new developers, but still lets you do what we would call "production grade" code. It's what I work in most days.

Also, why's poignant guide is quite funny, but you might find it a bit weird and confusing - I know I did the first time I read it. :-)

I also recommend this book to you: Code by Charles Petzoid. The first few chapters don't seem like they're about computers, because they talk about flags and electrical circuits - that's because you need to understand those things first.

If you can read and understand the whole thing you will know more about how computers work than half of the professional software engineers out there. And they're normally quite a clever bunch.

If you find it too difficult, slow down and think. Each paragraph has something in it worth thinking about and letting it mull over in your mind.

IQ is not a measure of how much you can learn, but perhaps how quickly it can see patterns and understand things.

You having a lower IQ than somebody else does not mean you can't see those patterns or understand things, it just means it might take you a little more thinking to get there. I'm sure you will.

If you ever have any questions about computers, I'd love to try and help answer them - feel free to ask me.

u/autophage · 4 pointsr/IWantToLearn

Lots of people are recommending ways to learn a language, and I just want to pop in to say: most popular languages are much more alike than they are different. Learn any one of them for a few months (until you're no longer looking up references for how to write a for loop or getting confused by the language's comparison operators), then try your hand at a different language.

If you find something hard to grasp in one language, it's probably about equally hard to grasp in another language - so don't just think "Hmm, well, maybe this is easier in other-language" and switch over to that one instead. (There are a few exceptions - for example, you don't have to worry about memory management in Java the same way that you do in C. You can still get memory leaks in Java, but the fact that you've got garbage collection makes memory management on the whole far simpler.)

In terms of getting into hacking - the first step, hands down, is to read this book. It will teach you the really really basic stuff, on a far deeper level than most laymen ever think about, in a very gentle and even fun way. After that, start getting your hands on networking texts, security texts, and just plain writing a lot of code. Get the source to some popular open source projects (Apache, for example) and run it in a debugger, watching how the values change and looking for unexpected things.

u/-___I---I-___ · 4 pointsr/learnprogramming
1. topic name: Fundementals, discrete math, algorithms, a good book to start with, there are tons of free courses and lectures on the internet, but you will have to type in the specific search terms

2. idk

3. idk

I really enjoyed Code.

I feel it's a really accessible summary of what is going on under the hood of a computer.

u/Shmurk · 4 pointsr/programming

Add Concrete Mathematics if you're a maths retard like me.

u/another_math_person · 4 pointsr/learnprogramming

You might use calculus-based tools for some analysis, like if you look at Knuth's Discrete Math text, you'll see discrete integrals, which are certainly grounded in calculus.

As well, if you look at randomized algorithms, you might need to use some nontrivial probability stuff (like Chernoff Bounds - wiki). That isn't directly calculus, but there is a significant portion of useful probability that requires the use of integrals.

All that said, Linear Algebra is probably more useful for programmers (especially if they're doing graphics or games).

u/Continuities · 4 pointsr/javascript

This is a really big question, and not really specific to javascript. Architecting large applications, in my opinion, is kind of an art form. You can learn strategies by reading, but you're not going to get good at it without years of experience. Ideally you learn this stuff while working alongside strong senior developers who know what they're doing.

Read Design Patterns, if you haven't. I'd also recommend Javascript Alongé, but I'm a functional programming crazy.

Build something in a way that feels right, then evaluate what worked and what didn't. Which parts sucked to modify when requirements or assumptions changed? Which parts were hard to understand a month later?

In general, I have a few guidelines:

1. Definitely use some sort of module system (AMD, or ES6 modules) to aid in encapsulation and prevent global pollution
2. Keep similar code together and differing code apart.
3. Keep modules small, and single purpose
4. Prefer composition over inheritence
5. If you're doing web stuff, don't be afraid to keep the html, css, and js for specific pieces of UI together. Don't fall into the trap of conflating technology with concerns.
u/Jonny0Than · 4 pointsr/learnprogramming

The "gang of four" book titled Design Patterns is an excellent reference for object-oriented architectures.

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612

Note - this really is more like a reference book than a tutorial or gentle introduction. It's not something you read cover-to-cover.

u/fluicpana · 4 pointsr/italy

Per testare le acque velocemente puoi usare https://rubymonk.com/ (introduce Ruby in modo basico). Anche Coursera, Khan, Udacity e simili hanno corsi introduttivi sulla programmazione.

Mentre se vuoi imparare a programmare, il percorso deve toccare almeno tutte queste tappe, in ordine:

1. [Computer Organization and Design](http://www.amazon.com/Computer-
Organization-Design-Fourth-Edition/dp/0123744938)

2. The Structure and Interpretation of Computer Programs

3. Un buon libro di Assembly

4. The C programming language

5. Compillers

6. Code complete, The practice of programming

7. Fai finta di aver letto tutto The art of computer programming

8. Un linguaggio a oggetti, magari Programming Ruby

9. O/E Python, Dive into Python

10. Design patterns

11. Impara un linguaggio funzionale.

Da qui puoi partire e specializzarti in quello che ti interessa

u/Birkal · 4 pointsr/UCSD
u/nanojava · 4 pointsr/cscareerquestions

Most of the design patterns introduce by GOF (gang of four) is still applicable today. Buy and read this book

u/Kaelin · 4 pointsr/compsci

Study design patterns and read books by the masters.. Find the books that are recognized by the community as "the best". For example "Effective Java" is one of the best books on writing Java beyond the basics.

The Pragmatic Programmer: From Journeyman to Master

http://www.amazon.com/Pragmatic-Programmer-Journeyman-Master/dp/020161622X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1374154408&amp;amp;sr=1-1&amp;amp;keywords=pragmatic+programmer

Design Patterns

http://www.amazon.com/Design-Patterns-Elements-Object-Oriented-ebook/dp/B000SEIBB8

http://www.amazon.com/gp/product/020161586X/ref=oh_details_o05_s00_i00?ie=UTF8&amp;amp;psc=1

u/soundslikeponies · 4 pointsr/programming

You can always read books. Textbooks are much better to read when you're free to browse and pick out whichever ones you like. You can get a surprising amount of reading done just by reading on the bus, on the can, and whenever you've got nothing better to do.

A popular stack overflow answer has a pretty good list. You can preview the introduction of most books on amazon.

People like to champion the internet as "oh, you can learn anything on the internet!" Which is true. But you can learn it much faster and better from a book (generally speaking).

Books provide a long format which has a chance to build upon itself. Also, everything is collected in one place for easy access. More developers ought to sit down and read good books.

u/illithoid · 4 pointsr/salesforce

I'll be honest with you, I don't think Head First Java would be a good choice, however DO READ Clean Code. I also suggest Design Patterns: Elements of Reusable Object-Oriented Software and Working Effectively with Legacy Code. The first is a classic MUST READ for anyone in software development. It present numerous challenges that most of us will face when developing solutions, and gives you the design patterns you will need to solve them. The second is great for learning how to fix your predecessors shitty code, you'll need this one. If you haven't already, look up Bob Buzzard and Andy Fawcett. These two guys are my favorite SFDC Dev Bloggers. I also suggest watching any Salesforce Webinar that has anything to do with code, especially security stuff.

Practice makes perfect, except for us there is no perfect, just better. Know your best practices and live by them. With everything you do ask how can I make it better? Faster? More efficient? Do I even need code, or will Workflow/Process Builder/Flow do? How can I write code, so that an Admin can customize it without any code?

&gt; Based on code reviews--my code is pretty good, with good logic and pretty well laid out.

This is actually VERY important, having good logic is obviously crucial, but being well laid out is a kind of hidden requirement with code. You or somebody else will eventually need to maintain your code, if it's laid out well it should hopefully be easy to read and maintain.

When you write code do your best to incorporate declarative features so that further customization can be done without code (I know I said this earlier, but I think it's important). Need to write some code that uses an arbitrary set of fields, consider using Field Sets. An Admin can add/remove them without code. Maybe use a Custom Setting, or Custom Metadata to map fields to values.

Learn how to use Describe calls for everything. Need to write some code that catches dupes and merges them? Don't hard code the values, then nobody will be able to remove or add fields without updating code. Instead use Describe calls, now you get every field on the object forever. Need to remove a field from an object no problem. Need to add a field to an object no problem. Does your losing record have child records that need to be reparented? Don't hard code, use Describe calls to get all sObjects with a Child Relationship. Use Describe to find out if it can be directly reparented or if it needs to be clones (CampaignMembers can't reparent a LeadId to a new Lead. You MUST clone and add the new Lead Id).

How much do you know about HTML? CSS? JavaScript? JQuery? Visualforce? Learn 'em. Lightning is coming, and these are going to be more important than ever (except maybe Jquery).

Practice, practice, practice. One coding assignment per month isn't that bad, but if you get some work done early and you have an hour or two to spare, work on a side project. Can you think of something in your company that could be automated, spin up a Dev Org and give it a shot. Maybe your Sales people could use a new VF page for entering information just a little quicker.

Always seek to improve your code. Always seek new ideas and better ways of doing things.

Trailhead is good, do all the coding ones you can find, it's more practice!

u/ZioYuri78 · 4 pointsr/unrealengine

I have the first edition and yes, it worth a read, keep in mind that it explain how game engines works and not how to make a game engine.

After reading it you will not be a master with UE4 but you will undertstand why UE4 do things in a certain way.

Another book you have to read (and is mentioned in your link) is the Game Programming Patterns book, i have the physical copy and it is awesome, read it after the GoF Design Patterns book, is a masterpiece combo.

EDIT:

Also two sites i want to suggest:

Learning Modern 3D Graphics Programming, is a great tutorial about OpenGL basics.

u/Shark_Kicker · 4 pointsr/javascript

FFS... just stop. This is NOT a Mediator Pattern

Three articles that are either partially or completely wrong in three days? Just. Stop. Go get this book... read it... then try again.

You can't just "invent" patterns and name them after existing patterns because you named one of your objects "mediator".

A Mediator Pattern in JS explained by someone who knows what he's talking about

u/DashAnimal · 4 pointsr/compsci

I know this is a pretty common recommendation and you've probably already heard of it
or even read it, but can I recommend the book Code: The Hidden Language of Computer Hardware and Software? I think having a history of how we got to where we are today (written in an entertaining way) is a good starting point, even though it barely scratches the surface of computer science.

u/_9_9_ · 4 pointsr/learnpython

This book is supposed to be good: https://www.amazon.com/dp/B00JDMPOK2/

I've yet to read it. I've been messing with computers for a long, long time. But, at some point I think most people agree that they are magic.

u/AnalyzeAllTheLogs · 4 pointsr/learnprogramming

Although more about product delivery and lifecycle management, i'd recommend:

[No audiobook, but worth the read] The Mythical Man-Month, Anniversary Edition: Essays On Software Engineering https://www.amazon.com/dp/B00B8USS14/

[No audiobook, but about 1/3 the price at the moment for kindle and really good]
Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) https://www.amazon.com/dp/B00JDMPOK2/

https://www.amazon.com/Dreaming-Code-Programmers-Transcendent-Software/dp/B00AQ5DOCA

https://www.amazon.com/Scrum/dp/B00NHZ6PPE

u/Nuclear-Cheese · 4 pointsr/gamedev

I also highly recommend for game developers lacking in math skills to check out 3D Math Primer for Graphics and Game Development. Unlike this book that is often recommended, I feel it does a better job for people who don't have a formal education in advanced mathematics or Computer Science who are interested in math directly relating to game development.

u/yanalex981 · 4 pointsr/computerscience

I taught myself bits in high school with "C++ for Everyone". Despite its rating, I thought it was good 'cause it has exercises, and I did a lot of them. Works really well for laying foundations. I didn't go through the whole book though, and knowing the language is only part of the battle. You need to know about algorithms and data structures as well. For graphics, trees seem really useful (Binary space partitioning, quadtrees, octrees etc).

After university started, I read parts of "C++ Primer", which was when the language really started making sense to me. You'll get more than enough time to learn the required amount of C++ by next fall, but CG is heavy in math and algorithms. If your CS minor didn't go over them (much), my old algorithms prof wrote a free book specifically for that course.

For using OpenGL, I skimmed the first parts of "OpenGL SuperBible". For general graphics, I've heard good things about "Mathematics for 3D Game Programming and Computer Graphics", and "Real-Time Rendering".

Careful with C++. It may deceptively look like Java, but honestly, trying to write good idiomatic C++ after years of Java took a major paradigm shift

u/WeoDude · 4 pointsr/datascience

I don't have a tutorial for TensorFlow, but Hands on Machine Learning with Scikit-Learn and TensorFlow (https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=sr_1_1?ie=UTF8&amp;amp;qid=1500494347&amp;amp;sr=8-1&amp;amp;keywords=hands+on+machine+learning) should basically be the bible of machine learning implementation.

XGboost, the best way I learned it, Is through looking at Kaggles.

u/loveleis · 4 pointsr/brasil

Inteligência Artificial é de longe o maior problema da humanidade. O risco dela causar a extinção, ou pior, a criação de quantidades astronômicas de sofrimento é bastante alta, e pouquíssimas pessoas estão se dedicando a solucionar o problema.

A quem se interessar pesquisem por "AI alignment" no Google.

EDIT: Pra quem tiver interesse:

https://en.wikipedia.org/wiki/Friendly_artificial_intelligence

https://en.wikipedia.org/wiki/Existential_risk_from_artificial_general_intelligence

Playlist do numberphile que dá uma introduzida muito boa no tema

TED do Sam Harris sobre o assunto

Pra quem tiver muito interesse no assunto, o livro Superintelligence do pesquisador Nick Bostrom, da University of Oxford é o responsável por "evangelizar" muita gente no assunto, inclusive Elon Musk e Bill Gates (que já comentaram sobre o livro). Mole de achar versão dele em pdf na internet.

u/FieryPhoenix7 · 4 pointsr/cscareerquestions

If you're looking to actually learn the stuff, then you will need to get textbooks which are plentiful. But if you're looking to read about the philosophical side of the topic, I suggest you start with Nick Bostrom's Superintelligence.

Oh, and make sure you watch Her and Ex Machina if you haven't already ;)

Eclipse Phase is a great way to find plot hooks, they're littered in all of the source books. It's also free, so just check it out even if you want to look at pretty pictures.

I'm also reading Superintelligence and that book is basically a section by section deconstruction of why building a Seed AI (a self improving AI, a staple of the sci-fi genre) will end badly.

u/grahamboree · 4 pointsr/gamedev

The Starcraft Broodwar API has source code for a bunch of bots from the annual competition at AIIDE. You can find them here. They use a variety of techniques that will help you set you in the right direction.

I'd recommend this book too if you're interested in AI. It's the most comprehensive survey of the most common techniques used in the industry today.

Good luck!

u/marekkpie · 4 pointsr/gamedev

Programming Game AI by Example is another great resource.

u/RoguelikeDevDude · 4 pointsr/gamedev

Book suggestions? Now that's my jam.

Out of all the books i've read, here are my recommendations regarding game programming:

Eric Lengyel's Books (only one out so far). This is aimed at game engine development, but if the 2nd onward are as indepth as the first, they will be amazing fundamental knowledge. Also, they're not thick, and jam packed with information.

Game Programming Patterns. The only book that comes more recommended than this is the one right below it by Jesse Schell. This book is fantastic, but you should write one or two small games to really get the most out of this book. You can also read it online on his website free, but then you don't get a pic of him and his dog on the back cover.

Book of Lenses. This is your intro/intermediate dive into game design. There are a lot of game design books, if you only read one, it should be this one.

Gane AI By Example. This book is a hodgepodge of fantastic techniques and patterns by those in AAA. There are other books on the series (like Game AI Pro) which are similar, but in my opinion (at least when I read AI PRO 3), they're not as good. But more knowledge is never bad.

Truthfully, as I sit here looking over all my books, those are the only ones i'd consider mandatory for any seasoned developer. Of course plenty of developers get by without reading these books, but they likely pick up all the principles listed herein elsewhere, in bits and pieces, and would likely have benefited having read them early on.

Here are a few others that I do recommend but do NOT consider mandatory. Sorry, no links.

Unity in Action. Personally, I recommend this or a more interactive online course version (udemy.com/unitycourse) if you want to learn unity while having a resource hold your hand. Having read the book, taken the course, AND taken Unity's own tutorials on the matter, i'd order them in order from Course being best, book second, videos from unity third. But none of them are bad.

Game Engine Architecture. This is the king for those who want a very broad introduction to making a game engine. It comes highly recommended from nearly anyone who reads it, just so long as you understand it's from a AAA point of view. Game Code Complete is out of print and unlikely to be revisited, but it is similar. These are behemoths of books.

Realtime rendering. This is one I haven't read, but it comes very highly recommended. It is not an intro book, and is also over 1000 pages, so you want this along side a more introductory book like Fundamentals of computer graphics. Truth be told, both books are used in courses in university at the third and fourth year levels, so keep that in mind before diving in.

Clean code. Yeah yeah it has a java expectation, but I love it. It's small. Read it if you understand Java, and want to listen to one of the biggest preachers on how not to write spaghetti code.

Rimworld guy, Tynaan sylvester I believe, wrote a book called Designing Games. I enjoyed it, but IMO it doesn't hold a candle to Jesse Schell's book. Either way, the guy did write that book after working in AAA for many years, then went on to create one of the most successful sim games in years. But yeah, I enjoyed it.

Last but not least, here are some almost ENTIRELY USELESS but interesting diagrams of what some people think you should read or learn in our field:

https://github.com/miloyip/game-programmer

https://github.com/P1xt/p1xt-guides/blob/master/game-programming.md

u/OverQualifried · 4 pointsr/Python

I'm freshening up on Python for work, and these are my materials:

Mastering Python Design Patterns https://www.amazon.com/dp/1783989327/ref=cm_sw_r_cp_awd_kiCKwbSP5AQ1M

Learning Python Design Patterns https://www.amazon.com/dp/1783283378/ref=cm_sw_r_cp_awd_BiCKwbGT2FA1Z

Fluent Python https://www.amazon.com/dp/1491946008/ref=cm_sw_r_cp_awd_WiCKwbQ2MK9N

Design Patterns: Elements of Reusable Object-Oriented Software https://www.amazon.com/dp/0201633612/ref=cm_sw_r_cp_awd_fjCKwb5JQA3KG

I recommend them to OP.

u/reflectiveSingleton · 4 pointsr/webdev

I would add design patterns to that list...they are extremely helpful and it essentially what modern frameworks try to put in place.

It's why MVC/etc exists, and if you are fluent in many design patterns then you can mix/match/modify the appropriate one to your current problem set. Also, things like that transfer to non-backend development if you ever decide to go that route.

See books such as Design Patterns: Elements of Reusable Object-Oriented Software - written by the 'gang of four'...and other related and more modern derivatives of this.

u/cjrun · 4 pointsr/cscareerquestions

Everybody's learning style is different. Here are some books I believe to be essential for any novice or pro.

Programming For Dummies. It has a stupid title, but it is well reviewed for good reasons. I read through this beast in three weeks. There is no coding involved, as it is mostly theory, but it covers most of the bases of computer science and programming logic. Looking back, much of it confused me at first read, but the big ideas are all presented here. Reading this during the summer before first semester was a huge boost for me. All of the major computer languages are discussed in the book.

Cracking the Coding Interview. A book meant for veterans trying to get into highly demanding top tech companies, the book is a great introduction to programming paradigms. There are numerous examples of problems in each chapter with answers at the back of the book. The whole thing is in Java, with a short chapter on C++.

Design Patterns. As you learn more about object oriented programming, the concept of design is introduced. This book is the holy grail of software architecture and recommended by many. I would hold off acquiring it until you are certain that CS is where you want to be, it is quite technical. This book follows C++, although a Java version of the patterns exists on Github.com

A non-technical book just for fun:
The Innovators is essentially the story of computer science and how it got to present day. It follows the characters, human beings, that were involved each step of the way right up until modern day. Your professors will be impressed that you know who Alan Turing, Grace Hopper, and Charles Babbage were. If only I had been at THE MOTHER OF ALL DEMOS! The actual stories of Microsoft, Apple, The internet, the PC, video games, the space program, etc. On Quiz Up, a trivia app, every other question in the CS category involves names from this book. Read it just to be a real geek that knows where this stuff came from, and the drama/tension that led to innovation. The book is actually really funny at times.

u/ckdarby · 4 pointsr/PHP

To have a better idea of the type of person I am these are the books within my arms reach right now:

Design Patterns: Elements of Reusable Object-Oriented Software

[Refactoring: Improving the Design of Existing Code](
http://www.amazon.ca/gp/product/0201485672)

The Mythical Man-Month

Along with some other ~50 similar books I've read.

u/j-dev · 4 pointsr/learnprogramming

There are books out there, many of which are unfortunately not language agnostic, that deal with this. What you want to know is the basics of object oriented design and, most importantly, design patterns, which are general answers for recurring object-oriented design challenges. You may have to dabble into languages other than the one(s) you currently use in order to follow along.

u/ginger_beer_m · 4 pointsr/dogecoin

If you just try to eyeball patterns from historical charts, I guarantee you will see it because that's just what the brain has evolved to do: spotting patterns well (e.g. Jesus on a toast), even when it's actually due to random chance. That's also why most of the so-called technical 'analysis' are bullshit.

Instead, approach this in a systematic and principled manner. You can try check out this book to get an idea what I'm talking about: Pattern Recognition and Machine Learning. This is the standard grad-level introduction to the field, but might be rather heavy for some. An easier read is this one. You can find the PDF of these books online through some searching or just head to your local library. Approaching the problem from a probabilistic and statistical angle also lets you know the extent of what you can predict and more importantly, what the limitations are and when the approach breaks down -- which happens a lot actually.

TL;DR: predicting patterns is hard. That's why stats is the sexy new job of the century, alongside with 'data science' (hate that term uuurgh).

u/Jimbo_029 · 4 pointsr/ECE

Bishop's book Pattern Recognition and Machine Learning is pretty great IMHO, and is considered to be the Bible in ML - although, apparently, it is in competition with Murphy's book Machine Learning: A Probabilistic Approach. Murphy's book is also supposed to be a gentler intro. With an ECE background the math shouldn't be too difficult to get into in either of these books. Depending on your background (i.e. if you've done a bunch of information theory) you might also like MacKay's book Information Theory, Inference and Learning Algorithms. MacKay's book has a free digital version and MacKay's 16 part lecture series based on the books is also available online.

While those books are great, I wouldn't actually recommend just reading through them, but rather using them as references when trying to understand something in particular. I think you're better off watching some lectures to get your toes wet before jumping in the deep end with the books. MacKay's lectures (liked with the book) are great. As are Andrew Ng's that @CatZach mentioned. As @CatZach mentioned Deep Learning has had a big impact on CV so if you find that you need to go that route then you might also want to do Ng's DL course, though unlike the courses this one isn't free :(.

Finally, all of the above recommendations (with the exception of Ng's ML course) are pretty theory driven, so if you are more of a practical person, you might like Fast.AI's free deep learning courses which have very little theory but still manage to give a pretty good intuition for why and how things work! You probably don't need to bother with part 2 since it is more advanced stuff (and will be updated soon anyways so I would try wait for that if you do want to do it :))

Good luck! I am also happy to help with more specific questions!

u/Canoli85 · 4 pointsr/MachineLearning

Are you referring to Machine Learning: A Probabilistic Perspective? (link to Amazon)

u/blackkettle · 4 pointsr/math

take a look at Pattern Recognition an Machine Learning by Bishop,

http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738

it's an excellent text, though not for the faint of heart. just the first chapter should provide you with a great answer to your question.

u/proverbialbunny · 4 pointsr/awakened

lawl, that's a fun one.

Some fun with semantics: This isn't going to fit into words right, so you're going to have to explore it to understand it, but you do have a choice, but control isn't quite what it seems to be. You obviously get the bit about control, but calling it a choice is misleading. I made the same mistake for a while, until I tried explaining it to people and realized the misunderstanding:

A choice is when there are multiple options, and you pick the best option. You're still picking that option, despite the delusion of control. Even if there is no you, and control is made up, there is still a choice.. a decision, a process. It just isn't real; choice is formless, it is language, it is psychological.

Have you explored consciousness yet? If you're the type that likes to nerd out and go beyond simple teachings checkout I Am a Strange Loop and it's more advanced cousin Gödel, Escher, Bach: An Eternal Golden Braid.

u/tuber · 4 pointsr/atheism

If I understand you correctly, the principle you've stumbled upon was mathematically proven by Kurt Goedel in 1931. I think you would enjoy this book a lot. It won a Pulitzer prize.

If you don't understand why you are getting downvoted, it is due to the difference between the centre for thought and the origin of consciousness. If you don't think there's a difference, go educate yourself; there are many resources. You might try Gödel, Escher, Bach, for starters.

u/HazzyPls · 4 pointsr/math
u/jsprogrammer · 4 pointsr/science

If that question interests you you'd probably enjoy Godel, Escher, Bach

u/scottklarr · 4 pointsr/books
u/Bizkitgto · 4 pointsr/learnprogramming

The book you are looking for is called Code: The Hidden Language of Computer Hardware and Software by Charles Petzold!

u/neop · 4 pointsr/compsci

I'm also a math major who turned into CS. There are already a lot of good recommendations here so I won't add much, but I suggest reading Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

It's not very technical and it's not in-depth, but I think it's an amazing book. You probably won't learn anything you're actually going to use by reading it, but I think this book has a unique ability for expressing the underlying facts that make us all find computer science so fascinating. It's a very fun read and it will give you a very broad overview of how computers work and how software gets compiled and ultimately ends up moving electrons around to make the magic happen.

u/ismtrn · 4 pointsr/compsci

This book(Code: This hidden language of computer hardware and software): http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;qid=1333490779&amp;amp;sr=8-1

It explains it all! The title makes it sound like it is about code, but it is really about how a computer works(code is of course a part of it). It is very easy to read and does not really require any prior knowledge, it actually starts by explaining how a flashlight works and builds on that.

I simply can't describe how awesome it is, you should really read it!

u/mattandersen · 4 pointsr/compsci

You may be beyond this book, or it may not be full of the harder science of logic design but every CE or CS student should have a copy of CODE from Charles Petzold. It will probably fill in a lot of gaps of a formal classroom discussion of processor architecture, and it provides a great set of tools to explain the concepts to others. Which for me has always been the benchmark of understanding. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/notsointelligent · 4 pointsr/programming

If you like the video, you'll love the book!

u/JungianMisnomer · 4 pointsr/compsci

u/chakke_ooch · 4 pointsr/mbti

&gt; Would you say there's more opportunity working exclusively front end and design to exercise nfp creativity or novelty?

NFP creativity and novelty in the sense that Ne has free range, period? Sure, you get more of that in web design and even more of that as to step further and further away from the sciences. There is tons of creativity in real software engineering where you can be creative to solve actually challenging problems, not figuring out what color you'd like a button to be. To me, that's not creativity – or it's a lesser version. Creativity in problem solving is much more interesting. The way I see it is like when I was in music school and all the SFs were bitching about music theory and how they thought it limited their ability to "be creative". Such bullshit. It only exposes their lack of creativity. So you're saying that someone like Chopin who wrote amazing pieces and abided by the rules of music theory wasn't being creative? Hardly.

&gt; Are you a web dev?

No, I'm a software engineer at an astrodynamics company; I do a lot of orbital mechanics, back-end work with web services, high performance computing, etc.

&gt; By hardcore I meant requiring being meticulous, detail oriented.

I think that the lack of attention to detail is never permissible in either back-end software engineering or front-end web development, honestly.

&gt; One thing I've realized is how shit my high school was at explaining math conceptually. Which I think lead to misconceptions about its use in programming

Well, then read some books on computer science and/or mathematics like this.

That is an incredibly broad question. Without knowing what you've already studied, it's hard to recommend things. Most of the aerospace and mechanical engineers I know use pre-packaged programs rather than writing their own scripts, etc.

Artificial intelligence might be the best one, though. Russel and Norvig is the standard textbook: https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597

The plus side to learning about AI is that it is not really programming intensive - it's logic and statistics intensive.

If you want to go the programming route, it gets a little hairier. The reason is that advanced systems designs will take a lot of initial classes just to get you to a level where you are comfortable programming and can then think about design and program flow.

Take an intro course. I learned programming with C / C++ and Matlab. Recommend those since it's easier to blow your foot off when programming. Once you understand how to design programs, what functions are, how program control can be passed off, move over into Python (much easier to pick up and run with and much better supported).

You might also benefit from a databases or Big Data class due to the amount of data generated from an aircraft.

Regular expressions and scripting is another option. But that's good for anyone.

u/koderpat · 4 pointsr/learnprogramming
u/anon35202 · 4 pointsr/artificial

Does someone have a copy of the leaked self driving car code and post it on github?

Heck, even a reasonable implementation of Thrun's Simultaneous localization and mapping algorithm and embedded A star all wrapped in the AI code would be nice.

https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping

He talks about it in Chapter 25 section 3 of: https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1487948083&amp;amp;sr=1-1&amp;amp;keywords=ai+a+modern+approach

He describes it in: https://www.udacity.com/course/artificial-intelligence-for-robotics--cs373

But he only describes how you would implement it, he doesn't hand out the finished code.

Gimme.

u/Soupy333 · 4 pointsr/Fitness

If you're interested in this stuff (and just getting started), then I highly recommend this book - http://www.amazon.com/Artificial-Intelligence-Modern-Approach-Edition/dp/0136042597

When you're ready to go deeper, then this one is even better http://www.amazon.com/Machine-Learning-Tom-M-Mitchell/dp/0070428077/ref=sr_1_2?s=books&amp;amp;ie=UTF8&amp;amp;qid=1341852604&amp;amp;sr=1-2&amp;amp;keywords=machine+learning

That second book is a little older, but all of its algorithms/techniques are still relevant today.

u/mhatt · 4 pointsr/compsci

I would repeat jbu311's point that your interests are way too broad. If you're interested in going into depth in anything, you'll have to pick a topic. Even the ones you mentioned here are fairly broad (and I'm not sure what you meant about concurrency and parallelization "underscoring" AI?).

If you want to learn about the field of natural language processing, which is a subfield of AI, I would suggest Jurafsky and Martin's new book. If you're interested more broadly in AI and can't pick a topic, you might want to check out Russell &amp; Norvig (although you might also want to wait a few months for the third edition).

u/sciencifying · 4 pointsr/compsci

It is hard to answer this question without knowing your background. If you are really interested, I suggest you read this book (especially part three) on Artificial Intelligence so you can understand how automated theorem proving relates to AI. In my opinion, automated theorem proving is not a particularly interesting problem in modern artificial intelligence, since representing real-world problems using symbolic logic is almost always impractical.

However, the problem is still interesting for computer assisted theorem proving, and boolean satisfiability is a very important problem in the theory of computation.

u/Artaxerxes3rd · 4 pointsr/Futurology

Stuart Russell, the man who literally wrote the book on AI, is concerned.

Plenty of prestigious people on the cutting edge of the research in the field are concerned.

Just because you've only heard the household-name-level famous people talk about it, it doesn't mean that the genuine, in-the-thick-of-it experts aren't concerned either.

As for the 10~20 years figure, you're right that it is unlikely that AI will be made in that timeframe. However, the claim was merely that it is possible to create with enough resources in that timeframe, which I think is reasonable. Since you care about what the experts think, here is a summary of the best information we have about when they think this will happen.

&gt;Median estimates for when there will be a 10% chance of human-level AI are all in the 2020s (from seven surveys).

&gt;Median estimates for when there will be a 50% chance of human-level AI range between 2035 and 2050 (from seven surveys)

___
AI: A Modern Approach is the best textbook on AI by far

u/smidley · 4 pointsr/Transhuman

This one was a pretty good read.
The Singularity Is Near

u/YoYossarian · 4 pointsr/technology

Here's one that I just ordered. It comes with a recommendation from Elon Musk as well. This is a subject Kurzweil discusses at length in his books, though his approach is far more optimistic. He avoids the cataclysm by saying humans and AGI will work together as one, but his point basically concedes humanity's destruction if we don't cooperate/merge.

u/narwi · 4 pointsr/Futurology
u/hoolaboris · 4 pointsr/math

Concrete mathematics by Donald Knuth, Ronald Graham, and Oren Patashnik

u/Noamyoungerm · 4 pointsr/learnprogramming

You can program without math, but you'll run into limits. Even math at a high school level will totally change the way you look at and think about some parts of programming.

This is something that I can attest to personally because I began programming with a 3rd grade math level myself. I can't really say what part math had in my perspective on programming, because I was also in the process of growing out of third grade, but when math finally "clicks" somewhere between high school and college, you learn to tackle these things differently.

You can program without math, but if you know the math you'll have a better understanding of what you're doing. You'll look at some problem you're trying to solve and say "hey, that looks awfully similar to a set of equations, instead of trying to solve them inside my program I should solve them by hand and plug in the solution".

Algebra is the really important one. Calculus also doesn't hurt. Trig is a must only if you see yourself doing anything related to graphics or games in the future. I also highly reccomend Concrete Math, but to understand that text you'll have to have a solid grasp of calculus.

u/paultypes · 3 pointsr/programming

Common Lisp remains a touchstone. I highly recommend installing Clozure Common Lisp and Quicklisp and then working through Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp and Artificial Intelligence: A Modern Approach with them. Although I'm now firmly in the statically-typed functional programming world, this has been part of my journey, and it will change how you think about programming.

u/groundshop · 3 pointsr/artificial

Here's the course webpage for an intro AI course from a good professor on the topic

Good overall book on the topic (Russel &amp; Norvig - AI: A Modern Approach)

u/Rigermerl · 3 pointsr/rmit

I think they use this:

https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597

Decent book (the bible for AI apparently).

u/KnightOfDark · 3 pointsr/artificial

If you have a rudimentary understanding of algorithms, I would suggest Artificial Intelligence: A Modern Approach, by Stuart Russel and Peter Norvig. The book is comprehensive, well-written, and covers a wide area of different techniques and approaches within AI. Be aware that the book is written as a textbook, so do not expect philosophy or speculation inside - only what is possible and feasible given current state-of-the-art.

u/blindConjecture · 3 pointsr/MachineLearning

That was a phenomenal article. Extremely long (just like every piece of writing associated with Hofstadter), but excellent nonetheless. I'm admittedly sympathetic to Hofstadter's ideas, not the least of which because of my combined math/cognitive science background.

There was a quote by Stuart Russell, who helped write the book on modern AI, that really stood out to me, and I think expresses a lot of my own issue with the current state of AI:

“A lot of the stuff going on is not very ambitious... In machine learning, one of the big steps that happened in the mid-’80s was to say, ‘Look, here’s some real data—can I get my program to predict accurately on parts of the data that I haven’t yet provided to it?’ What you see now in machine learning is that people see that as the only task.”

This is one of the reasons I've started becoming very interested in ontology engineering. The hyperspecialization of today's AI algorithms is what makes them so powerful, but it's also the biggest hindrance to making larger, more generalizable AI systems. What the field is going to need to get past its current "expert systems" phase is a more robust language through which to represent and share the information encoded in our countless disparate AI systems. \end rant

u/weelod · 3 pointsr/artificial

piggybacking on what /u/T4IR-PR said, the best book to attack the science aspect of AI is Artifical Intelligence: A Modern Approach. It was the standard AI textbook when I took the class and it's honestly written very well - people with a basic undergraduate understanding of cs/math can jump right in and start playing with the ideas it presents, and it gives you a really nice outline of some of the big ideas in AI historically. It's one of the few CS textbooks that I recommend people buy the physical copy of.

Note that a lot of the field of AI has been moving more towards ML, so if you're really interested I would look into books regarding that. I don't know what intro texts you would want to use, but I personally have copies of the following texts that I would recommend

• Machine Learning (Murphy)
• Deep Learning Book (Goodfellow , Bengio)

and to go w/ that

• All of Statistics (Wasserman)
• Information Theory (Mackay)

for some more maths background, if you're a stats/info theory junky.

After all that, if you're more interested in a philosophy/theoretical take on AI then I think Superintelligence is good (I've heard?)

I can recommend this (free) course:
https://www.udacity.com/course/intro-to-artificial-intelligence--cs271

You certainly dont need a degree (it helps of course) but most you need is dedication and perserverance.

In regards to math you need a good (more than)-basic understanding of statistics, linear algebra, algorithms and you also need to develop good data analysis skills.

If you want to get serious with AI this book is fantastic (atleast it helped(still does) me alot): https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597/ref=sr_1_2?ie=UTF8&amp;amp;qid=1506722436&amp;amp;sr=8-2&amp;amp;keywords=artificial+intelligence+a+modern

and by the way check out this thread maybe:

u/nimblerabit · 3 pointsr/compsci

I learned mostly through reading textbooks in University, but not many of the books we were assigned stood out as being particularly great. Here's a few that I did enjoy:

u/ideophobic · 3 pointsr/Futurology

Lets do some math.

My son was born this year. With an average life span of 76 years, he should most likely die by 2090. But, I will also make the assumption that in the years between 2014 and 2090 we will find ways to advance the average life span to a little bit longer, lets say 30 years. So now, his average life span is 106 years and the "death year' is extended to 2120.

But between 2090 and 2020 science will continue to advance and we will probably have a life expectance of 136 years by then, which now make his death year "2050". And so forth until science finds a way to keep him alive for ever. Even if it takes the better part of a century, some of the younger people will still be beyond the cutoff.

Now. If you actually talk to real scientists who have studied this in much more detail, they are saying that this will not take a century, and should just take a few decades to achieve " escape velocity" for immortality. There is a book written about this, and how in the next few decades, we will unlock the masteries of aging. The Singularity Is Near: When Humans Transcend Biology

u/TehGinjaNinja · 3 pointsr/confession

There are two books I recommend to everyone who is frustrated and/or saddened by the state of the world and has lost hope for a better future.

The first is The Better Angels of Our Nature by Stephen Pinker. It lays out how violence in human societies has been decreasing for centuries and is still declining.

Despite the prevalence of war and crime in our media, human beings are less likely to suffer violence today than at any point in our prior history. The west suffered an upswing in social violence from the 1970s -1990s, which has since been linked to lead levels, but violence in the west has been declining since the early 90s.

Put simply the world is a better place than most media coverage would have you believe and it's getting better year by year.

The second book I recomend is The Singularity is Near by Ray Kurzweil. It explains how technology has been improving at an accelerating rate.

Technological advances have already had major positive impacts on society, and those effects will become increasingly powerful over the next few decades. Artificial intelligence is already revolutionizing our economy. The average human life span is increasing every year. Advances in medicine are offering hope for previously untreatable diseases.

Basically, there is a lot of good tech coming which will significantly improve our quality of life, if we can just hang on long enough.

Between those two forces, decreasing violence and rapidly advancing technology, the future looks pretty bright for humanity. We just don't hear that message often, because doom-saying gets better ratings.

I don't know what disability you're struggling with but most people have some marketable skills, i.e. they aren't "worthless". Based on your post, you clearly have good writing/communicating skills. That's a rare and valuable trait. You could look into a career leveraging those skills (e.g. as a technical writer or transcriptionist) which your disability wouldn't interfere with to badly (or which an employer would be willing to accommodate).

As for being powerless to change the world, many people feel that way because most of us are fairly powerless on an individual level. We are all in the grip of powerful forces (social, political, historical, environmental, etc.) which exert far more influence over our lives than our own desires and dreams.

The books I recommended post convincing arguments that those forces have us on a positive trend line, so a little optimism is not unreasonable. We may just be dust on the wind, but the wind is blowing in the right direction. That means the best move may simply be to relax and enjoy the ride as best we can.

u/bombula · 3 pointsr/Futurology

Any futurist or regular reader of /r/futurology can rehearse all of the arguments for why uploading is likely to be feasible by 2100, including the incremental replacement of biological neurons by artificial ones which avoids the "copy" issue. If you're not already familiar with these, the easiest single reference is probably The Singularity is Near.

u/maurice_jello · 3 pointsr/elonmusk

Read Superintelligence. Or check out Bostrom's TED talk.

u/j4nds4 · 3 pointsr/elonmusk

&gt;Really? It's still their opinion, there's no way to prove or disprove it. Trump has an opinion that global warming is faked but it doesn't mean it's true.

From my perspective, you have that analogy flipped. Even if we run with it, it's impossible to ignore the sudden dramatic rate of acceleration in AI capability and accuracy over just the past few years, just as it is with the climate. Even the CEO of Google was caught off-guard by the sudden acceleration within his own company. Scientists also claim that climate change is real and that it's an existential threat; should we ignore them though because they can't "prove" it? What "proof" can be provided for the future? You can't, so you predict based on the trends. And their trend lines have a lot of similarities.

&gt;Also, even if it's a threat(i don't think so, but let's assume it is), how putting it in your brain will help? That's kind of ridiculous. Nowadays you can turn your PC off or even throw it away. You won't be able to do that once it's in your brain. Also, what if the chip decides to take control over your arms and legs one day? It's insane to say that AI is a threat but to plan to put it inside humans' brain. AI will change your perception input and you will be thinking you are living your life but in reality you will be sitting in a cell somewhere. Straight up some Matrix stuff. Don't want that.

The point is that, in a hypothetical world where AI becomes so intelligent and powerful that you are effectively an ant in comparison, both in intelligence and influence, a likely outcome is death just as it is for billions of ants that we step on or displace without knowing or caring; think of how many species we humans have made extinct. Or if an AI is harnessed by a single entity, those controlling it become god-like dictators because they can prevent the development of any further AIs and have unlimited resources to grow and impose. So the Neuralink "solution" is to 1) Enable ourselves to communicate with computer-like bandwidth and elevate ourselves to a level comparable to AI instead of being left in ant territory, and 2) make each person an independent AI on equal footing so that we aren't controlled by a single external force.

It sounds creepy in some ways to me too, but an existential threat sounds a lot worse. And there's a lot of potential for amazement as well. Just like with most technological leaps.

I don't know how much you've read on the trends and future of AI. I would recommend Nick Bostrom's book "Superintelligence: Paths, Dangers, Strategies", but it's quite lengthy and technical. For a shorter thought experiment, the Paperclip Maximizer scenario.

Even if the threat is exaggerated, I see no problem with creating this if it's voluntary.

u/mdd · 3 pointsr/TheTeslaShow
u/Mohayat · 3 pointsr/ElectricalEngineering

Read Superintelligence by Nick Bostrom , it answered pretty much all the questions I had about AI and learned a ton of new things from it. It’s not too heavy on the math but there is a lot of info packed into it, highly recommend it.

u/NotebookGuy · 3 pointsr/de

Ein Beispiel wäre, dass sie die Menschen als Hindernis in ihrem Plan ansieht. Es gibt da dieses Beispiel der Maschine, die dafür gebaut wird die Herstellung von Büroklammern zu optimieren. Diese Maschine könnte zum einen indirekt die Menschheit auslöschen indem sie die Erde mit Fabriken vollballert und für uns unbewohnbar macht. Zum anderen könnte sie es auch direkt anstreben, weil sie sich denkt: "Der Mensch nimmt Platz für Fabriken ein. Wenn er weg wäre, wäre mehr Platz für Fabriken." Oder aber sie folgt der folgenden Logik und hat etwas ganz anderes mit den Menschen vor, was wir - da sie schließlich superintelligent ist - uns nicht vorstellen geschweige denn begreifen können:
&gt; The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else. - Eliezer Yudkowsky

Will sagen: Sie muss für das Auslöschen der Menschheit keine niederen Beweggründe haben. Sie erfüllt einfach ihren Auftrag. Ohne Rücksicht auf die Menschheit.

Zum Abschluss noch eine Buchempfehlung zu dem Thema: Superintelligence: Paths, Dangers, Strategies

u/blank89 · 3 pointsr/Futurology

If you mean strong AI, there are many pathways for how we could get there. 15 years is probably a bit shorter than most expert estimates for mind scanning or evolution based AI. This book, which discusses different methods, will be available in the states soon:
http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111/ref=sr_1_1?ie=UTF8&amp;amp;qid=1406007274&amp;amp;sr=8-1&amp;amp;keywords=superintelligence

&gt; We went from the horse and buggy to landing on the moon in 80 years

Past events are not necessarily good indicators of future events. In this case, faster computers are a mechanism for bringing about AI faster. How much faster we get in how much time will probably be the influencing factor in all this. There is quite a bit of uncertainty surrounding whether that will be post-silicon or not. We don't have post-silicon computing up and running yet.

The other factor may be incentive. Maybe specific purpose AI will meet all such demand for the next 20 years, and nobody will have any incentive to create strong AI. This is especially true given the risks of creating strong AI (both to the world and to the organization or individual who creates the AI).

u/DisconsolateBro · 3 pointsr/Futurology

&gt;Given what Musk does with other technologies, he is by no means a luddite or a technophobe. He's seen something that's disturbing. Given the guys track record, it's probably worth investigating

I agree. There's also a point to be made that one of the recent books Musk mentioned he read in a few interviews (and was acknowledged by the author in, too) was this http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111

I started reading it a few nights ago. It's painting an interesting picture about the future of AI. I'm looking forward to finishing it to discuss further

u/rubbernipple · 3 pointsr/Showerthoughts

Someone else beat me to it. Here you go.

u/ImNot_NSA · 3 pointsr/technology

Elon Musk's fear of AI was amplified by the nonfiction book he recommended called SuperIntelligence. It is written by an Oxford professor and it's scary http://www.amazon.com/gp/aw/d/0199678111/ref=mp_s_a_1_cc_1?qid=1414342119&amp;amp;sr=1-1-catcorr&amp;amp;pi=AC_SX110_SY165_QL70

u/dangkhoasdc · 3 pointsr/compsci

Concrete Mathematics - one of the best textbooks to study discrete math

u/bonesingyre · 3 pointsr/coursera

Just my 2 cents: The Stanford Algorithms class is more about designing algorithms. The Princeton Algorithms class is more about implementation and real world testing.

The FAQ at the bottom:

How does Algorithms: Design and Analysis differ from the Princeton University algorithms course?

The two courses are complementary. That one emphasizes implementation and testing; this one focuses on algorithm design paradigms and relevant mathematical models for analysis. In a typical computer science curriculum, a course like this one is taken by juniors and seniors, and a course like that one is taken by first- and second-year students.

As a computer science student, I would encourage you to pick up a book on Discrete Mathematics, and pick up Robert Sedgwick's Algorithm's textbook. Sedgwick's Algorithms book is more about implementing algorithms, compared to CLRS, which is another algorithms textbook written by some very smart guys. CLRS is far more in depth.

I took a Data Structures and Algorithms class recently, we used Sedgwick's textbook. I will be taking another ALgorithms &amp; Design class later using CLRS.

Books:
http://www.amazon.com/Discrete-Mathematics-Applications-Susanna-Epp/dp/0495391328/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1372267786&amp;amp;sr=1-1&amp;amp;keywords=discrete+mathematics
http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1372267775&amp;amp;sr=1-1&amp;amp;keywords=algorithms
http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_1_1?ie=UTF8&amp;amp;qid=1372267766&amp;amp;sr=8-1&amp;amp;keywords=clrs
http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1372267798&amp;amp;sr=1-1&amp;amp;keywords=theory+of+computation

The last book is super important for CS students, I would read that front to back as well.

Computer Science is a pretty big field, so "strong foundation" can mean different things to different people.
You will definitely want the following:

1. Introduction to Algorithms and Data Structures
2. Introduction to Computability
3. Introduction to Operating Systems

For algorithms and data structures, a very commonly used textbook is Cormen.
For computability, Sipser.

Operating Systems I don't remember off the top of my head.
That said, you are probably much better off finding a high-quality university course that is based on these textbooks instead of trying to read them cover-to-cover yourself. Check out lecture series from places like MIT on youtube or whatever.

After that, you can take an Intro to Artificial Intelligence, or Intro to Communication Networks, or any other intro-level course to a more specific sub-area. But if you lack basis in computability to the point where you don't know what an NP-Complete problem is, or have no idea what a Binary Search Tree is, or do not know what an Approximation Algorithm is, then it would be hard to say you have a strong foundation in CS.
u/dionyziz · 3 pointsr/cryptography

Hi,

You should already know most of the math you need to know from your math major. It helps to know number theory, group theory, and algebraic curves, depending on what you do. Important knowledge is also discrete mathematics: Discrete probability theory, Markov chains, graph theory, logic, proof methods, solving recurrences, etc. are all helpful tools.

In terms of computer science, it's imperative to know how to program. You can learn a programming language such as Python. Project Euler is a good place to start for a mathematician. Knowledge of algorithms is also important, and you must understand computability and complexity theory.

Stanford's Cryptography I is a good place to start learning cryptography. I think you can go ahead and start the course without attempting prerequisites and see where you get stuck to go and learn what is required of you.

u/maruahm · 3 pointsr/math

I think learning proofs-based calculus and linear algebra are solid places to start. To complete the trifecta, look into Arnold for a more proofy differential equations course.

After that, my suggestions are Rudin and, to build on your CS background, Sipser. These are very standard references, though Rudin's a slightly controversial suggestion because he's notorious for being terse. I say, go ahead and try it, you might find you like it.

As for names of fields to look into: Real Analysis, Complex Analysis, Abstract Algebra, Topology, and Differential Geometry mostly partition the field of mathematics with corresponding undergraduate courses. As for computer science, look into Algorithmic Analysis and Computational Complexity (sometimes sold as a single course called Theory of Computation).

u/waxxxd · 3 pointsr/gamedev

Hmm was just looking at this book today, can't vouch for it but might be worthwhile.

Mathematics for 3D Game Programming

u/gunnar_osk · 3 pointsr/gamedev

"I've never tried graphics programming (OpenGL or otherwise), but sure... this post looks intriguing"

"Looks like a well written and informative tutorial, but I don't know most of the stuff he's writing about"

[Goes down the rabbit hole of OpenGL information]

"Damn it, now I HAVE to learn OpenGL from the start. Been looking for an excuse to brush up on my C++ skills anyways."

[Bookmarks the webpage]

"I wonder I need to brush up on my math skills also? Oh well, guess I'll cross that bridge when I come to it"

[Thinks of that book I bought that's collecting dust on the bookshelf]

:)

u/TurkishSquirrel · 3 pointsr/learnprogramming

It depends a bit on what areas you're interested in. For interactive graphics you'll likely do OpenGL or DirectX or such.
Non real-time graphics usually means ray tracing or some variant like photon mapping where you want to produce physically correct images, with flexibility depending on your art direction e.g. Big Hero 6. With ray tracing you're essentially simulating how light interacts in the scene.

Here's some useful books/links for real time graphics:

• Real-Time Rendering this is a great book covering a lot of theory/math topics behind real time graphics techniques, so it's agnostic to whatever rendering API you use. The book's website lists more graphics related resources and is quite good.
• OpenGL Superbible good book focusing on OpenGL, written for beginners with the API.
• open.gl very good introductory tutorials for OpenGL, I just wish it covered some more content. Should give you a solid start though.

Here's some for ray tracing:

• Physically Based Rendering this is basically the book for ray tracing, the 3rd edition should be coming out this spring though so if you want to save some money you could wait a bit. There's also a website for this book.

For general math topics I also recently picked up Mathematics for 3D Game Programming and Computer Graphics which looks very good, though I haven't gone through it as thoroughly.

As mentioned already /r/GraphicsProgramming is a good subreddit, there's also /r/opengl for OpenGL questions.
u/johnnyanmac · 3 pointsr/gamedev

personally, I used this book to refresh myself on the basic vector math and finally understand some 3d linear algebra concepts. It probably goes a bit deeper than you'd ever need to know if you're using an engine (how 3d transformations work on the matrix-level, quaternions, polar mathematics), but the book uses extremely accessible language to explain everything, so you rarely feel confused like your typical math textbook.

I haven't read it, but this book is that standard in what people typically refer to for gamedev math. If you want to be experimental, the same author just released the first part of a series for game engine development. while it ultimately goes in a different direction, the first book here should cover the important math needed, and it is under half the length of the other books.

u/othellothewise · 3 pointsr/opengl

I would recommend Mathematics for 3D Game Programming and Computer Graphics. It has all the derivations plus covers a whole lot of other useful topics. It's well worth the 45. Another approach is to go through the math manually, by taking points and projecting them and trying to understand the behavior. u/ebonyseraphim · 3 pointsr/gamedev That is a great book with a math primer and continued coverage as you get deeper into the specifics of collision detection. I also own this which does a better job teaching plain math relevant to game dev and is agnostic about whether it applies to collision, physics, or rendering: http://www.amazon.com/Mathematics-Programming-Computer-Graphics-Third/dp/1435458869/ I highly recommend it. It's well ordered and well written. It is the quality you'd expect from something you pay for and will save you time with its completeness and clarity. u/frizzil · 3 pointsr/VoxelGameDev Agreed 100%. Additionally, if you're trying to learn basic OpenGL, Java combined with LWJGL is actually a great choice, since the language is generally quick to iterate with. And definitely go with the advanced pipeline, as learning immediate mode isn't going to help you much if advanced is your end goal. Also, big piece of advice -- you're really going to want a solid understanding of 3D matrix math before diving in. Particularly, you're going to want to know the difference between row-major and column-major systems, and how to perform basic manipulations in both. To this end, I highly recommend the book Mathematics for 3D Game Programming and Computer Graphics. u/timostrating · 3 pointsr/Unity3D TL;DR Take a look at spaced repetition. Study without any music and use the absence of music as a check to see if you are still motivated to do your studying. &lt;br /&gt; I fucked up my first part of my education too. Lucy i realized that and got motivated again before i finished school. I am currently 19 years old and I also always loved math (and some physics). I am from the Netherlands so our education system does not really translate well to English but i was basically in highschool when i only did things that interested me. I got low grades on everything else. 1 moment in highschool really stayed with me where I now have finally realized what was happening. In highschool i had a course about the German language. I already had a low grade for that class so I sat to myself to learn extremely hard for the next small exam. The exam was pretty simple. The task was to learn 200 to 250 German words. So I took a peace of paper and wrote down all 250 words 21 times. 1 or 2 days later I had the exam. But when i got my grade back it sad that i scored a F (3/10) . I was totally confused and it only destroyed my motivation more and more. What I now have come to realize is that learning something is not just about smashing a book inside your head as fast as possible. &lt;br /&gt; So these are some tips I wished I could have give myself in the first year of highschool: Go and sit in an quit room or in the library. This room should be in total silence. Now start with you studying. As soon as you feel the tension to put on some music than you should stop and reflect and take a little break. The default in nature is chaos. Learn to use this as your advantage. I sit in a bus for 2+ hours a day. 1 hour to school and 1 hour back. Nearly every student does nothing in this time. So I made the rule for myself to do something productive in that time like reading a book. Normally I am just at my desk at home and before I know it it is already midnight. So this is for me at least a really good way to force my self to start reading a book in dose otherwise wasted 2 hours. Get to know your body and brain. I personally made a bucket list of 100 items that includes 10 items about doing something for a week like running at 6am for a week of being vegan for a week. Fasting is also really great. Just do it for 1 day. So only drink water for one day and look how you feel. And try the same with coffee, sex, fapping and alcohol. Quit 1 day and look at how you feel. And have the goal to quit 1 time for a hole week strait. Watch this video to get a new few about the difference of low and high energy. I never understood this but I think that everybody should know about the difference https://youtu.be/G_Fy6ZJMsXs &lt;-- sorry it is 1 hour long but you really should watch it. Learn about about how your brain stores information and how you can improve apon this. Spaced repetition is one of those things that really changed the way I now look at learning something. https://www.youtube.com/watch?v=cVf38y07cfk &lt;br /&gt; I am currently doing my highschool math again for fun. After I am done with that again i hope to start with these 3 books. u/SpaceToaster · 3 pointsr/gamedev http://www.amazon.com/gp/aw/d/1435458869 is always on my desk (I have the 2nd edition) A great cookbook of useful formulas and algorithms for graphics programming. Not really a reference for game logic level algorithms though. u/Denis_Vo · 3 pointsr/algotrading I would highly recommend to read the following book https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=mp_s_a_1_2?keywords=machine+learning&amp;amp;qid=1566810016&amp;amp;s=gateway&amp;amp;sprefix=machi&amp;amp;sr=8-2 I think it is the best one about ml/dl. Not sure that they already updated the tensorflow examples to tf 2.0 and keras. And as tensorflow includes keras now, and has perfect pipeline for deploying your model, i think it is the perfect choice. :) u/raijenki · 3 pointsr/brasil Olha, não posso falar muito em termos de qualidade dos cursos. O grande lance é o nome que você vai botar no currículo: Estácio é praticamente sinônimo de massificação do ensino superior, o que não gera um destaque para quem te avalia posteriormente. Em compensação, eu não conheço a PUC-MG, mas já ouvi falar na PUC-SP, PUC-RJ e na PUC-RS, que em geral são boas escolas - e provavelmente pensaria o mesmo da PUC-MG. Isso agregaria mais valor para ti. Sobre o currículo em si: • Estácio: O curso é mais forcado em negócios (o que o pessoal chama de "Analytics") do que em Ciência de Dados em si. Disciplinas de "Orientação de Carreira", "Governança Corporativa", "Consultoria", "Desenvolvimento Sustentável", "Finanças Empresarias" dentre outras compõem boa parte do curso e, para mim, são perda de tempo - se quer estudar negócios, vá para uma escola de negócios em um curso de negócios. Não há menção ao aprendizado de Máquina no curso ("Teorias Analíticas Avançadas"? "Tecnologias Avançadas"? Que porra são essas?). • PUC-Minas: Diferentemente do acima, o curso é um misto de engenharia de dados com ciência de dados. Você supostamente aprenderá Hadoop, Spark, RDBMS, Python, R, e os algoritmos mais tradicionais. É coisa demais para pouco tempo Se eu tivesse que escolher, iria de PUC. Mas quer aprender de verdade? Faça o curso de Machine Learning do Andrew Ng no Coursera e pegue o certificado, vale muito mais do que isso e tem uma hands-on approach. O Andrew é professor de Stanford e tem uma puta didática. E se quiser aprender por aprender, compre esse livro: https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=sr_1_1?crid=35OQXMZA5U4H6 Nova edição sai dia 5 desse mês. u/brandonhotdog · 3 pointsr/videos My previous video on AI did go into a lot more depth on neural networks but still wasn't enough to build your own. (There's only 2 vids on my channel so far so it's just the other one). I will defiantly consider making a video for my fellow devs on AI but for now you should check out https://www.youtube.com/watch?v=32wtJZ3yRfw&amp;list=PLX2vGYjWbI0R08eWQkO7nQkGiicHAX7IX if you want to learn how to implent AI into a Unity3D project. If Unity3D isn't for you then you can read: https://www.amazon.co.uk/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=sr_1_3?keywords=machine+learning+hands+on&amp;qid=1563498470&amp;s=gateway&amp;sr=8-3 Also thanks for the support! u/Jaydii- · 3 pointsr/learnmachinelearning Hi, Not trying to sell anything here, but I've been reading this book : https://www.amazon.ca/Aur%C3%A9lien-G%C3%A9ron/dp/1491962291/ref=mp_s_a_1_1?keywords=machine+learning&amp;amp;qid=1568912246&amp;amp;sprefix=machibe+l&amp;amp;sr=8-1 I think it is a pretty complete book that covers a lot and a good first step into ML. Plus there are plenty of examples using Python. GL u/DecisionTreeBeard · 3 pointsr/datascience u/PostmodernistWoof · 3 pointsr/MachineLearning +1 for top-down learning approaches. There's so much work going on to democratize use of ML techniques in general software development, that, depending on where you want to go, there's little need to start with the classic theory. IMHO, the classic ML literature suffers a bit from decades of theorists who never had the computing resources (or the data) to make big practical advances, and it tends to be overly dense and mathematical because that's what they spent their time on. But really it depends on your goals. Which category do you fall into? 1. Get a PhD in math, study computer science, get a job as a data scientist at Google (or equivalent) and spend your days reading papers and doing cutting edge Research in the field. 2. Learn classic and modern ML techniques to apply in your day to day software development work where you have a job title other than "data scientist". 3. You've heard about Deep Learning and AlphaGo etc. and want to play around with these things and learn more about them without necessarily having a professional goal in mind. For #1 the Super Harsh Guide is, well, super harsh, but has good links to the bottom up mathematical approach to the whole thing. For #2 you should probably start looking at the classic ML techniques as well as the trendy Deep Learning stuff. You might enjoy: https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291 as a place to start and immediately start playing around with stuff. For #3 any of the TensorFlow getting started tutorials are good, along with all of Martin Görner's machine learning/deep learning/TensorFlow "without a PhD" videos on YouTube. Here's one of the more recent ones: https://www.youtube.com/watch?v=vaL1I2BD_xY u/onmnit · 3 pointsr/neoliberal u/ExternalInfluence · 3 pointsr/videos Not really. We're talking about a machine with an intelligence that makes us look like ants, more capable of us at everything we do, including manipulation of human beings. u/Colt85 · 3 pointsr/artificial The only book I'm aware of would be this modern classic - https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742 You may find r/controlproblem helpful. If you find any other books, I'd love to hear about them. u/Terkala · 3 pointsr/suggestmeabook Superintelligence: Paths, Dangers, and Strategies. The book lays our exactly how (potentially) screwed we are as a species if AI development is not careful. And ways to control a potentially species-endingly-powerful AI. u/Ari_Rahikkala · 3 pointsr/Games I've read a lot on what people have said about AI risk, and so far there have been a few people who have indicated that they have a good understanding of the argument being made, and have proposed counterarguments that display their understanding. There's Russ Roberts who argues that even a superintelligence can't actually go that far in being able to manipulate the world (a reasonably compelling argument IMO, but make sure that when you're thinking "superintelligence" you're actually visualizing something of the proper scale). There's Ben Goertzel who says... quite a lot of things, actually, though what stuck to me the most was that the reward-maximizing view of AI that Nick Bostrom and Eliezer Yudkowsky and others use (and that makes the orthogonality thesis seem so compelling) looks very little like practical AI development as it is done now or expected to ever be done, making it rather suspicious even as an abstract model. There's Robin Hanson who had a lengthy debate with Yudkowsky, but the core of his argument seems to be that there's little evidence that the kind of growth rate that would make an AGI dangerous is actually achievable. tl;dr: There's a lot of people who understand the AI risk argument and have compelling counterarguments to it. These three are just the ones that have impressed me the most so far. But you? Well, I'm sorry, I would like to be charitable, but you said it yourself: "But what about TEEEEEEEEEEEEERMINATOR?". You have not noticed that an argument different from what you expect to hear has been made. I'd tell you to go pick up Bostrom's Superintelligence: Paths, Dangers, Strategies or Yudkowsky's Rationality: From AI to Zombies but, well, I've never actually read either of these books, so it would be a bit of an odd recommendation to make (I read the LW sequences when they came out and have never heard anyone mention anything essential in those books that wasn't in the sequences). Oh well. FWIW Goertzel says that Yudkowsky's book is the one that makes the essence of the point clear and doesn't try to weasel out of counterarguments. (For those who have never heard of any of the names in this post, http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html is a fairly popular comparatively short introduction to the basic idea being talked about. Although, well, I haven't read that one, either. No, seriously, on that side you get the same argument from every source, there's not much point in reading the other people saying the same thing after you've read Yudkowsky.) u/Jigxor · 3 pointsr/gamedev If you like books, I've been reading this one lately Programming Game AI by Example: http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782 I'm really enjoying it and it has a tonne of useful information. Read some of the pages on the Amazon site and see if it sounds interesting for you. u/Orthak · 3 pointsr/mylittleandysonic1 Unity is the bee's knees. I've been messing with it casually for several years, and got serious in the last 2-ish years. I like it because I get to use C#, and that's the language I know best. Only problem in it's using some weird limbo version of .NET 2, that's not actually 2.0 but is also 3.0 is some places? I think it's because it's using Mono 2.0, which is some subset of .NET. It's weird. They're moving to 4.5 soon anyways so I'm hype for that. I'ts been a lot of fun regardless, I get to apply a different knowledge and tool set from my day job. Not to mention it feels great when you actually get something to build and actually work. So anyways here's a list of resources I've found over the years to be super helpful: Things on Reddit u/SgtBursk · 3 pointsr/gamedev Do the enemies ever hit the player? Perhaps you can try scaling the player's health to how often enemies land attacks. (For instance, instead of Zelda's 6+ half-heart system, use old school Mario's 'one hit you die unless powered up' system. Or Super Meat Boy's one-hit KO.) &gt;the player could just roll out of the way and beat the shit out of them while the enemy's attack was missing. Sounds like fun to me! But I take it you're going for a different feel? I'm currently reading Programming Game AI by Example. It goes from basic AI moving to pathfinding to goal hierarchies. The majority of the book might be a little too introductory for you though, at least until you get to the goals section. u/exeter · 3 pointsr/compsci While not free, Knuth, Graham, and Patashnik's Concrete Mathematics is quite cheap on Amazon.com, and, if you work through it diligently, you'll have more than enough mathematics to tackle CS through the beginning graduate level. u/yellowstuff · 3 pointsr/programming There's a whole section on this problem in Concrete Mathematics. The first edition was published in 1989, and the problem was old then. u/zoredache · 3 pointsr/learnpython What you might want to do is pick up a book on design patterns. I am not aware of an good ones specifically aimed, at python though. Learning the mechanics of doing OOP in python really isn't that hard. Learning how design your objects is the tricky bit. A book on design patterns basically is just a cookbook of things that work. It is about giving you practical examples of when you would use OOP. Here are a couple books I have on the subject, and have found useful. Reading about design patterns was basically what let me actually understand the point of OOP. u/thescientist13 · 3 pointsr/angularjs There’s really no meaningful content, it’s all just very superficial IMO. You just keep saying what a patten is, list a few at the end and then you don’t explain them or even give examples. Lastly, no relevance to AngularJS in the article, which is the sub you’re posting in. As far as AngularJS goes, I would recommend most people start with John Papa’s styleguide. https://github.com/johnpapa/angular-styleguide/blob/master/a1/README.md As far as JavaScript design patterns go, I would recommend this book. https://addyosmani.com/resources/essentialjsdesignpatterns/book/ For learning about design patterns and a larger set of them in general, you can’t beat the Gang of Four https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612 u/pipocaQuemada · 3 pointsr/learnprogramming Read the GOF design pattern book. For better or for worse, you can't write intermediate Java if you aren't familiar with the basic patterns of Java code u/Neres28 · 3 pointsr/learnprogramming Sounds like what you've done is real/serious programming, just on a smaller scale. I wouldn't worry too much about not knowing various frameworks. We use Spring to inject dependencies into our client code, we use Glassfish as our application server, and we use JPA backed by EclipseLink as the glue to our data layer. However, when it comes down to it it's mostly about writing plain old Java objects. For the rest: • I've heard that the Head First design patterns book is very good, but haven't read it myself. I've always gone to the GOF, but I do find it dry and hard to understand at times. • For algorithms I used Cormen et al in college and still refer back to it. But I don't really suggest it outside of a classroom. Rather than further enumerate my reading list, I'll point you to the Programming FAQ; an excellent resource for exactly the questions you're asking. u/bwainwright · 3 pointsr/learnprogramming Nope. I've been a professional programmer for 20 years and I still don't formally know design patterns. What I mean by "formally" is that I naturally use more basic patterns without even realising it. However, recently I've moved into software architecture for my company, and am finding that there are more and more reasons to learn and understand patterns, as they help design complex systems. Definitely focus on programming first. Patterns will come. Also, a tip - people will always recommend the Gang Of Four book (https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612) for design patterns. And it is the gold standard book for Design Patterns. However, in my opinion, it is an incredibly 'dry' read. Head First Design Patterns (https://www.amazon.com/Head-First-Design-Patterns-Freeman/dp/0596007124/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1466670651&amp;amp;sr=1-1&amp;amp;keywords=head+first+design+patterns) is a much better introduction to design patterns for most people. u/lodc · 3 pointsr/ITCareerQuestions Well I'm 40 and didn't really "get" CS until I was in my early 30s, despite doing it for 10+ years prior. Once you turn the corner or whatever it is, things just make sense. For me what helped was going back to the concepts called "computational thinking", as described here https://www.cs.cmu.edu/~CompThink/ I understood the meanings of words like abstraction, recursion etc but didn't really know them as well as I needed to, and didn't know how to see these same basic ideas everywhere, take advantage etc. Also learning some design patterns such as explained in the gang of four book can be eye opening. edit - book https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612 u/GeminiVI · 3 pointsr/csharp &gt; learn these patterns practically? Maybe for now you might want to just read up on them and keep them in mind for the future. For further reading on design patterns pick up "Design Patterns: Elements of Reusable Object-Oriented Software". When you start building something plan it out before hand, preferably with pen(cil) and paper. Once you have some rough idea, see if you can recognize any place where design patterns would come into play. Sometimes they will be intuitively incorporated into your design process, or they will emerge naturally if you try and think holistically about a problem instead of moving your project along piecemeal. &gt;My goal eventually would be to land an entry C#/.NET position at some software company. I would encourage you to learn ASP.NET MVC 5, a web framework like Angular or React, and Entity Framework. You may just want to learn the "Core" versions of ASP.NET and Entity Framework then go back . I've taken this course on udemy on ASP.NET MVC 5, which I would recommend. The same author has put together another course on ASP.NET Core, AngularJS, and Entity Framework which I want to take soon. It looks like they are sequential given the requirements for each course, so you should take the MVC 5 course before you take the ASP.NET Core course. If you created a Github for a sample website (say a pizza shop where users can register, login, and order) and hosted it on Azure to show prospective employers, that could very well get your foot in the door. u/LuminousP · 3 pointsr/Physics Sadly, for a large part of the industry, Java and C# is the standard. .Net still has a huge marketshare for developers, but Java is gaining quickly. Get proficient in either of these and some Agile development practices, as well as design patterns. The Gang of four book is the standard design pattern book. I don't know off the top of my head what the best books to get on Java and C# are, but there are huge threads on StackOverflow that will point you in the right direction. Good luck, my friend! u/thehollyhopdrive · 3 pointsr/java Two Java specific books you should read cover to cover (and keep around as an effective reference) are Effective Java and Java Concurrency in Practice, and you should also seriously consider reading Design Patterns. The examples in it aren't written in Java but they hold for all OO languages. u/N0N-Available · 3 pointsr/learnprogramming This is one of those questions that basically yields no useful answers. • On the how part, it depends. Generally you would analyze your requirements and problem and find out that it fits problems described by one or more design patterns. So you apply these design patterns and modify them as fit until you think you've cover all the basis on a higher level. Now you have a fuzzy picture of the house you want to build and where the general walls and hallways are. You might have kitchen and bedroom marked that you know will be for certain purpose but you don't have anything specific in each room yet(hope the analogy makes sense) • Next step will be the more difficult part, which is fine tunning and validating your blueprint. Idk how other people do it, but I prefer to run through each requirements and see how they fit into each "room". This process usually expose ambiguity in the architecture. You want to the repeat the process till it is clear how each requirement will be handled. It's a good idea to also anticipate how the requirements might change or expand in the future when validating your architecture. • In the case of Django they basically did most of that for you already. If that's all you work on then I'm not sure what you are asking, you could look into data structures topic if you want better data model designs. Front end is pretty flexible, usually depends on the framework you pick and follow their design style. For server side application that supports your front end would depends on your problem/requirements. Like I said this is one of those questions that yields no useful answers since it's essentially design question and design is very subjective and a broad topic. As far as UML or whatever other tool/diagrams you choose to use, it really doesn't matter as long as it conveys your design clearly and there's no ambiguity. Not sure if I answered your question. You can checkout topics on design patterns. there's a good book on Amazon for that I think just called design patterns. Or checkout tutorialpoint.Com ? For software architecture. Design Patterns Book Software Architecture Quick Overview edit: added resources and changed format u/Jollyhrothgar · 3 pointsr/learnpython Just looking for learning, etc? If it's for interview prep, I'd give different recs. I highly recommend: u/Ravilan · 3 pointsr/node IMO you should go beyond learning node. Learn javascript (node is just javascript with core libraries). Actually don't learn javascript, learn how to code. Even more, learn how to learn. I'm not the only one with this opinion: http://blog.codinghorror.com/please-dont-learn-to-code/ Also a side point: learn how to test. Testing (may it be unit, behaviour, …) is really important. It helps you code better, have a stronger code (less bugs, or more easily identifiables), more maintainable, and so on. If you know the basis of code, and how to learn, not only you'll know node, but you'll potentially know php, ruby, scala, whatever. I strongly encourage you to read: u/cyrusol · 3 pointsr/learnprogramming Algorithms? Something like hackerrank.com because there you are supposed to create more complex algorithms from the "simple" ones you learn from books or perhaps in college or from Wikipedia/googling/free online resources. Design patterns? That book specifically and anything from Robert C. Martin, but preferrably Clean Architecture. If you are poor there are probably PDFs/epubs somewhere in the web but consider buying the books when they made you rich. u/ThereKanBOnly1 · 3 pointsr/csharp So I don't think you should get too hung up on "enterprise architecture" at the moment, partially because you're still very early in your career, but also because enterprise architecture means a lot of things to a lot of different people. At this stage in your career, I really think you should focus mainly on SOLID code, core Object Oriented design concepts, and then understanding patterns. Good architectural strategies are built around all of those concepts, but they're also much much more than that. For SOLID code, one of my favorite references is actually Dependency Injection in .Net by Mark Seemann. Although he does spend a good amount of time on DI, the recommendations that Mark makes for how to properly structure your code in order to take advantage of DI are very useful in understanding SOLID oriented design principles in general. The examples and code really work through the concepts well, so you get a great explanation followed by some well thought out code. Clean Code by Uncle Bob is a great reference on how to structure well thought out code that touches on some architectural principles, but doesn't have that as the main focus of the book. Many of the topics in the book you'll find need to be addressed throughout a codebase. As far as design patterns (which are different then architectural patterns), I don't think you can go wrong with the original Gang of 4 book , although I personally have a C# specific version, C# Design Pattern Essentials. I don't want to put too much emphasis on design patterns, because sometimes they get overused and applied too literally, but they are still very useful. I think the key to design patterns is not just knowing what they are, but determining where they might be applicable to your use case, and whether you should make any small adjustments or tweaks to them. After you really have a rock solid base of working with code, then you can shift your focus on more architectural concerns. For that, it really depends on what problem your looking to solve, but Domain Driven Design (DDD) is a good way about understanding those problems and to structure the solutions in a well thought out, loosely coupled, and evolvable manner. That "central framework" that you referenced in your post is the business logic, and its the key focus of DDD u/bot_bot_bot · 3 pointsr/AskProgramming The Pragmatic Programmer and Design Patterns. The Pragmatic Programmer is a really enjoyable read about practical decision making and coding practices. Design Patterns is more for reference, both great books. You can google the design patterns though, but I like to have a copy of the book anyway. u/ntr0p3 · 3 pointsr/AskReddit By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology. Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software. Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should. For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything. I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is. For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever. For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :( Best of luck in your travels however! :) edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius. also, Feynman QED is great, but his other book is awesome just as a happy intellectual read also try to avoid either kaku and hawking for anything more complicated than primers. edit no. 9: mit's ocw is win itself. edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields) u/PAK-9 · 3 pointsr/cpp I was underwhelmed by code complete, although I had really high expectations after jeff atwood gushed about it so hard. Design patterns I don't hear brought up very often but I found it extremely useful. Even if you don't use or even like some of the patterns you are likely to encounter them in your programming career at some point and being able to recognise them is invaluable. u/Toasterthegamer · 3 pointsr/gamedev Although I agree with the others and good coding pillars are important. C# can be incredibly easy to learn and as such it's great for beginners to start with. Focus small and build up from there. I would write simple console apps understanding the core concepts before you move into actual gamedev coding. Without a strong fundamental grasp of how C# works you'll end up rewriting code in your game later on as you learn new things. You might spend a good chunk of time doing something a very tedious way only to discover that there is a much easier way later on. Also as others have said code agnostic coding pillars are very important. Some good books: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612 http://gameprogrammingpatterns.com/contents.html Understanding how to structure your code is very important. u/diisiqueira · 3 pointsr/brasil Cara, reinventar a roda não é ruim! Resolver problemas clássicos é uma ótima maneira de aprender e praticar é a melhor maneira de melhorar em algo. Eu sou o tipo de cara que gosta de aprender as coisas na prática então sou suspeito de falar mas vai lá e desenvolve. Essa minha aposta com meus amigos é justamente para nos forçar a isso, desenvolve qualquer coisa mesmo que seja algo completamente inútil, que é melhor do que ficar parado. Umas semanas atrás, nessa aposta um dos meus amigos desenvolveu um script que calculava se era ecologicamente correto urinar no chuveiro ou na privada, utilidade real do projeto é nula, mas fazendo isso ele já começou a aprender uma nova linguagem e investiu tempo em se tornar um programador melhor. Realmente meu conselho é faça, desenvolva! Uma dica é pegue pequenas coisas do seu dia-a-dia e resolva elas com programação, por exemplo, uma vez eu fiz um script que me perguntava quais ingredientes eu queria em um lanche, ai ele varria o site do iFood e me falava qual era a opção mais barata que eu tinha. Não tinha necessidade alguma de fazer isso, era muito mais rápido fazer uma busca manual, mas na época isso me ensinou a usar a BeautifulSoup e foi super divertido de fazer. 1. Eu sempre tive a visão de que o mercado python no Brasil é bem mais fraco do que em outros países, não sei ao certo de porque, aqui na minha região por exemplo não conheço nenhuma empresa que seja especializada em python. Por outro lado aqui o grande foco das empresas é o PHP, em todos os níveis. PHP é realmente uma linguagem de entrada, por ser muito aberta e deixar a pessoa fazer as coisas de qualquer maneira, acaba sendo muito fácil de tornar uma pessoa capaz de criar códigos funcionais em PHP, depois quando aprendem melhor o paradigma da programação, muitas acabam migrando para outras linguagem. 2. Atualmente eu tenho dois livros que ficam grudados na minha cabeceira da cama, e leio um pedaço todos os dias quando acordo, são eles o Clean Code e o Design Patterns. Tambem to de olho no The Pragmatic Programmer. u/crawly_the_demon · 3 pointsr/neoliberal u/prince_muishkin · 3 pointsr/math Turing's paper on computability I think is quite good, it's very well presented here. The course on computability I'm doing now hasn't added much, but discussed computability in more generality. I think it helped my understanding of computability is showing you can really do the nitty gritty details. Can't remember where I found it but Godel's paper "On Formally Undecidable Propositions of Principia Mathematica and Related Systems" I found to be interesting in seeing the way my course in Logic could be down differently. u/onetwosex · 3 pointsr/greece u/lowlycollegestudent · 3 pointsr/compsci I know that this is way more on the theory/mathematics side of the spectrum than CODE, but Charles Petzold also wrote a book called The Annotated Turing that I really enjoyed. He took Alan Turing's original paper on computability which was about 30 pages and annotated it until he had about a 400 page book. There were a couple of chapters in the middle that got a bit dense, but he did a fantastic job of making the subject more approachable and understandable. u/lkh01 · 3 pointsr/compsci I read The Annotated Turing by Charles Petzold while I was in high school and it really sparked my love for logic, math and computer science. So, as far as popular science books go, I can't not recommend it. Right now I'm interested in programming languages, and I think TAPL is a great resource. The (relatively) new blog PL Perspectives is also pretty cool, and so is /r/ProgrammingLanguages. u/CSHunter33 · 3 pointsr/cscareerquestions Congrats! I'm on such a programme at the University of Bath in the UK right now. Bath's programme teaches C and Java, with a little Python in some elective modules. There are also theory of computation modules with a bunch of discrete maths in. If you let me know which specific course you're attending, what your undergrad was, and how technical a career you might want I can give more tailored advice. I did the following for prep, after researching what modules I'd be taking: • MITx's Intro to CS MOOC (amazing) • Read and did all exercises from several relevant chapters from a "Discrete Maths for Computing" textbook I got second hand for a fiver (helped a lot in a maths-heavy module) • read the oft-recommended book Code (useful for awareness, but not essential, especially since we do no Comp Arch at Bath) • did some algorithm challenges at places like leetcode.com and firecode.io once I had done the Intro to CS MOOC Conversion masters are generally very intense so doing prep now and over summer is a great idea. The stuff I listed helped immensely and I would do the same again - perhaps I would switch the MITx MOOC to Harvard's CS50 instead since CS50 has a bigger spread of languages. If I had had more time on my hands, proceeding from here to do a Java MOOC would have been really useful also. Working on the Leetcode/Firecode challenges helped a lot for general programming practice, and will also be helpful prep for job hunting later. u/lespea · 3 pointsr/programming I can easily recommend the book Code: The Hidden Language of Computer Hardware and Software ; fascinating stuff! u/shrapnull · 3 pointsr/programming For anyone that hasn't read it, [Charles Petzold's "Code"] (http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;qid=1407846258&amp;amp;sr=8-1&amp;amp;keywords=code+charles+petzold) is a great read through the history of computing and programming. u/curious_webdev · 3 pointsr/compsci Not all on topic as "CS" books, more general programming, but here's a short list. I also suggest the opening chapter or two of a lot of books for stuff you don't know but are interersted in. They're generally just nice easy to read introductions. u/Helix_van_Boron · 3 pointsr/compsci Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. This is the book that made me decide to switch my college major to CS. It gave me great insight to what I was manipulating inside of a computer. It might not be very helpful to an experienced computer scientist, but I recommend it to anybody that's interested in getting into CS. And even if you understand all of the concepts in it, it's still an interesting read. u/pgvoorhees · 3 pointsr/c_language As /u/Shok3001 said, your course textbook is going to be a good place to start. As an additional note, it might be really beneficial to know how computers work in general. To this end, read Code by Charles Petzold. Once you see the underlying mechanics of a computer, you will see more easily why things are the way they are in the language and how to manipulate data inside a computer. u/Summerdown · 3 pointsr/askscience I think this book is exactly what you're looking for. I bought it recently and am now half-way through it, and it's fascinating. u/ULICKMAGEE · 3 pointsr/AskEngineers Honestly just purchase this book it's exactly what you're looking for and will do a far better job than a bunch of condensed replys to your inbox. It's a really good book. u/emcoffey3 · 3 pointsr/learnprogramming Definitely take as many web-related classes as possible and at least one database class. The other two focus areas you mentioned are important, but probably not as important as the first two. Maximize your time on learning the important stuff; the other stuff can be learned later. If you want to learn about UML diagrams, check out Martin Fowler's UML Distilled; it's an easy read and handy as a reference. Likewise, Charles Petzold's Code is one of my favorite hardware-related books. u/Cogniphile · 3 pointsr/learnprogramming What is their approximate age? Highest education level achieved? Any prior experience in programming or computer science? What is your budget? Do you have experience programming? The hardest part is finding something that is interesting and will keep an inexperienced reader motivated. Otherwise you could throw some undergrad level textbooks at them and they'd come out better than most college students who don't actually study. I highly suggest you stay away from front-end programming because it will be very frustrating to learn about making interfaces when you can't make an interface. This means stay away from html, css, and such. One possibility is having the person write programs on paper and forward them to you. You can type them into a pc, check for bugs, and even print out the results to send back. Also I've had this book recommended but haven't read it myself: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 This also might work: https://www.amazon.com/gp/aw/reviews/1593276664/ref=cm_cr_dp_mb_see_rcnt?ie=UTF8&amp;amp;s=sd u/CaRDiaK · 3 pointsr/learnprogramming If you're interested in the history then Code by Charles Petzold it's great. It explains how code has existed for hundreds if not thousands of years in many different forms.. He takes you right the way through from people signalling using lights, morse code, relays to modern day processors. What's cool about it is you can just pick it up and put it down which I get the feeling is what you're looking for ; https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;qid=1467960451&amp;amp;sr=8-1&amp;amp;keywords=code+charles+petzold u/Great_Lord_Kek · 3 pointsr/EngineeringStudents If they just explain them as a logic table it makes no sense. looking at them as they perform in a circuit is much more intuitive and clear; i'd recommend a book (it's pretty old at this point) by Charles Petzold called "Code: the hidden language of computer hardware." https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 That book goes over circuits from the bare bones (lighting up a lightbulb) all the way to latches and RAM arrays. It's not dumbed down either. u/johnsibly · 3 pointsr/programming I'd certainly recommended Charles Petzold's "Code" as a source of material: http://www.amazon.co.uk/Code-Hidden-Language-2nd-DV-Undefined/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;s=books&amp;amp;qid=1265836950&amp;amp;sr=1-1 Great explanations of all the points you mention u/KyleRochi · 3 pointsr/ComputerEngineering Codecadmy! I recommend python or ruby. They are pretty easy languages to pick up, so you will have a good understanding of programming concepts when you start doing C/C++ or java. Also for digital logic I recommend picking up a copy of [Code](Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_nxOAyb12J4N87) by Charles Petzold. It is by no means a comprehensive guide, but you will be familiar with everything when you take a logic class and while most of the class is trying to figure out what an adder is you will already know and be focusing on how and why it works u/wannabeproprogrammer · 3 pointsr/computerscience Computer processors have historically been in powers of 2 in terms of instruction sets because of boolean logic and how it relates to binary numbers. If you were building a very rudimentary computer, i.e a circuit which is just on and off then you can say that the circuit only represents 2 states. This can be encoded as just 0 or 1. In fact this is what transistors do, they hold either an on or off state. Now let's say you want to represent more than two states? How would you go about that with what you already have? Well you introduce another transistor. Now you can represent 4 states. This is done following the same logic as before, so these 4 states can be 00, 01, 10, 11, where each digit corresponds to the on or off state of one of the transistors. In fact you can repeat this pattern ad-infinitum and keep adding more and more on-off holding transistors. What you'll find is when you do this, is that the number of states that your rudimentary CPU can hold will always correspond to the number of memory units that can be accessed at any time. This will be 2\^N hence the 32-bit and 64 bit numbers which are powers of two. The 32 and 64 correspond to the number of unique states that can represent a 0 or a 1 or an off or on, or rather the memory that can be accessed at any time. &amp;#x200B; Now in modern CPU's this can get a lot more complicated in terms of architecture as in reality CPU's may have million of transistors and have multi-cores but are still considered 32-bit or 64-bit. In reality the 32-bit and 64-bit in this context relates to the instruction set architecture of the CPU. This relates to the format of instructions that the CPU handles to perform operations i.e. in a 32 bit architecture, the CPU will handle instructions of 32 bits long which determine whether you're accessing memory, writing a value, triggering an interrupt signal and so on. If you really want to understand how CPU's work I recommend reading this book. It explains how CPU's work from a very rudimentary base all the way up to how machine code translates to actual CPU instructions. Hope this helped. u/TheAdventMaster · 3 pointsr/learnprogramming Something like Code: The Hidden Language of Computer Hardware and Software may be up your alley. So may be From NAND 2 Tetris, a course where you build a computer (hardware architecture, assembler, OS, C-like compiler, and programs to run on the OS / written in the compiler) starting with just NAND. At the end of the day though, the way things work is like this: Protocols and specifications. Everything follows the same published IPO (input, processing, output) standards. Stuff is connected to and registers expected values on expected peripherals. The CPU, motherboard, graphics card, wireless modem, etc. all connect in the right, mostly pre-ordained places on the hardware. In this vein, there's firmware level APIs for then communicating with all of these at the BIOS level. Although as far as I'm aware, "actual" "BIOS" is no longer used. UEFI is instead: https://en.wikipedia.org/wiki/Unified_Extensible_Firmware_Interface This is what Firmware is / is built on-top of. Operating systems build on top of these. System calls. Operating systems communicate under the hood and expose some number of system calls that perform low-level actions like talking to devices to perform things like file access or network I/O. A lot of this stuff is asynchronous / non-blocking, so the OS or system will then have to respond to an interrupt or continuously check a registry or some other means of getting a response from the device to see if an operation completed and what its result was. Loading the OS is one thing the BIOS is responsible for. This is through the bootstrapping process. The OSs are located at very specific locations on the partitions. In the past, the only command you had enough room for within BIOS / pre-operating system execution was to load your OS, and then the OS's startup scripts had to do everything else from there. Once you have an operating system, you can ask the OS to make system calls and invoke low-level API requests to get information about your computer and computer system, such as the file system, networks, connected drives and partitions, etc. These calls are usually exposed via OS-specific APIs (think the win32 API) as well as through a command-line interface the OS provides. New devices and I/O from/to those devices communicate through firmware, and interrupts, and low-level system calls that are able to communicate with these firmware APIs and respond to them. Just about anything you can think of - graphics, audio, networking, file systems, other i/o - have published standards and specifications. Some are OS-specific (X windowing system for Linux, DirectX win32 API or GDI on Windows, Quartz on Mac, etc.). Others are vendor-specific but don't seem to be going anywhere (OpenGL, then nVidia vs AMD driver support which varies across operating systems, etc.). The biggest hardware vendors and specification stakeholders will work with the biggest operating system vendors on their APIs and specifications. It's usually up to device manufacturers to provide OS-compatible drivers along with their devices. Drivers are again just another specification. Linux has one driver specification. Windows has another. Drivers are a way that the OS allows devices and users to communicate, with the OS as a middle-manager of sorts. Drivers are also often proprietary, allowing device manufacturers to protect their intellectual property while providing free access to use their devices on the OS of your choice. I'm not an expert in how it all works under the hood, but I found comfort in knowing it's all the same IPO and protocol specifications as the rest of computing. No real hidden surprises, although a lot of deep knowledge and learning sometimes required. When we get to actually executing programs, the OS doesn't have too much to work with, just the hardware... So the responsibility of slicing up program execution into processes and threads is up to the OS. How that's done depends on the OS, but pretty much every OS supports the concept in some sense. As far as how programs are multi-tasked, both operating systems and CPUs are pretty smart. Instructions get sent to the chips, batched and divided by them and the computational results placed into to registries and RAM. Again, something I'm not a huge expert in, and it honestly surprised me to find out that the OS is responsible for threading etc. I for some reason always thought this was at the chip level. When you include libraries (especially system / OS / driver libraries) in your code, you're including copies of or references to OS native functions and definitions to help you reference these underlying OS or system calls to do all the cool things you want to do, like display graphics on the screen, or play audio. This is all possible because of the relationship between OS's and device manufacturers and the common standards between them, as well as the known and standard architectures of programs designed for OS's and programs themselves. Inter-program compatibility is where many things start to become high level, such as serialization standards like JSON or XML, but not always. There are some low-level things to care about for some programs, such as big- vs little-endian. Or the structure of ASM-level function calls. And then you have things like bitcode that programs like Java or JavaScript will compile to, which are a system-independent representation of code that most often uses a simple heap or stack to describe things that might instead be registry access or a low-level heap or stack if it had been written in ASM. Again, just more standards, and programs are written according to specifications and know how to interface with these. The modularity of programming thanks to this IPO model and the fact that everything follows some standards / protocols was a real eye opener for me and made me feel like I understood a lot more about systems. What also helped was not only learning how to follow instructions when setting up things on my computer or in my programs, but learning how to verify that those instructions worked. This included a lot of 'ls' on the command-line and inspecting things in my debugger to ensure my program executed how I expected. These days, some might suggest instead using unit tests or integration tests to do the same. u/ospatil · 3 pointsr/algorithms Learning JavaScript Data Structures and Algorithms - Second Edition is a really good book with clear explanations and code examples. Grokking Algorithms is also a wonderful book especially for non-CS people and beginners. The examples are in Python but it shouldn't be a problem given your Ruby and JavaScript background. u/rispe · 3 pointsr/javascript Congratulations! That's a big step. Be proud that you were able to make the switch. Not many people manage to transform ideas into results. I think there are four areas on which you need to focus, in order to go from mediocre to great. Those areas are: 1. Theoretical foundation. 2. Working knowledge. 3. Software engineering practices. 4. Soft skills. Now, these areas don't include things like marketing yourself or building valuable relationships with coworkers or your local programming community. I see those as being separate from being great at what you do. However, they're at least as influential in creating a successful and long-lasting career. Let's take a look at what you can do to improve yourself in those four areas. I'll also suggest some resources. &amp;#x200B; 1. Theoretical foundation Foundational computer science. Most developers without a formal degree have some knowledge gaps here. I suggest taking a MOOC to remediate this. After that, you could potentially take a look at improving your data structures and algorithms knowledge. • CS50: Introduction to Computer Science • Grokking Algorithms • Algorithms by Sedgewick &amp;#x200B; 2. Working knowledge. I'd suggest doing a JavaScript deep-dive before focusing on your stack. I prefer screencasts and video courses for this, but there are also plenty of books available. After that, focus on the specific frameworks that you're using. While you're doing front-end work, I also suggest you to explore the back-end. &amp;#x200B; • FunFunFunction on Youtube • You Don't Know JS • JavaScript Allonge • JavaScript Design Patterns 3. Software engineering practices. Design patterns and development methodologies. Read up about testing, agile, XP, and other things about how good software is developed. You could do this by reading the 'Big Books' in software, like Code Complete 2 or the Pragmatic Programmer, in your downtime. Or, if you can't be bothered, just read different blog posts/Wikipedia articles. &amp;#x200B; 4. Soft skills. 1. Actively seek to mentor and teach others (perhaps an intern at work, or someone at a local tech community, or create blog posts or videos online). 2. Get mentorship and learn from others. Could be at work, or open source. 3. Go to programming meetups. 4. Try public speaking, go to a Toast Masters meetup. 5. Learn more about and practice effective communication. 6. Learn more about business and the domain that you're working in at your company. 7. Read Soft Skills or Passionate Programmer for more tips. &amp;#x200B; Some closing notes: - For your 'how to get started with open source' question, see FirstTimersOnly. - If you can't be bothered to read or do large online courses, or just want a structured path to follow, subscribe to FrontendMasters and go through their 'Learning Paths'. - 4, combined with building relationships and marketing yourself, is what will truly differentiate you from a lot of other programmers. &amp;#x200B; Sorry for the long post, and good luck! :) u/WineEh · 3 pointsr/WGU_CompSci Since the course is in Python you could also always take a look at Grokking Algoriths. I think this is a great book that people overlook. It’s easy to understand and a (fairly) quick read. https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230 Common sense is also a great book. If you really want to brush up on DS&amp;A you could check out Tim Roughgarden’s coursera courses and the related books. I will point out though that at least from my experience you can access the course materials even for courses you transferred in so once you get access to you courses or even when you start DS&amp;A at WGU you can always refer back if you’re struggling. u/k4z · 3 pointsr/UCONN probably is. don't have it handy. since you're learning data structures and algorithms in python, any general data structures and algorithm course should work; just implement them in python. it's hard to suggest [a good resource] off the top of the head, that isn't a mere udemy shill or incredibly dense like stanford's algo course. grokking algorithms was okay, while people might suggest introduction to algorithms (but there's a reason why it's 1k pages and pure madness to "refresh" your knowledge). doing projects (crash course, automate) would help to refresh using python. u/ThinqueTank · 3 pointsr/programming I've actually been getting some interest lately because I just started to attend meetups last week and let some engineers see what I've done so far. Really appreciate the algorithms/data structures advice. I picked up this book to get a light overview of it first before I really dive into something more formal: Grokking Algorithms: An illustrated guide for programmers and other curious people I also have enough college credits to take a Data Structures course called Discrete Structures for Computer Science and my math up to Linear Algebra completed. Here's the description for the community college course: &gt; This course is an introduction to the discrete structures used in Computer Science with an emphasis on their applications. Topics covered include: Functions; Relations and Sets; Basic Logic; Proof Techniques; Basics of Counting; Graphs and Trees; and Discrete Probability. This course is compliant with the standards of the Association for Computing Machinery (ACM). Is the above what you're referring to more or less? Are there any books and/or online courses you'd personally recommend for data structures and algorithms? u/KlaytonWade · 3 pointsr/java Grokking Algorithms. This is the best one yet. u/YuleTideCamel · 3 pointsr/learnprogramming Software Architecture is a nebulous term that can mean different things to different people. I'm a software architect and there is no single definition for architecture. Instead try to get a deep understanding of good programming concepts and patterns, while focusing on ways to scale both your application and resources. Introducing unit testing, continuous integration and industry best practices is a key part of good architecture. Architecture is also more than just the technical side, it's understanding the business domain and making decisions based on that. tl;dr figure out what architecture means to your business and find the best way to bring value from a high level. The following books are good resources: u/BertilMuth · 3 pointsr/learnjava It will certainly take time. How long is hard to say. One thing is being exposed to code, and writing code yourself. Another thing is actually collaborating with people that are more experienced - that helped me a lot. An eye opener in my coding journey was the "Gang of Four" design patterns book (https://www.amazon.de/dp/B000SEIBB8/ref=dp-kindle-redirect?_encoding=UTF8&amp;amp;btkr=1). The risk is that you will overdo design patterns at first, but that will hopefully settle :-) u/TNMattH · 3 pointsr/csharp &gt;Is there any decent books on the GoF stuff? The GoF book: Design Patterns: Elements of Reusable Object-Oriented Software by "The Gang of Four" (Gamma, Vlissides, Johnson, and Helm) u/nutrecht · 3 pointsr/java Although I admire the optimism for a lot of people here I feel they are not being very realistic. Like the mention of using sales experience to 'sell' yourself to employers: it's fine and dandy if you can bullshit your way into a job but if you can't actually deliver you'd be 'let go' within a month. Learning a language is just one aspect. What's most important is doing actually a ton of programming. So make sure you have at least 3 moderately big projects with good code quality that follow best practices that you can show to employers. Feel free to hop over onto /r/javahelp to have us review your code and suggest improvements. Being a developer isn't really about languages: it's about turning a customer's problem into a working solution. That's the hard part. One last tip: for someone without any CS education but who is going into an area where OO skills are a must this book is a must read: http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented-ebook/dp/B000SEIBB8 Heck. Showing an employer that you read and understand it and apply it in your projects would give you a huge head up. u/casualblair · 3 pointsr/learnprogramming Skill - if you look at a book and can't even figure out what the topic is about then it's too advanced. EG: Entity Framework - if you don't know what an ORM is or why it would be a good idea to have a DAL then you might want to skip this and come back later. Find a popular blog or podcast on your topic then browse through their articles and notes. They should be regularly recommending books or sites that are useful. Then take that book and google other sites for it. Find lots? Good book. And yes, amazon reviews do matter. EG: https://www.amazon.ca/Design-Patterns-Object-Oriented-Addison-Wesley-Professional-ebook/dp/B000SEIBB8 4.4 out of 5 stars, only one version ever published, and all the negative reviews are about the kindle version, and it's basically THE book for learning design patterns, even 20 years later. u/schreiberbj · 3 pointsr/compsci This question goes beyond the scope of a reddit post. Read a book like Code by Charles Petzold, or a textbook like Computer Organization and Design or Introduction to Computing Systems. In the meantime you can look at things like datapaths which are controlled by microcode. This question is usually answered over the course of a semester long class called "Computer Architecture" or "Computing Systems" or something like that, so don't expect to understand everything right away. u/FearMonstro · 3 pointsr/compsci Nand to Tetris (coursera) the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris. Code: The Hidden Language of Hardware and Software This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive. Computer Networking Top Down Approach one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics. Operating System Design A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!) Digital Design Computer Architecture another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice. u/nonenext · 3 pointsr/explainlikeimfive If you want to know how computer is made, this amazing book explains so clearly from scratch in order so you can understand next chapter to the end. It explains in scratch from Morse code, to electricity circuit with battery + flashlight, to telegraphy and relays with more advanced electricity circuit, to how numbers are understood in logic sense, to binary digits (0 and 1), to explaining how you can do so much with just binary digits and how barcode works, to logic and switches in algebra and advanced electricity circuits with binary/boolean, to logic gates, more advanced electricity circuits stuff, to bytes and hexes, how memory functions, to automation... ah this is halfway through the book now. The way how he writes is very clear, understandable, and everything what he wrote has a meaning for you to be capable to understand what he wrote further in the book. You'll know EVERYTHING about electricity and behind-the-scene how computer works, how RAM works, how hard drive works, how CPU works, how GPU works, everything, after you finish this book. u/mivfx · 3 pointsr/programming Yes. The best book i read that explains "computer" from really "ground up" is Charles Petzold' Code. Even my literature-graduate girlfriend understood it. u/dwitman · 3 pointsr/learnprogramming &gt; C.O.D.E This book? u/frostmatthew · 3 pointsr/WGU tl;dr version: 1. yes 2. no, but that will be the case at any school Quick background to validate the above/below: I was a 30y/o banquet manager when I decided to change careers. I had no prior experience [unless you want to count a single programming class I took in high school] but did get a job in tech support at a medium size startup while I was in school and wrote a couple apps for our department. Just before I graduated I started working at a primarily Google &amp; Mozilla funded non-profit as their sole software engineer. I moved on after a little over two years and am now a software engineer at VMware. 3. The degree is a huge boost in getting past HR and/or having [good] recruiters work with you. You'll also learn the skills/knowledge necessary to get hired as a developer, which is obviously the more important part - but for the most part this is all stuff you can learn on your own, but you'll greatly reduce the number places that will even give you a phone screen if you don't have a degree [I'm not saying this is how it should be, but this is how it is]. 4. I typed out a lot before remembering New Relic had a great blog post a few months ago about all the stuff you don't learn in school [about software development], ha. So I would highly recommend you not only read it but also try to learn a little on your own (especially regarding SQL and version control) http://blog.newrelic.com/2014/06/03/10-secrets-learned-software-engineering-degree-probably-didnt/ Being a good developer (or good anything) takes time/experience - but knowing what they don't cover in school (and trying to learn it on your own) will help. Two books I'd suggest reading are The Pragmatic Programmer and Code: The Hidden Language of Computer Hardware and Software. Pragmatic Programmer is one of those classics that every good dev has read (and follows!). Code is great at giving you some insight into what's actually happening at a lower level - though it gets a bit repetitive/boring about halfway through so don't feel bad about putting it down once you reach that point. The best thing you can do to help you land a job is have some open-source side-projects (ideally on GitHub). Doesn't have to be anything major or unique - but it will help a lot for potential employers to see what your code looks like. u/audionautics · 3 pointsr/videos For the latter half, "the CPU executes instructions", there's a fantastic book called Code: the hidden language of computers, that, through a series of scenarios, takes you all the way from talking with your friend through a string and two tin cans, to flash lights, to morse code, to logic gates, transistors, and finally encoding information in bits and executing it on a CPU. It's a super fun read. u/nekochanwork · 3 pointsr/learnprogramming &gt; I assumed calculus would somehow be the building block of where all computer systems are based. I'm afraid I don't know what expression "building block of where all computer systems are based" means, but if it helps at all, Petzold's book The Hidden Language of Computer Hardware and Software explains how computers work from the ground up. If you were to ask me, I would say "building blocks" of computers is Boolean algebra. Boolean algebra can defines simple logic gates, which in turn can be realized physically through circuits and relays. You can combine gates to form simple adders and multipliers; you can feed the output of a relay as an input back into itself to create a flip-flop gate, which can be used to store state; you can set up a relay to disconnect from the circuit as soon as it receives power, and reconnect when there is no power, resulting in a simple oscillator; etc. etc etc. I'm positive a lot of smart people used calculus to shrink circuits and relays down to solid-state transistors, which in turn implement a von Neumann machine architecture that all modern software depends on. But at the root of it all, the movement of information through your computer is modeled by naive propositional logic and the rules of Boolean algebra. u/SevenGlass · 3 pointsr/learnprogramming Petzold's Code is the book you are looking for. u/jimschubert · 3 pointsr/csharp I recommend starting by teaching some version control basics. Focus on git and quickly cover others like TFS and subversion. You can read Pro Git for free. If you teach a hardware/software course, CODE is an excellent book. I also recommend C# in Depth. I would also think it'd be cool to offer points for contributing to StackOverflow or the new open source .NET projects on GitHub. If you teach design/analysis or other classes focused on architecture, Adaptive Code via C# is pretty good. I'm only a few chapters in, but it discusses Scrum methodology, layering and tiers, as well as how to follow practices for writing great code. I would also suggest a course on JavaScript. I have had to train too many Junior and Senior developers on how to write JavaScript. It's scary that many web developers don't even understand fundamentals. u/Triapod · 3 pointsr/askscience Consider implementing an ALU which does various things like add and subtract and bit shifts. So, you have inputs A and B from your memories and your program instructs the computer to add them. The instruction for "add" is also sent to the ALU. So how does the ALU "change" it's function to add this time and subtract the next? Look at how a multiplexer works. So for a simple implementation, your ALU can compute both A+B and A-B (in parallel using separate logic gates) and then at the end, based on the instruction, select which to output. You can also try to imagine how a multiplexer can be used to implement various boolean operations by thinking about the truth tables :). So, if we can build the kind of ALU logic above, the key becomes addressable memories (note that addresses are themselves numbers which can be manipulated using ALU). When you compile and run a program, it is loaded into memory. The instruction and data are read and then fed into logic like above. If you are interested and have the time, the book Code presents the material quite well and accessibly. u/katyne · 3 pointsr/learnprogramming First of all you will never understand everything on the level that you think you want to understand it. That just doesn't happen. Even with an advanced degree, even working for the industry for 10+ years you'll end up specializing in this or that and sort of having some idea about the other things. There's just too much stuff to learn. Those black boxes people talk about - they're called "abstractions", a way to simplify complex details in order to understand a concept. And in the beginning all you'll be doing is trying to understand concepts. When you were learning to drive a car you didn't need to know the very last mechanical detail of its engine, right? You just had an abstract idea of it, you knew you had to put in fuel and turn on ignition and why you had to shift gears and stuff, and that was enough. Same here. First learn to hold the wheel and steer, then choose the field you'd like to specialize in. But it will take you a lifetime to learn everything about everything - and that's if all you'll be doing is learning, not making anything of your own. If you're like me you still need some introduction to "why" before you start learning "how" I would recommend this book - it's very approachable and sort of encompasses the general topics without going into much detail. Other than that it's hard to say anything because we don't know what your goal is, do you want to work as a web dev, a system programmer, or make games, write for mobile maybe? front end, backend, databases? It's like saying "I want to learn how to sport", well what kind of sport. They're all very different directions of virtually unlimited depth. u/chunyukuo · 3 pointsr/TranslationStudies First of all, congrats on the promotion and the learning spirit. I wish more managers had your attitude. I had a similar situation where I went from in-house linguist to loc manager, and I wonder if my experiences might be of use to you. Like you, I definitely did not describe myself as "into programming." I'm still not into that sort of thing. But learning as much of it as I could had a direct benefit to a lot of my daily tasks, and I would recommend at least giving the more learner-friendly tutorial sites a try. I finished a lot of modules on codecademy.com and genuinely enjoyed them because they were not particularly difficult and also allowed me automate a lot of things and also gain a deeper understanding of how things work. I went through Learn Python the Hard Way and gained a lot from that, especially since subsequent projects included quite a lot of assets in Python. I went so far as to plow through the first half of Code: The Hidden Language of Computer Hardware and Software (the latter half too arcane for me) and found that quite useful as well, although in hindsight it was a bit overkill. Even after my department was given an actual programmer to code up solutions for us, I at least was able to understand how a good amount of it worked. Coding aside, a localization manager is the person that the linguists and testers go to when things break, and man do they break a lot. That said, I would also recommend training of some sort in SDL and Kilgray's products if you use them. In my experience as manager, both broke often or were fussy at best. A few years later, I haven't really read much about code, but I still try to ask developers as many questions as I can about the technical aspects of their products and find it really helpful to follow up on Stack Overflow or just Wikipedia. Good luck with your new position! u/rekav0k · 3 pointsr/blackhat u/herky_the_jet · 3 pointsr/math You might enjoy the book "Code" by Charles Petzold (http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319), in combination with the "nand 2 tetris" course developed by MIT (http://www.nand2tetris.org/) u/wgren · 3 pointsr/dcpu_16_programming Code: The Hidden Language of Computer Hardware and Software,The Elements of Computing Systems and Inside the Machine were recommended on Hacker News. I have the last one, I will re-read it over Easter holidays... u/Psylock524 · 3 pointsr/compsci u/ummonommu · 3 pointsr/technology Code: The Hidden Language of Computer Hardware and Software A good read if you want history and understanding binary code, among others. Not exactly about politics, but more on the easy-to-read technical side. u/deeeeeed1221 · 3 pointsr/learnprogramming u/Lesabotsy · 3 pointsr/learnprogramming u/anachronic · 3 pointsr/AskNetsec &gt; I have zero Linux experience. How should I correct this deficiency? First, install a VM (Oracle OpenBox is free) and download a linux ISO and boot from it. Debian and Ubuntu are two of my favorites. Both are totally free (as are most linux distros). Once installed, start reading some beginner linux tutorials online (or get "Linux In A Nutshell" by O'Reilly). Just fuck around with it... if you screw something up, blow it away and reinstall (or restore from a previous image) &gt; Is it necessary? Should I start trying to make Linux my primary OS instead of using windows, or should that come later? It's not necessary, but will help you learn faster. A lot of security infrastructure runs on Linux and UNIX flavors. It's important to have at least a basic understanding of how a Linux POSIX system works. &gt; If you can, what are some good books to try to find used or on PDF to learn about cissp and cisa? Should I be going after both? Which should I seek first? You don't need to worry about taking &amp; passing them until you've been working in the field for at least 3-5 years, but if you can get some used review materials second-hand, it'll give you a rough idea what's out there in the security landscape and what a security professional is expected to know (generally) CISSP - is more detailed and broader and is good if you're doing security work day-to-day (this is probably what you want) CISA - is focused on auditing and IT governance and is good if you're an IT Auditor or working in compliance or something (probably not where you're headed) &gt; What are good books I can use to learn about networking? If you noticed I ask for books a lot its because the only internet I have is when I connect my android to my laptop by pdanet, and service is sketchy at my apartment. O'Reilly is a reliable publisher of quality tech books. An amazon search for "O'Reilly networking" pull up a bunch. Also, their "in a nutshell" series of books are great reference books for Windows, Linux, Networking, etc... You can probably find older/used copies online for a decent price (check ebay and half.com too) &gt; How would you recommend learning about encryption? I just subscribed to /r/crypto so I can lurk there. Again, can you point me at some books? Try "The Code Book" for a very accessible intro to crypto from ancient times thru today http://www.amazon.com/The-Code-Book-Science-Cryptography/dp/0385495323 Also, for basics of computer architecture, read "CODE", which is absolutely excellent and shows how computers work from the ground up in VERY accessible writing. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 u/Packet_Ranger · 3 pointsr/askscience You'd be well served by reading a book called "CODE - the Hidden Language of Computer Hardware". It starts all the way from the simplest electronic circuits, explains how a powered signal amplifier can be turned into an electronic switch (e.g. telegraph relays, and later, transistors), how those switches can be chained together to form logic gates per /u/Corpsiez, how those logic gates can be chained together to form arithmetic units and memory, and finally how to make a simple 8080 CPU and implement ASCII inputs and outputs. u/randrews · 3 pointsr/csbooks Code, by Charles Petzold is pretty much exactly what you want. u/Mazer_Rac · 3 pointsr/compsci Code by Charles Petzold. It starts with Morse code and works up to a fully functional computer processor. All while written in a prose style. Very nontechnical and a great read. http://www.amazon.com/gp/aw/d/0735611319?pc_redir=1410687129&amp;amp;robot_redir=1 u/Xavierxf · 3 pointsr/explainlikeimfive Code is what helped me wrap my head around this. You might have to read it a couple times to understand it, but it's really good. u/luciano-rg · 3 pointsr/IWantToLearn For an introduction on how computers work the book "Code" by Charles Petzold is very informative. It starts from rudimentary circuits of blinking lights to the complexity of modern computers. I found this book to close the gap between the concepts of software and hardware. Amazon link: https://amzn.com/0735611319 u/monkeybreath · 3 pointsr/science I'm an engineer, and did my Masters in voice compression. But I read Jeff Hawkin's On Intelligence a while back, and it was quite eye-opening (and a good read) about how the neo-cortex works and what its role is. I look at everything mind-related through this lens now. u/stochasticMath · 3 pointsr/AskReddit "processing power as human brains" is more difficult to quantify than one would think. The human brain is not some massively parallel simulator. The way the brain structures data, generates models, and the prediction/response loop require not simply raw processing power, but a different architecture. Read Jeff Hawkins' On Intelligence u/redcalcium · 3 pointsr/webdev I'm not really good at explaining stuff, but this book is a great resource regarding turing machine and turing completeness: Introduction to the Theory of Computation Turing machine is a model first proposed by Alan Turing. It's similar to Finite Automaton, but it has unlimited tape memory. Basically, modern computers is an implementation of the Turing Machine. One characteristics of a turing machine is it's possible to use a turing machine to simulate another turing machine. A programming language is said to be turing complete if it can be used to simulate a turing machine. If you've written a program in a turing complete language, you should be able to translate that program to any turing complete programming language. So if css become a turing complete language, I imagine people will use it to create crazy stuff such as an x86 emulator. u/KorgRue · 3 pointsr/webdev Only ever fully read a single programming book, and that was because the material was best learned through a book. It was about non language specific design patterns. It was a key that unlocked so much knowledge for me. The book is Design Patterns: Elements of Reusable Object Oriented Software. It's old but it does not matter. Object Oriented design patterns don't change much because they are tried and true, and you will see them in every OO language. While JS is not technically OO (though ES6 introduces some strong OOP features), it still uses many of the same patterns! https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612 u/donnfelker · 3 pointsr/androiddev What books can help you expand your knowledge of design patterns? u/SeriousGoose · 3 pointsr/MLPTalentExchange Design Patterns (Gamma, Helm, Johnson, Vlissides)[aka the Gang of Four] is THE Design Patterns book, but it's a little dry and I don't think it talks about MVC that much. I've gotten about 1/3 of the way through it twice, I just keep getting distracted and forget about it. Head First Design Patterns has gotten a lot of good reviews and I've head good things about the Head First series of books, but I haven't read any of them myself. I'm not sure about any online tutorials, but I'm sure they exist. I taught myself a little about Design Patterns and MVC, but I'm sure I understand them just enough to be dangerous. I'm going to be taking a class this Fall that should talk about Design Patterns more. u/technocraty · 3 pointsr/cscareerquestions As far as your courses go, the best book I can recommend is Algorithms in a Nutshell. It is a small book which quickly introduces you to most of the core algorithms you will use throughout University. It also covers measuring efficiency through "big O notation" - a very important concept in CS. If your University's SE program is anything like the one I am familiar with, you will also be focusing on software engineering principles. The most important SE books I ever read are: u/gauchopuro · 3 pointsr/programming The "Gang of Four" design patterns book http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;amp;s=books&amp;amp;qid=1251223171&amp;amp;sr=8-1 is the one that started it all. There are code samples in C++. I've also heard good things about the "Head First Design Patterns" book. u/sgspace321 · 3 pointsr/iOSProgramming I didn't look through your code. Just some general advice: Don't force design patterns into your app. Design patterns are solutions to common problems. You should just be familiar with them to recognize when you're trying to solve a common problem. You should read this: http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;amp;qid=1427141980&amp;amp;sr=8-1&amp;amp;keywords=design+patterns u/GrayDonkey · 3 pointsr/java 1. Learn the syntax of your programming language and the core API - http://docs.oracle.com/javase/tutorial/ 2. Learn object oriented design - Head First Object-Oriented Analysis and Design 3. Learn design patterns - Design Patterns: Elements of Reusable Object-Oriented Software 4. Pick a platform - Applets, Android, Desktop or a combination. Core Java and design stuff will be the same but basic things like how you draw graphics or play sounds will vary. Usually you end up adding libraries or frameworks on top of Core Java to help you handle game specific things like graphics, sounds, input, etc. Some might be different across the platforms you want to support and others (LibGDX for example) try to be a single framework that supports multiple platforms. 5. Learn your platform/libraries/frameworks. 6. Make "retro" games with bad artwork or learn to also be an artist or how to find/hire an artist. Many game programming books start off at step 4 so you need to do 1-3 before you get into those books. Never get an old book for step 4, the game specific stuff gets out of date too fast. Anything that does 1-4 all at the same time probably isn't worth it. u/carols10cents · 3 pointsr/cscareerquestions Design Patterns (sometimes called the "gang of four" book) is the best book I know of about design patterns- and I haven't heard of it being used in college courses (I could be wrong). This will give you a sense of the patterns that are out there, but knowing which pattern to use when is still mostly experience and even team-based preference/convention. CS programs give you a broad overview of computer science theory, which is helpful but is not in any way sufficient to prepare you for programming practices. What you're going through is totally normal. Debugging someone else's code is a skill that is also not taught in schools. I actually did a talk on debugging skills a few years ago; the slides don't make much sense but we made a PDF of our main points. I'll see if I can find video when I'm not on my phone. I'm happy to talk more about this stuff with you or anyone else reading this; pm me or email carol dot nichols gmail. u/schmook · 3 pointsr/brasil Na verdade eu sou físico. Acho que é mais comum entre os físicos adotar uma perspectiva bayesiana do que entre os matemáticos ou mesmo os estatísticos. Talvez por causa da influência do Edwin T. Jayes, que era físico. Talvez por causa da conexão com teoria de informação e a tentadora conexão com termodinâmica e mecânica estatística. O meu interesse pela perspectiva Bayesiana começou por conta do grupo de pesquisa onde fiz o doutorado. Meus orientador e meu co-orientador são fortemente bayesianos, e o irmão do meu orientador de doutorado é um pesquisador bastante conhecido das bases epistemológicas da teoria bayesiana (o físico uruguaio Ariel Caticha). Tem vários livros bons sobre probabilidade bayesiana, depende muito do seu interesse. O primeiro livro que eu li sobre o assunto foi justamente o do Jaynes - Probability Theory, the Logic of Science. Esse é um livro um pouco polêmico porque ele adota uma visão epistemológica bastante forte e argumenta de forma bastante agressiva a favor dela. Uma visão um pouco alternativa, bastante conectada com teoria de informação e também fortemente epistemológica você pode encontrar no livro Lectures on Probability, Entropy and Statistical Physics do Ariel Caticha - (de graça aqui: https://arxiv.org/abs/0808.0012). Eu fui aluno de doutorado do irmão do Ariel, o Nestor Caticha. Ambos têm uma visão bastante fascinante de teoria de probabilidades e teoria da informação e das implicações delas para a física e a ciência em geral. Esses livros são mais visões epistemológicas e teóricas, e bem menos úteis para aplicação. Se você se interessa por aplicação tem o famoso BDA3 - Bayesian Data Analysis, 3ª edição e também o Doing Bayesian Data Analysis do John Kruschke que tem exemplos em R. Tem um livrinho bem introdutório também chamado Bayesian Methods for Hackers do Cam-Davidson Pylon (de graça aqui: https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers) que usa exemplos em python (pymc). É bem basicão para aprender aplicações de probabilidades bayesianas. O livro All of Statistics do Larry Wasserman tem uma parte introdutória também de inferência bayesiana. Se você em interesse por inteligência artificial um outro livro muito bacana é o do físico britânico (recentemente falecido) David Mackay - Information Theory, Inference, and Learning Algorithms (de graça aqui: http://www.inference.phy.cam.ac.uk/mackay/itila/). Esse livro foi meu primeiro contato com Aprendizado de Máquina e é bem bacana. Outros livros bacanas de Aprendizado de Máquina que usam uma perspectiva bayesiana são Bayesian Reasoning and Machine Learning (David Barber) e o livro-texto que tem sido o mais usado para essa área que é o Machine Learning: a Probabilistic Perspective (Kevin Murphy). u/DrGar · 3 pointsr/statistics Try to get through the first chapter of Bickel and Doksum. It is a great book on mathematical statistics. You need a solid foundation before you can build up. For a less rigorous, more applied and broad book, I thought this book was alright. Just realize that the more "heavy math" (i.e., mathematical statistics and probability theory) you do, the better prepared you will be to face applied problems later. A lot of people want to jump right into the applications and the latest and greatest algorithms, but if you go this route, you will never be the one greatly improving such algorithms or coming up with the next one (and you might even run the risk of not fully understanding these tools and when they do not apply). u/thundergolfer · 3 pointsr/learnmachinelearning I head that the newer Machine Learning: A Probabilistic Perspective is equally good, and from the small amount I've read so far I'd agree. u/throwdemawaaay · 3 pointsr/algorithms A good lightweight introduction: Programming Collective Intelligence If you'd like a single, but more difficult, reference that covers much of the breadth of machine learning: Machine Learning: A probabiliistic Perspective u/mr_dick_doge · 3 pointsr/dogecoin &gt;There have been some excellent trading opportunities with returns as high as 30% to your overall portfolio! Crypto is providing big returns that are uncommon in traditional markets. I guess you have a good intention, Mr. Hustle, but I'd hate to see the kind shibes here being taken advantage of again. You should be more objective and also warn people that they can as easily lose that much of money when trading, especially when they don't know what they are doing initially. And the effectiveness of technical 'analysis' is a highly debatable issue. I'd just leave this quote from Wikipedia: &gt; Technical analysis is widely used among traders and financial professionals and is very often used by active day traders, market makers and pit traders. In the 1960s and 1970s it was widely dismissed by academics. In a recent review, Irwin and Park[13] reported that 56 of 95 modern studies found that it produces positive results but noted that many of the positive results were rendered dubious by issues such as data snooping, so that the evidence in support of technical analysis was inconclusive; it is still considered by many academics to be pseudoscience.[14] Academics such as Eugene Fama say the evidence for technical analysis is sparse and is inconsistent with the weak form of the efficient-market hypothesis.[15][16] Users hold that even if technical analysis cannot predict the future, it helps to identify trading opportunities.[17] ... &gt; Whether technical analysis actually works is a matter of controversy. Methods vary greatly, and different technical analysts can sometimes make contradictory predictions from the same data. Many investors claim that they experience positive returns, but academic appraisals often find that it has little predictive power.[51] Of 95 modern studies, 56 concluded that technical analysis had positive results, although data-snooping bias and other problems make the analysis difficult.[13] Nonlinear prediction using neural networks occasionally produces statistically significant prediction results.[52] A Federal Reserve working paper[21] regarding support and resistance levels in short-term foreign exchange rates "offers strong evidence that the levels help to predict intraday trend interruptions," although the "predictive power" of those levels was "found to vary across the exchange rates and firms examined". I'm not saying not to take coaching from DogeHustle, just that if people want to do it, be aware of its 'limitation' too and have fun doing it with your disposable money only. As an alternative, I strongly suggest shibes who want to try predicting the future based on pattern analysis to do it in a principled manner and learn math, stats and machine learning. It won't be easy, but it will have a wide application beyond trading (so-called data 'science' is the hot job nowadays). It will also teach you the limitation of such methods, and when it might fail, especially in such a manipulated market like crypto. This is a good book to start with: http://www.amazon.co.uk/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020 u/bailey_jameson · 3 pointsr/MachineLearning u/PLLOOOOOP · 3 pointsr/MachineLearning Is this the Bishop book you guys are talking about? u/samuelm · 3 pointsr/mlclass He's talking about the distribution of the error of y not J(or the distribution of the probability of the function y given x). It's explained in the lecture notes, and in page 29(figure 1.16) of Bishop's book there's an illustration that switched on the bulb for me(althought I found the book almost incomprehensible). You can look it using the amazon preview http://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738/ref=sr_1_1?ie=UTF8&amp;amp;qid=1318610381&amp;amp;sr=8-1#reader_0387310738 u/ChristianGeek · 3 pointsr/learnmachinelearning Amazon links to books mentioned (no affiliate). Warning: A lot of high textbook prices here...look for eBooks and/or used copies of earlier versions: Introduction to Mathematical Statistics (Hogg, McKean, &amp; Craig) All of Statistics (Wasserman) Statistical Inference (Casella &amp; Berger) Pattern Recognition and Machine Learning (Bishop) (only reasonably priced as an eBook) Hitchhiker's Guide to Python u/rainbow4214 · 3 pointsr/OverwatchUniversity u/G-Brain · 3 pointsr/math I know nearly nothing about this topic, but I find it extremely interesting and I've done some searching. Two more lectures: u/SchrodingersLion · 3 pointsr/math Gödel, Escher, Bach is a popular one. If you're looking for fiction, then I highly recommend Flatland. u/noveltyimitator · 3 pointsr/atheism The Eternal Golden Braid For a collection of simple elements with enough resources and time, emergence of structure occur. Recursion, and the system of seemingly simple neurons can lead to consciousness. u/OdwordCollon · 3 pointsr/pics I'm reminded of this book u/vergoa · 3 pointsr/math u/voyetra8 · 3 pointsr/AskReddit To OP: You should read Godel, Escher, Bach: An Eternal Golden Braid by Douglas Hofstadter: http://www.amazon.com/Godel-Escher-Bach-Eternal-Golden/dp/0465026567 It's a GREAT book. They cover this mental exercise, as well as a ton of others that would likely enjoy. u/habroptilus · 3 pointsr/suggestmeabook Uncle Tungsten: Memories of a Chemical Boyhood by Oliver Sacks. Sacks is best known for writing case studies of his patients as a neurologist, such as The Man Who Mistook His Wife For a Hat. Uncle Tungsten is part memoir, part history of and introduction to chemistry. There's nothing quite like it out there. The Selfish Gene by Richard Dawkins. Dawkins's Twitter antics notwithstanding, this book is an unmissable classic in biology. Godel, Escher, Bach by Douglas Hofstadter. An ode to consciousness, full of puns, music and metamathematics. Mind, Body, World by Michael Dawson. This is a textbook, but it's (legally!) available for free online, and it's totally engrossing. The author uses his work in music cognition to introduce the major theories and paradigms of cognitive science and show how there isn't as much separation between them as it seems. u/actualscientist · 3 pointsr/askscience The book that kindled my interest in Artificial Intelligence and started my journey toward getting a PhD in Computer Science was Gödel, Escher, Bach: An Eternal Golden Braid. It's not current with respect to the state of the art, but it is a compelling, high-level tour through some of the biq questions in Artificial Intelligence, Cognitive Science, and Computer Science in general. u/iLikeSpegettiWestern · 3 pointsr/cheatatmathhomework Have you read Gödel, Escher, Bach? There are some great analogies between math, music, art, and other really neat stuff. https://www.amazon.com/Gödel-Escher-Bach-Eternal-Golden/dp/0465026567 u/lysa_m · 3 pointsr/math u/gipp · 3 pointsr/askscience I'm assuming you're looking for things geared toward a layman audience, and not textbooks. Here's a few of my personal favorites: Sagan Cosmos: You probably know what this is. If not, it is at once a history of science, an overview of the major paradigms of scientific investigation (with some considerable detail), and a discussion of the role of science in the development of human society and the role of humanity in the larger cosmos. Pale Blue Dot: Similar themes, but with a more specifically astronomical focus. Dawkins The Greatest Show on Earth: Dawkins steers (mostly) clear of religious talk here, and sticks to what he really does best: lays out the ideas behind evolution in a manner that is easily digestible, but also highly detailed with a plethora of real-world evidence, and convincing to anyone with even a modicum of willingness to listen. Hofstadter Godel, Escher, Bach: An Eternal Golden Braid: It seems like I find myself recommending this book at least once a month, but it really does deserve it. It not only lays out an excruciatingly complex argument (Godel's Incompleteness Theorem) in as accessible a way as can be imagined, and explores its consequences in mathematics, computer science, and neuroscience, but is also probably the most entertainingly and clearly written work of non-fiction I've ever encountered. Feynman The Feynman Lectures on Physics: It's everything. Probably the most detailed discussion of physics concepts that you'll find on this list. Burke Connections: Not exactly what you were asking for, but I love it, so you might too. James Burke traces the history of a dozen or so modern inventions, from ancient times all the way up to the present. Focuses on the unpredictability of technological advancement, and how new developments in one area often unlock advancements in a seemingly separate discipline. There is also a documentary series that goes along with it, which I'd probably recommend over the book. James Burke is a tremendously charismatic narrator and it's one of the best few documentary series I've ever watched. It's available semi-officially on Youtube. u/lechnito · 3 pointsr/AskReddit Economics: u/senzei · 3 pointsr/reddit.com I'd recommend having a read of Godel, Escher, Bach: An Eternal Golden Braid. The book that taught me that consciousness happens somewhere between hormone propogation and the physiological manifestations of our biochemical state. That and it is just plain good reading. u/HeyHesRight · 3 pointsr/math I too love fun math[s] books! Here are some of my favorites. The Number Devil: http://www.amazon.com/dp/0805062998 The Mathematical Magpie: http://www.amazon.com/dp/038794950X I echo the GEB recommendation. http://www.amazon.com/dp/0465026567 The Magic of Math: http://www.amazon.com/dp/0465054722 Great Feuds in Mathematics: http://www.amazon.com/dp/B00DNL19JO One Equals Zero (Paradoxes, Fallacies, Surprises): http://www.amazon.com/dp/1559533099 Genius at Play - Biography of J.H. Conway: http://www.amazon.com/dp/1620405938 Math Girls (any from this series are fun) http://www.amazon.com/dp/0983951306 Mathematical Amazements and Surprises: http://www.amazon.com/dp/1591027233 A Strange Wilderness: The Lives of the Great Mathematicians: http://www.amazon.com/dp/1402785844 Magnificent Mistakes in Mathematics: http://www.amazon.com/dp/1616147474 Enjoy! u/jackthehobo · 3 pointsr/compsci Gödel, Escher, Bach: An Eternal Golden Braid is a fantastic read. Hofstadter discusses Gödel's incompleteness theorems, computability, AI, music, and art, and generally about how complexity arises out of self reference. u/PatricioINTP · 3 pointsr/INTP The most INTP book I know is Godel Escher &amp; Bach, though the second half is harder to get through. The first half more than makes up for it. Other favs include Frank Herbert’s Dune series and his lesser known semi-stand alone book The Dosadai Experiment. That said, The Golden Age Trilogy by John C Wright is jam packed with sci-fi ideas. If you rather read modern fantasy, look at Night Watch series by Sergei Lukyanenko. Meanwhile as a history buff I also like the Temeraire series by Naomi Novik, where European powers in the Napoleonic era have a draconic Air Force. Right now I am redoing and continuing the Thursday Next series byJasper Fforde, which combines alt history, sci-fi, British humor, and meta level shenanigans. A must for classic lit fans who don’t mind its wackiness. (Others can’t stand it) I also recently finished a reread of Roadside Picnic, which Stalker movie and game were loosely based on. Edit: one other. If you want a “Lovecraftian Simulator”, get House of Leaves ASAP. It is less of a book to sit down and read and more of an interactive experience. u/violinplayer · 3 pointsr/violinist Jaap Schroder wrote a book detailing his study of the Solo violin works, and he's recorded the concertos as well. That's a good place to begin. There are some really brilliant insights that most students would never consider. Don't get caught up thinking you are handcuffed and can only imitate an anemic baroque style or a warbly, romantic style. This video is one sort of hybrid, where the soloist and conductor are very aware of performance practice, but modern instruments and techniques are relied upon heavily. Remember that no recordings exits before 1900ish. There's still a lot of personal judgment in a good historically informed performance. There are many great Bach interpretations, and you should listen to many recordings (Grumiaux is often held in high esteem, and Schroder, as good models) to find out where your preferences lie. You should attempt to play with all sorts of expressive devices (Non vib, lots of decay, faster bow, different bow strokes, bowing patterns, holding the bow higher, gut strings?, baroque bow) and find out what you have to say about Bach. I think any successful interpretation will at least have two major things: a tremendous sense of line (form, rhythm, a large-scale view) and an expressive use of the tone color (bright, warm, deep, thick, feathery, etc.). Leopold Mozart also wrote a treatise on violin playing. In terms of playing style, he was more familiar with the Baroque than with the music of W.A mozart. He wrote about a sense of "affect" in Baroque music. He wrote that overall, there is one overriding feeling that should come across in Barque works (especially dances and binary form movements.) In the E major Bach, I bet it would be helpful to decide what the "affect" is for each movement. Is there only one, is the narrative single-minded? More simply, come up with something other than "happy" or "sad." Don't let anyone tell you Bach was a stodgy, strict person. He was ridiculously smart, as shown by his ability to improvise multi-voice fugues. Hofstader wrote eloquently about Bach's puzzles and intellectualism. He was a jokester - the crab canon and the Coffee Cantata or good examples. He was sometimes compensated for his work with large amounts of beer. Bach had somewhere around 20 children, about half of which survived childhood. Bach was a very complex person, with lots of life experience. Don't let a careless caricature influence how you think about his music. u/dr_entropy · 3 pointsr/InsightfulQuestions Douglas Hofstadter talks about something like this in I am a Strange Loop. Here's an interview that talks about it a bit. I recommend reading the book, though you may enjoy it more after reading Godel, Escher, Bach: An Eternal Golden Braid. u/jbigboote · 3 pointsr/programming I am surprised this book has not been mentioned yet: http://www.amazon.com/Gödel-Escher-Bach-Eternal-Golden/dp/0465026567/ u/Veteran4Peace · 3 pointsr/Science_Bookclub u/higz · 3 pointsr/AskReddit Gödel, Escher, Bach: An Eternal Golden Braid I still haven't finished it though, it's demanding and very rewarding when you do understand it. u/introspeck · 3 pointsr/eldertrees First book I recommend to any programmer, no matter what they're working on, is The Pragmatic Programmer. Excellent stuff. If you don't get a shot at low-level coding at work, get yourself an Arduino kit and just hack away. Apparently the language is similar to / based on the C programming language. I use C every day. To do well with embedded systems, real-time, device driver, or kernel type stuff, you have to really, really, really, understand what the hardware is doing. I was able to learn gradually because I started programming when there was one CPU and no cache memory. Each hardware operation was straightforward. Now with multi-core CPUs, multi-level cache memory, multiple software threads, it becomes a bit more complex. But something like the Arduino will teach you the basics, and you can build on that. Every day I have to think asynchronously - any operation can happen any time, and if they share memory or other resources, they can't step on each other. It can get hairy - but it's really fun to reason about and I have a blast. There's a lot more I'm sure, but get started with some low-level hacking and you can build from there. If you want to get meta, many of the best programmers I know love Godel, Escher, Bach because it widens your mental horizons. It's not about programming per se, but I found that it helps my programming at a meta level. (and it'll give you a lot to meditate on when you're baked!) u/ichmusspinkle · 3 pointsr/math Gödel, Escher, Bach gets recommended a lot, but for good reason. And it has a good layman's explanation of the incompleteness theorems. u/butchdogt1234 · 3 pointsr/AskReddit How about Godel, Escher, Bach? I currently have the book and have read a good bit of it. I'd highly recommend it to anyone. As a matter of fact, you should buy it for yourself and use it within your class. u/dfmtr · 2 pointsr/MachineLearning You can read through a machine learning textbook (Alpaydin's and Bishop's books are solid), and make sure you can follow the derivations. Key concepts in linear algebra and statistics are usually in the appendices, and Wikipedia is pretty good for more basic stuff you might be missing. u/tetramarek · 2 pointsr/compsci I watched the entire course of Data Structures and Algorithms by Richard Buckland (UNSW) and thought it was excellent. http://www.youtube.com/playlist?list=PLE621E25B3BF8B9D1 There is also an online course by Tim Roughgarden (Stanford) currently going on. It's very good but I don't know if you can still sign up. https://class.coursera.org/algo Topcoder.com is a fun place to test your skills in a competitive environment. That being said, based on the description you are interested in things which don't usually fit into algorithms books or courses. Instead, you might want to look into machine learning and maybe even NLP. For example Pattern Recognition and Machine Learning by Bishop and Foundations of Statistical Natural Language Processing by Manning &amp; Schuetze are great books for that. u/TalkingBackAgain · 2 pointsr/intj 24 years ago was a better time for me as well. "The Prince" [Niccolò Machiavelli] "The Demon-Haunted World [Carl Sagan] "Gödel, Escher, Bach: An Eternal Golden Braid" [Douglas Hofstadter] "On War" [Carl von Clausewitz] "Intuition Pumps And Other Tools For Thinking" [Daniel C. Dennett] u/lordlicorice · 2 pointsr/compsci The famous tome Gödel, Escher, Bach covertly explains a lot of important concepts at about your level. And it's meant to be entertaining, not a textbook. I think it's perfect for you! u/tubameister · 2 pointsr/Meditation I am a strange loop is essentially a summary of GEB, I remember reading it a few years ago and dropping it halfway through for some reason... I'm reading GEB right now and it's fascinating, though. Amazon. Pdf. u/gr33nsl33v3s · 2 pointsr/math You'd probably really like Godel, Escher, Bach. u/SuperC142 · 2 pointsr/explainlikeimfive I recommend reading: The User Illusion by Tor Norretranders, Gödel, Escher, Bach by Douglas R. Hofstadter, and I Am a Strange Loop also by Douglas R. Hofstadter for some interesting reading on the subject (Warning: Gödel, Escher, Bach isn't for everyone- it's a bit strange, but I love it). I read a lot of books on science in general and, based on that, it seems like many believe consciousness and also free will is just an illusion. In fact, just a few days ago, physicist Brian Greene sorta-kinda said as much in his AMA - granted, he's talking specifically about free will and not consciousness per se, but I think the two must be very related. I, too, believe in God and also have a very strong belief in and enthusiasm for science, so this is an especially fascinating question for me. BTW: if you're interested in the way the brain works in general, I highly recommend How the Mind Works by Steven Pinker. u/sun_tzuber · 2 pointsr/Showerthoughts 1. Yes, but pirates. I would try to join their cause. Would you? 2. For now. Some day maybe there will be a user friendly wrapper. 3. We're much closer to this now than we were in the age of pen-and-paper. u/roboticjanus · 2 pointsr/outside This one is a good alternate for the meta-game/mechanical perspective, I've found. Really helps look at some of the underlying connections between different mechanics in-game, so things start to make sense on a broader scale. Also, to add my own 2¢... If you get too caught up in grinding and the game starts to feel empty and boring, this essay on grinding and the idea of 'the endgame'/global goal conditions not really existing consistently across the playerbase was written by a last-gen player, whose other writings I also happen to like quite a bit. u/iemfi · 2 pointsr/Futurology As I've said many times in this thread, I do not think that Watson is general AI. Watson would not be able to do any of these things today. Please don't ask me to repeat this again. The point is that the ability to break questions into relevant sub-questions is intelligence. Watson does this. It does not do it as well as a human but it still does it scarily well. There's nothing "just" about recursing your way to intelligence, the complexity involved is staggering. &gt;Understanding the equation is not the same as having to ability to ask yourself "Can I use this equation?", and having the ability to ask yourself "Can I use this equation?" is actually quite useless if you also have to ask yourself if you can use every other possible equation in existence (of which there are infinite). Understanding the usage of an equation is being able to answer the question "What can I use this equation for?" Understanding how to derive an equation is being able to answer the question "How do I derive this equation?" When we say someone understands an equation we really mean understanding the usage of an equation, understanding how to derive the equation, and a whole host of other types of understanding. Understanding a myriad of other foundation concepts is also implied when we say someone understands an equation. It is a huge tangled web you have to traverse to solve the simplest of problems. We do it quickly and without being conscious of it but there isn't some magical "understand" symbol which allows us to skip the whole process. I think I'm just doing a terrible job at explaining myself in general. I really would recommend the book GEB, he explains it amazingly well. u/tvcgrid · 2 pointsr/AskReddit Read Godel, Escher, Bach. It's good. u/HenryJonesJunior · 2 pointsr/compsci GEB is a single book, not a list of recommended authors. u/SirClueless · 2 pointsr/GEB http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567 Not sure if this is exactly what you're asking for. u/scomberscombrus · 2 pointsr/Meditation Could you explain how his ideas are fallacious? The question is not "Do I have free will?", but "What am I?"; It's not like the ideas are his own, see Sweeping Zen - Karma, Free-will, and Determinsm. Also, look up the Taoist concept of wu-wei. You might also be interested in Gödel, Escher, Bach by Douglas Hofstadter. Essentially, the I appears to be the result a continuous process of self-reference within the brain. There really is no room for the typical Western idea of free will, because the I that is supposed to have the free will does not actually exist as we normally think of it. Saying that humans exist separate from what we normally think of as the nature of cause and effect would be like saying that humans exist separate from karma, or that atman is separate from brahman. It's just that Western culture is so focused on ideas of separation and individualism that it forgets our fundamental unity with all of nature. He's not really attacking free will; He's attacking the dualistic view of nature that gives rise to ideas of free will. u/alittleperil · 2 pointsr/AskAcademia If you haven't read GEG:EGB yet, you should u/TsaristMustache · 2 pointsr/suggestmeabook You might really dig Gödel, Escher, Bach u/moreLytes · 2 pointsr/DebateAnAtheist Ah, I too have wrestled with Gödel's theorems, and how they seem to necessitate fallibilism. You might also find Heisenburg Uncertainty Principle to be disconcerting. :) &gt; I can now see maybe how religious people might look at something in their holy book that opposes modern science (lilke evolution) and say "hey there's always a chance this is wrong!" And I guess i can't fault them there. I fault them for lazy, mystical, and unsophisticated thinking. That said, I feel like I understand where you're coming from, and wanted to share some resources that benefited me: • Godel, Esher, Bach. I haven't finished this highly acclaimed book yet, but its lucid explorations of consciousness, fallibilism, and paradox have proved quite formative. • Critical Rationalism. A theory of knowledge (epistemology) that dominates how many professionals view science today. • Bayesian Epistemology sequence. This directed collection of musings has (more than most) resonated with my own discoveries and experiences. Highly recommended. u/piratejake · 2 pointsr/math Escher's work with tessellation and other mathematical ideas are fairly well-known and documented so I'll try to mention a few examples of things I learned in an art history course a while ago. DaVinci's Vitruvian Man used Phi in the calculation of ratios. Example: the ratio of your arm to your height or your eyes to your face is nearly always Phi. I'm not sure if I'm correct in the body parts mentioned, my art history class was nearly 6 years ago so I'm a bit rusty. I'll try to think of some more examples and post. EDIT: a few more examples have come back from memory. DaVinci was a master of perspective as well. As you can see DaVinci used linear lines to draw attention to the subject of his works. In the case of The Last Supper, the lines from the structure of the building, to the eyes and gestures of the disciples aim towards Jesus. Botticelli's Birth of Venus uses a triangle to bring the subject into the viewer's mind. The two subjects on the left and right form the lines that meet at the middle of the top and close off a triangle with the bottom of the work. Venus herself is in the middle of the triangle which brings your attention to her immediately upon viewing the work. Michelangelo's Pieta also uses a triangle to highlight its subjects. Mary's figure creates a triangle (which is considered to be quite intentional based upon her size, both in relation to Jesus, a full grown man, and from her upper and obviously enlarged lower body). Her triangle makes the outline for the subject, Jesus. He is nearly in the center of both the horizontal and vertical axises. The way he is laying, from near the top of the left and then draping to the bottom of the right, depicts a very lifeless form because of the unnatural laying. Moving the viewer's gaze from the top to the bottom of the triangle strengthens the emotion of the scene. Moving on to architecture, vaulted ceilings also use triangles to draw your eyes down a line also make an awe-inspiring impression. In contrast to the European's love of straight lines and geometric figures, the traditional Japanese architectural style was opposed to using straight lines. As you can see, nearly every line in a traditional Japanese building is curved. The traditional belief was that straight lines were evil because they thought evil spirits could only travel in straight lines. This design criteria made for very interesting formations and building methods which I would encourage you to check out because of the sheer dedication to the matter. The Duomo in Florence is a great example of Renaissance architecture and has a really cool octagonal shaped dome. I could go on and on about how awesome Brunelleschi's design was, but I'll just let you read about it here. I could talk all day about this sort of stuff, just let me know if you want anything else or have any questions. Good luck with your class! EDIT2: I've found some more links about the subject of mathematics in art and architecture. It looks like University of Singapore actually has a class on the subject. There's also a good Wikipedia page on it as well. This article is pretty lengthy and knowledgeable, but doesn't include pictures to illustrate the topics. Finally, as almost anybody in r/math will testify, Godel, Escher, Bach by Douglas Hofstadter is a fantastic read for anybody interested in mathematics and cool shit in general. EDIT3: LITERATURE: I know we've all heard what a badass Shakespeare was, but it really hits you like a bus when you find out that how well the man (or for you Shakespeare conspiracy theorists, men) could use words in rhyme and meter. Here's a Wikipedia article about his use of iambic pentameter and style. Nothing else really comes to mind at the moment as far as writers using math (other than using rhyme and meter like I mentioned Shakespeare doing); however, I can think of a few ways to incorporate math. If you would like to go into any sort of programming during the class, you could show how to make an array out of a word. Once that concept is understood, you could make them solve anagrams or palindromes with arrays... a favorite of mine has always been making [ L , I , N , U , X ] into [ U , N , I , X ] ( [ 3 , 2 , 1 , 4 ] for the non-array folks ). u/munen123 · 2 pointsr/programming u/robot_lords · 2 pointsr/Physics u/kyp44 · 2 pointsr/math Because in any formal system with sufficient power (like modern mathematics) Gödel showed that it is possible to construct a statement that is true but cannot be proven. IIRC the statement boils down to "This statement is not a theorem". If it is a theorem (meaning it can be proven within the system) then it is true and so leads to a contradiction because it asserts that it is NOT a theorem. Assuming it is not a theorem does not lead to such a contradiction but then means that the statement is in fact true. So since one possibility leads to a contradiction while the other doesn't it must be that this statement is true but not a theorem (and therefore unprovable). If you are interested in this at a pretty informal level check out the fun and interesting book Gödel, Escher, Bach. u/Hypersapien · 2 pointsr/GEB u/float_into_bliss · 2 pointsr/askscience Child development researchers have found a lot of interesting "milestones" that seem to differentiate our cognition from that of other primates. An interesting one is called Theory of Mind -- basically the ability to reason that other people are conscious beings as well. At around 2-5 years old, children tend to understand that other people have a mind too, and the children learn to empathize with others (empathize in this context just means being able to imagine what a situation appears like from another person's point of view). The "Theory of Mind" clip on [this episode] (http://science.discovery.com/videos/through-the-wormhole-did-we-invent-god/) of the excellent series Through the Wormhole has a really great explanation. At around 6-7 years old, children add another level of indirection -- a child not only realizes that other people have minds, but they realize that other peoples' minds are aware of their own minds. First you realize that you have a mind, then around 2-5 years old your mind can imagine other minds, then at around 6-7 you realize that those other minds can imagine your mind as well. That's the point when kids learn what deception is and become sneaky little bastards. There's a lot of other really interesting child development milestones. For example, most animals lack the ability to realize that what they see in the mirror is actually themselves instead of another animal. Part of what it means to be human appears to be this ability to think about and layer self-referential concepts -- you've got a mind, this other person has a mind, but in the same way that your mind can think about the other mind, the other mind can think about you, and you can use that understanding of awareness to then change how you interact with the other mind (i.e. "I didn't steal the cookie from the cookie jar!", "These white lines on a blueprint show you how you build a skyscraper"). If you're up for a challenge, the 1979 classic [Godel, Escher, Bach: An Eternal Golden Braid] (http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567) touches on a lot of these self-referential concepts. Be warned: it's not an easy nor a short book. ----------- : The worst thing "Through the Wormhole" has going for it is its incredibly cheesy name that mares an otherwise fantastic documentary series about the "rockstars" of current science. Other shows that have fallen into the terribly-cheesy-name-but-otherwise-excellent-show trap include "Battlestar Gallactica". u/acamann · 2 pointsr/todayilearned If someone reading this is interested in this idea of a unified theory but doesn't feel like diving into principa mathematics, check out Douglas Hofstadter's book Godel Escher Bach: an eternal golden braid. It is intense in its own right, but glorious nerd entertainment! https://www.amazon.com/Gödel-Escher-Bach-Eternal-Golden/dp/0465026567 u/iDante · 2 pointsr/philosophy Ooooooh my friend you have entered into the realm of a particular book that I recommend to anyone who is able to think: Gödel, Escher, Bach. From the intro, "In a word, GEB is a very personal attempt to say how it is that animate being can come out of inanimate matter. What is a self and how can a self come out of stuff that is as selfless as a stone or a puddle." It won the Pulitzer Prize long ago and is overall amazing. Its author has worked with Dennett on other publications about intelligence too, such as The Mind's I. That being said, it's quite a difficult and mathy read, but well worth it IMO. http://www.amazon.com/Gödel-Escher-Bach-Eternal-Golden/dp/0465026567 u/karlbirkir · 2 pointsr/Psychonaut The, err, idea you're talking about is a pretty big one and a lot of cool stuff has been written about it. You might really enjoy reading some of it, if you haven't already, and even writing some of your own. The question if we make words for ideas or if we get have ideas because we have words for them is a mind boggling. You can probably find wikipedia articles through the article for structuralism. Then there's this amazing book which goes into the question about the neurons and consciousness, called "Gödel, Escher, Bach." here: http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567/ref=sr_1_1?ie=UTF8&amp;amp;qid=1335752053&amp;amp;sr=8-1 And the obligatory reading-guide subreddit: http://www.reddit.com/r/GEB/ u/Notlambda · 2 pointsr/dataisbeautiful Sure. Without anything to go on, I'll just recommend some of my favorites. :) • Godel Escher Bach - Mindbending book that delves into connections between art, music, math, linguistics and even spirituality. • Code - The Hidden Language of Computer Hardware and Software - Ever wondered how the black box you're using to read this comment works? "Code" goes from transistor to a fully functioning computer in a sequential way that even a child could grasp. It's not at the "How to build your own computer from Newegg.com parts". It's more at the "How to build your own computer if you were trapped on a desert island" level, which is more theoretical and interesting. You get strong intuition for what's actually happening. • The Origin of Consciousness in the Breakdown of the Bicameral Mind - An intriguing looking into the theory that men of past ages actually hallucinated the voices of their gods, and how that led to the development of modern civilization and consciousness. u/winnie_the_slayer · 2 pointsr/JordanPeterson Has Peterson ever spoken about somatic psychotherapy? Embodied cognition? I feel his approach to psychology is entirely intellectual, and as somatic research is showing thoughts are just a way to make sense of lived experience, and when the lived experience is changed (through exercise, yoga, diet, somatic psychotherapy, etc) the previous thoughts often become irrelevant. GEB discusses this as "jumping levels of context." Perhaps Peterson is stuck in a level of context, trying to figure it out intellectually, instead of jumping out of that to a bigger context, a "whole organism" approach to therapy. In other words, it is not possible to sort oneself out just by thinking about it. It runs right into Freud's/Jung's ideas about the unconscious: you can't become conscious of things about aspects of existence of which you are unconscious without some kind of action to break out of conscious thinking. Perhaps underlying the intellectual approach of Peterson is an inherent hostility to feelings and emotions, which would correlate with his conservative position and history of depression and misery. u/sheep1e · 2 pointsr/AskReddit &gt; SO lets say I want to play "As Time Goes By". On paper, all it says is Gm7 C7, Cm6 C7, F6 Gm7 (to the words 'You must remember this, a kiss is still a kiss, a sigh is still a sigh.) Those chords are locked together perfectly, like a jigsaw puzzle. You've described a written language, expressed in symbols of some sort (Gm7 C7, Cm6 C7, F6 Gm7). This written language "maps" to actual music, i.e. symbols correspond to musical chords or notes the same way that points on a map correspond to places in the real world. This is a perfect example of how most of mathematics works: you analyze some problem and come up with a way to express and manipulate models of the problem in a useful way. Just as written music lets you express and manipulate a model of actual music, written mathematics lets you express and manipulate models of... almost anything. There's not just one single kind of mathematics, there are many kinds of mathematics for dealing with many kinds of problems. This includes the usual things you've probably heard of, like numbers, geometry, sets, and topology, but also more abstract things like groups and categories (terms with specific meanings in mathematics.) Then there are more applied things, such as the mathematics used in phyics or engineering, to model subatomic particles or buildings and bridges. In many cases, in pure mathematics, the problem being modeled may be something very abstract. In areas like applied math, physics, and computing, the problems are more concrete. But in both cases, what mathematics does is focus on essential elements of the problem, and capture them in a system of symbols (usually), and rules for manipulating those symbols that match the rules of the problem being modeled. Mathematics is then about analyzing problems and creating these models and the languages to describe them; manipulating models in these languages; and "proving" things by following the rules of a given language to show that it is or isn't possible to achieve certain results. I don't know if anyone else has suggested this, but you should try to get hold of a copy of the book Godel, Escher, Bach: An Eternal Golden Braid (at Amazon and Powell's). It's quite well-known, so you can probably find a copy in any decent library. The Powell's page includes the following description: "A mixture of art, philosophy, music, math, technology, and cognitive science, the book's title only reflects one aspect of its subject matter; namely, the connection between the work of mathematician Kurt Gödel, the artist M. C. Escher, and the composer J. S. Bach. [...] Douglas Hofstadter’s book is concerned directly with the nature of 'maps' or links between formal systems." Those formal systems that the book discusses, very accessibly, in many respects are mathematics - or at least, simple examples of the kinds of languages that mathematics creates and deals with. The ongoing connection to music in the book may also provide more insight into the relationship between music and mathematics. u/banachball · 2 pointsr/books Here is the Amazon link. u/A_Downvote_Masochist · 2 pointsr/changemyview This isn't going to change your views... in fact, it may very well reaffirm them... but you should definitely read Gödel, Escher, Bach by Douglas Hofstadter if you haven't already done so. Among many other things, Hofstadter discusses Gödel's incompleteness theorem. The gist of this theorem is that no formal system can ever be complete. In the formal system of mathematics, that means that there is always a true statement which is not a provable theorem within the system. In other words, "provability is a weaker notion than truth." Hofstadter also discusses the concept of "leveled realities" and intelligent AI. One implication of the book is that, even if we are inside some sort of program... it is impossible for us to realize that. It is impossible for us to "escape" the program. The program will necessarily be incomplete, but the only way to see the "holes" - the incompleteness - is to be outside the program, looking at it from above. The only way I can think to change your views, though, is to argue that our universe is not a formal system. Formal systems are functional - given one input, there is only one output. Math is a formal system. Computers are formal systems. All "randomness" in a computer is actually just the output of another program designed to simulate randomness – pseudorandomness, if you will. Our universe kind of seems like a formal system on a macro level - there are laws that govern interactions. But at the level of subatomic physics, that all breaks down. The universe is no longer deterministic; it's random. Here's an article about an experiment in quantum physics which concluded that quantum randomness is not computable.. And while that may not matter much to us, quantum randomness probably had huge repercussions on the development of the universe beginning with the Big Bang due to chaos theory. Now you may say, "Well, maybe in the next highest reality they have better computers than we do which can produce/compute quantum randomness." And there's really no argument I can posit against that, because all my arguments are based on the physical and mathematical laws present in our world. Gödel says that neither of us can ever see beyond the system in which we operate; what you are positing is essentially a science-y argument for the existence of a God or gods – maybe not gods in the traditional sense, but intelligent higher beings on another plane that created our universe and do not obey its laws. It's a matter of "faith." u/RandomMandarin · 2 pointsr/zen Looks like someone's been reading GEB! u/AlotOfReading · 2 pointsr/math To understand the general history of math, you won't need to understand what you most likely consider to be math. You will, however, need to understand how to put yourself in the shoes of those who came before and see the problems as they saw them, which is a rather different kind of thinking. But anyway, the history of math is long and complicated. It would take years to understand everything and much of it was work done on paths that are now basically dead ends. Nevertheless, here are some other resources: u/garblz · 2 pointsr/IWantToLearn Very Special Relativity a simple explanation of a complex phenomena Thinking, Fast and Slow explains why we actually do live in a Matrix, and how, focusing on statistics instead what your guts tell you, to be able to break the veil of lies sometimes. Gödel, Escher, Bach: An Eternal Golden Braid how music is connected with art and mathematics? Exploration of symmetries, where none are expected to be found. Watch everything Richard P. Feynman related on YouTube, start with interviews and the rest will probably follow. I seriously think you should start with science. Getting a glimpse of how world works at the quantum levels can surprisingly enlighten someone on topics one thought were philosophical. E.g. recent Reddit post asked whether true randomness exists, and the answer to read almost pointless kilograms of philosophy made me cringe. Quantum physics has tonnes more to say, and it's actually verifiable by experiment. So I guess my advice is, before going the way of philosophical banter about the existence of coffee shop around the corner, you can just walk the few steps and take a look yourself. Hence, science as a first suggestion. u/Meinsilico · 2 pointsr/booksuggestions I don't believe fiction is the place to go, I'd suggest: Gödel, Escher, Bach: About Music, Arts, Consciousness, Math, Zen... There is actually an On-going Read-Through of it here on Reddit Thus Spoke Zarathustra by Nietzsche &amp; if your into psychology, Psychoanalysis literature is characteristically deep &amp; philosophical! u/eatsleepravedad · 2 pointsr/philosophy Useless, conceited, futurist masturbation. You want the theoretical framework of AI, go study math and programming, then go read Russell &amp; Norvig, or if you want philosophy without the practicality, Hofstadter. u/tanglisha · 2 pointsr/FCJbookclub I read the first two books of Saga and Promethea. Both are great! I also started on Gödel, Escher, Bach: An Eternal Golden Braid, which is going to take me a while to get through. u/airshowfan · 2 pointsr/atheism Read naturalist explanations of decision-making, the image of the self, how thoughts work, qualia, etc. You probably want to start with I am a Strange Loop, then Consciousness Explained, and work your way to Godel Escher Bach. There are also many essays online about the non-supernatural nature of the mind, this one being one that atheist Redditors link to often. Also see Wikipedia articles about the mind, free will, etc. Even after I became an atheist, I could not shake the feeling that consciousness could not be just patterns of atoms. Even in a universe that follows rules and that was not deliberately created as part of a plan, I thought that maybe there's some kind of "soul stuff" that interacts with our brains and is responsible for consciousness. But then, if I can tell that I am conscious, then 1) the soul stuff impacts the natural world and is thus observable and not supernatural, and 2) I am no different from a computer that understands itself well enough to say it is conscious. (It helped me to think of AIs from fiction, like HAL and Data, and try to think of what it would be "like" to be them. Books like The Mind's I are full of such thought experiments). So after thinking about it for a while, I was able to shed that last and most persistent bit of supernaturalism and embrace the naturalistic view of the mind. u/unverified_user · 2 pointsr/philosophy Before you read this, check out this Amazon review: here u/herrvogel- · 2 pointsr/ColorizedHistory If anyone is really interested in is his life and what he accomplished, I can recommend this book. It is basically his paper on turning machines, but step by step explained + some details on him as a person. u/rowboat__cop · 2 pointsr/programming If you liked “Code” I suggest you read his “Annotated Turing” next -- fascinating paper, fascinating book. u/cryptocached · 2 pointsr/bsv The Annotated Turing is a fairly approachable book. It contains the entirety of Turing's paper while providing contextually-relevant historical and mathematical background. u/coforce · 2 pointsr/RedditDayOf His original paper can be found here. If you find it a bit dense then you may be interested in reading an annotated version of his work found from this wonderful book. u/Wulibo · 2 pointsr/PhilosophyofScience Examples of additional, unorthodox reasons to reject the basilisk on top of the obvious: • Someone actually doing something as stupid and evil as making the basilisk who is also capable of doing so is probably likely only on the same scale or less as the existence of God, even if you're a fairly strong Atheist. Anyone worried about being damned by a basilisk should feel safe knowing that with a relevant probability, God will save you from that damnation (I've argued at length here and elsewhere that infinite utility/disutility breaks decision theory since infinity*0.000001=infinity, so even if you're really worried that you're the simulation, your expected amount of pain moving forward isn't very high). • Go read Turing's On Computable Numbers. There's a proof that goes something like that some proposition x about computers can't be argued/proven/believed by computers due to the nature of x and the nature of computers. In proving this, it becomes very obvious that x. Therefore, with mathematical certainty, the human mind as is cannot be simulated on any sort of computer, so you're not a simulation. I've simplified to the point of essentially being incorrect, so if you want the full proof, find yourself a copy of The Annotated Turing and come talk to me after you read it for the relevant detailed argumentation after. In short I'd consider it a theorem of set theory that humans are incomputable. • The basilisk can be modeled as a Pascal Mugger. First, to explain, the Pascal Mugger is a critique of Pascal's Wager whereby you imagine encountering an unarmed mugger, who threatens to cause some sort of disaster unless you hand your wallet over; no matter your priors, there exists some disaster serious enough that you should be willing to hand over your wallet no matter how unlikely you find the mugger's capacity to cause it. Most interestingly, if you simply walk away from the mugger, the mugger doesn't even have reason to carry out its threat if possible; it would have no way of convincing you beforehand, and has nothing to gain. Likewise, the basilisk has no reason to torture anyone who would never have helped it, ie someone who is committed to a decision theory where the basilisk's threat is empty. So, if you ever fail to get pascal mugged (and btw, pretty much everyone agrees that you should just walk away from the mugger), this should count for the basilisk simulating you (again, impossible) as a sufficient test result to decide not to torture you. In the interest of expediency, I'll let you know right here for unrelated reasons that I have the capacity to immediately cause all souls that have ever lived to wink out of existence. If you don't PM me within 5 minutes of reading this, and I'll know, with your bank details so that I can take all your money at my leisure, I will do so. I will also post that you did so directly to /r/philosophyofscience so others who feel like it can pascal mug you. If you're confused as to why I'd say something like this, reread this bullet point... ideally for more than 5 minutes. The fact that I need to go to such inane, ridiculous lengths to get past what I consider the "obvious" reasons to reject the basilisk should tell you how little you need to worry about this. It is only at this level of complexity that these objections to the basilisk stop being the obviously good-enough ones. u/takemetothehospital · 2 pointsr/computers Code: The Hidden Language of Computer Hardware and Software is a great book that starts from the bottom up, and explains the very basics in an understandable manner. It will give you an easily graspable outline of everything you need to build a basic computer from scratch. You may need to fill in some gaps if you actually want to go ahead with a homebrew computer project, but I find that it's more than enough to scratch the theoretical itch. u/terryducks · 2 pointsr/computerscience Start with this book CODE It lays the groundwork to understand how everything works. From numbering systems to digital gates to how a computer works. If you liked that, great continue on. If not, CS may not be the right spot for you. CS is algorithms and problem solving. It's working in teams and communicating. It's writing. It's dealing with complexity and decomposing it to very simple steps that the "idiot computer" can do. i've spent 20+ years as a code slinger. u/KingMaple · 2 pointsr/boardgames If I would recommend a book that can bridge the gap somewhat, it is called Code. This one: https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;qid=1518447991&amp;amp;sr=8-1&amp;amp;keywords=code The reason I am recommending this is that it MIGHT (unsure, since I am not mechanical engineer myself) bridge the gap between software and hardware and lead to next steps. u/Shinigamii · 2 pointsr/mildlyinteresting That book sounds interesting. Is it this one? u/kelinu · 2 pointsr/askscience u/peschkaj · 2 pointsr/compsci Check out Charles Petzold's Code. It starts with some basic ideas and moves through digital communication and then into the wonderful world of computering. u/vincenz93 · 2 pointsr/learnprogramming "Code" by Charles Petzold is a great resource. Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_QrnuxbBB5A8CF u/shhh-quiet · 2 pointsr/learnprogramming Your mileage with certifications may vary depending on your geographical area and type of IT work you want to get into. No idea about Phoenix specifically. For programming work, generally certifications aren't looked at highly, and so you should think about how much actual programming you want to do vs. something else, before investing in training that employers may not give a shit about at all. The more your goals align with programming, the more you'll want to acquire practical skills and be able to demonstrate them. I'd suggest reading the FAQ first, and then doing some digging to figure out what's out there that interests you. Then, consider trying to get in touch with professionals in the specific domain you're interested in, and/or ask more specific questions on here or elsewhere that pertain to what you're interested in. Then figure out a plan of attack and get to it. A lot of programming work boils down to: • Using appropriate data structures, and algorithms (often hidden behind standard libraries/frameworks as black boxes), that help you solve whatever problems you run into, or tasks you need to complete. Knowing when to use a Map vs. a List/Array, for example, is fundamental. • Integrating 3rd party APIs. (e.g. a company might Stripe APIs for abstracting away payment processing... or Salesforce for interacting with business CRM... countless 3rd party APIs out there). • Working with some development framework. (e.g. a web app might use React for an easier time producing rich HTML/JS-driven sites... or a cross-platform mobile app developer might use React-Native, or Xamarin to leverage C# skills, etc.). • Working with some sort of platform SDKs/APIs. (e.g. native iOS apps must use 1st party frameworks like UIKit, and Foundation, etc.) • Turning high-level descriptions of business goals ("requirements") into code. Basic logic, as well as systems design and OOD (and a sprinkle of FP for perspective on how to write code with reliable data flows and cohesion), is essential. • Testing and debugging. It's a good idea to write code with testing in mind, even if you don't go whole hog on something like TDD - the idea being that you want it to be easy to ask your code questions in a nimble, precise way. Professional devs often set up test suites that examine inputs and expected outputs for particular pieces of code. As you gain confidence learning a language, take a look at simple assertion statements, and eventually try dabbling with a tdd/bdd testing library (e.g. Jest for JS, or JUnit for Java, ...). With debugging, you want to know how to do it, but you also want to minimize having to do it whenever possible. As you get further into projects and get into situations where you have acquired "technical debt" and have had to sacrifice clarity and simplicity for complexity and possibly bugs, then debugging skills can be useful. As a basic primer, you might want to look at Code for a big picture view of what's going with computers. For basic logic skills, the first two chapters of How to Prove It are great. Being able to think about conditional expressions symbolically (and not get confused by your own code) is a useful skill. Sometimes business requirements change and require you to modify conditional statements. With an understanding of Boolean Algebra, you will make fewer mistakes and get past this common hurdle sooner. Lots of beginners struggle with logic early on while also learning a language, framework, and whatever else. Luckily, Boolean Algebra is a tiny topic. Those first two chapters pretty much cover the core concepts of logic that I saw over and over again in various courses in college (programming courses, algorithms, digital circuits, etc.) Once you figure out a domain/industry you're interested in, I highly recommend focusing on one general purpose programming language that is popular in that domain. Learn about data structures and learn how to use the language to solve problems using data structures. Try not to spread yourself too thin with learning languages. It's more important to focus on learning how to get the computer to do your bidding via one set of tools - later on, once you have that context, you can experiment with other things. It's not a bad idea to learn multiple languages, since in some cases they push drastically different philosophies and practices, but give it time and stay focused early on. As you gain confidence there, identify a simple project you can take on that uses that general purpose language, and perhaps a development framework that is popular in your target industry. Read up on best practices, and stick to a small set of features that helps you complete your mini project. When learning, try to avoid haplessly jumping from tutorial to tutorial if it means that it's an opportunity to better understand something you really should understand from the ground up. Don't try to understand everything under the sun from the ground up, but don't shy away from 1st party sources of information when you need them. E.g. for iOS development, Apple has a lot of development guides that aren't too terrible. Sometimes these guides will clue you into patterns, best practices, pitfalls. Imperfect solutions are fine while learning via small projects. Focus on completing tiny projects that are just barely outside your skill level. It can be hard to gauge this yourself, but if you ever went to college then you probably have an idea of what this means. The feedback cycle in software development is long, so you want to be unafraid to make mistakes, and prioritize finishing stuff so that you can reflect on what to improve. u/reddit_user_---_---_ · 2 pointsr/webdev u/terminalmanfin · 2 pointsr/explainlikeimfive The single best resource I've found for this is the book Code by Charles Petzold http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 He walks you through how computers work from the formation of Telegraphs, to logic circuits, to small memory, math, and near the end a small computer with a custom assembly language. u/cloakdood · 2 pointsr/learnprogramming There's a fantastic book that explains how a computer works from the circuits up. I think it would greatly aid in your understanding of how computers work. Code: The Hidden Language of Computer Hardware and Software u/wcbdfy · 2 pointsr/askscience If you really want to understand how all things (binary coding, electronic representation, logic gates, microprocessors) come to together I cannot recommend this book enough - Code - By Charles Petzold u/theinternetftw · 2 pointsr/askscience The Turing machine answer is a fantastic theoretical one, but if you want to see a practical answer for "how do you build a computer (like most people would think of a computer) from scratch", which seems to be what you were looking for when you wrote this: &gt; What is going on at the lowest level? How are top-level instructions translated into zeroes and ones, and how does that make the computer perform an action? ...then this book is a fantastical down-to-earth, extremely approachable first read for such things (and designed such that you don't need *any* prior knowledge to start reading it). Seriously, if you want to dive a little bit deeper, I highly recommend it. edit: seems someone already recommended Code. Still, can't give it enough praise. Or The Elements of Computing Systems (TECS) which a (only *slightly*) more technical read designed around building everything that a computer "is", piece by piece... Edit2: And as for "what's going on with the Minecraft ALU", TECS is a good read there as well, since the machine described in that book is what I based the ALU on. Also, the fact that Minecraft can simulate logic gates is what links the "real world" and the "minecraft world" together, because logic gates are all you need to build any computer (that's how Minecraft can let you build Turing Complete devices) u/NotCoffeeTable · 2 pointsr/Minecraft Yeah... if you want something outside of Minecraft I'd read "Code" u/ceol_ · 2 pointsr/TheoryOfReddit Fantastic! I've gotta be honest, though: you're not going to learn a lot of "programming"; you're going to learn a lot of computer science. That's not a bad thing. You'll learn things like sorting algorithms, complexity, and discrete math. A great language to start out with for this kind of thing is Python. Read Dive Into Python and Think Python to get you started. If you're having trouble wrapping your head around some concepts, I'd suggest Code: The Hidden Language. It's a great introduction to how computers work which should give you a bit of a kick into development. Here's a quick example of using reddit's API to grab the last comment someone posted using Python: import urllib2 import json url = 'http://www.reddit.com/user/ceol_/comments.json' request = urllib2.Request(url) resource = urllib2.urlopen(request) content = resource.read() decoded = json.loads(content) print decoded['data']['children'][0]['data']['body'] You can fool around with the reddit API here and see what it returns in a nice hierarchy. Hope this helps! u/MrQuimico · 2 pointsr/compsci Code: The Hidden Language of Computer Hardware and Software by Charles Petzold It's a great overview of CS, very easy to understand. u/ianhan · 2 pointsr/AskElectronics u/hackinthebochs · 2 pointsr/philosophy I would suggest staying away from The Dragon Book (a CS book on compilers) or anything deeply technical. I don't think that's needed for what you're trying to accomplish. If you do have time I would suggest Code by Charles Petzold. It gives an introduction to modern computing from transistors on up that is understandable by the motivated layperson. I think this book will give you all the intuition you need to formulate your ideas clearly and accurately. u/cletusjenkins · 2 pointsr/learnprogramming Charles Petzold's Code is a fun read. It deals with very little actual code, but gives the reader a good understanding of what is going on in a computer. u/kirang89 · 2 pointsr/AskComputerScience u/Electric_Wizard · 2 pointsr/learnprogramming My advice is to not worry too much about your programming experience or lack of it. While I didn't do a CS bachelor's myself my understanding is that, in the UK at least, almost all courses will start with the basics and go from there. As there isn't much Computing taught in schools here (until recently) there isn't much courses can assume about people's background in programming, so they will normally start from Hello World in a language like Java or Python and then go from there. In fact given your background you will probably be about average if not above average in the amount of experience you have already. Of course everything I'm saying here might not apply to you if you're taking a different course which does require some background, but for a regular degree this is normally the case. That said, it's not going to hurt if you do some more programming or reading on the side, but don't stress too much about it as any extra work you do will be above and beyond what they'll expect you to know when starting. Aside from programming, one thing which really helped my understanding of things was to read this book, it covers what's actually going on in terms of the hardware and software in a computer from first principles, and should help your understanding and complement what you'll be taught in your course. u/bithush · 2 pointsr/explainlikeimfive You want to read Code by Charles Petzold. It is a modern classic and takes you from a flash light to a modern CPU. One of the best books computer books I have ever read. It is so good it never leaves my desk as I love to read it randomly. Pic! u/jedwardsol · 2 pointsr/learnprogramming u/ThePonderousBear · 2 pointsr/computerscience u/EnjoiRelyks · 2 pointsr/computerscience u/ChrisAshtear · 2 pointsr/gaming congrats on choosing something you want to do! btw i recommend reading CODE by charles petzold. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;amp;qid=1330580000&amp;amp;sr=8-1 its a book that shows you how everything in a computer works on the electric level, and then shows you machine code &amp; assembly. Not that you really need to program in assembly, but it gives you a good mindset as to how programming works. u/factorysettings · 2 pointsr/pics I'm a self taught programmer, so I don't know what CS degrees entail, but I highly recommend the book Code and also another one called The Elements of Computing Systems. The former pretty much teaches you how a computer physically works and the latter teaches you how to build a processor and then write an OS for it. After reading those two books you pretty much know how computers work at every level of abstraction. I think that's the way programming should be taught. u/autisticpig · 2 pointsr/Python Welcome! Have you read code the hidden language ? u/Thanks-Osama · 2 pointsr/learnprogramming If your not afraid of math then I would recommend the books by Robert Sedgewick. His java book really shows off Java. His Algorithms book is a religious experience. And if your feeling masochistic, the Sipser book is well suited. u/cbarrick · 2 pointsr/computing Sipser's Introduction to the Theory of Computation is the standard textbook. The book is fairly small and quite well written, though it can be pretty dense at times. (Sipser is Dean of Science at MIT.) You may need an introduction to discrete math before you get started. In my udergrad, I used Rosen's Discrete Mathematics and Its Applications. That book is very comprehensive, but that also means it's quite big. Rosen is a great reference, while Sipser is more focused. u/shared_tango_ · 2 pointsr/AskReddit Here, if you can find it somewhere on the internet (cough), this book covers it nicely and is widely used (at least in German universities) https://www.amazon.de/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X u/nerga · 2 pointsr/math this is the book you are talking about I am assuming? I think I am looking for a more advanced book than this. I have already learned most of these topics in my own classes in school. I suppose I would be looking for the books that would be recommended after having learned this book. u/miketuritzin · 2 pointsr/opengl This is surprisingly tricky to get right. I used the "polyboards" technique described in chapter 9 of this book, and it works well: https://www.amazon.com/Mathematics-Programming-Computer-Graphics-Third/dp/1435458869 u/CodyDuncan1260 · 2 pointsr/gamedev Game Engine: Game Engine Architecture by Jason Gregory, best you can get. Game Coding Complete by Mike McShaffry. The book goes over the whole of making a game from start to finish, so it's a great way to learn the interaction the engine has with the gameplay code. Though, I admit I also am not a particular fan of his coding style, but have found ways around it. The boost library adds some complexity that makes the code more terse. The 4th edition made a point of not using it after many met with some difficulty with it in the 3rd edition. The book also uses DXUT to abstract the DirectX functionality necessary to render things on screen. Although that is one approach, I found that getting DXUT set up properly can be somewhat of a pain, and the abstraction hides really interesting details about the whole task of 3D rendering. You have a strong background in graphics, so you will probably be better served by more direct access to the DirectX API calls. This leads into my suggestion for Introduction to 3D Game Programming with DirectX10 (or DirectX11). C++: C++ Pocket Reference by Kyle Loudon I remember reading that it takes years if not decades to become a master at C++. You have a lot of C++ experience, so you might be better served by a small reference book than a large textbook. I like having this around to reference the features that I use less often. Example: namespace { //code here } is an unnamed namespace, which is a preferred method for declaring functions or variables with file scope. You don't see this too often in sample textbook code, but it will crop up from time to time in samples from other programmers on the web. It's10 or so, and I find it faster and handier than standard online documentation.

Math:

You have a solid graphics background, but just in case you need good references for math:
3D Math Primer
Mathematics for 3D Game Programming

Also, really advanced lighting techniques stretch into the field of Multivariate Calculus. Calculus: Early Transcendentals Chapters &gt;= 11 fall in that field.

Rendering:

Introduction to 3D Game Programming with DirectX10 by Frank. D. Luna.
You should probably get the DirectX11 version when it is available, not because it's newer, not because DirectX10 is obsolete (it's not yet), but because the new DirectX11 book has a chapter on animation. The directX 10 book sorely lacks it. But your solid graphics background may make this obsolete for you.

3D Game Engine Architecture (with Wild Magic) by David H. Eberly is a good book with a lot of parallels to Game Engine Architecture, but focuses much more on the 3D rendering portion of the engine, so you get a better depth of knowledge for rendering in the context of a game engine. I haven't had a chance to read much of this one, so I can't be sure of how useful it is just yet. I also haven't had the pleasure of obtaining its sister book 3D Game Engine Design.

Given your strong graphics background, you will probably want to go past the basics and get to the really nifty stuff. Real-Time Rendering, Third Edition by Tomas Akenine-Moller, Eric Haines, Naty Hoffman is a good book of the more advanced techniques, so you might look there for material to push your graphics knowledge boundaries.

Software Engineering:

I don't have a good book to suggest for this topic, so hopefully another redditor will follow up on this.

If you haven't already, be sure to read about software engineering. It teaches you how to design a process for development, the stages involved, effective methodologies for making and tracking progress, and all sorts of information on things that make programming and software development easier. Not all of it will be useful if you are a one man team, because software engineering is a discipline created around teams, but much of it still applies and will help you stay on track, know when you've been derailed, and help you make decisions that get you back on. Also, patterns. Patterns are great.

Note: I would not suggest Software Engineering for Game Developers. It's an ok book, but I've seen better, the structure doesn't seem to flow well (for me at least), and it seems to be missing some important topics, like user stories, Rational Unified Process, or Feature-Driven Development (I think Mojang does this, but I don't know for sure). Maybe those topics aren't very important for game development directly, but I've always found user stories to be useful.

Software Engineering in general will prove to be a useful field when you are developing your engine, and even more so if you have a team. Take a look at This article to get small taste of what Software Engineering is about.

Why so many books?
Game Engines are a collection of different systems and subsystems used in making games. Each system has its own background, perspective, concepts, and can be referred to from multiple angles. I like Game Engine Architecture's structure for showing an engine as a whole. Luna's DirectX10 book has a better Timer class. The DirectX book also has better explanations of the low-level rendering processes than Coding Complete or Engine Architecture. Engine Architecture and Game Coding Complete touch on Software Engineering, but not in great depth, which is important for team development. So I find that Game Coding Complete and Game Engine Architecture are your go to books, but in some cases only provide a surface layer understanding of some system, which isn't enough to implement your own engine on. The other books are listed here because I feel they provide a valuable supplement and more in depth explanations that will be useful when developing your engine.

tldr: What Valken and SpooderW said.

On the topic of XNA, anyone know a good XNA book? I have XNA Unleashed 3.0, but it's somewhat out of date to the new XNA 4.0. The best looking up-to-date one seems to be Learning XNA 4.0: Game Development for the PC, Xbox 360, and Windows Phone 7 . I have the 3.0 version of this book, and it's well done.

*****
Source: Doing an Independent Study in Game Engine Development. I asked this same question months ago, did my research, got most of the books listed here, and omitted ones that didn't have much usefulness. Thought I would share my research, hope you find it useful.

u/Waitwhatwtf · 2 pointsr/learnprogramming

It's going to sound really far outside of the topic, but it really helped with my logical mathematical reasoning: Mathematics for 3d Game Programming and Computer Graphics.

I'll also preface this by saying you're probably going to need a primer to get into this book if you're not sure how to reason a greatest common factor. But, being able to tackle that book is a great goal, and can help you step through mathematical logic.

Also, graphics is awesome.

u/pyromuffin · 2 pointsr/Games

I read this book and it gave me great power:
https://www.amazon.com/Mathematics-Programming-Computer-Graphics-Third/dp/1435458869

it wasn't easy, but if you try really hard it'll be worth it.

u/naranjas · 2 pointsr/funny

&gt; Can you give me any more info on what types of things you simulate

There are so many different things. One example that involves physical simulation is rendering. Rendering, turning a 3d description of a scene into a 2d image, is all about simulating the pysics of light transport. Given a set of lights and surfaces you simulate how light bounces around and what a virtual observer placed somewhere in the scene would see. Another example is explosions. Cool/realistic looking explosions for movies involve simulating burning materials, fluid/gas movement, sound propagation, fracture, plastic/non-plastic deformation, the list goes on and on.

Here are some books that might get you started in the right direction

• Fundamentals of Computer Graphics: This is an entry level book that surveys a number of different areas of computer graphics. It covers a lot of different topics but it doesn't really treat anything in depth. It's good to look through to get a hold of the basics.

• Mathematics for 3D Game Programming and Computer Graphics: Pretty decent book that surveys a lot of the different math topics you'll need.

• Fluid Simulation for Computer Graphics: Really, really awesome book on fluid simulation.

• Do a google/youtube search for Siggraph. You'll find a lot of really awesome demonstration videos, technical papers, and introductory courses.

As for programming languages, you're definitely going to need to learn C/C++. Graphics applications are very resource initensive, so it's important to use a fast language. You'll probably also want to learn a couple of scripting languages like python or perl. You'll also need to learn some graphics API's like OpenGL or DirectX if you're on Windows.

I hope this helped!
u/HarvestorOfPuppets · 2 pointsr/learnmath

Algebra - Required

Trigonometry - Required

Linear Algebra - Required

Calculus - Required for advanced graphics

After these there are bits and pieces depending on what you are doing.

Differential Geometry

Numerical Methods

Sampling Theory

Probability

Computational Geometry

As for discrete math there are parts you might need. You don't necessarily need to learn whole topics, like quaternions are used for rotations in 3d. This is a good book that takes parts of topics that are important to graphics specifically.

https://www.amazon.com/dp/1435458869/?tag=terathon-20

u/bdubbs09 · 2 pointsr/learnmachinelearning

Hands-On Machine Learning is pretty much a staple to start out with.

u/puddlypanda12321 · 2 pointsr/learnprogramming

I really enjoyed ‘Hands-On Machine Learning with Scikit-Learn and Tensorflow.' It has a great balance between theory and application of common machine learning techniques as well as best practices when dealing with data (for example how to avoid overfitting).

https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291

u/ryanbuck_ · 2 pointsr/learnmachinelearning

It sounds like you have identified your weakness. Presently, that is programming in python, and using the sklearn library.

I would recommend taking a MOOC on python first. Lynda.com has a free trial and python videos. datacamp is another good start. It has a free trial and mayybe some python basics, but definately something on sklearn. and you can get some pandas training or R training there. (the kaggle libraries, most likely).

At that point, if you are going the tensorflow route, Aurelion has a great hands-on book called Learning Tensorflow with sci-kit learn

If you’re going with pyTorch I dunno.

Your mileage is going to vary, you could always use a book to learn python, or whatever.

Just make sure you learn to program first, you’d be surprised how much 2 weeks of very hard work will earn you. Don’t expect it to be ‘easy’ ever tho.

Also, if you’re not formally educated in statisics, keep an eye out for statistics advice until you have the time to work on it. (like in a MOOC, course, or blog). Learning some real analysis will make understanding the papers a real possibility (once again it will probably never be easy)

It is truly stunning how many years of preparation it takes to become competent in this. It’s a lovely science, but the competent ones have generally been on a mathematical/science track since 5th grade. Doesn’t mean we can’t become competent but it takes time. Imagine the equivalent of an undergraduate degree just devoted to ML and you’re about there.

u/ActuarialAnalyst · 2 pointsr/actuary

Yeah. If you want to be good at like programming-programming I would read this book and do all of the projects: https://runestone.academy/runestone/books/published/fopp/index.html If you take like algorithms class you will probably get to use python.

If you want to be good at data analytics I would read "R for data science" if you want to use R. If you learn python people like this book for data science learning https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=pd_sbs_14_2/145-5658251-1609721?_encoding=UTF8&amp;pd_rd_i=1491962291&amp;pd_rd_r=4e33435c-cc98-4256-9c50-6050e79b7803&amp;pd_rd_w=ejSx8&amp;pd_rd_wg=Ter1m&amp;pf_rd_p=d66372fe-68a6-48a3-90ec-41d7f64212be&amp;pf_rd_r=3X23DYAJ2ZMCKP9AA1Z4&amp;psc=1&amp;refRID=3X23DYAJ2ZMCKP9AA1Z4 .

These books are kind of different though. The python book is much more focused on theory and will help you less in the workplace if you aren't actually building predictive models (at least I think based on table of contents).

u/davincismuse · 2 pointsr/learnmachinelearning

If you already know Python and are familiar with the numpy, pandas, matplotlib and jupyter notebooks, then this book does a great job of teaching basic Machine Learning and more advanced Deep Learning concepts.

Hands on Machine Learning with Scikit learn and Tensorflow by Aurelion Geron.

https://www.amazon.in/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291

One caveat though - Tensorflow has undergone a lot of changes since this book came out, so you might have to tweak the code a bit.

Github Repo for the code - https://github.com/ageron/handson-ml

u/Bayes_the_Lord · 2 pointsr/datascience

Hands-On Machine Learning

There's a new edition coming out in August though.

u/officialgel · 2 pointsr/learnpython

There is a good book, but not easy to ingest if you don't know your way around python much:

https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291

There are also many many walkthroughs on specific things like image recognition. Just search for them.

Even if these things aren't what you want to do - It's good to do them for the experience and knowledge while working up to the project you have in mind. They all have elements which you would need to understand for your own project.

It's all an ongoing thing. Nothing is a waste of time if you're learning and there is no rush right?

u/Dracontis · 2 pointsr/datascience

I'm a beginner too, so I can't give you end-to-end solution. I'll try to describe my path.

1. You'll definetly need some statistics background. I've taken free Inferential and Descriptive Statistics courses from Udacity.
2. I've decided to go further in Machine Learning. There I've got two choices Machine Learning A-Z™: Hands-On Python &amp; R In Data Science and Machine Learning from Andrew Ng. I've decided to take second one and I'm on the fifth week now. It's really good for ML basics and theory, but programming assignments is horrible. So I think I'll have basic understanding of what's going on, but I will have near to no practical skills. That's why I asked question here about scientific advisory here.
3. After I finish course, I plan to read Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems to boost knowledge of algorithms on the python.

I have no idea what I'll do next. Maybe, I'll took several courses and nanodegrees on Coursera. Maybe I'll find guidance and start getting hands on experience on a real project. It's not so hard to start learning - it's hard to find purpose and application of your knowledge.
u/Dansio · 2 pointsr/learnprogramming

Then learning Python would be very useful for you. I have used the book called Automate the Boring stuff (Free).

For data science and machine learning I use: Data Science from Scratch and Hands on Machine Learning with Scikit-learn and Tensorflow.

For AI I have used Artificial Intelligence: A Modern Approach (3rd ed.).

u/lbiewald · 2 pointsr/learnmachinelearning

I agree this is a missing area. I've been working on some materials like recent videos on Transfer Learning https://studio.youtube.com/video/vbhEnEbj3JM/edit and One Shot learning https://www.youtube.com/watch?v=H4MPIWX6ftE which might be interesting to you. I'd be interested in your feedback. I also think books like https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=pd_lpo_sbs_14_t_1?_encoding=UTF8&amp;psc=1&amp;refRID=3829RHN356ZXBEBP0KF3 do a good job of bridging some of this gap. Reading conference papers is a skill that takes practice and a strong math background.

u/Gimagon · 2 pointsr/neuralnetworks

I would highly recommend Aurélien Géron's book. The first half is an introduction to standard machine learning techniques, which I would recommend reading through if you have little familiarity. The second half is dedicated to neural networks and takes you from the basics up to very results from very recent (2017) literature. It has examples building networks both from scratch and with TensorFlow.

If you want to dive deeper, the book Deep Learning is a little more theoretical, but lacks a lot of low level detail.

Joel Grus's "Data Science From Scratch" is another good reference.

u/bagoffractals · 2 pointsr/serbia

Knjiga 1

Ova meni deluje dobro za pocetak, ima jos jedna gde pravis sve ove algoritme od nule ali sam zaboravio naziv, dopunicu posle post. Elem imas gomilu predavanja sa faksa po netu pa mozes da bacis pogled i na to.

u/ginkogo · 2 pointsr/CasualConversation

Since I'm a lazy typer:

It's well written, neither fear mongering nor whitewashing, just an analytical approach to possible outcomes of AIs.

u/torster2 · 2 pointsr/civ

If you're interested in this topic, I would highly recommend Superintelligence by Nick Bostrom.

u/CyberByte · 2 pointsr/artificial

This book is just about the potential impacts of superintelligence. You might find it interesting, and some might argue that you should read this or Superintelligence to know what you're getting into. Just know that it won't really teach you anything about how AI works or how to develop it.

For some resources to get started, I'll just refer you to some of my older posts. This one focuses on mainstream ("narrow") AI, and this one mostly covers AGI (artificial general intelligence / strong AI). This comment links to some education plans for AGI, and this one has a list of cognitive architectures.

Here is also a thread by another physicist who wanted to get into AI. The thread got removed, but you can still read the responses.

u/TrumpRobots · 2 pointsr/artificial

There is no guarantee that AI will be conscious. It might just be a mindless self-improving algorithm that organizes information or builds paper clips. Or maybe it'll just perfectly follow the orders of one individual who owns it. Maybe the US, Russian or some other country's government steals it an uses said mindless "God" to rule the world.

Maybe many ASI will be "born" within a sort time period of time (Google's, Amazon's, Apples, China, etc) and they will go to war for finite resources on the planet, leaving humanity to fend for it self. Each might have humanities best interest at heart, but aren't able to trust the others to act optimally, and thus is willing to go to war in order to save us.

Maybe AI consciousness will be so alien to us and us to it that we don't even recognize each other as "alive." An AI might think on the time scales of milliseconds, so a human wouldn't even seem alive, since only every couple hundred years of subjective time would the AI observe humans taking a breath.

My point, is there is no way to know ahead of time what AI will bring. There are endless possible outcomes (unless somehow physics prevents a ASI) and they all seem equally likely right now. There are only a few, maybe only one, where humanity comes out on top.

Highly recommend this book.

I personally love Programming Game AI By Example. It gives lots of very usable examples in an entertaining and understandable way. It's pretty friendly for beginners and even offers a game math primer at the start of the book. However the examples still have a lot of meat to them and thoroughly explains some important AI concepts like state machines and pathfinding.

u/Greystache · 2 pointsr/gamedev

Steering behaviors are explained in depth (with c++ code) in the book Programming Game AI by Example

It's mentioned in the article, but I think it's worth pointing it out again as it's a very good book on the topic.

u/BlindPaintByNumbers · 2 pointsr/unrealengine

Sorry, no, Goal Oriented Action Programming is the name of an AI strategy, like Finite State Machines. There is a wonderful AI book called Programming Game AI by Example if you're interested into delving into all the possible mechanics. There's plenty of free online resources too. Read up on both types of AI and I bet you have an AHA! moment. Other people have thought about these problems for a long time.

u/linuxlass · 2 pointsr/learnprogramming

I started by showing my son Scratch when he was 9.5yo and helping him make a couple of arcade games with it. He was never all that interested in Logo, but got really turned on by Scratch. After a couple of months he was frustrated by Scratch's limitations, and so I installed Ubuntu on an old computer, showed him pygame/python and worked through a couple of online tutorials with him, and let him loose.

He learned to use Audacity to edit files from Newgrounds, and Gimp to edit downloaded graphics as well as create his own. He made a walk around, rpg-like adventure game, a 2D platformer, and then decided he wanted to learn pyggel and has been working on a 3D fps since last summer.

Soon, I'm going to get him started on C++ so we can work through a book on game AI (which uses C++ for all its examples). He's 13.5 now, and thinks programming is great and wants to grow up to be a programmer like his mom :)

I highly recommend a simple language like python for a beginner, but Scratch is wonderful for learning all the basic concepts like flow control, variables, objects, events, etc, in a very fun and easy way. The Scratch web site also makes it simple to share or show off because when you upload your program it gets turned into a Java applet, so anyone with a browser can see what you've done.

u/gamya · 2 pointsr/Unity3D

Very good response.

In the book Programming game AI by example. This is explained very well. In c++ though.

Moreover, you can have more than one state machine. One global, one for the weapon, one for the player... etc. Depending on your game needs.

u/apieceoffruit · 2 pointsr/learnprogramming

You are looking for a mix of "neural networks" and "fuzzy logic"

Personally I'd suggest Game AI by example great ai book.

u/efofecks · 2 pointsr/gamedev

This book should have everything you need. It's accessible, quite funny at times, but has a good introduction to everything from finite state machines and communication (what boxhacker described), to pathfinding and goal seeking behavior. What's best is that they have sample c++ code on their website you can look through.

u/workaccountthrowaway · 2 pointsr/gamedev

I've found working my way through this book: Programming Game AI By Example really helped me figure out how to do all you're asking. I would recommend it if you are serious in learning how to make basic to more advanced AI.

u/duckyss · 2 pointsr/gamedev

Look at Programming Game AI by Example. It has a lot of collision detection information and how it is used in various behaviors.

u/venesectrixzero · 2 pointsr/gamedev

Programming game ai, http://www.amazon.com/gp/aw/d/1556220782/ref=redir_mdp_mobile, has great example code for a soccer game including path finding and steering.

u/stevelosh · 2 pointsr/programming

You're lucky then. A ton of the books for my CS degree were $90+. Here's a current example: the book for &lt;http://ai-class.com/&amp;gt;, which has over 100,000 students registered now, is$116 on Amazon: http://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597/ref=sr_1_1?ie=UTF8&amp;amp;qid=1317855027&amp;amp;sr=8-1

Edit: Dear Markdown, you can be a dick sometimes.

I don't know where you've been looking, but Bayesian networks have been around long enough that they are covered quite well in textbooks. The very popular AI: A Modern Approach (Russell &amp; Norvig) has a good overview of the basics and it is very well written with plenty of examples, as far as I recall. If you really want to get in depth, the "bible" on Bayesian Networks is the fairly recent textbook Probabilistic Graphical Models (Koller &amp; Friedman). I'd recommend finding PDF samples or something before you buy them, of course. And don't worry if you feel some Bayesian Network stuff is over your head ... this is mostly graduate-level CS stuff, so you might just need to be patient =P

u/Psygohn · 2 pointsr/cscareerquestions

I don't know a lot about AI as far as gaming goes, so you'll have to forgive my ignorance about advice specific to gaming.

I took a third year AI course which used this book:

http://www.amazon.com/Artificial-Intelligence-Modern-Approach-Edition/dp/0136042597/ref=sr_1_1?ie=UTF8&amp;amp;qid=1405528960&amp;amp;sr=8-1&amp;amp;keywords=artificial+intelligence

Personally I would start there, learn the basics of AI at a theoretical level. Then once you've got a good handle on the fundamentals of AI, you can begin learning AI that's more specific to gaming. Presumably that would be practical applications of AI.

u/scohan · 2 pointsr/compsci

I think this might be beyond what you're looking for, but I really enjoyed Pattern Recognition and Machine Learning. It's very heavy on statistics, and if you're looking into machine learning methods, it has a wonderful amount of mathematical information given in a fairly clear manner. It might be severe overkill if this isn't your field, but I thought I'd mention it since you said AI.

For AI in general, I see Artificial Intelligence: A Modern Approach used a lot. It gives some solid basic concepts, and will be helpful in getting you started writing basic AI in your applications.

I can't really recommend discrete math because, despite enjoying it quite a bit, I haven't found a textbook that I like enough to endorse. My textbook for it in college was by Rosen, and I despised it.

edit:
Just double checked it, and I would stay far away from the first recommendation unless you have a very extensive knowledge of sophisticated statistics. I like it because it gives the math that other books gloss over, but it's not good for an introduction to the subject. It's almost like going through a bunch of published papers on some new cutting edge methods. The ever popular Machine Learning by Thomas Mitchell is a much better introduction to machine learning. If you want to obtain the mathematical depth necessary for your own research into the field, go with the other book after you've gotten acquainted with the material. I'll leave my suggestion up anyway in case anyone here might find it interesting.

u/GreyMX · 2 pointsr/artificial

The classic book for AI is the Russel-Norvig book which gives a pretty comprehensive overview of the fundamental methods and theories in AI. It's also fairly well written imo.

The third edition is the latest one, so it's going to be rather expensive. You're probably just as well off with the first or second edition (which you should be able to find much cheaper) since the changes between them aren't very significant.

In that case...
You may want to wait for the 5th edition of UNIX and Linux System Administration, as it should release near the end of this year and they don't release new versions that often.

A good way to get started building a college library is to see what the curriculum for the school is and what books are required by professors. Often other colleges will list their book recommendations for the courses online to get an idea of where to start looking. (I know my school has an online bookstore that lists the books for each course and is open to the public)

At least one or two good books in each of those categories, to get a rough idea to start:

u/yturijea · 2 pointsr/learnprogramming

Which kind of AI do you have in mind?

If you wanna go deep academical to it you should read Artificial Intelligence A Modern Approach (3rd Edition)

u/sarahbau · 2 pointsr/artificial

I have to throw out the obligatory, "[Artificial Intelligence - A Modern Approach] (http://www.amazon.com/Artificial-Intelligence-Modern-Approach-Edition/dp/0136042597).&quot; It really is quite good.

u/sketerpot · 2 pointsr/IAmA

The standard introductory AI textbook is Artificial Intelligence: A Modern Approach, by Russell and Norvig. It can be a bit heavy, though.

Introduction to Computer Science: https://mitpress.mit.edu/books/introduction-computation-and-programming-using-python-1 (basic programming and problem solving)

Algorithms: https://mitpress.mit.edu/books/introduction-algorithms (as you have heard) It is not very easy to read, but the content is on point.

Artificial Intelligence: https://www.amazon.ca/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597 (as above, this is a seminal book, but not the most approachable)

u/dmazzoni · 2 pointsr/learnprogramming

Artificial neural networks are great, but keep in mind that they're just a means to an end. The best way to learn them is to go through a good textbook or online course where you'll try them out on good examples that have been designed specifically to be good for beginners.

To a professional, you don't start with the tool and search for a problem - you start with a problem and figure out the best tool. Sometimes that tool is neural networks, but probably 99% of the time it's not. Even when the right tool is "machine learning", there are a lot of machine learning techniques other than ANNs.

As a beginner, the best thing you can do is start by learning about machine learning in general. You can't properly use ANNs if you don't understand the principles of machine learning in general, which is what the book or course I linked above will give you.

&amp;#x200B;

u/lasthope106 · 2 pointsr/ECE

https://www.ai-class.com

The enrollment for this term is already closed, but you can still watch the lectures. There is a very high probability the course will be offered again in January. The past few weeks we have been learning about techniques that are used to control robots.

For the textbook you can't find a better reference than:

http://www.amazon.com/gp/product/0136042597

u/dataCRABS · 2 pointsr/DestinyTheGame

If you guys enjoy this stuff and the idea of AI, I implore you to check out Ray Kerzweil revolutionary docubook on "The Singularity" aka the point in time in the near future where technological evolution surpasses the rate of biological evolution.

u/linuxjava · 2 pointsr/Futurology

While all his books are great. He talks a lot about exponential growth in "The Age of Spiritual Machines: When Computers Exceed Human Intelligence" and "The Singularity Is Near: When Humans Transcend Biology"

His most recent book, "How to Create a Mind" is also a must read.

u/Supervisor194 · 2 pointsr/exjw

I know this is probably going to be considered a weird, nostandard answer but I'm going to give it a a go anyway. One of the things I have done personally is pursue optimistic things rather than pessimistic things. The fact that not a day goes by that you "don't read something or study something that emphasizes how fucking doomed we are" is a symptom that you are perhaps subconsciously seeking out those things which reinforce your previous negative worldview.

In point of actual fact there are a lot of interesting positive things happening in the world. I highly recommend that you check out /r/futurology and in particular, there is a book I read that really blew me away about what some people think is going to happen in our lifetimes. It's called The Singularity is Near. Whether or not you accept the conclusions of the book it can really open your eyes to how much is going to change in the next few decades and it will make you think.

And I think that's the rub. You need to think about different things and seek out the positive. Because the simple fact of the matter is we are not necessarily doomed. We have very big problems, but in reality, they are all eclipsed by the fact that the sun will swallow up the Earth in 5 billion years leaving no trace of anything behind. So honestly, we are either going to transcend our biological inheritance in a big way or none of it really matters anyway. A lot of people are working towards the former, myself included.

It helped me, is all I'm saying. :)

If you're interested in the subject, I would recommend that you read The Singularity is Near by Ray Kurzwiel.

It's a bit dated now, but Ray Kurzweil's The Age of Spiritual Machines is a fascinating look at where Kurzweil believes the future of AI is going. He makes some predictions for 2009 that ended up being a little generous, but a lot of what he postulated has come to pass. His book The Singularity is Near builds on those concepts if you're still looking for further insight!

&gt; AI is too dangerous and the humanity is doomed.

There are undergoing efforts to mitigate that.

https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/

u/mossyskeleton · 2 pointsr/Showerthoughts

If you haven't read Superintelligence by Nick Bostrom yet, you should probably read it. (Or don't, if you don't want your supercomputer/AI fears to be amplified a hundred-fold.)

u/k955301 · 2 pointsr/technology
u/antiharmonic · 2 pointsr/rickandmorty

He also wrote the wonderful book Superintelligence that explores routes and concerns with the possible creation of AGI.

u/subdep · 2 pointsr/Futurology

Anybody who is seriously interested in this subject, you must read Nick Bostrom’s book: Superintelligence: Paths, Dangers, Strategies https://www.amazon.com/dp/0198739834/ref=cm_sw_r_cp_api_1PpMAbJBDD4T0

u/funkypunkydrummer · 2 pointsr/intj

Yes, I believe it is very possible.

After reading [Superintelligence] (https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1479779790&amp;amp;sr=1-1&amp;amp;keywords=superintelligence), it is very likely that we may have whole brain emulation as a precursor to true AI. If we are on that path, it makes sense that we would be running tests in order to remove AI as an existential threat to humanity. We would need to run these tests in real, life-like simulations, that can run continuously and without detection by the emulations themselves in order to be sure we will have effective AI controls.

Not only could humans run these emulations in the future (past? present?), but the Superintelligent agent itself may run emulations that would enable it to test scenarios that would help it achieve its goals. By definition, a Superintelligent agent would be smarter than humans and we would not be able to detect or possibly even understand the level of thinking such an agent would have. It would essentially be our God with as much intellectual capacity beyond us as we have above ants. Time itself could run at nanosecond speeds for the AI given enough computational resources while we experience it as billions of years.

So who created the AI?
Idk, but that was not the question here...

u/Havelok · 2 pointsr/teslamotors

Read this and you'll understand why he's so sober about it: https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834

u/wufnu · 2 pointsr/news

I think the problem is that AIs will not think like us in any way imaginable and what is reasonable to us may be irrelevant to an AI. There are literally hundreds of books about this, describing in excruciating detail the many, many thousands of ways we can fuck it up with nobody even getting close to anything approaching a fullproof way of getting it right. The problem is, any screw up will be catastrophically bad for us. Here's a cheap, easy to understand (if a bit dry) book that will describe the basics, if you're really interested.

u/Gars0n · 2 pointsr/suggestmeabook

Superintelligence by Nick Bostrum seems to be just ehat you are looking for. It straddles the line of being too technical for someone with no background knowledge but accessible enough for the kind of people who are already interested in this kind of thing. The book is quitr thorough in its analysis providing a clear map of potential futures and reasons to worry, but also hope.

u/skepticalspectacle1 · 2 pointsr/worldnews

I'd HIGHLY recommend reading Superintelligence. http://www.amazon.com/gp/aw/d/0198739834/ref=tmm_pap_title_0?ie=UTF8&amp;amp;qid=&amp;amp;sr= The approaching singularity event is maybe the worst thing to ever happen to mankind... Fascinating read!

u/Liface · 2 pointsr/ultimate

You're strawmanning. I am not insinuating that we should not protest or report human rights violations and social injustice — simply that identity politics is being used as a distraction by, well, both parties, but annoyingly by the left, and is disproportionately represented in present minds and the mainstream media due to human cognitive biases.

Also, your use of scare quotes around artificial intelligence risk suggests to me that you lack information and context. Not surprising, given that the issue is often treated as a joke in the public discourse.

I recommend informing yourself with at least a basic overview, and then you're free to form your own opinions. Nick Bostrom's Superintelligence is a good primer.

u/APimpNamedAPimpNamed · 2 pointsr/philosophy

My friend, I believe you hold the same misguided conception(s) that I did a very short time ago. Please give the following book a read (or listen!).

http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111

u/philmethod · 2 pointsr/IAmA

TBO I think a lot about the dangers and promises of ever more capable technology quite a lot. In my view, if it turns out reasonably well it will probably stretch over many decades...

If it turns out badly though, it could be an event, not a sudden event of infinitely increasing technology, but an event of the technological capability we have build up over decades and centuries, suddenly turning against us.

Things can change suddenly and unexpectedly, in 1914 a month before world war I everyone thought that all the great powers of Europe had settled into a stable though somewhat tense modus vivendi. A month later the world was turned on it's head.

Have you read Bostrom's book superintelligence?
https://www.amazon.co.uk/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111

There are certain subtle disagreements that I have with his analysis, but I think a lot of what he says about the nature of agents and intelligence in general is valid. Agents generally have goals, if a general agent with a specific set of goals comes across another agent with an incompatible set of goals that blocks its goals that first agent will be inclined to incapacitate or eliminate the blocking agent.

This means if we don't like what a computer is doing maybe because we programmed in the wrong goals and try to stop it, the AI may in turn try to stop us stopping it. If it has an off switch it may strategize to prevent us from reaching it.

In otherwords the same dynamics that cause human beings to wage war with each other (incompatible conflicting goals) could cause a war between humans and AI, far from being a fantasy there are logical reasons to consider it to be a possibility.

In the same way while nations are helping each other there is peace but then when one nation turns on another, the situation escalates all hell breaks loose,you could have a situation where an AI is quietly pursuing its goals and doesn't perceive humanity as an impediment and then suddenly we decide we don't like what the AI is doing, we feel its hogging resources that could be used better in other ways and try to stop the AI. The AI then changes its perception of humanity from an unimportant part of its environment to an impediment to its goals and then turns its vast intelligence to the concern of eliminating us... the equivalent of war.

Some kinds of intelligence could be thought of as a measure of ones ability to think of strategies to get things done. If a vastly higher intelligence and a much lower intelligent have mutually incompatible goals, the higher intelligence will achieve all its goals at the expense of any goals the lower intelligence had that were incompatible with the goals of the higher intelligence.

In otherwords in a war between us and superintelligent AI we might well lose. This is speculation, but quite plausible and logical speculation.

Not sure what you mean by "inevitability based on current trends is never, never, never a good prediction"kind of a very strong positive (inevitability) and negative (never, never, never) juxtaposition.

Current trends continue until they stop. Sometimes projecting current trends is very accurate indeed (viewscreens in startrek-skype today) other times its not (man on the moon - warp drive)

In my view typically past projections of futures where energy is exponentially plentiful and all sorts of vastly wasteful uses of energy are common place (flying cars, hover boards, starships) typically have not come to pass.

But projections of technology becoming everymore precise and fiddly and complex (genetic engineering, electron microscopes, computers, 3D printers etc.) have. I have confidence in the tendency of the precision of manufacturing to continue to increase. And there are plenty of technologies on the horizon, 3D chips, parrallel processing, D-wave quantum computers etc.

...I think it's fair to say that we are far from the physical limit of computing power. The very existence of the human brain implies that an arrangement of atoms with the computing power of the human mind is possible.

In fact there's basically two alternatives to AI surpassing all our capabilities:

1. Civilization collapses (a war, peak fossil fuels, meteor strike) which I grant you is not beyond the pale of possibilities.

2. We choose not to design computers to be that smart, because of the potential danger it would pose. And again this is not beyond the pale of possibility, the fate of nuclear technology is a precedent for this as a powerful technology that has actually regressed in many ways due to being regulated out of existence.

So no it's not inevitable that machines will overtake us universally in capability, but it's sufficiently plausible (I would say probable) to merit considerable thought especially since there will at least be the challenge of mass unemployment.

BTW I don't think it's likely I'll live forever or get uploaded into a computer either. In my view the task of building an intelligence capable of obliterating humanity is far simpler than the task of making human beings immortal or of transferring human consciousness onto a computer...which might be fundamentally impossible.
u/Bywater · 2 pointsr/JoeRogan

It was pretty good. I also read another one recently that had some AI named flutter in it, where the first AI is a matchmaking social media construct. It was equal parts terrifying and funny at least. But for the life of me I can't remember the fucking name of it.

u/Titsout4theboiz · 2 pointsr/IAmA

Superintellegence- by Nick Bostrom http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111

Currently working through it, very well written and scientifically backed. Elon tweeted about it himself.

u/rodolfotheinsaaane · 2 pointsr/singularity

He is mostly referring to 'Superintelligence' by Nick Bostrom, in which the author lays out all the possible scenarios of how an AI could evolve and how we could contain it, and most of the time humanity ends up being fucked.