(Part 2) Best computer science books according to redditors

Jump to the top 20

We found 9,284 Reddit comments discussing the best computer science books. We ranked the 1,900 resulting products by number of redditors who mentioned them. Here are the products ranked 21-40. You can also go back to the previous section.

Next page

Subcategories:

AI & machine learning books
Systems analysis & design books
Cybernetics books
Information theory books
Computer simulation books
Human-computer interaction books

Top Reddit comments about Computer Science:

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/10_6 · 126 pointsr/learnprogramming

What I did a few years ago after graduating from college in CS to brush up on my DS + Algs knowledge was the following:

  • Read the Algorithm Design Manual.

  • Go through some of the challenges on this interactive python algorithms website.

  • Practice coding simple and then more advanced algorithms on sites like Coderbyte (my site) and HackerRank which provide good explanations and solutions as well. Here's a list of popular coding challenge websites in 2017.

  • Read as many algorithm explanations and code examples as you can on GeeksforGeeks.

  • Try and implement basic algorithms yourself like: shortest path, minimum spanning tree, DFS + BFS, tree traversals, different sorting algs, min/max heap, etc. and learn about their running times (big-o).

  • Look at some interview questions posted on careercup and try and understand how other users solved the questions. Like this example.

  • Aside from coding challenge sites, try and solve common coding interview questions you find online such as this list.

    Eventually when you get a coding problem it will be sort of like a switch going off in your head because you will have had so much practice with different types of algorithms and data structures that you'll be able to reduce the problem into a simpler problem you've done before. This is especially the case with dynamic programming problems. Once you've completed like 50+ DP challenges and understand how they work, you'll be able to solve (practically) any DP problem because they're all very similar.
u/Scarbane · 70 pointsr/Futurology

He should brush up on his knowledge about general AI. Nick Bostrom's Superintelligence is a good starting place, even though it's already a few years old.

I recommend the rest of you /r/Futurology people read it, too. It'll challenge your preconceived notions of what to expect from AI.

u/WarrenHarding12 · 53 pointsr/AskReddit

Solar Plane

http://www.solarimpulse.com/en/our-adventure/the-first-round-the-world-solar-flight/

Sustainable City

https://www.youtube.com/watch?v=7UMvj2ZYnU8

New Horizons

http://pluto.jhuapl.edu/index.php

Electric Cars

http://cleantechnica.com/2014/07/28/electric-vehicle-revolution-nigh-infographic/

http://www.iea.org/publications/globalevoutlook_2013.pdf

http://www.abb-conversations.com/2014/03/electric-vehicle-market-share-in-19-countries/

http://www.futuretimeline.net/blog/2014/08/3.htm

http://www.futuretimeline.net/blog/2012/04/9.htm

https://en.wikipedia.org/wiki/Charging_station#Battery_swapping

http://www.futuretimeline.net/21stcentury/images/future-car-technology-2020.gif

Moon Ad

http://www.independent.co.uk/life-style/gadgets-and-tech/the-first-advert-on-the-moon-japanese-soft-drink-manufacturer-will-deliver-a-can-of-pocari-sweat-to-the-lunar-surface-in-2015-9382535.html

https://www.otsuka.co.jp/en/company/release/2014/0515_01.html

Virtual Reality

http://www.computerandvideogames.com/309486/sony-virtual-reality-gamings-going-to-be-absolutely-amazing/

http://www.futuretimeline.net/blog/2013/06/7-2.htm

Genome Sequencing

http://www.amazon.com/Singularity-Near-Humans-Transcend-Biology/dp/0143037889/

http://www.nature.com/nbt/journal/v30/n7/full/nbt.2241.html

http://www.genome.gov/sequencingcosts/

http://www.wired.com/2014/01/the-fda-may-win-the-battle-this-holiday-season-but-23andme-will-win-the-war/

http://www.prnewswire.com/news-releases/global-next-generation-sequencing-report-market-size-segmentation-growth-and-trends-by-provider---2013-edition-213150681.html

http://diginomica.com/2013/09/05/four-reasons-big-data-breathes-life-health-care/

http://www.futuretimeline.net/blog/2012/02/24.htm

http://www.genome.gov/27555651

http://www.theguardian.com/society/2013/may/14/angelina-jolie-mastectomy-breast-cancer

u/bcorfman · 53 pointsr/Python

Fluent Python. I've learned more per page from that book than any other Python resource I've come across. For example, studying the book's material on hashable objects got me over a problem I was having with a planning algorithm. When I finished, I had successfully implemented a knowledge base and a total-order planner that could directly leverage the A*-search in the AIMA-Python code base. Kudos to the book's author, Luciano Ramalho, as that is just one example that helped my problem-solving skills in Python.

u/reddilada · 51 pointsr/learnprogramming

CODE: The Hidden Language of Computer Hardware and Software is a great book written for the non-tech crowd. It gives a good basis for what computers are all about.

If he works in an office, I'd point him to Automate the Boring Stuff with Python as it will deal with things he is probably already familiar with.

u/QSIT_Researchers · 50 pointsr/science

Here go some resources I like (I might update this list). LdR

Books:

u/n3xg3n · 47 pointsr/networking

Leased phone lines.

A book I highly recommend reading (it's light on technical matters, but it is a really interesting read... at least for me since I wasn't quite alive to experience most of it) is Where Wizards Stay Up Late: The Origins of the Internet, which looks at the ideation, implementation, and growth of the ARPANET, various regional networks, and eventually the Internet.

u/zombox · 38 pointsr/gamedev

The last couple of weeks in Zombox development:

(tl;dr: brand new zombie AI with lots of bells and whistles, tweaked ammo display and you can no longer hit things through walls)

  • The zombie AI was completely re-written. After I was inspired by examples in the book Programming Game AI By Example, I decided to go with a behavioural hiearchy approach, rather than my old system which relied on a complex FSM in the form of a large if-then tree.
  • In the old zombie AI system, zombies knew the player's position at all times, and their only goal was to find the player and attack him. In the new system, the zombies' behaviour is more realistic -- they have limited knowledge of their surroundings, and react to sensory input to gain new information. Zombies generally idle or wander, and will not attack the player unless they see him. If they hear a noise (like the player firing a gun, or hitting something with a blunt weapon), they will seek the source of the sound, and either continue wandering if they find nothing, or attack the source if they can see it and it's human. This sight/sound based system is nice because it allows you to do things like distract zombies with a sound while you escape in the opposite direction, or sneak up behind zombies to attack.
  • Zombie flocking has been improved. Zombies will no longer randomly walk through walls, or walk through the player, or walk through each other. Also, when attacking they will aim to surround the target, rather than simply attack from one side.
  • If a zombie finds that a door is in the way of its path (ie, if it chases a target into a building and the target closes the door, or it hears a sound inside a building), it will attempt to break the door down to get inside.
  • In non-AI related news, the weapon ammo system has been improved to show the ammo counters for weapons in the active item slots up top, and when a weapon's ammo is depleted, the active item icon for that weapon will blink red.
  • Props and zombies can no longer be hit through walls or other objects.

    Here are some screens showing the debug info I use to help me visualize the new AI system. White lines show new A* paths that are calculated. Green lines point to the next node on a zombie's path when the zombies is following a sound. Magenta/Cyan lines point to a zombie's active target (cyan = close, magenta = far). Red lines show the next node on a zombie's path when the zombie is chasing a target (although zombies are allowed to veer off their path when the target is directly in range). Yellow lines point to a new sound that a zombie has heard.

  • One, Two, Three, Four, Five

    Animations:

  • here's a gif showing some zombies chasing the player into a building, and attempting to break down the door to get inside.
  • here's another gif showing that same thing happening, in a different scenario
  • here's a gif (sped up 200%) showing some of the improved swarming
  • here's a gif showing the active item icon ammo improvements

    As for New Year's resolutions...my short term goal is to implement Jump Point Search into my A* algorithm. My long term goals involve adding human NPCs, the underground subway system, the structure-building system, mini-quests, more zombie types including zombie animals, and releasing the game in May.

    More info: Devblog, Youtube, Twitter
u/5hot6un · 36 pointsr/videos

Ray Kurzweil best describes how humankind will transcend biology.

Our biology binds us with time. By transcending biology we will transcend time. With all the time we need, we needn't worry ourselves with trying to create wormholes when we can just send a copy of ourselves in all directions.

u/Pally321 · 33 pointsr/mildlyinteresting

If you're serious about getting into software development, I'd recommend you start looking into data structures and algorithms as well. It's something I think a lot of people who were self-taught tend to miss because it's not required knowledge to program, but it will give you a huge competitive advantage.

While I haven't read it, this book seems like a good introduction to the concept: https://smile.amazon.com/dp/1617292230/?coliid=I34MEOIX2VL8U8&colid=MEZKMZI215ZL&psc=0

From there I'd recommend looking at MIT's Intro to Algorithms, 3rd Edition. A bit more advanced, but the topics in there will play a huge role in getting a job in software.

u/geek_on_two_wheels · 33 pointsr/csharp
u/YeWenjie · 31 pointsr/Python

In case you're unaware, or anyone else is loo8for a more substantial book, Fluent Python covers Pythonic usage through 3.5, that should at least get you most of the way there.

u/dupelize · 30 pointsr/compsci

>So do you guys have any ideas on a title?

"Everything I Learned About Quantum Computing After I Stopped Worrying About the Title and Learned About the Content"

>and can you recommend any good books?

What level? The standard intro is Mike and Ike if you have a calculus and linear algebra background.

It sounds like you might be in high school (or equivalent) so you probably don't have much linear algebra knowledge beyond knowing what a matrix is.

It isn't a book, but Scott Aaronson has a decent blog. There is a lot of non Quantum talk too, but if you sift through there is a lot of interesting stuff.

u/KobayashiDragonSlave · 28 pointsr/learnprogramming

Not OP but I discovered this book 'Grokking Algorithms' from a fantastic youtube channel 'The Coding Train'. The book explains a lot of the algorithms and data structures that I am learning in my first sem of CS at school. This thing even has the stuff that I am going to learn in the next semster. I found this book much more fun than my monotonous textbooks.
If anyone wants to get a good grasp of the fundamentals of A&DS this is a great starting point and then move on to MOOCs by famous Universities. MIT's A&DS was the one that I used. Dunno if it's still available on YouTube because I remember that OCW courses were removed or something?

Link

u/DarkAnt · 26 pointsr/compsci

I don't know how to tell you how code well, because I don't know how to do it myself. I look at John Carmack, Bjarne Stroustrup, Guido van Rossum, Herb Sutter and co. and I realize how poorly I measure. That said, I do know of some things that will certainly help you. I believe to get good at something takes time and dedication. The following is in the order that I thought of it. I'm not sure how you should attempt to learn this material. Hopefully someone else can help you out with that.


Learning how to recognize potential solutions to classes of problems and of course having the basic tools to design a solution.

u/Lericsui · 26 pointsr/learnprogramming

"Introduction to Algorithms"by Cormen et.al. Is for me the most important one.

The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good.


Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics.


Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it.


SICP(although married to a language) teaches very very good fundamentals.


Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material.

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/KeepingItClassy11 · 25 pointsr/learnpython

I really like The Algorithm Design Manual by Steven Skiena.

u/victorioushermit · 24 pointsr/learnpython

I'm working my way through Python Crash Course right now and recommend it. It starts from basics but doesn't treat the reader like an idiot. And the exercises are good for helping you to think through how to format your code. From there I'm planning to go through Automate the Boring Stuff with Python, and after I get a bit better at it, Fluent Python

Python Crash Course

Automate the Boring Stuff with Python

Fluent Python

u/compSecurity · 24 pointsr/netsecstudents

I'd recommend learning to use Linux well first, since that is what you will need to use a lot of the tools for Pen Testing, after that you can choose an area to start with, most go with web app sec or net sec, since those are most in use right now - after that you can move into areas like cloud security, forensics or some other specialty.

As far as resources go there are a lot out there, i'll link some good ones that I use:

https://github.com/wtsxDev/Penetration-Testing

https://github.com/jivoi/offsec_pdfs

Those two should keep you going for a while at least.

As for coding, i'd recommend learning to use Bash first, then python. Bash is the Born Again SHell, a scripting language used in linux and is something that you will use a lot, and python is a language that is used a lot in offsec.

Here is a place where you can learn some Bash:
https://www.tldp.org/LDP/Bash-Beginners-Guide/html/Bash-Beginners-Guide.html

There are two books i'd recommend for python, ill link them here:
https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579

https://www.amazon.com/Black-Hat-Python-Programming-Pentesters/dp/1593275900

the book in the second link is a bit easier to approach in my opinion, but both require some basic knowledge of python - so youtube or google some tutorials and im sure you'll do fine.

If you want to get into pen testing web apps, then you will want to learn some PHP and JavaScript, a lot of websites are written in PHP, and a lot of exploits are executed with JS: Cross site scripting in particular. You should also learn some SQL since that is another common one for manipulating databases, and can be attacked in a method known as SQL injection.

If you want a place to practice things you are learning then go here: http://overthewire.org/wargames/
They offer some pretty basic war games for things like linux commands and what not so you can really test your knowledge and learn a lot of the things you will have to do to progress through the games.

That's all I can think of atm, but i'm sure of the other people in here will be happy to give you some more suggestions

good luck!

u/JoseJimeniz · 22 pointsr/ProgrammerHumor

It reminds me of one of Raymond Chen's blog posts where he reminds you that the a NULL garbage collector is a perfectly valid implementation:

------------

Everybody thinks about garbage collection the wrong way
-----

When you ask somebody what garbage collection is, the answer you get is probably going to be something along the lines of "Garbage collection is when the operating environment automatically reclaims memory that is no longer being used by the program. It does this by tracing memory starting from roots to identify which objects are accessible."

This description confuses the mechanism with the goal. It's like saying the job of a firefighter is "driving a red truck and spraying water." That's a description of what a firefighter does, but it misses the point of the job (namely, putting out fires and, more generally, fire safety).

Garbage collection is simulating a computer with an infinite amount of memory. The rest is mechanism. And naturally, the mechanism is "reclaiming memory that the program wouldn't notice went missing." It's one giant application of the as-if rule.

Now, with this view of the true definition of garbage collection, one result immediately follows:

> If the amount of RAM available to the runtime is greater than the amount of memory required by a program, then a memory manager which employs the null garbage collector (which never collects anything) is a valid memory manager.

--------------------

Bonus Reading
------

MIT Press - Maintaining the Illusion of Infinite Memory, Chapter 5, pp. 540:

> If we can arrange to collect all the garbage periodically, and if this turns out to recycle memory at about the same rate at which we construct new pairs, we will have preserved the illusion that there is an infinite amount of memory.

u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/ataraxic89 · 20 pointsr/videos

YOU HAVE COME TO THE RIGHT PLACE MY BOY! TODAY I GOT LINKS FOR DAYZZZ


IBM-Q is an online cloud based 5 qubit quantum computer open to the public. You can "write" simple algorithms here and it has resources to learn. https://www.ibm.com/quantum-computing/learn/what-is-quantum-computing

This video is a good introduction to Quantum computer by one of the lead QC scientists/engineers at IBM, https://youtu.be/JRIPV0dPAd4

A video overview of the math involved in quantum computing, https://youtu.be/IrbJYsep45E

For a full understanding of the topic at a college introductory level, buy this book: https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176

Along the way you should also look into languages like q# by microsoft. I believe google uses a python library for it.

u/____candied_yams____ · 20 pointsr/learnpython

OOP in python is a bit different than other languages because of how python doesn't necessarily advocate for private data members as in many statically typed languages like C++ and Java. Additionally, the keyword decorator property eliminates the need for getters and setters up front in class design.

I have lots of python books but the two that helped me write pythonic classes were Effective Python (I actually only have the 1st edition) and Fluent python. The first book is probably a great crash course so if you get just one book get that one, but the latter book is good too and goes into much more detail without coming across like a dense reference text.

James Powell also has some great videos on pythonic programming, a lot of which involves OOP style.

u/DucBlangis · 20 pointsr/netsecstudents

Here is a "curriculum" of sorts I would suggest, as it's fairly close to how I learned:

  1. Programming. Definitely learn "C" first as all of the Exploitation and Assembly courses below assume you know C: The bible is pretty much Dennis Richie and Kernighan's "The C Programming Language", and here is the .pdf (this book is from 1988, I don't think anyone would mind). I actually prefer Kochan's book "Programming in C" which is very beginner freindly and was written in 2004 rather than 1988 making the language a little more "up to date" and accessible. There are plenty of "C Programming" tutorials on YouTube that you can use in conjunction with either of the aforementioned books as well. After learning C than you can try out some other languages. I personally suggest Python as it is very beginner friendly and is well documented. Ruby isn't a bad choice either.

  2. Architecture and Computer basics:
    Generally you'll probably want to look into IA-32 and the best starting point is the Intel Architecture manual itself, the .pdf can be found here (pdf link).
    Because of the depth of that .pdf I would suggest using it mainly as a reference guide while studying "Computer Systems: A Programmers Perspective" and "Secrets of Reverse Engineering".

  3. Operating Systems: Choose which you want to dig into: Linux or Windows, and put the effort into one of them, you can come back to the other later. I would probably suggest Linux unless you are planning on specializing in Malware Analysis, in which case I would suggest Windows. Linux: No Starch's "How Linux Works" is a great beginner resource as is their "Linux Command Line" book. I would also check out "Understanding the Linux Kernel" (that's a .pdf link). For Windows you can follow the Windows Programming wiki here or you can buy the book "Windows System Programming". The Windows Internals books are generally highly regarded, I didn't learn from them I use them more as a reference so I an't really speak to how well they would teach a "beginner".

  4. Assembly: You can't do much better than OpenSecurityTraining's "Introductory Intel x86: Architecture, Assembly, Applications, & Alliteration" class lectures from Xeno Kovah, found here. The book "Secrets of Reverse Engineering" has a very beginner friendly introduction to Assembly as does "Hacking: The Art of Exploitation".

  5. Exploitation: OpenSecurityTraining also has a great video series for Introduction to Exploits. "Hacking: The Art of Exploitation" is a really, really good book that is completely self-contained and will walk you through the basics of assembly. The author does introduce you to C and some basic principles of Linux but I would definitely suggest learning the basics of C and Linux command line first as his teaching style is pretty "hard and fast".

  6. Specialized fields such as Cryptology and Malware Analysis.


    Of course if you just want to do "pentesting/vuln assessment" in which you rely more on toolsets (for example, Nmap>Nessus>Metasploit) structured around a methodology/framework than you may want to look into one of the PACKT books on Kali or backtrack, get familiar with the tools you will use such as Nmap and Wireshark, and learn basic Networking (a simple CompTIA Networking+ book will be a good enough start). I personally did not go this route nor would I recommend it as it generally shys away from the foundations and seems to me to be settling for becoming comfortable with tools that abstract you from the real "meat" of exploitation and all the things that make NetSec great, fun and challenging in the first place. But everyone is different and it's really more of a personal choice. (By the way, I'm not suggesting this is "lame" or anything, it was just not for me.)

    *edited a name out





u/RagaTanha · 20 pointsr/singularity

The singularity is near by ray kurzweil has all the science behind it.

Accelerando

and Singularity Sky by Charles Stross for Fiction.

u/CiscoJunkie · 20 pointsr/networking

On mobile, but there's a book called "Where Wizards Stay Up Late". Will see if I can get you a link, but it should be easily found on Amazon.

Edit: Here ya go!

u/RobertJacobson · 19 pointsr/ProgrammingLanguages

Honestly, if they are already using DrRacket, The Structure and Interpretation of Computer Programs (SICP) has aged very well and, for an undergraduate class, is probably at least as good as anything else. You will want to strategically choose which parts you cover. It is still being used at MIT, for example.

(Edit: SICP, not SCIP.)

u/Ibrey · 19 pointsr/badphilosophy

That's what they teach in universities, isn't it?

> Educators, generals, dieticians, psychologists, and parents program. Armies, students, and some societies are programmed.

- First words of Structure and Interpretation of Computer Programs

u/tazzy531 · 19 pointsr/google

If you want a job at Google, look up Steve Yegges article on how to prep for the interview.

There's no shortcuts from actually knowing your shit. Code, algorithm, design, Big O... Stuff that you actually do on the job.

Don't waste time on questions mentioned in this article.

When I interviewed, I read Skiena's Algorithm Design Manual cover to cover for a couple of months leading up to the interview.

u/christianitie · 18 pointsr/math

Without knowing much about you, I can't tell how much you know about actual math, so apologies if it sounds like I'm talking down to you:

When you get further into mathematics, you'll find it's less and less about doing calculations and more about proving things, and you'll find that the two are actually quite different. One may enjoy both, neither, or one, but not the other. I'd say if you want to find out what higher level math is like, try finding a very basic book that involves a lot of writing proofs.

This one is aimed at high schoolers and I've heard good things about it, but never used it myself.

This one I have read (well, an earlier edition anyway) and think is a phenomenal way to get acquainted with higher math. You may protest that this is a computer science book, but I assure you, it has much more to do with higher math than any calculus text. Pure computer science essentially is mathematics.

Of course, you are free to dive into whatever subject interests you most. I picked these two because they're intended as introductions to higher math. Keep in mind though, most of us struggle at first with proofwriting, even with so-called "gentle" introductions.

One last thing: Don't think of your ability in terms of your age, it's great to learn young, but there's nothing wrong with people learning later on. Thinking of it as a race could lead to arrogance or, on the other side of the spectrum, unwarranted disappointment in yourself when life gets in the way. We want to enjoy the journey, not worry about if we're going fast enough.

Best of luck!

u/ytterberg_ · 18 pointsr/changemyview

The problem is AI alignment: how do we make sure that the AI wants good stuff like "acting like a neutral arbiter" and not bad stuff like "world domination"? This turns out to be a very hard question, and a lot of very smart people believe that a superintelligence would destroy humanity unless we are very very careful. Bostroms Superintelligence is a good introduction to this topic.

> The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. If machine brains surpassed human brains in general intelligence, then this new superintelligence could become extremely powerful - possibly beyond our control. As the fate of the gorillas now depends more on humans than on the species itself, so would the fate of humankind depend on the actions of the machine superintelligence.

If you don't have the time for the book, this FAQ is good:

> 4: Even if hostile superintelligences are dangerous, why would we expect a superintelligence to ever be hostile?

> The argument goes: computers only do what we command them; no more, no less. So it might be bad if terrorists or enemy countries develop superintelligence first. But if we develop superintelligence first there’s no problem. Just command it to do the things we want, right?

> Suppose we wanted a superintelligence to cure cancer. How might we specify the goal “cure cancer”? We couldn’t guide it through every individual step; if we knew every individual step, then we could cure cancer ourselves. Instead, we would have to give it a final goal of curing cancer, and trust the superintelligence to come up with intermediate actions that furthered that goal. For example, a superintelligence might decide that the first step to curing cancer was learning more about protein folding, and set up some experiments to investigate protein folding patterns.

> A superintelligence would also need some level of common sense to decide which of various strategies to pursue. Suppose that investigating protein folding was very likely to cure 50% of cancers, but investigating genetic engineering was moderately likely to cure 90% of cancers. Which should the AI pursue? Presumably it would need some way to balance considerations like curing as much cancer as possible, as quickly as possible, with as high a probability of success as possible.

> But a goal specified in this way would be very dangerous. Humans instinctively balance thousands of different considerations in everything they do; so far this hypothetical AI is only balancing three (least cancer, quickest results, highest probability). To a human, it would seem maniacally, even psychopathically, obsessed with cancer curing. If this were truly its goal structure, it would go wrong in almost comical ways.

> If your only goal is “curing cancer”, and you lack humans’ instinct for the thousands of other important considerations, a relatively easy solution might be to hack into a nuclear base, launch all of its missiles, and kill everyone in the world. This satisfies all the AI’s goals. It reduces cancer down to zero (which is better than medicines which work only some of the time). It’s very fast (which is better than medicines which might take a long time to invent and distribute). And it has a high probability of success (medicines might or might not work; nukes definitely do).

> So simple goal architectures are likely to go very wrong unless tempered by common sense and a broader understanding of what we do and do not value.

u/The_Dirty_Carl · 17 pointsr/gamedev
u/Cryocore · 17 pointsr/gamedev

you could use space partitioning. Split the world into grids of say 10 x 10.
let each agent update which grid cell it is on on each frame. Then its just a matter of testing only the agents in the surrounding grid cells from the current agent (which will most of the time be a small number)

This book explains it really well: Programming Game AI by Example. Also has code samples online

u/MattDPS · 17 pointsr/gamedev

The one I always suggest is Programming Game AI By Example. The amount of valuable info you can pull out of this one book is incredible. It's still something I reference years later.

u/edwardkmett · 17 pointsr/programming

Three books that come to mind:

Types And Programming Languages by Benjamin Pierce covers the ins and outs of Damas-Milner-style type inference, and how to build the bulk of a compiler. Moreover, it talks about why certain extensions to type systems yield type systems that are not inferrable, or worse may not terminate. It is very useful in that it helps you shape an understanding to understand what can be done by the compiler.

Purely Functional Data Structures by Chris Okasaki covers how to do things efficiently in a purely functional (lazy or strict) setting and how to reason about asymptotics in that setting. Given the 'functional programming is the way of the future' mindset that pervades the industry, its a good idea to explore and understand how to reason in this way.

Introduction to Algorithms by Cormen et al. covers a ton of imperative algorithms in pretty good detail and serves as a great toolbox for when you aren't sure what tool you need.

That should total out to around $250.

u/mightybyte · 16 pointsr/haskell

I actually had this exact discussion today. A number of people argue that type classes must have laws. I definitely share the general sentiment that it is better for type classes to have laws. But the extreme view that ALL type classes should have laws is just that...extreme. Type classes like Default are useful because they make life easier. They reduce cognitive overload by providing a standardized name to use when you encounter a concept. Good uniformly applied names have a host of benefits (see Domain-Driven Design for more on this topic). They save you the time and effort of thinking up a name to use when you're creating a new instance and also avoids the need to hunt for the name when you want to use an instance. It also lets you build generic operations that can work across multiple data types with less overhead. The example of this that I was discussing today was a similar type class we ended up calling Humanizable. The semantics here are that we frequently need to get a domain specific representation of things for human consumption. This is different from Default, Show, Pretty, Formattable, etc. The existence of the type class immediately solves a problem that developers on this project will encounter over and over again, so I think it's a perfectly reasonable application of a useful tool that we have at our disposal.

EDIT: People love to demonize Default for being lawless, but I have heard one idea (not originally mine) for a law we might use for Default: def will not change its meaning between releases. This is actually a useful technique for making an API more stable. Instead of exporting field accessors and a data constructor, export a Default instance and lenses. This way you can add a field to your data type without any backwards-incompatible changes.

u/RoboticHam · 16 pointsr/HowToHack

Hi! Saying this as constructively as possible...but I would argue that you do not need Kali to learn about pentesting. In fact, I would go as far as saying to not install Kali until you already know something about pentesting.

If I may recommend some reading material I think that it does a good job of explaining what is going on and the opportunity to write your own scripts and learn some cool (and reusable) stuff along the way.

I just don't think installing Kali anywhere is a great place to really start. I believe you will become a little bit overwhelmed and miss out on what it really means to pentest.

u/Horizivertigraph · 16 pointsr/QuantumComputing

Don't get discouraged, it's possible to get to a reasonable understanding with some sustained effort. However, you need to get the following into your head as quickly as possible:

Popular level explanations of anything quantum are a waste of your time.

Go back and read that again. You will never get close to understanding the field if you rely on someone else managing to "find the right metaphors" for you. Quantum computing is a mathematical field, and if you want to understand a mathematical field, you need to do mathematics. This sounds super scary, but it's actually no problem! Math is not what you think it is, and is actually a lot of fun to learn. You just need to put some work in. This just means maybe doing an hour or so of learning every day before you go to work, or afterwards.

Let's look at a little bit of a roadmap that you can follow to get to a reasonable understanding of quantum computing / quantum information. This is pretty much the path I followed, and now I am just about to submit my PhD thesis on quantum computational complexity. So I guess it worked out OK.

  1. You can get really far in quantum computing with some basic understanding of linear algebra. Go to Khan Academy and watch their fantastic introduction.

    If Sal asks you to do an exercise, do the exercise.

  2. Now you know what a vector is, can kind of grasp what a vector space is, and have some good intuition on how matrix-vector and matrix-matrix multiplication works, then you can probably make a reasonable start on this great intro book: https://www.amazon.co.uk/Quantum-Computing-Computer-Scientists-Yanofsky/dp/0521879965

    Start from the start, take it slowly, and do all of the exercises. Not some of the exercises, do all of the exercises. If you don't know a term, then look it up on wikipedia. If you can't do an exercise, look up similar ideas on Google and see if you can muddle your way through. You need to get good at not being scared of mathematics, and just pushing through and getting to an answer. If there is an explanation that you don't understand, look up that concept and see if you can find somebody else's explanation that does it better. Do the first few intro chapters, then dip in to some of the other chapters to see how far you get. You want to get a pretty good coverage of the topics in the book, so you know that the topics exist and can increase your exposure to the math involved.

  3. If you manage to get through a reasonable chunk of the book from point 2), then you can make a start on the bible: Quantum information and computation by Nielsen and Chuang (https://www.amazon.co.uk/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/ref=pd_lpo_sbs_14_img_1?_encoding=UTF8&psc=1&refRID=S2F1RQKXKN2268JJF3M2). Start from the start, take it slowly, and do all of the exercises.

    Nielsen and Chuang is not easy, but it's doable if you utilise some of the techniques I mention in point 2): Google for alternative explanations of concepts that the book explains in a way that confuses you, do all of the exercises, and try to get good coverage throughout the whole book. Make sure you spend time on the early linear algebra and basic quantum chapters, because if you get good at that stuff then the world is your oyster.

    Edit:

    Just remembered two more excellent resources that really helped me along the way

    A) Quantum mechanics and quantum computation, a video lecture course by Umesh Vazirani (YouTube playlist here) is fantastic. Prof. Vazirani is one of the fathers of the field of quantum computing, with a bunch of great results. His lecture course is very clear, and definitely worth devoting serious attention to. Also, he has a wonderful speaking voice that is very pleasant to listen to...

    B) Another lecture course called "Quantum Computing for the determined", this time given by Michael Nielsen (YouTube playlist here). In my opinion Nielsen is one of the best scientific communicators alive today (see also his unrelated discourse on neural networks and machine learning, really great stuff), and this series of videos is really great. Communicating this sort of stuff well to non-practitioners is pretty much Nielsen's whole jam (he quit academia to go on and write about science communication ), so it's definitely worth looking at.
u/JoshuaSmyth · 15 pointsr/gamedev

This book Programming Game AI By Example comes highly recommended by me.

It contains all of the above along with an example 2D topdown style deathmatch game with bots to give you a clear understanding of the most common topics in Game AI. It's also one of the more practically focused books, rather than theory focused.

u/shadowblade7536 · 15 pointsr/hacking

There are online forums that provide with tutorials on how to hack certain things, so read those and try them on your own devices or devices you have the permission to attack.

Examples of those forums : [NullByte] (https://null-byte.wonderhowto.com/) and [BlackMOREOps] (https://www.blackmoreops.com/)

Download Kali, load it onto a USB and look at the tools, especially [Metasploit] (https://www.metasploit.com/) and play with port scanners and such. I'd also recommend running vulnerable VM's such as Metasploitable and running vulnerable web apps such as [DVWA] (http://www.dvwa.co.uk/).

When it comes to writing code, Python excells for writing hacking tools. There are books about that such as [Violent Python] (https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579) and [Black Hat Python] (https://www.nostarch.com/blackhatpython). Im sure there are some about writing payloads and exploits in C, but I cant really remember the names.

If you have any questions, feel free to ask! And remember one thing: Be as creative as you can when experimenting. You'll learn a great deal that way.

u/chub79 · 15 pointsr/algorithms

The Algotitms Design Manual by Skienna helped me a lot.

I was also curious about this one.

Also, this site may help :)

u/NilsLandt · 14 pointsr/rails

Smaller business logic frameworks would be mutations and ActiveInteraction.

They would replace the operations (and parts of reform) of TB.
Personally, I wouldn't use either of them over TB, they still add complexity, but don't offer too much over self-written stuff. YMMV of course.

If you want to start simple: create POROs for your "operations" with 2 public methods - initialize and run (or call, execute, apply, process etc.). Put your logic in them, create / execute them in your controllers.
Call them services, workflows, procedures, operations, scenarios, whatever.
try to put no persistent state in them - let them do their thing, return some sort of result (true / false, model / nil, small result object).

This fulfills a number of your criteria: it shouldn't slow you down much at all, it's simple, fairly maintainable and easily unit testable.

If you would like to research a different approach, look into DDD. The Arkency book should make for a good start, with the original DDD book giving quite a bit more background information.

> I'm not coding SPAs, so I still need awesome logic for Views / Presenters.

If you liked the Cells from TB, you can use them without using the rest of TB.
If you want something simpler, use a decorator like draper with ERB or Slim.

u/mcur · 14 pointsr/linux

You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.

I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/

I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&ie=UTF8&qid=1377402221&sr=1-1

Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&ie=UTF8&qid=1377402371&sr=1-1

Edit: Got the wrong Hennessy/Patterson book...

u/vvillium · 14 pointsr/compsci

https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176

Best book hands down. This will bring you to the frontier of quantum computing. The book is also very approachable and meant for people trying to learn. It covers some linear algebra as well as physics in order to bring you up to speed.



Michael Nielson is an amazing educator and expert in the field. His you tube lecture course https://www.youtube.com/playlist?list=PL1826E60FD05B44E4 Quantum Computing for the Determined, is a short version of that book. He also has a free book online on Neural Networks that is probably the most referenced source on the matter. http://neuralnetworksanddeeplearning.com/index.html

u/jhanschoo · 14 pointsr/compsci

Google hasn't been helpful because no such algorithm exists. Check out Rice's Theorem for the impossibility.

edit:

Let S be the set of languages that you can reduce SAT to in polynomial time.

SAT is clearly in S, and we know some machine recognizes it.

The empty language is not in S (even if P=NP, so that SAT is P-complete), and we know some machine recognizes it.

By Rice's Theorem, no machine decides, when given a machine as input, whether that machine recognizes a language in S.

(we assume that the "any custom problem" input is as a machine encoding)

edit2:

I see that you make a lot of questions about computational complexity, but do not have a good foundation. Many things you propose or ask already have known impossibility results. May I suggest you have a look at Sipser (https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X)? That will give you a better understanding of computability and complexity to understand the feedback you're getting.

u/lennyjump · 14 pointsr/gamedev

Designing Visual Worlds by Bartle

Theory of Fun for Game Design by Koster is a classic and still largely valid

u/gatewaynode · 14 pointsr/pythontips

A nice learn as you build web project is : https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world

If Python is just a new language this is a really good book for hitting that next level: https://www.amazon.com/Fluent-Python-Concise-Effective-Programming/dp/1491946008

Here's a site worth reading most updates: https://www.planetpython.org/

And definitely subscribe to r/learnpython, it's a friendly place for all levels of Pythonistas.

u/The_Serious_Account · 13 pointsr/QuantumComputing

Quantum Computation and Quantum Information by Nielsen and Chuang is the standard intro text-book on the subject. It's how I got into the field and have taught classes using it as well. Highly recommended.

u/scandii · 13 pointsr/AskProgramming

as a beginner you're stuck in a position where you want to learn how to do things, but also how to write them. i.e you don't only want to paint a painting, but you also want it to be pretty and admired.

for programming there's a lot of schools of thought on this subject, some guys prefer test-driven development, others domain-driven design.

some think comments outside of method parameters are good coding praxis, others think it's a code-smell because if you have to explain your code you probably wrote it in a way that makes it difficult to understand.

some think patterns are for hipsters, others are of the correct opinion (ahem) that they are standardised solutions for common problems.

all in all, if I could go back in time 15 years when I started programming, I would read the following if they were available at the time:

https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

Domain-Driven Design is the concept of breaking your code into logical real world units and bundling your code around these objects so that it makes sense to program if you understand the real world your program is mirroring. i.e if you're making a fruit shop program, you might have a fruit seller, a register, a fruit warehouse, a process to deal with ordering fruit etc.

https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882

Clean code talks not so much about the architectural patterns (outside of test-driven development) but rather what's good and bad about code at it's core. i.e

if(fruitStand.Amount < 5)
{
fruitHelper(fruitStand, 3434)
}

vs

if(fruitStand.Amount < dailySoldAverage)
{
OrderNewFruit(fruitStand, wholesaleDiscountCode)
}

outside of that, I think you'll find a lot more resources by simply researching the concepts these guys talk about. programming is constantly changing, as long as you have the fundamentals in place as these guys talk about you're golden as long as you're willing to learn.

u/[deleted] · 13 pointsr/DotA2

I enjoyed seeing the topic being brought up in a public format as AI becomes more popular to the majority of gamers, but I wasn't a huge fan of the type of discussions going on in the thread. The main commenters were definitely knowledgable, but a lot of important info to really express their points was left out. This book is a really good introduction to game AI for anyone interested in the field:
http://www.amazon.com/Programming-Example-Wordware-Developers-Library/dp/1556220782

You can easily find a pdf of the book by searching for it on google.

edit: It goes into specifics for coding purposes, but I wouldn't say it's difficult to read without any programming experience. It's definitely intermediate concepts described for beginners.

u/j3toler · 13 pointsr/blackhat

How comfortable are you with python in general? There are some sites out there like codesignal.com that offer many small Python challenges you can do reasonably quickly. As far as "Black Hat Python" drills, not many that can be finished in 20 mins. but you can always follow the Violent Python chapters while at work. That's what I did, and it seems to go pretty smoothly.

u/hell_onn_wheel · 13 pointsr/Python

Good on you for looking to grow yourself as a professional! The best folks I've worked with are still working on professional development, even 10-20 years in to their profession.

Programming languages can be thought of as tools. Python, say, is a screwdriver. You can learn everything there is about screwdrivers, but this only gets you so far.

To build something you need a good blueprint. For this you can study objected oriented design (OOD) and programming (OOP). Once you have the basics, take a look at design patterns like the Gang of Four. This book is a good resource to learn about much of the above

What parts do you specify for your blueprint? How do they go together? Study up on abstract data types (ADTs) and algorithms that manipulate those data types. This is the definitive book on algorithms, it does take some work to get through it, but it is worth the work. (Side note, this is the book Google expects you to master before interviewing)

How do you run your code? You may want to study general operating system concepts if you want to know how your code interacts with the system on which it is running. Want to go even deeper with code performance? Take a look at computer architecture Another topic that should be covered is computer networking, as many applications these days don't work without a network.

What are some good practices to follow while writing your code? Two books that are widely recommended are Code Complete and Pragmatic Programmer. Though they cover a very wide range (everything from organizational hacks to unit testing to user design) of topics, it wouldn't hurt to check out Code Complete at the least, as it gives great tips on organizing functions and classes, modules and programs.

All these techniques and technologies are just bits and pieces you put together with your programming language. You'll likely need to learn about other tools, other languages, debuggers and linters and optimizers, the list is endless. What helps light the path ahead is finding a mentor, someone that is well steeped in the craft, and is willing to show you how they work. This is best done in person, watching someone design and code. Also spend some time reading the code of others (GitHub is a great place for this) and interacting with them on public mailing lists and IRC channels. I hang out on Hacker News to hear about the latest tools and technologies (many posts to /r/programming come from Hacker News). See if there are any local programming clubs or talks that you can join, it'd be a great forum to find yourself a mentor.

Lots of stuff here, happy to answer questions, but hope it's enough to get you started. Oh, yeah, the books, they're expensive but hopefully you can get your boss to buy them for you. It's in his/her best interest, as well as yours!

u/Jetbooster · 12 pointsr/Futurology

Why would it care if the goal we gave it didn't actually align with what we wanted? It has no reasons to care, unless these things were explicitly coded in, and as I said, morality is super hard to code into a machine.

To address your second point, I understand my example wasn't perfect, but say it understands that the more physical material a company controls, the more assets it has. So it lays claim to the entire universe, and sets out to control it. eventually, it is the company, and growing the company's assets just requires it to have more processing power. Again, it is an illustrative point, loosely derived from my reading of Superintelligence by Nick Bostrom. I would highly recommend it.

u/biglambda · 12 pointsr/haskell

I highly recommend The Haskell School of Expression by the late great Paul Hudak. Also you should learn as much as you can about Lambda Calculus in general like for example this paper.
After that you should learn as much as you can about types, Types and Programming Languages is really important for that.
Finally don't skip the important fundamental texts, mainly Structure and Interpretation of Computer Programs and the original video lectures by the authors (about the nerdiest thing you will ever watch ;)

u/HenryJonesJunior · 12 pointsr/cscareerquestions

Skiena's Algorithm Design Manual - It gives you an overview of what classes of problems exist and how real world problems can be expressed as instances of them. It doesn't always give you the step-by-step directions of how certain algorithms work, but it gives you enough of an overview to understand the problem and points you towards existing implementations.

It's certainly one of the most useful books I used when preparing for interviews (and comes in handy in the real world as well). As an anecdote, in one interview at a big-N company, I was presented with a problem, said "based on these factors I'd treat this as a network flow problem by doing X", and that was the only buzzword needed - rather than watch me try to write a solution to a known problem, we were able to move on to other questions. Without knowing that term, I probably would have spent the remainder of the interview trying to optimize a solution to the problem instead.

u/jpjandrade · 11 pointsr/Python

In my personal opinion any python book list that doesn't include Fluent Python is pure garbage.

u/enteleform · 11 pointsr/compsci

Check out:
Grokking Algorithms: An illustrated guide for programmers and other curious people
 
I'm also pretty rusty at math right now, and have been getting by with a try-different-things-until-it-works approach.  Even for the types of problems I've become efficient at solving, in many cases I don't know the actual terminology, so it makes it difficult to expand upon concepts or communicate them with others.  I'd like to get to a point where I can mentally reason about processes & formulas without having to execute them in order to see the results, and I feel like the first step to get there is to get reacquainted with terminology & foundational concepts.  Here are Some Resources I've queued up to work through for that purpose.

u/khedoros · 11 pointsr/EmuDev

I don't know about "beginner", but I was introduced to a lot of the key ideas when I took my Computer Architecture course in college, using a book like this.

Emulator 101 should be a good guide getting started, and other posts like Imran Nazar's about emulating the Game Boy in Javascript would be useful.

Chip-8 is a simple starting point (just a few operations, very simple architecture, only expects to run at about 500-1000 Hz, there are timers but not really interrupts, etc). Makes sense that it's simple and slow; it's actually a VM that ran on some microcomputers in the latter half of the 70s.

Space Invaders (the arcade game) has a more complex CPU, straightforward graphics and audio, and predictable timing on the interrupts.

Game Boy is a cleaner design than the NES, and the CPU can very nearly be adapted from the Space Invaders one. It introduces interrupts, interrupt priorities, memory bank-switching, more complex graphics and audio.

NES is similar to the Game Boy in some ways, but I feel like the quirkiness is even closer to the surface. Fewer interrupts, split memory map between CPU and PPU (the graphics chip), and a horrendous number of bank-switchers used in different games.

A lot of Sega's hardware, the SNES, or even something more obscure might make sense at this point.

My own path, so far, has been NES, started Game Boy (took a small break to build Chip-8), then finished Game Boy, added Color. Took a bit more time, then jumped into Game Boy Advance, which is my current thing (and being fair, I've taken a lot of breaks...I think I was seriously looking into GBA over a year ago).

u/lowlandslinda · 11 pointsr/Futurology

Musk keeps up with what philosopher Nick Bostrom writes. Same reason why he knows about the simulation theory which is also popularised by this philosopher Nick Bostrom. And lo and behold, Bostrom has also a paper and a book on AI.

u/punctured-torus · 11 pointsr/compsci
u/Pandasmical · 11 pointsr/computerscience

I enjoyed this one!
Code: The Hidden Language of Computer Hardware and Software

Here is someone else's detailed review on it

"Charles Petzold a does an outstanding job of explaining the basic workings of a computer. His story begins with a description of various ways of coding information including Braille, Morse code, and binary code. He then describes the development of hardware beginning with a description of the development of telegraph and relays. This leads into the development of transistors and logic gates and switches. Boolean logic is described and numerous electrical circuits are diagramed showing the electrical implementation of Boolean logic. The book describes circuits to add and subtract binary numbers. The development of hexadecimal code is described. Memory circuits are assembled by stringing logic gates together. Two basic microprocessors are described - the Intel 8080 and the Motorola 6800. Machine language, assembly language, and some higher level software languages are covered. There is a chapter on operating systems. This book provides a very nice historical perspective on the development of computers. It is entertaining and only rarely bogs down in technical detail."

u/charles__l · 11 pointsr/lisp

Lisp is like magic - it's the programmable programming language - if you learn it, everything else kind of pales in comparison :P

One fascinating aspect of lisp is that it's based on lambda calculus, which is basically a cleaner alternative to Turing machines (Turing machines are basically a mathematical way to describe computable problems). After learning about lambda calculus, Turing machines looked like a hack to me. A decent non-mathematical guide I found introducing them was this: http://palmstroem.blogspot.com/2012/05/lambda-calculus-for-absolute-dummies.html

Even though lisp allows for a lot of functional programming, it's not purely functional, and can be used to write object oriented code, or anything else really.

The books I'd recommend to learning it are:

  • The Little Schemer - a lovely, beginner friendly book that introduces Lisp and computation in a rather unique way.
  • Structure and Interpretation of Computer Programs - this is the book that was used to teach a bunch of programming classes at MIT, and is a classic text for computer science. Despite its advanced topics, it's still rather approachable, especially if you have a decent amount of programming background.
u/davepeck · 10 pointsr/programming

I'm often surprised when the people I work with (at a large software company) have never heard of key moments in our industry such as this one.

This has got me thinking: what other key events are there that everyone who works in the industry should know about?

A few to start the conversation -- personally, I think these are essential reading for anyone who claims to be interested in computers:

u/vorpal_potato · 10 pointsr/csMajors

I learned most of what I know from Robert Sedgewick, whose prose is exceeded in clarity only by his diagrams:

https://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X

Steve Skiena is also excellent, and writes a bit more accessibly:

https://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1849967202

You can usually find these on the shelves of a university library.

u/Echrome · 9 pointsr/hardware

If you want textbooks, Hennessy and Patterson's Computer Architecure 5th ed. is the de facto standard taught at most universities. It's a pretty good book, and shouldn't bog you down too much with equations. It won't help you evaluate components, but you will learn a lot of the underlying principles that are used in modern processors.

u/Philipp · 9 pointsr/Showerthoughts

For a deeper look into the subject of what the AI may want, which goes far beyond "it clearly won't harm us/ it clearly will kill us", I recommend Superintelligence by Nick Bostrom. Fantastic book!

u/arsenalbilbao · 9 pointsr/learnpython
  1. if you want to LEARN how to write programs - read "Structure and interpretation of computer programms" on python - SICP (project: you will write an interpreter of "scheme" programming language on python)

  2. if you want to TRAIN your OOP skills - Building Skills in Object-Oriented Design (you will code 3 games - roulette, craps and blackjack)

  3. Helper resources on your way:
    3.1. Dive into python 3 (excellent python book)
    3.2. The Hitchhiker’s Guide to Python! (best practice handbook to the installation, configuration, and usage of Python on a daily basis.)
    3.3 Python Language Reference ||| python standard library ||| python peps

  4. if you want to read some good python code - look at flask web framework (if you are interested in web programming also look at fullstackpython

  5. good but non-free books
    5.1. David Beazley "Python cookbook" (read code snippets on python)
    5.2. Dusty Phillips "Python 3 Object Oriented Programming" (learn OOP)
    5.3. Luciano Ramalho "Fluent python" (Really advanced python book. But I haven't read it YET)

  6. daily challenges:
    6.1. r/dailyprogrammer (easy, intermediate and advanced challenges) (an easy challenge example)
    6.2. mega project list

  7. BONUS
    From NAND to tetris ( build a general-purpose computer system from the ground up) (part1 and part2 on coursera)
u/sarevok9 · 9 pointsr/learnprogramming

Here's an entire book about it: https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579

Short answer -- it makes a lot of sense for whipping up quick tools for network based red-team work. If you already have an entirely custom set of tools in Delphi / Pascal / C# -- it might not be entirely sensible for you to use. It really depends on if your kit is lacking something specific.

Lastly -- from a red-team perspective, it seems like the plurality of hosts support Python out of the box now. OSX and many unix distros ship with it, which will give you a scripting framework to work on within the host that is a bit more concise / readable than bash.

u/galahadredgrave · 9 pointsr/ArtificialInteligence

I'm just beginning this journey myself, so judge what I say accordingly.

Artificial Intelligence: A Modern Approach seems to be the most popular textbook.

This article has some seemingly good advice, though it seems to be geared more toward Machine Learning (ML) than AI in general.

I think you'll want to learn a programming language. The above article recommends Python as it is well suited to ML.

There is (was?) a free online course on ML from Stanford by Andrew Ng. I started to take it a couple years ago but never finished. It is very accessible. The lectures appear to be on YouTube.

Grokking Algorithms is a highly regarded book on algorithms.

Make a free Amazon Web Services account and start playing with Sagemaker.

There really is no well defined path to learning AI, in my opinion. It is a highly interdisciplinary endeavor that will require you to be a self-starting autodidact. It's very exciting though. There is still plenty of new ground to be broken. Some might argue it is difficult for the little guy to compete with big labs at the big tech companies with their ungodly amounts of data to feed their AI, but I am optimistic.

u/sporksporksporkspork · 9 pointsr/compsci

I highly, highly, highly recommend reading Skiena:

http://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1849967202/ref=sr_1_5?ie=UTF8&qid=1320821047&sr=8-5

It's really readable, and is a really good refresher. I've actually reread it a couple times now (like, actually read like a novel, unlike what you do with CLRS), and each time I've been glad I've done so.

u/mm256 · 9 pointsr/programming

Introduction to Algorithms, Second Edition. Of course, don't miss Mr. Krumins Video Lectures perfect complement.

u/sea_turtles · 9 pointsr/networking

awesome videos you linked there.

EDIT: if you are interested in this type of stuff check out the book Where Wizards Stay Up Late: The Origins Of The Internet

u/slacker87 · 9 pointsr/networking

I LOVE following the history of networking, awesome find!

If you end up wanting more, where wizards stay up late and dealers of lightning are great reads about the people behind the early internet.

u/adventuringraw · 9 pointsr/MachineLearning

dude, ten hours of intro that can help you intuitively navigate relevant research questions when jumping into the actual research is completely fine and appropriate. You're welcome to your opinion, but a roadmap is all the more helpful when the challenge of Arxiv for a beginner is the double wammy of finding 'worthwhile papers' to read in the first place (citation count? Topic? Survey papers? Which papers are most important to start with?) along with the timesink of parsing even a single individual paper. Concept learning in deep RL is also an incredibly active area of research (one I'm just wading into), but if I could have a really engaging, intuitive, hands on 5 hour whirlwind tour through different established results, theories, contrasting approaches and so on, then sign me up, that sounds great to me. You'll still need to roll up your sleeves and get into some gnarly concepts and really intense math if you want to actually implement one of the cutting edge approaches, but starting with this kind of high level eli5 overview can be immensely helpful when deciding how to use your precious time. Even in a 100 lifetimes I don't know I could do all the things I want to do, so any time savings are more than welcome.

Granted, this particular course might not function well as a road map, but that would be a specific critique on this course in particular. I call bullshit that a course of this kind is useless in general in an emergent field. Perhaps it is for you, but not everyone learns like you, let others have their road if it suits them. We're all adults here, and I hope we can judge for ourselves where our time is most wisely spent.

Shitty courses being slapped together to take advantage of novices and pop science hype is a potential related problem, but if that's the chip on your shoulder, I'd challenge that potentially perverse incentive structure giving rise to a high number of worthless courses doesn't mean the 'ideal' intro course couldn't exist and be valuable.

also for what it's worth... I'm dabbling in this book, and it's doing a great job of laying framework. There might be divergent ideas and theories, but they'll all share a unified framework... why not start by exploring there? even bleeding edge doesn't have NOTHING but disconnected ideas.

u/shwestrick · 9 pointsr/compsci

Sipser's Introduction to the Theory of Computation is an excellent book with three major parts:

  1. Automata and Languages
  2. Computability Theory
  3. Complexity Theory

    Each part builds on the previous. I highly recommend working through it.
u/DiggyDog · 9 pointsr/gamedev

Hey there, I'm a game designer working in AAA and I agree with /u/SuaveZombie that you'll probably be better off with a degree in CS. BUT... don't give up on wanting to be a designer!

 

You should realize that it's not giving up on your dream at all, in fact, it's great advice for how to reach that dream. A designer with an engineering background is going to have a lot more tools at their disposal than one who doesn't.

 

Design is way more than just coming up with a bunch of cool, big ideas. You need to be able to figure out all the details, communicate them clearly to your teammates, and evaluate how well they're working so you can figure out how to make something people will enjoy. In fact, working on a big game often feels like working on a bunch of small games that all connect.

Take your big game idea and start breaking it down into all the pieces that it will need to be complete. For example, GTA has systems for driving and shooting (among many other things). Look at each of those things as its own, smaller game. Even these "small" parts of GTA are actually pretty huge, so try to come up with something as small as possible. Like, super small. Smaller than you think it needs to be. Seriously! You'll eventually be able to make big stuff, but it's not the place to start. Oh, and don't worry if your first game(s) suck. They probably will, and that's fine! The good stuff you make later will be built on the corpses of the small, crappy games you made while you were learning.

 

If you're truly interested in design, you can learn a lot about usability, player psychology, and communication methods without having to shell out $17k for a degree. Same goes for coding (there are tons of free online resources), though a degree will help you get in the door at companies you might be interested in and help provide the structure to keep you going.

 

Here's some books I recommend. Some are specific to games and some aren't, but are relevant for anything where you're designing for someone besides yourself.

 

Universal Principles of Design

The Design of Everyday Things

Rules of Play

The Art of Game Design This and the one below are great books to start with.

A Theory of Fun This is a great one to start with.

Game Feel

• Depending on the type of game you're making, some info on level design would be useful too, but I don't have a specific book to recommend (I've found pieces of many books and articles to be useful). Go play through the developer commentary on Half-Life 2 or Portal for a fun way to get started.

 

Sounds like you're having a tough time, so do your best to keep a positive attitude and keep pushing yourself toward your goals. There's nothing to stop you from learning to make games and starting to make them on your own if that's what you really want to do.

Good luck, work hard!

u/lostchicken · 8 pointsr/compsci

https://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871

SICP, and Lisp/Scheme programming in general, got me to really appreciate computing on a mathematically elegant level. Read the first couple of chapters (http://web.mit.edu/alexmv/6.037/sicp.pdf, free PDF!) and you'll be hooked.

u/phao · 8 pointsr/cscareerquestions

The best way I know how is by solving problems yourself and looking at good solutions of others.

You could consider going back to "fundamentals".

Most programming courses, IMO, don't have nearly as many exercises I think they should have. Some books are particularly good on their exercises list, for example K&R2, SICP, and TC++PL. Deitel's has long exercises lists, but I don't think they're particularly challenging.

There are some algorithms/DS books which focus on the sort of problem solving which is about finding solutions to problems in context (not always a "realistic" one). Like the "Programming Challenges" book. In a book like that, a problem won't be presented in a simple abstract form, like "write an algorithm to sort numbers". It'll be inside some context, like a word problem. And to solve that "word problem", you'll have to find out which traditional CS problems you could solve/combine to get the solution. Sometimes, you'll just have to roll something on your own. Like a new algorithm for the problem at hand. In general, this helps you work out your reduction skills, for once. It also helps you spotting applications to those classical CS problems, like graph traversal, finding shortest plath, and so forth.

Most algorithms/DS books though will present problems in a pretty abstract context. Like Cormen's.

I think, however, people don't give enough credit to the potential of doing the exercises on the books I've mentioned in the beginning.

Some books I think are worth reading which also have good exercises:

u/doctor_midnight · 8 pointsr/technology

if you like this subject matter, "Where Wizards Stay Up Late" is a must read... read it while getting my BS in Comp Sci

u/AutomaticSector · 8 pointsr/GoldandBlack

Here's the issue. AT&T had a government-granted monopoly over the phone lines at the time, and AT&T did not believe in any of this "computers talking to each other over phone lines" stuff. AT&T did not want to let anybody use their phone lines or provide any support for such a project, and as a government-granted monopoly, there was no other option.

The government basically told AT&T that they had to let their lines be used for this. That's pretty much the only thing the government was necessary for, and it was only to overcome a government-created problem. The amount of funding given towards networking research was pretty paltry, and was less than the government spent on voodoo mind control and LSD remote viewing experiments.

However, similar technology was also being worked on in Europe at the same time, and had they finished it first, the internet would have just grown out of that.

A great book about all this is Where Wizards Stay Up Late.

u/juliansorel · 8 pointsr/AskComputerScience

Yes, but computer architecture is way more than just a set of instructions. If you wanna learn computer architecture, I would recommend the Patterson book: https://www.amazon.com/Computer-Architecture-Quantitative-John-Hennessy/dp/012383872X

u/rtz90 · 8 pointsr/embedded

Sounds like you don't know some of your low-level computing fundamentals as well as you should for the jobs you want. I recommend studying up on those, and then developing more familiarity with them by tinkering or doing relevant projects.

​

If you're looking for a book recommendation, try Computer Systems: A Programmer's Perspective. If you read and understand chapter 2 (it's dry, hang in there), your question #1 will seem trivial to you (and you'll learn much more as well; pretty much all of it is important material). The book overall is a great read for embedded programmers, and anyone doing any form of low-level computing. There is a newer edition but the one I linked is the one I read.

​

u/911bodysnatchers322 · 8 pointsr/conspiracy

Ask and ye shall receive.

Gnostic Globalists / Fascists

u/Sk8nkill · 8 pointsr/IAmA

Hijacking to plug a couple of other books really worth reading if you're into this sort of thing:

Radical Abundance: How a Revolution in Nanotechnology Will Change Civilization by K. Eric Drexler

The Singularity is Near by the aforementioned Ray Kurzweil

u/ringl-bells · 8 pointsr/technology

Everyone should read SuperIntelligence by Nick Bostrom.

Non-affiliate Amazon link: Superintelligence: Paths, Dangers, Strategies

u/mastercraftsportstar · 8 pointsr/ShitPoliticsSays

I don't even think we'll get that far. I honestly believe that once we create proper A.I. it will snowball out of control in a matter of months and it will turn against us. Their communist plans are mere fever dream when it comes to A.I. "Well, if the robots are nice to us, don't destroy the human species, and actually are subservient to us, then our Communist fever dream could work"

Yeah, okay, it's like trying to decide whether you want chicken or fish for the in-flight meal while the plane is going down.



I recommend reading Superintelligence if you want to get more theroies about it.

u/fnord123 · 8 pointsr/compsci

There is also Violent Python

u/_kaine_ · 8 pointsr/HowToHack

I found Violent Python a very useful starting point. Particularly when someone else walks through it on video. I particularly find it harder to pick up computer science concepts when I can only read about them rather than follow someone else actually doing and explaining them like in a college course.

u/coned88 · 8 pointsr/compsci

The best way to learn these structures is to simply make a list of the ones you need to know, which can be done via looking at wikipedia and http://xw2k.nist.gov/dads/

One you have that list, take any language you know and program them without copying the code/psudocode from the web/book. You should know the general idea of the structure and be able to implement it. Then have a worker function that populates and tests all of them.

This books is also quite good if you want a book

http://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1849967202/ref=sr_1_3?s=books&ie=UTF8&qid=1292699709&sr=1-3



  • Queue
  • Stack
  • Linked List
  • Dictionary/AA
  • Hash Table
  • Heap
  • Binary Tree/search tree
  • Binary Tree
  • etc...
u/argvnaut · 7 pointsr/gamedev

Check out Programming Game AI by Example and Artificial Intelligence for Games. They are both decent game AI books, the former actually develops an AI for a soccer simulation in chapter four. There's also [Behavioral Mathematics for Game AI] (http://www.amazon.com/Behavioral-Mathematics-Game-Dave-Mark/dp/1584506849/), it expands on the concepts of utility and decision making which are only touched upon in the first two books. It's purely theoretical but very interesting.

u/dogewatch · 7 pointsr/learnprogramming

The Grokking Algorithms Book is good for beginners or for those who want a fun refresher. Obviously not too much depth but teaches it in a nice illustrated way. Code is also available on github.

u/Y0tsuya · 7 pointsr/hardware

Low level stuff is pretty academic. You need textbooks just to get started.

These were my college textbooks (I have earlier editions):

Computer Architecture: A Quantative Approach

Computer Organization and Design: The Hardware/Software Interface

The material is pretty dry but if you can slog through it you will gain good insight into how and why chips and systems are designed the way they are.

Below this level is logic gate design where if you don't have a background in semiconductor physics you'd never get through it.

u/srnull · 7 pointsr/hardware

> Textbooks aren't much of a thing because so much information is available online and technology changes so fast.

That's

  • really
  • not
  • true

    and I'm not just pointing out that those books exist. They're really good resources!
u/Salyangoz · 7 pointsr/Turkey

ilk olarak kolay gelsin.

Stanford ve MIT'in online course'lari var itunes university'de ordan bakmaya basla istersen.

Internetten egitimini sevdigin okullarin syllabus'unu alip ordaki kitaplardan calismaya baslayabilirsin zaman kaybetmemek icin. Istersen sana sirali kendi transcriptimdeki dersleri PM olarak atabilirim.

Okulundan aldigin .edu emailini cogu programi bedava kullanmak icin kullanabilirsin. (github inanilmaz bi pack veriyor, %100 suistimal etmeni tavsiye ederim)

Kitap oku ve bol bol kod yaz. Boktan olsa, bozuk olsa bile yaz. Kagit kalemle de yazmaya cekinme (is basvurularinda seni beyaz tahtaya cikaracaklar malesef debugger/syntax checker olmicak)

aklima gelen standart kitaplardan en onde su geliyor:

  • Introduction to algorithms : facebook ve google direk bu kitap icinden soru soruyor ise alimlarda. Cogu ilk basta cok zor gelebilir, korkuya gerek yok, 2 sayfayi 3 gunde falan yapiyorsan cok iyi.


  • Bilgisayar temel bilgileri icin de Computer Organization

    Eger lise bilgilerinden korkuyorsan cok inanilmaz bi matematik yok (sektorune gore degisebilir tabi). Lineer Cebir (image processing/game-development vs.) ve Olasilik (AI, Machine learning, data analysis vs.) bilgilerini tazele. Eger machine learning falan yapmak istiyorsan ilerde olasilik bilginin guclu olmasi gerek.

    Cok net bi cizgi izlemene gerek yok. Gerektikce ogrenme politikasi benden yana cikti su ana kadar ama bu tartisilir.

    baska bisi olursa cevap vermeye calisirim.
u/IlluminateTruth · 7 pointsr/technology

The Swedish philosopher Nick Bostrum wrote a book called Superintelligence that covers much of this topic. I'd recommend it to anyone as it's not technical at all.

He maintains a strong position that the dangers of AI are many and serious, possibly existential. Finding solutions to these problems is an extremely arduous task.

u/sinesha · 7 pointsr/quantum

I work on quantum information theory, and there are lots of researchers with a maths, computer science or electrical engineering background (I did physics). So the answer is no, you don't need to go through classical physics. You do need linear algebra, and things like general algebra and calculus are also important. Then, work through Nielsen and Chuang's book Quantum Information and Quantum Computation. Where in Australia are you based?

(edit: link)

u/YuleTideCamel · 7 pointsr/learnprogramming

Pick up these books:

u/Eleglac · 7 pointsr/programming

It was the first incarnation of what would later become the Internet, if that's not immediately apparent. The network started with four hosts (I think they were called nodes) one of which was at UCLA.

If you're interested in the subject, this is an excellent book to get you started.

u/unshift · 6 pointsr/programming
u/cparen · 6 pointsr/learnprogramming

Taking in mind that you may not have access to a computer:

If you've got no background already, I'd recommend The Little Schemer / The Seasoned Schemer books as they teach programming as if it were arithmetic, so it can be studied without a computer, working out the examples with pencil and paper or just very carefully in your head -- the examples are all very small, and worked through step by step. It moves very slowly though.

If you're looking for something more advanced, there is Structure and Interpretation of Computer Programs (SICP) which again takes a mathematical approach to computing, so the examples are meant to be just as well worked out like algebra as they are put into a computer. It helps a lot to have a computer, but I had fun working through it on a long plane flight, taking notes and working through the code on paper.

SICP's examples can all be worked through on a single sheet of paper, but ramp up quickly in difficulty, so they can definitely keep you busy.

u/scarthearmada · 6 pointsr/explainlikeimfive

The internet isn't a specific 'thing'; there is no internet box that you can point to and say, "that's the internet!" The internet is an abstract term applied to a series of computer networks of an indeterminate number greater than one. This is important because prior to the networking of two distinct networks together, you only had two distinct, non-communicating networks.

There is a varying level of redundancy in the connections between the various networks, all with one specific thing in common these days: the TCP/IP internet protocol suite. It was the best way of allowing for common communication between distinct computer networks.

If you visualize a long line -- a wire -- and then envision computer networks connecting to it via servers and more wire, you're envision what the internet is at a basic, broad level. There is a great video on YouTube that explains the internet this way. I'm trying to locate it now. However, if you enjoy reading about such things, there are two fantastic books that I recommend on the subject:

  1. Where Wizards Stay Up Late: The Origins Of The Internet

  2. Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web

    The former explores the history of the internet, taken as a summation of its parts and their creation. The latter explores the origins (and potential futures) of the World Wide Web, a specific application of hosting and sharing documents (and other media) across the internet conveniently. It's written by Tim Berners-Lee, the number one scientist behind its creation. I include this link because it is a common misconception that "the internet" is "the world wide web."
u/MGJon · 6 pointsr/amateurradio

A bit unrelated, but as a computer geek from my earliest days, "ARPA" will always mean Advanced Research Projects Agency. You know, the folks who brought us what we now call the internet (among many many other useful things).

(Also, if you're interested in that sort of thing, Where Wizards Stay Up Late is an excellent history of how we came to have the internet)

u/Narbas · 6 pointsr/compsci

Can we not put this in the sidebar by now? This question has been posted so many times lately. Use this book. The prerequisites are basically nothing.

u/PM_ME_UR_OBSIDIAN · 6 pointsr/compsci

The first step to doing research is ingesting whatever knowledge already exists. With that in mind:

u/shred45 · 6 pointsr/gatech

So, when I was younger, I did attend one computer science related camp,

https://www.idtech.com

They have a location at Emory (which I believe I did one year) that was ok (not nearly as "nerdy"), and one at Boston which I really enjoyed (perhaps because I had to sleep on site). That being said, the stuff I learned there was more in the areas of graphic design and/or system administration, and not computer science. They are also quite expensive for only 1-2 weeks of exposure.

I felt it was a good opportunity to meet some very smart kids though, and it definitely lead me to push myself. Knowing and talking to people that are purely interested in CS, and are your age, is quite rare in high school. I think that kind of perspective can make your interests and hobbies seem more normal and set a much higher bar for what you expect for yourself.

On the other side of things, I believe that one of the biggest skills in any college program is an openness to just figure something out yourself if it interests you, without someone sitting there with you. This can be very helpful in life in general, and I think was one of the biggest skills I was missing in high school. I remember tackling some tricky stuff when I was younger, but I definitely passed over stuff I was interested in just because I figured "thats for someone with a college degree". The fact is that experience will make certain tasks easier but you CAN learn anything you want. You just may have to learn more of the fundamentals behind it than someone with more experience.

With that in mind, I would personally suggest a couple of things which I think would be really useful to someone his age, give him a massive leg up over the average freshman when he does get to college, and be a lot more productive than a summer camp.

One would be to pick a code-golf site (I like http://www.codewars.com) and simply try to work through the challenges. Another, much more math heavy, option is https://projecteuler.net. This, IMO is one of the best ways to learn a language, and I will often go there to get familiar with the syntax of a new language. I think he should pick Python and Clojure (or Haskell) and do challenges in both. Python is Object Oriented, whilst Clojure (or Haskell) is Functional. These are two very fundamental and interesting "schools of thought" and if he can wrap his head around both at this age, that would be very valuable.

A second option, and how I really got into programming, is to do some sort of web application development. This is pretty light on the CS side of things, but it allows you to be creative and manage more complex projects. He could pick a web framework in Python (flask), Ruby (rails), or NodeJS. There are numerous tutorials on getting started with this stuff. For Flask: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world. For Rails: https://www.railstutorial.org. This type of project could take a while, there are a lot of technologies which interact to make a web application, but the ability to be creative when designing the web pages can be a lot of fun.

A third, more systems level, option (which is probably a bit more opinionated on my part) is that he learn to use Linux. I would suggest that he install VirtualBox on his computer, https://www.virtualbox.org/wiki/Downloads. He can then install Linux in a virtual machine without messing up the existing OS (also works with Mac). He COULD install Ubuntu, but this is extremely easy and doesn't really teach much about the inner workings. I think he could install Arch. https://wiki.archlinux.org. This is a much more involved distribution to install, but their documentation is notoriously good, and it exposes you to a lot of command line (Ubuntu attempts to be almost exclusively graphical). From here, he should just try to use it as much as possible for his daily computing. He can learn general system management and Bash scripting. There should be tutorials for how to do just about anything he may want. Some more advanced stuff would be to configure a desktop environment, he could install Gnome by default, it is pretty easy, but a lot of people really get into this with more configurable ones ( https://www.reddit.com/r/unixporn ). He could also learn to code and compile in C.

Fourth, if he likes C, he may like seeing some of the ways in which programs which are poorly written can be broken. A really fun "game" is https://io.smashthestack.org. He can log into a server and basically "hack" his way to different levels. This can also really expose you to how Linux maintains security (user permissions, etc. ). I think this would be much more involved approach, but if he is really curious about this stuff, I think this could be the way to go. In this similar vein, he could watch talks from Defcon and Chaos Computer Club. They both have a lot of interesting stuff on youtube (it can get a little racy though).

Finally, there are textbooks. These can be really long, and kinda boring. But I think they are much more approachable than one might think. These will expose you much more to the "Science" part of computer science. A large portions of the classes he will take in college look into this sort of stuff. Additionally, if he covers some of this stuff, he could look into messing around with AI (Neural Networks, etc.) and Machine Learning (I would check out Scikit-learn for Python). Here I will list different broad topics, and some of the really good books in each. (Almost all can be found for free.......)

General CS:
Algorithms and Data Structures: https://mitpress.mit.edu/books/introduction-algorithms
Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X
Operating Systems: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/0470128720

Some Math:
Linear Algebra: http://math.mit.edu/~gs/linearalgebra/
Probability and Stats: http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/readings/

I hope that stuff helps, I know you were asking about camps, and I think the one I suggested would be good, but this is stuff that he can do year round. Also, he should keep his GPA up and destroy the ACT.

u/iamktothed · 6 pointsr/Design

An Essential Reading List For Designers

Source: www.tomfaulkner.co.uk

All books have been linked to Amazon for review and possible purchase. Remember to support the authors by purchasing their books. If there are any issues with this listing let me know via comments or pm.

Architecture

u/TheMiamiWhale · 6 pointsr/battlestations

If you are a CS major I'd go with the $15 package just so you can get the two books on security. In terms of helping your job prospects, I'd think about the following:

  • Code as much as possible (be very comfortable with at least one language)
  • Practice algorithms
  • Have a project or two that isn't trivial

    Assuming you already know how to program in a handful of languages, you might find the following useful:

  • Fluent Python

  • Coding the Interview

    Of course depending on what your interested in (e.g., mobile, web, systems, etc.) there are additional great resources but these will at least get you moving in the right direction. The biggest thing is just practice writing code as much as possible.
u/theootz · 6 pointsr/cscareerquestions

TL;DR Improve yourself, invest in your future, don't worry about the mistakes...read the books listed at bottom, and practice!

Few months ago I royally fucked up an interview at Microsoft. A really simple question. But I had no experience doing coding on paper instead of a computer.

I spent a lot of time studying various books and paper coding to make sure it wouldn't happen again.

I then had an interview for another (in my mind at the time) dream job. I did fine for all the phone interviews and they flew me over to the west coast for an in person interview for the day. I did well for the first bit until they started pulling out dynamic programming and integer programming questions on me and expecting me. Once again something I didn't prepare for, and f'd up. Didn't get this job either. For the longest time I was really hard on myself at fucking up on both these interviews one after another. Especially this second one since a lot more was riding on it than just the job (another story).

But then I decided I didn't want to have this sort of experience again and expected better of myself. I made myself further improve and brush up on all those concepts as well. Did a few mock interviews with friends, spent some time working on interview type questions on both the computer and on paper. A month or two later I started interviewing again. By this point I was an interviewing machine - and I'm now able to do just about anything thrown at me. I've had my choice of employers and until just recently, was in the situation where I had so many offers I didn't know which one I wanted most. I'll be heading to silicon valley soon at one of the top tech companies in the world with a fantastic offer considering I just graduated.

The point is - learn from the mistakes and improve yourself. I realize you don't want to be that guy spending heaps of time coding outside of work or whatever... but this is an investment in yourself and your career. Do it once, and then just brush up on your skills from time to time. Get into the interviewing mindset and just rock them so you can have your choice of job - and then you can go about your thing once you have the job locked. The up front investment will be worth it!

Things that helped me:

  • www.hackerrank.com - practiced a lot of questions on here
  • www.careercup.com - another great site for questions
  • Cracking the Coding Interview More help on questions, but also some great insights into the interview process for the larger tech companies and many hints and tips on how to go about solving the more complex problems
  • Code Complete A great book for helping you to refresh or learn about software design
  • Eternally Confuzzled Great resource to learn how to think about common data structures and algorithms

    Having trouble with Algorithm design/analysis? These are some of the go-to books for that:

  • The Algorithm Design Manual Probably the defacto for learning about algorithm design and analysis
  • Introduction to Algorithms A great book with many different algorithms and data structures to learn about
  • Algorithm Design A great book if you want to dive deeper into more complex subjects like graph theory, dynamic programming, search algorithms, etc.. etc..
u/Gaff_Tape · 6 pointsr/ECE

Not sure about EE-related topics, but for CE you're almost guaranteed to use these textbooks:

u/ayequeue · 6 pointsr/learnprogramming

If you're trying to learn any assembly language (not specifically x86 based) I know there are several books out there for MIPS. I've used [Computer Organization and Design] (http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269/ref=sr_1_1?ie=UTF8&qid=1396238245&sr=8-1&keywords=computer+organization+and+design) by Patterson and can say I found it very helpful. On top of that, [MARS] (http://courses.missouristate.edu/kenvollmar/mars/), a combination IDE/emulator can be used with it (and is open source/free).

u/Echohawkdown · 6 pointsr/TechnologyProTips

In the interim, I suggest the following books:

  • Digital Design and Computer Architecture, by Harris & Harris - covers the circuitry & hardware logic used in computers. Should also cover how data is handled on a hardware level - memory's a bit rusty on this one, and I can't find my copy of it right now. Recommend that you read this one first.

  • Computer Organization and Design, by Patterson & Hennessy - covers the conversion of system code into assembly language, which itself turns into machine language (in other words, covers the conversion of programs from operating system code into hardware, "bare metal" code). Knowledge of digital circuitry is not required before reading, but strongly recommended.

  • Operating System Concepts, by Silberschatz, Galvin & Gagne - covers all the basic Operating System concepts that each OS today has to consider and implement. While there are Linux-based ones, there are so many different Linux "flavors" that, IMO, a book that covers a specific Linux base (called a Linux kernel) exclusively would be incomplete and fail to address all the key aspects you'll find in modern OSes. Knowledge of coding is required for this one, and therefore should be read last.

     

    As for the coding books, I suggest you pick one up on Python or Java - I'm personally biased towards Python over Java, since I think Python's syntax and code style looks nicer, whereas Java makes you say pretty much everything you're doing. Both programming languages have been out for a long time and see widespread usage, so there's plenty of resources out there for you to get started with. Personally, I'd suggest going with this book for Java and this book for Python, but if you go to Coursera or Codecademy, you might be able to get better, more interactive learning experiences with coding.

    Or you can just skip reading all of the books I recommended in favor of MIT's OpenCourseWare. Your choice.
u/joatmon-snoo · 6 pointsr/explainlikeimfive

Disclaimer: I don't know the EE stuff very well, but I do know enough to explain everything that comes after.

Here are two explanations of how you build logic gates from transistors: a simple one and courtesy of the EE StackExchange, a more technical one. (The value of an input and output is taken relative to V-/GND.)

Before you can build a CPU with logic gates, there are two concepts you need: (1) Boolean algebra and (2) memory cells.

----

If you look up Boolean algebra, you're going to get a lot of results that only math majors really understand (e.g. the Wikipedia page). To simplify it all, Boolean algebra is essentially the field of study that asks "if I only have two values to work with, TRUE and FALSE, what kind of math can I do?" Notice that TRUE and FALSE map neatly to 1 and 0 (hello, binary math!) as well as HIGH and LOW (V+ and 0V).

This means that you can make all sorts of circuits, like binary adders, multipliers, dividers, and so on. (Subtraction involves some extra logical tricks.)

At this point, what you essentially have is the ability to create any function.

----

Now what we need is some way to remember data: that's where memory cells come into play. (This is basically your RAM.)

The primitive form that gets taught in introductory EE courses is the flip-flop circuit: a circuit with two stable states. The stable part here is important: it means that if such a circuit enters this state, it will not leave this state until an input changes. (Similarly, if such a circuit enters an unstable state, generally, it will eventually transition into a stable state.) There are a lot more ways to construct memory cells, of course, but flip-flops are a simple way to see how you can store and manipulate data in a circuit.

With memory cells and Boolean algebra, you can now build state machines. Again, if you google this you're going to end up finding a lot of academic technical nitty-gritty, but at its most basic, a state machine has a finite number of states (A, B, C, ...) and each state corresponds to some function of its inputs.

----

The canonical example is a vending machine (keep in mind, all the electromechanical stuff is abstracted away here - we're only thinking about the control logic).

Let's start with a really simple vending machine. It only accepts $1 coins, it only dispenses one type of soda, and all sodas are $1 each. It's not our job to worry about restocking or counterfeit money or whatnot: our job is just the dispensing logic circuit. We know we're going to have one input and one output: an input for "is there a dollar coin being inserted" and an output for "dispense one can of soda". And if we think about it, the circuit should only have two states: dispensing a soda and not dispensing a soda.

That's pretty simple, then: we use one memory cell, to distinguish between the dispensing and not-dispensing state. The output will always reflect our internal state (i.e. output goes HIGH when dispensing, LOW when not dispensing); and if our input goes HIGH when we're not dispensing, we transition to dispensing, and no matter what our input is when we're dispensing, we transition to not dispensing.

Now we can start adding some complexity to our vending machine: let's accept pennies, nickels, dimes, and quarters too. How about dollar bills? To deal with this, clearly our state machine is going to need some kind of internal counter for how much money has been inserted. We're also going to need logic to compare how much money has been inserted to how much soda costs right now ($1), and also logic to dispense change.

But not everyone's a fan of Generic Soda™ so we're going to need some variety. Now we need a way for people to choose a soda. And since some people are snobs and want pricey stuff - they're willing to pay $2 for their canned beverage of choice (gasp! shock! horror!) - we need to add logic to handle different prices.

----

CPUs are built up much in the same way like the hypothetical vending machine above. A program is supplied as input, in the form of a list of instructions, and the CPU is basically a really big state machine that goes through the program line-by-line.

Explaining the details of how a basic CPU is designed is a full undergraduate course (Computer Organization/Architecture, usually), and seeing as how I've already outlined its prerequisite (Digital Logic) above, I'm going to stop here. The text I learned from was Patterson and Hennessy's Computer Organization and Design (you can find free PDFs of older versions floating around if you just google it).

----

Aside: if you have Steam and are interested in assembly-level programming, I've heard great things about Shenzhen I/O.

u/Gibborim · 6 pointsr/EngineeringStudents

It seems like you are looking for a textbook on computer architecture?

If so, this book would be pretty standard.

u/kcin · 6 pointsr/programming

Actually, it's a pretty comprehensive take on the subject: http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/

u/cyberbemon · 5 pointsr/hardware

This is a great start, as it explains and goes into great detail regarding cpu/gpu architectures: Computer Architecture, Fifth Edition: A Quantitative Approach

Another one that goes to low level is: Code: The Hidden Language of Computer Hardware and Software

>"He starts with basic principles of language and logic and then demonstrates how they can be embodied by electrical circuits, and these principles give him an opening to describe in principle how computers work mechanically without requiring very much technical knowledge"

-wiki

u/beeff · 5 pointsr/computing

The short answer is "because it doesn't increase performance".

The long answer involves the Von Neumann bottleneck, memory wall, power wall and ILP wall. For the long story I refer to the relevant chapter in Computer architecture: a quantitative approach

Adding more cores like we've been doing with multi-cores is a stop-gap measure, allowing manufacturers to keep claiming increased performance.

u/bonekeeper · 5 pointsr/compsci

Computer Systems - A Programmer's Perspective is a great book IMO, from Carnegie Mellon's CS course.

u/Capissen38 · 5 pointsr/singularity

You bring up an excellent point (and make a great case for land ownership!), and that is that actual physical space can't really be created, and will remain scarce, insofar as Earth has a fixed surface area. If the scenario I described above came to pass, though, would any landlords come looking for rent? Would any governments levy taxes? If no one needs cash and everyone has pretty much everything provided for them, all but the most stubborn landlords won't have any reason to give a hoot. I suspect government would take longer to die out, since it may still be needed to enforce laws, judge disputes, provide safety, etc. It's not hard to imagine a world even further down the line, however, when technology has advanced to the point where humans can't realistically do much damage to one another.

Edit: If you're really into this, I'd suggest reading some singularity-esque literature such as Down and Out in the Magic Kingdom (novella), Rainbows End (novel), and The Singularity is Near (speculative nonfiction to be taken with a grain of salt).

u/coHomerLogist · 5 pointsr/math

>I didn't say it was correct but it makes it more likely that people will dismiss it out of hand.

That's fair, I agree. It's just frustrating: there are so many strawmen arguments related to AI that a huge number of intelligent people dismiss it outright. But if you actually look into it, it's a deeply worrying issue-- and the vast majority of people who actually engage with the good arguments are pretty damn concerned.

I would be very interested if anyone can produce a compelling rebuttal to the main points in Superintelligence, for instance. I recommend this book very highly to anyone, but especially people who wonder "is AI safety just bullshit?"

>Especially when those people get significant amounts of funding

Numerically speaking, this is inaccurate. Cf. this article.

u/joenyc · 5 pointsr/compsci

Doesn't get better than CLRS.

EDIT: My bad, that's a dead-trees thing.

u/zrbecker · 5 pointsr/learnprogramming

Depends on what you are interested in.

If you are interested in games, pick a game and do it. Most board games are not that hard to do a command line version. A game with graphics, input, and sound isn't too bad either if you use something like Allegro or SDL. Also XNA if you are on windows. A lot of neat tutorials have been posted about that recently.

If you are more interested in little utilities that do things, you'll want to look at a GUI library, like wxWidgets, Qt and the sort. Both Windows and Mac have their own GUI libraries not sure what Windows' is called, but I think you have to write it with C++/CLI or C#, and Mac is Cocoa which uses Objective-C. So if you want to stick to basic C++ you'll want to stick to the first two.

Sometimes I just pick up a book and start reading to get ideas.

This is a really simple Game AI book that is pretty geared towards beginners. http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782/

I enjoyed this book on AI, but it is much more advanced and might be kind of hard for a beginner. Although, when I was first starting, I liked getting in over my head once in a while. http://www.amazon.com/Artificial-Intelligence-Modern-Approach-2nd/dp/0137903952/

Interesting topics to look up.

Data Structures

Algorithms

Artificial Intelligence

Computer Vision

Computer Graphics

If you look at even simple books in these subjects, you will usually find tons of small manageable programs that are fun to write.

EDIT: Almost forgot, I think a lot of these are Java based, but you can usually find a way to do it in C++. http://nifty.stanford.edu/ I think I write Breakout whenever I am playing with a new language. heh

u/joeswindell · 5 pointsr/gamedev

I'll start off with some titles that might not be so apparent:

Unexpected Fundamentals

These 2 books provide much needed information about making reusable patterns and objects. These are life saving things! They are not language dependent. You need to know how to do these patterns, and it shouldn't be too hard to figure out how to implement them in your chosen language.

u/yashinm92 · 5 pointsr/netsec

Choice of programming language differs among researchers but Python seems to be pretty common. I suggest you get the books Violent Python and Grey Hat Python . The former is more beginner friendly for people new to security. As for getting started with InfoSec maybe try reading the Security+ books?

u/kkoppenhaver · 5 pointsr/HowToHack

Along the same lines, I've very much enjoyed what I've read from Violent Python so far.

http://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579

u/proproseprowess · 5 pointsr/adventuretime
u/akame_21 · 5 pointsr/learnprogramming

Despite their age, the MIT lectures were great. If you're good at math and enjoy proofs this is the class for you. Same thing with the CLRS book. One of the best books on DS & Algos out there, but it's so dense it'll make your eyes glaze over, unless you love proofs and highly technical reading.

To get your feet wet, Grokking Algorithms is a good book.

A lot of people recommend Princeton's Algorithm Course. I took Algorithms in school already, but I'm probably going to take this course to round out my knowledge.

EDIT: special shout out to geeks for geeks. Great Website

u/baddox · 5 pointsr/programming

No. People say Cormen's Intro to Algorithms text is dense and inaccessible (though I thought it was very accessible when I first encountered it in college). TAoCP is way more dense and inaccessible than Intro to Algorithms.

I would recommend Intro to Algorithms if you want to seriously dive into algorithms. The newest version features improved pseudocode, which is pretty clear and instructive—especially compared to the MIX assembly code in TAoCP.

u/TotalPerspective · 5 pointsr/bioinformatics

Here are some books that I feel have made me better professionally. They tend toward the comp sci side, some are more useful than others.

  • Bioinformatics: An Active Learning Approach: Excellent exercises and references. I think most chapters evolved out of blog posts if you don't want to buy the book.
  • Higher Order Perl: I like perl to start with, so your mileage may vary. But learning how to implement an iterator in a language that doesn't have that concept was enlightening. There is a similar book for Python but I don't remember what it's called. Also, you are likely to run into some Perl at some point.
  • SICP: Power through it, it's worth it. I did not do all the exercises, but do at least some of the first ones to get the ideas behind Scheme. Free PDFs exist, also free youtube vids.
  • The C Programming Language: Everyone should know at least a little C. Plus so much has evolved from it that it helps to understand your foundations. Free PDFs exist
  • The Rust Programming Language: Read this after the C book and after SICP. It explains a lot of complex topics very well, even if you don't use Rust. And by the end, you will want to use Rust! :) It's free!

    Lastly, find some open source projects and read their papers, then read their code (and then the paper again, then the code...etc)! Then find their blogs and read those too. Then find them on Twitter and follow them. As others have said, the field is evolving very quickly, so half the battle is information sourcing.
u/sh0rug0ru · 5 pointsr/java

Read lots of code and read books to get multiple viewpoints. This is a deep topic which will require more than superficial online reading.

Check this out.

Books I have found useful:

u/ttutisani · 5 pointsr/softwarearchitecture

My blog about software architecture: http://www.tutisani.com/software-architecture/ (may not be for very beginners but I hope that it's able to communicate important topics).

I'd also suggest reading the classic book about design patterns (a.k.a. Gang of Four): https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_3?crid=1XRJO0L09LHLY&keywords=design+patterns+elements+of+reusable+object+oriented+software&qid=1557502967&s=gateway&sprefix=design+patterns%2Caps%2C162&sr=8-3

There are several good thought leaders in this direction, specifically Martin Fowler (https://martinfowler.com/) and Eric Evans (he does not write much online, but his book is great - all about modeling properly): https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

​

I'm big on modeling, objects, etc. so reply back if any questions.

u/ArthurAutomaton · 5 pointsr/math

It's Problem 7.29 in the third edition of Michael Sipser's Introduction to the Theory of Computation (you can find it by searching for "coloring" when using the preview function on Amazon). It's also in The design and analysis of computer algorithms by Aho, Hopcroft and Ullman as Theorem 10.12, though the proof there seems a little different from what you've sketched. (I hope that the Google Books link works. Sometimes it won't show the right preview.)

u/fbhc · 5 pointsr/AskComputerScience

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

​

But you might also consider Basics of Compiler Design which is a good and freely available resource.

​

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/tbid18 · 5 pointsr/math

What do you mean by "I want to be computer scientist?" Do you want to do research for a living, e.g, work in academia or for a lab? Or is your goal more along the lines of, "I want to learn more about computer science?" If the former, you're not going to get far without a degree, usually a Ph.D is necessary. If the latter is your goal then the 'traditional' math subjects would be 'discrete' subjects like probability and combinatorics. Linear algebra is heavily used in machine learning, and I believe PDEs are used as well.

On the computer science side, computability theory and computational complexity are necessary. Sipser is the standard for computability theory, and I like Arora/Barak for complexity (I don't necessarily recommend buying on amazon; that price for Sipser is outrageous).

u/OceansOnPluto · 5 pointsr/compsci

This is a little less scholarly and a little more geared towards history, but it's a fascinating read and one of the best books I've read in the last couple of years. It doesn't start with Claude Shannon, but rather with different ways that human beings have disseminated information to each other (talking drums, the telegram, etc), over the years. Definitely, after you're done with everything else, put this on your list, it's great.

Edit: Apparently I forgot the link. http://www.amazon.com/The-Information-History-Theory-Flood/dp/1400096235

u/an-anarchist · 5 pointsr/cryptography

Yes and no. If you're asking these questions you'll probably be very interested in Claude Shannon's work. Take a read of his seminal information theory paper: http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf

For an easy read and a fun intro take a look at "The Information: A History, A Theory, A Flood":
https://www.amazon.com/Information-History-Theory-Flood/dp/1400096235/

u/mysticreddit · 5 pointsr/gamedev

Every game programmer should have at least one of these books:

  • Mathematics for 3D Game Programming & Computer Graphics by Eric Lengyel
  • Game Physics by David Eberly
  • Real Time Collision Detection by Christer Ericson

    I own all 3. What I love about them is that they are some of the best ones around written by programmers to understand math in a clear and concise fashion; they are not written by some mathematician who loves theory and likes to hand-wave the worries about "implementation details."

    To help provide direction I would recommend these exercises to start; Work on (re) deriving the formulas (from easiest to hardest):

  • Work out how to reflect a vector
  • Derive the formula for how to calculate a 2D perpendicular vector
  • Work out the formula for how to project a vector A onto B.
  • Study how the dot product is used in lighting.
  • Derive the translation, scaling, and rotation 3x3 and 4x4 matrices.
u/swirlingdoves · 5 pointsr/Polska

@1. Mysle ze pytanie ktore trzeba zadac sobie najpierw to "czym jest dobry game design". Ile ludzi bedzie gralo w dana gre? Ile pieniedzy gra zarobi? Jaki efekt bedzie miala na graczach? Ogolnie polecam fora czy nawet subreddity typu /r/gamedesign. Sa tez kursy oferowane za darmo online przez powazne uczelnie np MIT. Z ksiazek polecam Theory of Fun i The Art of Game Design

@2 Tak, spojz na Notch'a ;)

@3 Rob male, proste gierki. Polecam "game jams" Nie wiem jakie to popularne w Polsce ale w Internecie jest tego sporo i po krotce chodzi o taki "sprint" (na przyklad 24 godzinny lub jedno-weekendowy) podczas ktorego celem jest zrobienie gdy na podstwie jakiego hasla lub protych ktryteriow. Znajdz innych ludzi i zamiast samotnie, pracuj w grupie powiedzmy trzech osob co by sie wzajemnie motywowac.

u/Strilanc · 4 pointsr/programming

Honestly not sure. Maybe some sort of university course? I gained my understanding almost by happenstance. I took a computer science degree, got interested in QC, bought and read Mike and Ike, and spent a lot of time developing/playing-with/trying-to-solve-problems-using my simulator Quirk. The simulator time probably helped the most.

u/fishoutofshui · 4 pointsr/QuantumComputing

I feel like I gained traction coming from statistics by ping-ponging between these three books. Nielsen and Chuang is a great place to start, especially the first two chapters. There’s a lot that will go over your head but you will pick up enough. Then Aaronson like you have been doing for a different perspective. Then McMahon holds your hand a bit on the computations, which will help if you aren’t familiar with quantum mechanics, as I was not. When you get stuck, switch books. I feel like once I bought all three books and started going back and forth and reading previous chapters again that is when things started to click and I gained some maturity. I have a long way to go but this has been the greatest self-learning journey I’ve been on in the past year. I hope you get as much as I have. Good luck.

https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/ref=nodl_

https://www.amazon.com/Quantum-Computing-Explained-David-Mcmahon/dp/8126564377/ref=mp_s_a_1_fkmrnull_1?crid=382OF32JOGTRH&keywords=quantum+computing+explained+mcmahon&qid=1551223235&s=gateway&sprefix=quantum+computing+explained&sr=8-1-fkmrnull

u/Statici · 4 pointsr/Physics

I got the most understanding out of reading Nielson and Chuang's Quantum Computation and Quantum Information.

It delves into what happens and what can be done with quantum information - that is, how qubits are different from bits. Philosophically, I don't think there is anything more important than that; it's nice to see what particles make reality up, but you don't get much idea as to what those particles are actually doing. As a forewarning though: This book will probably push you towards a many-worlds interpretation. Not because they push it; it's just (kind of) necessary to think that way, when considering large sets of quantum information interacting.

In terms of physics, it has only a single chapter dedicated to the direct exploration of Schrodinger's equation. After that, it starts to dig into "what's it like when we have more than one quanta?" which is...well, I can't summarize it in a post. If you would like a PDF copy, I found one online a long time ago, I could PM it to you :)

In any sense: I've had this book for three years now and it is by far the best buy I have made in ever. QI is growing in importance (mostly with regards to the AdS/CFT correspondence in quantum gravity theories) and it is also always nice to know (ahead of time) how quantum computers are going to be working!

u/chakke_ooch · 4 pointsr/mbti

> Would you say there's more opportunity working exclusively front end and design to exercise nfp creativity or novelty?

NFP creativity and novelty in the sense that Ne has free range, period? Sure, you get more of that in web design and even more of that as to step further and further away from the sciences. There is tons of creativity in real software engineering where you can be creative to solve actually challenging problems, not figuring out what color you'd like a button to be. To me, that's not creativity – or it's a lesser version. Creativity in problem solving is much more interesting. The way I see it is like when I was in music school and all the SFs were bitching about music theory and how they thought it limited their ability to "be creative". Such bullshit. It only exposes their lack of creativity. So you're saying that someone like Chopin who wrote amazing pieces and abided by the rules of music theory wasn't being creative? Hardly.

> Are you a web dev?

No, I'm a software engineer at an astrodynamics company; I do a lot of orbital mechanics, back-end work with web services, high performance computing, etc.

> By hardcore I meant requiring being meticulous, detail oriented.

I think that the lack of attention to detail is never permissible in either back-end software engineering or front-end web development, honestly.

> One thing I've realized is how shit my high school was at explaining math conceptually. Which I think lead to misconceptions about its use in programming

Well, then read some books on computer science and/or mathematics like this.

u/rickg3 · 4 pointsr/FCJbookclub

I read books 4-6 of the Dresden Files. I blame Patrick Rothfuss for getting me started and duckie for keeping me going. Coupla assholes. After I finish the other 8 books, I have some nice, solid non-fiction lined up.

In no particular order, I'm going to read:

The Information by James Gleick

The Better Angels Of Our Nature by Steven Pinker

The Math Book by Clifford A. Pickover

The Know-It-All by A.J. Coastie Jacobs

And others. I'm gonna nerd out so hard that I'll regrow my virginity.

u/grandzooby · 4 pointsr/compsci

Not exactly hardware focused, but I really enjoyed Gleick's "The Information, A History, A theory, A flood"

http://www.amazon.com/The-Information-History-Theory-Flood/dp/1400096235/

It does a great job of looking at computing in terms of information and its history. I loved the look at prominent figures like Babbage, Lovelace, Turing, etc.

u/CSMastermind · 4 pointsr/learnprogramming

I've posted this before but I'll repost it here:

Now in terms of the question that you ask in the title - this is what I recommend:

Job Interview Prep


  1. Cracking the Coding Interview: 189 Programming Questions and Solutions
  2. Programming Interviews Exposed: Coding Your Way Through the Interview
  3. Introduction to Algorithms
  4. The Algorithm Design Manual
  5. Effective Java
  6. Concurrent Programming in Java™: Design Principles and Pattern
  7. Modern Operating Systems
  8. Programming Pearls
  9. Discrete Mathematics for Computer Scientists

    Junior Software Engineer Reading List


    Read This First


  10. Pragmatic Thinking and Learning: Refactor Your Wetware

    Fundementals


  11. Code Complete: A Practical Handbook of Software Construction
  12. Software Estimation: Demystifying the Black Art
  13. Software Engineering: A Practitioner's Approach
  14. Refactoring: Improving the Design of Existing Code
  15. Coder to Developer: Tools and Strategies for Delivering Your Software
  16. Perfect Software: And Other Illusions about Testing
  17. Getting Real: The Smarter, Faster, Easier Way to Build a Successful Web Application

    Understanding Professional Software Environments


  18. Agile Software Development: The Cooperative Game
  19. Software Project Survival Guide
  20. The Best Software Writing I: Selected and Introduced by Joel Spolsky
  21. Debugging the Development Process: Practical Strategies for Staying Focused, Hitting Ship Dates, and Building Solid Teams
  22. Rapid Development: Taming Wild Software Schedules
  23. Peopleware: Productive Projects and Teams

    Mentality


  24. Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency
  25. Against Method
  26. The Passionate Programmer: Creating a Remarkable Career in Software Development

    History


  27. The Mythical Man-Month: Essays on Software Engineering
  28. Computing Calamities: Lessons Learned from Products, Projects, and Companies That Failed
  29. The Deadline: A Novel About Project Management

    Mid Level Software Engineer Reading List


    Read This First


  30. Personal Development for Smart People: The Conscious Pursuit of Personal Growth

    Fundementals


  31. The Clean Coder: A Code of Conduct for Professional Programmers
  32. Clean Code: A Handbook of Agile Software Craftsmanship
  33. Solid Code
  34. Code Craft: The Practice of Writing Excellent Code
  35. Software Craftsmanship: The New Imperative
  36. Writing Solid Code

    Software Design


  37. Head First Design Patterns: A Brain-Friendly Guide
  38. Design Patterns: Elements of Reusable Object-Oriented Software
  39. Domain-Driven Design: Tackling Complexity in the Heart of Software
  40. Domain-Driven Design Distilled
  41. Design Patterns Explained: A New Perspective on Object-Oriented Design
  42. Design Patterns in C# - Even though this is specific to C# the pattern can be used in any OO language.
  43. Refactoring to Patterns

    Software Engineering Skill Sets


  44. Building Microservices: Designing Fine-Grained Systems
  45. Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools
  46. NoEstimates: How To Measure Project Progress Without Estimating
  47. Object-Oriented Software Construction
  48. The Art of Software Testing
  49. Release It!: Design and Deploy Production-Ready Software
  50. Working Effectively with Legacy Code
  51. Test Driven Development: By Example

    Databases


  52. Database System Concepts
  53. Database Management Systems
  54. Foundation for Object / Relational Databases: The Third Manifesto
  55. Refactoring Databases: Evolutionary Database Design
  56. Data Access Patterns: Database Interactions in Object-Oriented Applications

    User Experience


  57. Don't Make Me Think: A Common Sense Approach to Web Usability
  58. The Design of Everyday Things
  59. Programming Collective Intelligence: Building Smart Web 2.0 Applications
  60. User Interface Design for Programmers
  61. GUI Bloopers 2.0: Common User Interface Design Don'ts and Dos

    Mentality


  62. The Productive Programmer
  63. Extreme Programming Explained: Embrace Change
  64. Coders at Work: Reflections on the Craft of Programming
  65. Facts and Fallacies of Software Engineering

    History


  66. Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software
  67. New Turning Omnibus: 66 Excursions in Computer Science
  68. Hacker's Delight
  69. The Alchemist
  70. Masterminds of Programming: Conversations with the Creators of Major Programming Languages
  71. The Information: A History, A Theory, A Flood

    Specialist Skills


    In spite of the fact that many of these won't apply to your specific job I still recommend reading them for the insight, they'll give you into programming language and technology design.

  72. Peter Norton's Assembly Language Book for the IBM PC
  73. Expert C Programming: Deep C Secrets
  74. Enough Rope to Shoot Yourself in the Foot: Rules for C and C++ Programming
  75. The C++ Programming Language
  76. Effective C++: 55 Specific Ways to Improve Your Programs and Designs
  77. More Effective C++: 35 New Ways to Improve Your Programs and Designs
  78. More Effective C#: 50 Specific Ways to Improve Your C#
  79. CLR via C#
  80. Mr. Bunny's Big Cup o' Java
  81. Thinking in Java
  82. JUnit in Action
  83. Functional Programming in Scala
  84. The Art of Prolog: Advanced Programming Techniques
  85. The Craft of Prolog
  86. Programming Perl: Unmatched Power for Text Processing and Scripting
  87. Dive into Python 3
  88. why's (poignant) guide to Ruby
u/vmsmith · 4 pointsr/statistics

A couple of these have been mentioned already:

  • Fooled by Randomness -- Nassim Nicholas Tabeb
  • The Black Swan -- Nassim Nicholas Taleb
  • The Drunkard's Walk
  • The Signal and the Noise (I'm almost finished reading it, and it's very good)

    [Note: Nassim Nicholas Taleb is an overbearing, insufferable egotist, but he says very interesting things, and I think his books are worth reading. I think he had an AMA on Reddit not too long ago.]

    Somewhat related, you might also consider The Information, by James Gleick. It pays to know something about the where and how the raw material of statistics.
u/Nuclear-Cheese · 4 pointsr/gamedev

I also highly recommend for game developers lacking in math skills to check out 3D Math Primer for Graphics and Game Development. Unlike this book that is often recommended, I feel it does a better job for people who don't have a formal education in advanced mathematics or Computer Science who are interested in math directly relating to game development.

u/yanalex981 · 4 pointsr/computerscience

I taught myself bits in high school with "C++ for Everyone". Despite its rating, I thought it was good 'cause it has exercises, and I did a lot of them. Works really well for laying foundations. I didn't go through the whole book though, and knowing the language is only part of the battle. You need to know about algorithms and data structures as well. For graphics, trees seem really useful (Binary space partitioning, quadtrees, octrees etc).

After university started, I read parts of "C++ Primer", which was when the language really started making sense to me. You'll get more than enough time to learn the required amount of C++ by next fall, but CG is heavy in math and algorithms. If your CS minor didn't go over them (much), my old algorithms prof wrote a free book specifically for that course.

For using OpenGL, I skimmed the first parts of "OpenGL SuperBible". For general graphics, I've heard good things about "Mathematics for 3D Game Programming and Computer Graphics", and "Real-Time Rendering".

Careful with C++. It may deceptively look like Java, but honestly, trying to write good idiomatic C++ after years of Java took a major paradigm shift

u/lazylex · 4 pointsr/MMORPG

These are interesting and insightful reads:

https://www.amazon.com/MMOs-Inside-Out-Massively-multiplayer-Role-playing/dp/1484217233/

https://www.amazon.com/MMOs-Outside-Massively-Multiplayer-Role-Playing-Psychology/dp/1484217802

Might be available cheaper at some other locations -- google Bartle mmo book

Also Raph Koster's more universal video game book:
https://www.amazon.com/Theory-Game-Design-Raph-Koster/dp/1449363210/

u/YUM_BlueberryMuffins · 4 pointsr/learnpython
  1. Code more.
  2. Watch Raymond Hettinger videos on youtube.
  3. Watch David Beazley videos on youtube.
  4. Read Fluent Python.
  5. Podcasts? Talk python to me.
  6. Code more again.
  7. Repeat.
u/jungrothmorton · 4 pointsr/learnpython

Fluent Python. This book took my Python and coding in general to the next level.

u/Earhacker · 4 pointsr/learnprogramming

Websites are built with HTML, CSS, JavaScript, and that's it. You can have a back end (the bit that handles and serves up the data) written in any language, but every website you see is just HTML, CSS and JavaScript.

If you're underwhelmed by Python, I don't think you'll like JavaScript. It's like Python, only weird. But if you really want to get into web development, I'd recommend Eloquent JavaScript which is free online, but the paper book is a slightly older version until later this year hopefully. It's an excellent book on JavaScript, but a little too terse for total newbies. It's great for someone learning JavaScript as a second language. If it's too much for you, then the best choice for newer readers is The Modern JavaScript Tutorial.

But I suspect that you haven't taken Python as far as you think you have. It's a great language for beginners because it offers quick wins, and you can build cool little apps very quickly. But that belies its depth. You can build website back ends with the Flask framework for Python. Miguel Grinberg has written the gold standard of Flask learning in the Mega Tutorial, and has expanded it into a paper book in Flask Web Development on O'Reilly. Or, if you want to explore the nerdy depths of Python and know it inside out, get Fluent Python.

If you really weren't impressed with GCSE Python, the next level up is probably learning Java or C#, maybe even Go or Rust if you fancy something a little more cutting edge. I'm not an expert in these languages and I haven't read much on them, so I'll defer you to other answers.

u/CrimsonCuntCloth · 4 pointsr/learnpython

Depending on what you want to learn:

PYTHON SPECIFIC

You mentioned building websites, so check out the flask mega tutorial. It might be a bit early to take on a project like this after only a month, but you've got time and learning-by-doing is good. This'll teach you to build a twitter clone using python, so you'll see databases, project structure, user logons etc. Plus he's got a book version, which contains much of the same info, but is good for when you can't be at a computer.

The python cookbook is fantastic for getting things done; gives short solutions to common problems / tasks. (How do I read lines from a csv file? How do I parse a file that's too big to fit in memory? How do I create a simple TCP server?). Solutions are concise and readable so you don't have to wade through loads of irrelevant stuff.

A little while down the road if you feel like going deep, fluent python will give you a deeper understanding of python than many people you'll encounter at Uni when you're out.

WEB DEV

If you want to go more into web dev, you'll also need to know some HTML, CSS and Javascript. Duckett's books don't go too in depth, but they're beautiful, a nice introduction, and a handy reference. Once you've got some JS, Secrets of the javascript ninja will give you a real appreciation of the deeper aspects of JS.

MACHINE LEARNING
In one of your comments you mentioned machine learning.

These aren't language specific programming books, and this isn't my specialty, but:

Fundamentals of Machine Learning for Predictive data analytics is a great introduction to the entire process, based upon CRISP-DM. Not much of a maths background required. This was the textbook used for my uni's first data analytics module. Highly recommended.

If you like you some maths, Flach will give you a stronger theoretical understanding, but personally I'd leave that until later.

Good luck and keep busy; you've got plenty to learn!

u/OverQualifried · 4 pointsr/Python

I'm freshening up on Python for work, and these are my materials:

Mastering Python Design Patterns https://www.amazon.com/dp/1783989327/ref=cm_sw_r_cp_awd_kiCKwbSP5AQ1M

Learning Python Design Patterns https://www.amazon.com/dp/1783283378/ref=cm_sw_r_cp_awd_BiCKwbGT2FA1Z

Fluent Python https://www.amazon.com/dp/1491946008/ref=cm_sw_r_cp_awd_WiCKwbQ2MK9N

Design Patterns: Elements of Reusable Object-Oriented Software https://www.amazon.com/dp/0201633612/ref=cm_sw_r_cp_awd_fjCKwb5JQA3KG

I recommend them to OP.

u/fajitaman · 4 pointsr/learnprogramming

The usual advice is "get out and program!" and that works, but it can be very tricky coming up with something to write that's also satisfying. The idea is that you learn best by doing, and that many topics in programming can't really be learned without doing. All that stuff is true and I'm not denying that at all, but some of us need more. We need something juicier than spending hours configuring a UI for a project we couldn't care less about. It shouldn't be an exercise in masochism.

I guess what I'm saying is that there are a lot of ways to learn to write code and books are great if you can really sink your teeth into them (a lot of people can't). Code Complete is a great book on the practice of programming. You also say that you "get" OO pretty well, but it might open your eyes to read up on design patterns (e.g., Head First Design Patterns). You have a long way to go before you really get it

In addition to those, you could delve deeper into your languages of choice. There's no way around JavaScript if you're a web programmer, and a book like JavaScript: The Good Parts is pretty enlightening if you've got some experience in JavaScript already. It's a pretty interesting and unusual language.

But sometimes programming is about building gumption, so instead of just being practical, try to figure out what you like about computers and keep going deeper into it. If you have an interest in computer science and not in just building apps, then something like Structure and Interpretation of Computer Programs could instill in you an enthusiasm for computers that trickles down to everything else you do. If you're more interested in web design, there are probably similarly interesting books on artistic design principles.

I think what I'm ultimately saying is that you should find what you enjoy doing and just go deeper down the rabbit hole, getting your hands dirty when it's appropriate and interesting.

u/rbudhrani · 4 pointsr/QuantumComputing

There was a similar post a while back and I listed out some resources. Here they are, edited for your case:

Intro book (good to start with or use as a companion in your case): https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176 (you can probably find this online)

Youtube playlist by Nielson: https://www.youtube.com/playlist?list=PL1826E60FD05B44E4

Quantum cryptography: https://www.edx.org/course/quantum-cryptography-0

Understanding the basics of how computers works: https://www.coursera.org/learn/digital-systems

I added a basic course for digital systems and how computers work. I guess you don’t need any intro courses on Quantum physics. The cryptography course really starts off with basic stuff and it’s easy to pick up.

I would recommend just getting started right away with these and looking for resources on classical computing as you make your way through the resources. You can hit me up if you want to get started with something more advanced like quantum error correction and fault tolerant quantum computing.

u/YuvalRishu · 4 pointsr/QuantumComputing

Hi, I work on programming quantum computers. I studied in Canada (PhD from the Institute for Quantum Computing at the University of Waterloo) and I now live and work in Sydney, Australia. Your TL;DR is actually a bit different from the rest of your post, so I'll answer the questions in the TL;DR first.

I started getting interested in quantum computing when I was an undergraduate in Physics. I began with an interest in quantum entanglement and did a couple of summer research projects in the subject. I did my Master's degree with my supervisor for the last of those projects, and even wrote my first paper based on that work.

Quantum entanglement is of course very important in quantum computing but the study of the subject is more under the heading of quantum information theory. I switched over to quantum computing when I was deciding where to go for my PhD, and decided that I wanted to do the PhD to answer one simple question to myself: how far away are we, really, from a quantum computer? While I was finishing my PhD, the opportunity in Sydney came up and I decided that I liked the work happening here. I was (and am) interested in simulating quantum fields on a quantum computer, and have gotten interested in simulating physics in general (doesn't have to be quantum) as well as solving problems on a quantum computer in general (doesn't have to be physics).

We're talking about close to half my life at this point, so it's hard to summarise that story in any reasonable way. But if I had to try, I'd say that I followed my nose. I was interested in stuff, so I found ways to learn as much as I could about that stuff from the best people I could find who would give me the time of day or, better yet, a pay check. One of the nice things about doing science as a student is that there are plenty of people willing to pay you to study science if you know how to ask nicely.

Training a scientist is a long and arduous process, from the perspective of the student, the teacher, and the society as a whole. Take your time to learn properly. Don't let the bumps in the road stop you!

With the motivational stuff out of the way, my best advice is to learn everything. I mean everything. Physics, maths, computer science, engineering, chemistry, philosophy, sociology, history, everything. I know you can't possibly become an expert in all of that, but get at least a passing knowledge in whatever strikes your interest. When you hit on the thing that you simply can't stop thinking about, the thing that you literally lose sleep over, then you've found the topic for your PhD thesis. Find a supervisor and work on that as hard as you can for as long as you can until they tell you to get out and get a real job.

If that's not the advice you're looking for, then I'll try another piece. Go study functional analysis. You can't possibly understand quantum physics without knowing some functional analysis. If you're serious about quantum physics, this is now your bible. And when you give up on that book (and you will give up on that book), read this. When you're done, read this.

u/PastyPilgrim · 4 pointsr/AskComputerScience

Ah. I would start with computer architecture before getting to systems programming then. Hennessy and Patterson is the book that (I think) most Computer Architecture classes use to teach the material. Before that, however, you may want to learn circuit basics and introductory computer organization though.

It's going to be pretty difficult material to teach yourself unfortunately. Computer Architecture is hard and requires a lot of foundational knowledge (circuits, organization, logic, etc.), but once you understand how hardware works, then you should be able to move into systems programming with more ease.

u/Brianfellowes · 4 pointsr/askscience

The other answer in this thread is a bit simplistic, and not quite correct in some cases.

First, let's look at where differences in IPC (instructions per cycle) can arise. In essence, all Intel and AMD CPUs are Von Neumann machines, meaning, they perform computations by reading data from memory, performing an operation on that data, and then writing the result back to memory. Each of those actions take time, more specifically cycles, to perform. Computers can read from memory using load instructions. Computers operate on data through logical instructions (add, subtract, multiply, divide, bit shift, etc.) or control instructions (conditional/unconditional branches, jumps, calls... basically instructions to control what code gets executed). Computers write data to memory through store instructions. All useful programs will use all 4 types of instructions to some degree. So in order to improve IPC, you can implement features which will speed up the operation of those instructions, at least over the lifetime of the program (in other words, they improve average performance of an instruction)

So how can you improve the operation of these instructions? Here's a crash course in (some) major features of computer architectures:

  1. Pipelining: Instead of doing an instruction in 1 cycle, you can do 1/Nth of an instruction in 1 cycle, and the instruction will take N cycles to complete. Why do this? Let's say you split an instruction execution into 3 parts. Once the first 1/3 of the instruction 0 is completed on cycle 0, you can execute the 2/3 of instruction 0 in cycle 1 as well as the 1/3 of instruction 1. The overall benefit is that if you can execute 1 instruction in t time, you can execute 1/n of an instruction in t/n time. So our 3-stage pipeline can now on average do 1 instruction per cycle, but it can run 3 times faster. Practical impact: the processor frequency can be greatly increased. In this case, by 3x.
  2. Caching: Believe it or not, loads and stores to memory take far far far longer than logical or control instructions. Well, at least without the caching optimization. The idea of caching is to keep a small, fast memory close to the processor and the larger, slower memory farther away. For example, if you sat down at your desk and wanted a pencil, where would you want to have it? On the desk? Inside the desk drawer? In your storage closet? Or down the street at the office supply store? You have a small number of things you can fit on top of your desk, but keeping your pencil there is the best if you use it frequently. Practical impact: the average time it takes to access RAM is somewhere between 50 and 120 cycles. The average time to access the L1 cache (the fastest and smallest cache) is 3-5 cycles.
  3. Superscalar processing: Let's say that you have the following code:

    a = b + c
    d = e + f
    This code will form two add instructions. One key thing to note is that these two instructions are completely independent, meaning that the instructions can be performed in any order, but the result will be the same. In fact, the two instructions can be executed at the same time. Therefore, a superscalar processor will detect independent instructions, and try to execute them simultaneously when possible. Practical impact: allows the processor to reach an IPC higher than 1. Code varies a lot, but the theoretical IPC maximum for most single-thread programs is somewhere between 2-5.
  4. Branch prediction: When we introduce pipelining, we run into a problem where we might not be able to execute the first 1/3 of an instruction because we don't know what it is yet. Specifically, if we have a control instruction, we need to complete the control instruction before we can figure out what the next instruction to execute is. So instead of waiting to finish the control instruction, we can predict what the next instruction will be and start executing it immediately. The processor will check its prediction when the control instruction finishes. If the prediction is correct, then the processor didn't lose any time at all! If it guesses incorrectly, it can get rid of the work it did and restart from where it went guessed wrong. Practical impact: modern processors predict correctly 98+% of the time. This saves many, many cycles that would otherwise be spent waiting.
  5. Out of order / speculative processing: Building on superscalar processing, processors can try to guess on a lot of things in advance. Let's say there's a load instruction 10 instructions ahead of where the processor is currently executing. But, it doesn't look like it depends on any of the previous 9 instructions? Let's execute it now! Or, what if it depends on instruction 5? Let's guess at the result of instruction 5 and use it to execute instruction 10 anyways! If the processor guesses wrong, it can always dump the incorrect work and restart from where it guessed wrong. Practical impact: it can increase IPC significantly by allowing instructions to be executed early and simultaneously.
  6. Prefetching: A problem with caching is that the cache can't hold everything. So if the processor needs data that isn't in the cache, it has to go to the large, slow RAM to get it. Think of like when you look in your refrigerator for milk, but you don't have any, so you have to spend time going to the store. Well, processors can try to guess about the data it will need soon, and fetch that data from RAM to put it in the cache before it need it. Think of it like realizing your milk is low, so you stop by the store and pick some up on the way home from work. That way, when you actually need the milk it will already be there! Practical impacts: prefetching can significantly reduce the average time it takes to get data from RAM, this increasing IPC.

    The conclusion
    Knowing exactly why AMD's architecture doesn't have the same IPC as Intel's is a bit difficult to tell, because there are necessarily no people who have access to the internal design details for both Intel and AMD simultaneously. It would be like trying to tell how someone got sick - you can come up with a lot of educated guesses and theories, but there's not really a way to tell for sure.

    Another reason is that many of the inventions that go into CPU microarchitectures are patentable. So it could easily be that Intel has certain patents that they are unwilling to license or AMD doesn't want to license.

    To put things in perspective, both Intel and AMD perform all of the above items I listed. The main difference between the two is how they are implemented. Their architectures will differ in how many pipeline stages they use, how many instructions they can execute at the same time, how far ahead they can look for independent instructions, how they decide which data to prefetch, etc. These will cause minor differences in IPC.

    The bottom line
    One of the larger differences between the two recently, in my opinion, has been small differences and techniques on how they implement speculation related to load instructions. Intel pulls more tricks related to trying to guess the value of a load before it is known for sure, and doing so quickly and correctly. It's hard for AMD to replicate these because the techniques are either trade secrets or patented.

    Edit: many of these techniques can are described in the "bible" of computer architecture:
    J. Hennessy, and D. A. Patterson. Computer architecture: a quantitative approach. Elsevier, 2011.
u/KibblesNKirbs · 4 pointsr/hardware

i used this book for my first comparch course, written by the guys who pioneered RISC and MIPS

goes over just about everything for the basics of how computer processors work, you can find a pdf online pretty easily

u/MushinZero · 4 pointsr/ComputerEngineering

Do you understand a current ISA? This will become clearer once you do.

MIPS or RISC-V are the recommended ones to start with.

https://smile.amazon.com/dp/0124077269/ref=cm_sw_em_r_mt_dp_U_kfG4Db0D3JR91

https://smile.amazon.com/dp/0128122757/ref=cm_sw_em_r_mt_dp_U_hfG4DbCTTF7H3

Also, it is going to be much faster to implement a processor in an HDL than in Minecraft.

u/SneakingNinjaCat · 4 pointsr/AskComputerScience

I read Computer Systems A programmer's perspective on my first BSc year in Software Engineering.

u/smidley · 4 pointsr/Transhuman

This one was a pretty good read.
The Singularity Is Near

u/s-ro_mojosa · 4 pointsr/learnpython

Sure!

  1. Learning Python, 5th Edition Fifth Edition. This book is huge and it's a fairly exhaustive look at Python and its standard library. I don't normally recommend people start here, but given your background, go a head.
  2. Fluent Python: Clear, Concise, and Effective Programming 1st Edition. This is your next step up. This will introduce you to a lot of Python coding idioms and "soft expectations" that other coders will have of you when you contribute to projects with more than one person contributing code. Don't skip it.
  3. Data Structures and Algorithms in Python. I recommend everybody get familiar with common algorithms and their use in Python. If you ever wonder what guys with CS degrees (usually) have that self-taught programmers (often) don't: it's knowledge of algorithms and how to use them. You don't need a CS degree (or, frankly any degree) to prosper in your efforts, but this knowledge will help you be a better programmer.

    Also, two other recommendations: either drill or build pet projects out of personal curiosity. Try to write code as often as you can. I block out time three times a week to write code for small pet projects. Coding drills aren't really my thing, but they help a lot of other people.
u/grahamboree · 4 pointsr/gamedev

The Starcraft Broodwar API has source code for a bunch of bots from the annual competition at AIIDE. You can find them here. They use a variety of techniques that will help you set you in the right direction.

I'd recommend this book too if you're interested in AI. It's the most comprehensive survey of the most common techniques used in the industry today.

Good luck!

u/marekkpie · 4 pointsr/gamedev

Programming Game AI by Example is another great resource.

u/RoguelikeDevDude · 4 pointsr/gamedev

Book suggestions? Now that's my jam.

Out of all the books i've read, here are my recommendations regarding game programming:

Eric Lengyel's Books (only one out so far). This is aimed at game engine development, but if the 2nd onward are as indepth as the first, they will be amazing fundamental knowledge. Also, they're not thick, and jam packed with information.

Game Programming Patterns. The only book that comes more recommended than this is the one right below it by Jesse Schell. This book is fantastic, but you should write one or two small games to really get the most out of this book. You can also read it online on his website free, but then you don't get a pic of him and his dog on the back cover.

Book of Lenses. This is your intro/intermediate dive into game design. There are a lot of game design books, if you only read one, it should be this one.

Gane AI By Example. This book is a hodgepodge of fantastic techniques and patterns by those in AAA. There are other books on the series (like Game AI Pro) which are similar, but in my opinion (at least when I read AI PRO 3), they're not as good. But more knowledge is never bad.

Truthfully, as I sit here looking over all my books, those are the only ones i'd consider mandatory for any seasoned developer. Of course plenty of developers get by without reading these books, but they likely pick up all the principles listed herein elsewhere, in bits and pieces, and would likely have benefited having read them early on.

Here are a few others that I do recommend but do NOT consider mandatory. Sorry, no links.

Unity in Action. Personally, I recommend this or a more interactive online course version (udemy.com/unitycourse) if you want to learn unity while having a resource hold your hand. Having read the book, taken the course, AND taken Unity's own tutorials on the matter, i'd order them in order from Course being best, book second, videos from unity third. But none of them are bad.

Game Engine Architecture. This is the king for those who want a very broad introduction to making a game engine. It comes highly recommended from nearly anyone who reads it, just so long as you understand it's from a AAA point of view. Game Code Complete is out of print and unlikely to be revisited, but it is similar. These are behemoths of books.

Realtime rendering. This is one I haven't read, but it comes very highly recommended. It is not an intro book, and is also over 1000 pages, so you want this along side a more introductory book like Fundamentals of computer graphics. Truth be told, both books are used in courses in university at the third and fourth year levels, so keep that in mind before diving in.

Clean code. Yeah yeah it has a java expectation, but I love it. It's small. Read it if you understand Java, and want to listen to one of the biggest preachers on how not to write spaghetti code.

Rimworld guy, Tynaan sylvester I believe, wrote a book called Designing Games. I enjoyed it, but IMO it doesn't hold a candle to Jesse Schell's book. Either way, the guy did write that book after working in AAA for many years, then went on to create one of the most successful sim games in years. But yeah, I enjoyed it.

Last but not least, here are some almost ENTIRELY USELESS but interesting diagrams of what some people think you should read or learn in our field:

https://github.com/miloyip/game-programmer

https://github.com/utilForever/game-developer-roadmap

https://github.com/P1xt/p1xt-guides/blob/master/game-programming.md

u/Birkal · 4 pointsr/UCSD
u/0b_101010 · 4 pointsr/learnprogramming

But if you are looking for an algorithm and data structures book, I found The Algorithm Design Manual the most useful book of all I read in the field of CS. Find it used, but if you can afford it, it is very well worth the price in my opinion.

u/kharashubham · 4 pointsr/compsci

It's a great course, doing well in it would definitely give you more confidence in taking up more challenging problems in the CS or Software Engineering domain in general. I would recommend you to go through Data Structures chapter in the The Algorithm Design Manual by Steven S. Skiena, also a great book to study about Algorithms. This book is an amazing read and I recommend it to all CS majors.

u/DashAnimal · 4 pointsr/compsci

I know this is a pretty common recommendation and you've probably already heard of it
or even read it, but can I recommend the book Code: The Hidden Language of Computer Hardware and Software? I think having a history of how we got to where we are today (written in an entertaining way) is a good starting point, even though it barely scratches the surface of computer science.

u/_9_9_ · 4 pointsr/learnpython

This book is supposed to be good: https://www.amazon.com/dp/B00JDMPOK2/

I've yet to read it. I've been messing with computers for a long, long time. But, at some point I think most people agree that they are magic.

u/AnalyzeAllTheLogs · 4 pointsr/learnprogramming

Although more about product delivery and lifecycle management, i'd recommend:

https://www.audible.com/pd/Business/The-Phoenix-Project-Audiobook/B00VAZZY32

[No audiobook, but worth the read] The Mythical Man-Month, Anniversary Edition: Essays On Software Engineering https://www.amazon.com/dp/B00B8USS14/

[No audiobook, but about 1/3 the price at the moment for kindle and really good]
Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) https://www.amazon.com/dp/B00JDMPOK2/


https://www.amazon.com/Dreaming-Code-Programmers-Transcendent-Software/dp/B00AQ5DOCA

https://www.amazon.com/Scrum/dp/B00NHZ6PPE

u/Jigxor · 3 pointsr/gamedev

If you like books, I've been reading this one lately Programming Game AI by Example:
http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782

I'm really enjoying it and it has a tonne of useful information. Read some of the pages on the Amazon site and see if it sounds interesting for you.

u/Orthak · 3 pointsr/mylittleandysonic1

Unity is the bee's knees.
I've been messing with it casually for several years, and got serious in the last 2-ish years. I like it because I get to use C#, and that's the language I know best. Only problem in it's using some weird limbo version of .NET 2, that's not actually 2.0 but is also 3.0 is some places? I think it's because it's using Mono 2.0, which is some subset of .NET. It's weird. They're moving to 4.5 soon anyways so I'm hype for that. I'ts been a lot of fun regardless, I get to apply a different knowledge and tool set from my day job. Not to mention it feels great when you actually get something to build and actually work.

So anyways here's a list of resources I've found over the years to be super helpful:

Things on Reddit

u/SgtBursk · 3 pointsr/gamedev

Do the enemies ever hit the player? Perhaps you can try scaling the player's health to how often enemies land attacks. (For instance, instead of Zelda's 6+ half-heart system, use old school Mario's 'one hit you die unless powered up' system. Or Super Meat Boy's one-hit KO.)

>the player could just roll out of the way and beat the shit out of them while the enemy's attack was missing.

Sounds like fun to me! But I take it you're going for a different feel?

I'm currently reading Programming Game AI by Example. It goes from basic AI moving to pathfinding to goal hierarchies. The majority of the book might be a little too introductory for you though, at least until you get to the goals section.

u/klop2031 · 3 pointsr/HowToHack

Yeah, it can get very boring. The best thing I can recommend is to just try it out on your "personal" network.

I don't know how much you know about programming but learn to program, learn Python, C/C++, Java. after this you should be able to pick up any language.

look at this for injecting cookies
http://dustint.com/post/12/cookie-injection-using-greasemonkey

and for learning more hacking try violent python:
https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579

as with everything find the pdf of it. Its out there.

Google is your friend. So you want to look up tutorials for things like sql injection, XSS, Cross-Site Request Forgery (CSRF).

Here are some attacks you can read:

https://www.owasp.org/index.php/Category:Attack

u/-vandarkholme · 3 pointsr/HowToHack

https://www.amazon.com/Black-Hat-Python-Programming-Pentesters/dp/1593275900/ref=sr_1_1?ie=UTF8&qid=1468724554&sr=8-1&keywords=black+hat+python

https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579/ref=sr_1_1?ie=UTF8&qid=1468724562&sr=8-1&keywords=violent+python

Two good books that will answer what you need. They go over making different tools that can help you in a penetration test.

I think python should be used more to automate things you'd normally do with other tools, not exactly use it to make "hacks".

You'd probably be better off using Ruby with the metasploit framework to actually make exploits, if thats what you mean.

u/michael0x2a · 3 pointsr/learnprogramming

I would argue that in the long run, you should probably have a good understanding of all of those languages, and more -- security is the sort of thing where having a comprehensive and deep understanding of the entire system, and how different pieces interact is valuable.

That said, as a beginner, your instinct is right: you should pick just a single language, and stick with it for a while (maybe a year or so?). Since you're going to end up learning all of those languages at one point or another, it doesn't matter so much which particular one you start with, since you'll need to continuously be learning throughout your career. If you decide not to learn something today, you'll probably end up learning it a few months from now.

I would personally recommend Python as a good starting point, mainly because I happen to know one or two security-oriented introductions to Python, and am less familiar with what tutorials are available in other languages. In particular, there's a book named Violent Python which introduces Python from a security context to beginners. It does skim over some of the intro material, so you may want to supplement this with a more in-depth introductory tutorial to Python.

I think C, then C++ would then make decent second and third languages to learn, respectively. You can probably fit in Ruby and Java anywhere in between or after, as necessary -- I suspect they'll be easier to pick up.

u/_o7 · 3 pointsr/HowToHack

There is books on the subject, Violent Python comes to mind.

With that being said, you don't just write a script to hack a system. You're exploiting a vulnerability in the system.

For example a lot of exploits are Buffer Overflows the quick and dirty explanation on this is a program accepts a parameter and as assigned space in memory for that parameter, a buffer overflow attack uses more space than the parameter was allocated and overflows into the stack causing the code in the overflow to be executed.

u/Truffl3 · 3 pointsr/HowToHack


This is asked alot, start here. I would recommend starting on simple notepad batch file programs, tutorials on youtube. Once youre done exploring with what that has to offer and think youre ready to move on to more complicated things, its highly debated on what you should start off with; I prefer python, but pick what suits what you want to do best. If you end up going with python and when you are somewhat fluent with it get this book, it helped me a ton. If you want to communicate discretely, look into IRC's, and obviously use a vpn.

u/ospatil · 3 pointsr/algorithms

Learning JavaScript Data Structures and Algorithms - Second Edition is a really good book with clear explanations and code examples.

Grokking Algorithms is also a wonderful book especially for non-CS people and beginners. The examples are in Python but it shouldn't be a problem given your Ruby and JavaScript background.

u/rispe · 3 pointsr/javascript

Congratulations! That's a big step. Be proud that you were able to make the switch. Not many people manage to transform ideas into results.

I think there are four areas on which you need to focus, in order to go from mediocre to great. Those areas are:

  1. Theoretical foundation.
  2. Working knowledge.
  3. Software engineering practices.
  4. Soft skills.

    Now, these areas don't include things like marketing yourself or building valuable relationships with coworkers or your local programming community. I see those as being separate from being great at what you do. However, they're at least as influential in creating a successful and long-lasting career.

    Let's take a look at what you can do to improve yourself in those four areas. I'll also suggest some resources.

    ​

    1. Theoretical foundation

    Foundational computer science. Most developers without a formal degree have some knowledge gaps here. I suggest taking a MOOC to remediate this. After that, you could potentially take a look at improving your data structures and algorithms knowledge.

  • CS50: Introduction to Computer Science
  • Grokking Algorithms
  • Algorithms by Sedgewick

    ​

    2. Working knowledge.

    I'd suggest doing a JavaScript deep-dive before focusing on your stack. I prefer screencasts and video courses for this, but there are also plenty of books available. After that, focus on the specific frameworks that you're using. While you're doing front-end work, I also suggest you to explore the back-end.

    ​

  • FunFunFunction on Youtube
  • You Don't Know JS
  • JavaScript Allonge
  • JavaScript Design Patterns

    3. Software engineering practices.

    Design patterns and development methodologies. Read up about testing, agile, XP, and other things about how good software is developed. You could do this by reading the 'Big Books' in software, like Code Complete 2 or the Pragmatic Programmer, in your downtime. Or, if you can't be bothered, just read different blog posts/Wikipedia articles.

    ​

    4. Soft skills.

  1. Actively seek to mentor and teach others (perhaps an intern at work, or someone at a local tech community, or create blog posts or videos online).
  2. Get mentorship and learn from others. Could be at work, or open source.
  3. Go to programming meetups.
  4. Try public speaking, go to a Toast Masters meetup.
  5. Learn more about and practice effective communication.
  6. Learn more about business and the domain that you're working in at your company.
  7. Read Soft Skills or Passionate Programmer for more tips.

    ​

    Some closing notes:

    - For your 'how to get started with open source' question, see FirstTimersOnly.

    - If you can't be bothered to read or do large online courses, or just want a structured path to follow, subscribe to FrontendMasters and go through their 'Learning Paths'.

    - 4, combined with building relationships and marketing yourself, is what will truly differentiate you from a lot of other programmers.

    ​

    Sorry for the long post, and good luck! :)
u/WineEh · 3 pointsr/WGU_CompSci

Since the course is in Python you could also always take a look at Grokking Algoriths. I think this is a great book that people overlook. It’s easy to understand and a (fairly) quick read. https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230

Common sense is also a great book. If you really want to brush up on DS&A you could check out Tim Roughgarden’s coursera courses and the related books.

I will point out though that at least from my experience you can access the course materials even for courses you transferred in so once you get access to you courses or even when you start DS&A at WGU you can always refer back if you’re struggling.

u/k4z · 3 pointsr/UCONN

probably is. don't have it handy.

since you're learning data structures and algorithms in python, any general data structures and algorithm course should work; just implement them in python.

it's hard to suggest [a good resource] off the top of the head, that isn't a mere udemy shill or incredibly dense like stanford's algo course. grokking algorithms was okay, while people might suggest introduction to algorithms (but there's a reason why it's 1k pages and pure madness to "refresh" your knowledge).

doing projects (crash course, automate) would help to refresh using python.

u/ThinqueTank · 3 pointsr/programming

I've actually been getting some interest lately because I just started to attend meetups last week and let some engineers see what I've done so far.

Really appreciate the algorithms/data structures advice. I picked up this book to get a light overview of it first before I really dive into something more formal:
Grokking Algorithms: An illustrated guide for programmers and other curious people

I also have enough college credits to take a Data Structures course called Discrete Structures for Computer Science and my math up to Linear Algebra completed. Here's the description for the community college course:

> This course is an introduction to the discrete structures used in
Computer Science with an emphasis on their applications. Topics
covered include: Functions; Relations and Sets; Basic Logic; Proof
Techniques; Basics of Counting; Graphs and Trees; and Discrete
Probability. This course is compliant with the standards of the Association
for Computing Machinery (ACM).

Is the above what you're referring to more or less?

Are there any books and/or online courses you'd personally recommend for data structures and algorithms?

u/KlaytonWade · 3 pointsr/java

Grokking Algorithms.
This is the best one yet.

u/FourForYouGlennCoco · 3 pointsr/learnprogramming

I'd suggest finding a good course and a good book that you can use for reference. Then cementing your skills by doing lots of problems.

The book: this one's easy. Skiena's Algorithm Design Manual is highly regarded, surprisingly readable, and affordable by textbook standards. Don't need to read it cover to cover, but it's great reference, and I'd suggest reading the really critical chapters like graph search and sorting.

The course: You can't really do better than Stanford's course, taught by Tim Roughgarden. I took it a few years ago and have used it to brush up every time I apply for a new job.

The problems: it's important not to dive into this until after you've learned the basic concepts. But Leetcode is the standard, and for good reason -- it has a ton of problems, many of which are leaked from real companies. At first, these problems will seem very difficult -- it make take hours to solve a 'medium' level problem, if you can get it at all. If you simply can't get it, read the solutions and really understand them.

I also recommend the book Elements of Programming Interviews. You'll hear a lot of love for Cracking the Coding Interview on this sub, but it's too easy and outdated for current interview prep. It's a fine intro book, but it won't take you all the way to where you need to be.

u/BlackRaspberryChip · 3 pointsr/cscareerquestions

Wow that's awesome! Are you interested in embedded and RTOS systems at all?

If so, I'd highly recommend this and this and picking up a TI TM4C123 board to do some development on.

I also recommend this book for learning more about the practical application of algorithms.

u/munificent · 3 pointsr/learnprogramming

I've heard good things about The Algorithm Design Manual. I personally really got a lot from Algorithms in a Nutshell. As the name implies, it's a small book, but quite good.

I know you requested data structures, but the two subjects are closely intertwined: a given data structure often exists to support an algorithm and vice versa.

u/FullOfEnnui · 3 pointsr/cscareerquestions
u/ShadowWebDeveloper · 3 pointsr/ExperiencedDevs

Yes, working at a big N now after several years of working for startups. You'll need to study up on how to do successful coding interviews (which at large companies are very different than the startup interviews you've probably experienced). Cracking the Coding Interview is a good start. Leetcode is also good practice but make sure you do some whiteboard practice as well since you won't have the advantage of a compiler when doing the actual interviews. If, like me, you didn't have a formal CS background, consider watching the MIT 6.006 Introduction to Algorithms videos and possibly doing the exercises before any of that. They were really helpful when textbooks like the oft-recommended The Algorithm Design Manual came across as super-heavy (but still useful as a reference).

Good luck!

^(Disclaimer: Opinions are my own, not necessarily my employer's.)

u/empty-stack · 3 pointsr/gamedev

The Algorithm Design Manual

It's very concise and well organized. I always give this a quick skim before an interview

u/schreiberbj · 3 pointsr/compsci

This question goes beyond the scope of a reddit post. Read a book like Code by Charles Petzold, or a textbook like Computer Organization and Design or Introduction to Computing Systems.

In the meantime you can look at things like datapaths which are controlled by microcode.

This question is usually answered over the course of a semester long class called "Computer Architecture" or "Computing Systems" or something like that, so don't expect to understand everything right away.

u/FearMonstro · 3 pointsr/compsci

Nand to Tetris (coursera)

the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris.

Code: The Hidden Language of Hardware and Software

This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive.

Computer Networking Top Down Approach

one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics.

Operating System Design

A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!)

Digital Design Computer Architecture

another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice.

u/nonenext · 3 pointsr/explainlikeimfive

If you want to know how computer is made, this amazing book explains so clearly from scratch in order so you can understand next chapter to the end.

It explains in scratch from Morse code, to electricity circuit with battery + flashlight, to telegraphy and relays with more advanced electricity circuit, to how numbers are understood in logic sense, to binary digits (0 and 1), to explaining how you can do so much with just binary digits and how barcode works, to logic and switches in algebra and advanced electricity circuits with binary/boolean, to logic gates, more advanced electricity circuits stuff, to bytes and hexes, how memory functions, to automation... ah this is halfway through the book now.

The way how he writes is very clear, understandable, and everything what he wrote has a meaning for you to be capable to understand what he wrote further in the book.

You'll know EVERYTHING about electricity and behind-the-scene how computer works, how RAM works, how hard drive works, how CPU works, how GPU works, everything, after you finish this book.

u/throwdemawaaay · 3 pointsr/AskEngineers

Buy this book. If you find it interesting/motivating then you'll like the general field. Because it sits at the intersection of CS and EE you'll have a toolkit that would allow you to pursue a very broad spectrum of work.

u/SakishimaHabu · 3 pointsr/AskComputerScience
u/zkSNARK · 3 pointsr/ComputerEngineering

If you wanna go deeper with the hardware, this is the book my university used. It contains a lifetime of knowledge. However, it is nowhere close to the readability of Code. Where I found code to be friendly and inviting, this book is more of a grind through 100 pages in 2 months and question your existence type of thing. For OS stuff, we used this one. I'd say its a lot more friendly to read than the architecture book, but really as you go deeper into both of these subjects, they don't get friendlier.

u/gtani · 3 pointsr/compsci
u/mfukar · 3 pointsr/askscience

> When we say dual core of quad core processor, what we really mean is a single integrated chip (CPU) with 2 (dual) or 4 (quad) processors on it. In the old days processors were single core so this confusion didn't arise as a single core processor was just a processor.
>

Many CPUs can be included in a single integrated die.

In "the old days" there were multi-chip modules includinged multiple CPUs (and/or other modules) in separate ICs.

> A processor consists of a control unit (CU) and a arithmetic logic unit (ALU).

And many other things, which it sometimes shares (MMUs, I/O controllers, memory controllers, etc). Don't be too picky over what a processing unit includes. For those that want to dive in, grab this or this book and read on.

> The combination of components is why just having more cores or more GHz doesn't always mean a faster CPU - As the onboard cache and other factors can also slow the processing down, acting as a bottleneck.

Bit of a superfluous contrast, these days. Using anything external to the CPU slows it down, by virtue of propagation delays alone. That's one of the reasons we want many cores / CPUs. The more CPUs or faster clocks question is a red herring - here's an article that explores why (the context is CAD, but the observations are valid in most areas of application).

u/Opheltes · 3 pointsr/learnprogramming

Patterson and Hennessy's textbooks, Computer Architecture and Computer Organization and Design are pretty much the standard textbook used in every computer architecture class, everywhere.

u/mdf356 · 3 pointsr/cscareerquestions

It's about 40 years too late for any one person to have mastery of all the different parts of a computer.

For computer architecture, Hennessy and Patterson is the classic volume. For the software algorithms that are used everywhere, CLRS is the classic guide. For Operating Systems, The Design and Implementation of FreeBSD is a good book. I'm sure there's something similar for networking.

You could read the PCI spec, and some Intel data sheets, and the RFCs for networking protocols if you want to learn those things. For most parts of computers, I strongly suspect that most material is either too high level (yet another "Introduction to") or too low level (reading an RFC doesn't tell you whether it's used that way in practice or has been superseded).

The only way I've gotten to know things is to play with them. Change the code, make it do something else. Personal research like that is very educational but time consuming, and there's way too much out there to know everything, even for a single small piece of hardware and its associated software.

u/Quinnjaminn · 3 pointsr/cscareerquestions

Copy pasting my response to a similar question:

Edited to have more resources and be easier to read.

It's hard to draw the line between "essential" and "recommended." That depends a lot on what you want to do. So, I will present a rough outline of core topics covered in the 4 year CS program at my university (UC Berkeley). This is not a strict order of topics, but prerequisites occur before topics that depend on them.

Intro CS

Topics include Environments/Scoping, abstraction, recursion, Object oriented vs functional programming models, strings, dictionaries, Interpreters. Taught in Python.

The class is based on the classic MIT text, "Structure and Interpretation of Computer Programs." Of course, that book is from 1984 and uses Scheme, which many people don't want to learn due to its rarity in industry. We shifted recently to reading materials based on SICP, but presented in python. I believe this is the reading used now. This course is almost entirely posted online. The course page is visible to public, and has the readings, discussion slides / questions and solutions, project specs, review slides, etc. You can find it here.

Data Structures and basic algorithms

DS: Arrays, Linked Lists, Trees (Binary search, B, Spaly, Red-Black), Hash Tables, Stacks/Queues, Heaps, Graphs. Algorithms: Search (Breadth first vs depth first), Sorting (Bubble, radix, bucket, merge, quick, selection, insert, etc), Dijkstra's and Kruskal's, Big-O analysis.

This class uses two books: "Head First Java" and "Data Structures and Algorithms in Java" (any edition except 2). The class doesn't presupposed knowledge in any language, so the first portion is covering Object Oriented principles and Java from a java book (doesn't really matter which), then moving to the core topics of data structures and algorithms. The course page has some absolutely fantastic notes -- I skim through these before every interview to review. You can also check out the projects and homeworks if you want to follow along. The course page is available here (note that it gets updated with new semesters, and links will be removed -- download them soon if you want to use them).

Machine Structures (Intro Architecture)

Warehouse scale computing (Hadoop Map-Reduce). C language, basics of assemblers/compilers/linkers, bit manipulation, number representation. Assembly Language (MIPS). CPU Structure, pipelining, threading, virtual memory paging systems. Caching / memory hierarchy. Optimization / Performance analysis, parallelism (Open MP), SIMD (SSE Intrinsics).

This class uses two books: "The C Programming Language" and "Computer Organization and Design". This class is taught primarily in C, so the first few weeks are spent as a crash course in C, along with a discussion/project using Map-Reduce. From there in jumps into Computer Organization and Design. I personally loved the projects I did in this class. As with above, the lecture slides, discussion notes, homeworks, labs, solutions, and projects are all available on an archived course page.

Discrete Math / Probability Theory

Logic, Proofs, Induction, Modular Arithmetic (RSA / Euclid's Algorithm). Polynomials over finite fields. Probability (expectation / variance) and it's applicability to hashing. Distributions, Probabilistic Inference. Graph Theory. Countability.

Time to step away from coding! This is a math class, plain and simple. As for book, well, we really didn't have one. The class is based on a series of "Notes" developed for the class. When taken as a whole, these notes serve as the official textbook. The notes, homeworks, etc are here.

Efficient Algorithms and Intractable Problems

Designing and analyzing algorithms. Lower bounds. Divide and Conquer problems. Search problems. Graph problems. Greedy algorithms. Linear and Dynamic programming. NP-Completeness. Parallel algorithms.

The Efficient Algorithms class stopped posting all of the resources online, but an archived version from 2009 has homeworks, reading lists, and solutions. This is the book used.

Operating Systems and System Programming

Concurrency and Synchronization. Memory and Caching. Scheduling and Queuing theory. Filesystems and databases. Security. Networking.

The Operating Systems class uses this book, and all of the lectures and materials are archived here (Spring 2013).

Math

Those are the core classes, not including about 4 (minimum) required technical upper division electives to graduate with a B.A. in CS. The math required is:

  • Calculus 1 and 2 (Calc AB/BC, most people test out, though I didn't)

  • Multivariable calculus (not strictly necessary, just recommended)

  • Linear Algebra and Differential Equations.

    Those are the core classes you can expect any graduate from my university to have taken, plus 4 CS electives related to their interests. If you could tell me more about your goals, I might be able to refine it more.
u/QuoteMe-Bot · 3 pointsr/ComputerEngineering

> We use vivado in school and they teach verilog. My impression is that VHDL is more of an industry standard, but I'm still a student so don't quote me on that. The way my university introduced digital logic was by having us start at logic gate level then use those modules to make state machines and last semester we made a MIPS processor.

>
Vivado (web pack should be free)
https://www.xilinx.com/products/design-tools/vivado.html

> Here is the book we used for the processor
https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

~ /u/laneLazerBeamz

u/morto00x · 3 pointsr/embedded

Are you familiar with logic design (multiplexers, decoders, registers, logic gates, etc)? Computer Organization and Design covers a lot of it and is relatively easy to read. But having some background in digital logic will help a lot.

u/nimblerabit · 3 pointsr/compsci

I learned mostly through reading textbooks in University, but not many of the books we were assigned stood out as being particularly great. Here's a few that I did enjoy:

u/sarpunk · 3 pointsr/learnprogramming
  • I second the other comments about practice & sticking with projects. Perfectionism can be a great thing, but if it keeps you from finishing a project, let it go. The first iterations of your projects don't have to be perfect - just getting through them will help you grow.

  • Procrastinating on homework assignments will also tank your grade (been there, done that), even if the material seems easy - some programming assignments just take loads of time.

  • It sounds like you're still in school, so you'll probably be exposed to lots of different languages and paradigms, and that's a good thing. If you're going to insist on perfection in personal projects, though, it might be easiest to focus on one area, like halfercode suggested.

  • Finally, for reading material: It sounds like you don't need any basic intros, so look for advanced tutorials to new languages you want to learn, or just read the language documentation. This is a pretty good competency matrix to rate yourself against - if something looks unfamiliar, browse through the wiki page. Other great books: Computer Systems: A Programmer's Perspective - doesn't assume a ton of prior knowledge, but gets to a fair amount of depth pretty quickly. There are also really cool systems programming labs. Matt Might's list of everything a CS major should know is really comprehensive, with lots of reading material referenced. If I were you, I would focus specifically on the Data Structures & Algorithms and Theory sections, supplementing with practical projects.

  • As for projects: Start small, no matter the final size of the project. Focus on getting out a minimal example of what you want to do before you worry about what the UI looks like or perfect functioning.

    tl:dr Practice & perserverence are the main points. No one is really any good at programming until they've got a few years of churning out code, so don't get discouraged. Finally: don't let the breadth of the computer science/software world overwhelm you. Focus on small pieces, and in a few years you'll have learned more than you would have expected.
u/opensourcedev · 3 pointsr/programming

If you are looking to go a little deeper, I can recommend this book as well:

"Computer Systems: A Programmer's Perspective"

http://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040/ref=sr_1_1?ie=UTF8&qid=1292882697&sr=8-1

This book has a similar thread, but is much more in-depth and covers modern Intel and AMD processors, modern Operating Systems and more.

I learned all about linkers, compiler optimization, the crazy memory management structure of x86...

u/hewhomustbenamed · 3 pointsr/learnprogramming

Umm....if you want to do that...then write a simple program to lets say sum 10 numbers in C. Now , compile this file and "step" through the program in gdb....as you will see each assembly line executed you will have an understanding of whats going on.

However , for some sanity please refer the Intel manual or use this book (there might be other references as well) ... http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040


There's a free beta edition somewhere...and you will need Chapters 2 and Chapters 3. One full day read both of them thoroughly..and you"ll be golden. Let me know how it goes.

u/nabnob · 3 pointsr/AskReddit

Are you in high school or college?

C# is very similar to Java - it's object oriented, has garbage collection (meaning you can get away with not learning about memory), and strongly typed. I wouldn't really say it's that useful to learn if you already know Java unless you end up working for a software company that does work in C#.

C doesn't have any of those nice features of Java and C#(strongly typed, garbage collection), and all variables - pointers, integers, characters - are treated as bits stored somewhere in memory, either in the stack or the heap. Arrays and structs (similar to objects in Java, sort of) are longer blocks of memory. C++ is an object-oriented version of C, and if you already know C and Java you would be able to pick up on C++ fairly quickly.

Learning C forces you to learn a lot of memory and system concepts. It's not really used in the software industry as much because, since it's missing all those nice Java and C# features, it can be difficult to write huge, complicated systems that are maintainable. If you want to be a serious developer, you DO need to learn these things before you graduate from college. Most major software companies ask systems/memory type questions in their interviews.

However, if you're in high school, I wouldn't say it's really necessary to try to learn C on your own unless you really want to. A good computer science program in college would require at least one class on C programming. If you are really interested, I would look at this to learn C, and later this for more information on how computers work.

TLDR; Learn C in college if want to be a software engineer and, if you're in high school, learn whatever you find interesting.

u/ideophobic · 3 pointsr/Futurology

Lets do some math.

My son was born this year. With an average life span of 76 years, he should most likely die by 2090. But, I will also make the assumption that in the years between 2014 and 2090 we will find ways to advance the average life span to a little bit longer, lets say 30 years. So now, his average life span is 106 years and the "death year' is extended to 2120.

But between 2090 and 2020 science will continue to advance and we will probably have a life expectance of 136 years by then, which now make his death year "2050". And so forth until science finds a way to keep him alive for ever. Even if it takes the better part of a century, some of the younger people will still be beyond the cutoff.

Now. If you actually talk to real scientists who have studied this in much more detail, they are saying that this will not take a century, and should just take a few decades to achieve " escape velocity" for immortality. There is a book written about this, and how in the next few decades, we will unlock the masteries of aging. The Singularity Is Near: When Humans Transcend Biology

u/TehGinjaNinja · 3 pointsr/confession

There are two books I recommend to everyone who is frustrated and/or saddened by the state of the world and has lost hope for a better future.

The first is The Better Angels of Our Nature by Stephen Pinker. It lays out how violence in human societies has been decreasing for centuries and is still declining.

Despite the prevalence of war and crime in our media, human beings are less likely to suffer violence today than at any point in our prior history. The west suffered an upswing in social violence from the 1970s -1990s, which has since been linked to lead levels, but violence in the west has been declining since the early 90s.

Put simply the world is a better place than most media coverage would have you believe and it's getting better year by year.

The second book I recomend is The Singularity is Near by Ray Kurzweil. It explains how technology has been improving at an accelerating rate.

Technological advances have already had major positive impacts on society, and those effects will become increasingly powerful over the next few decades. Artificial intelligence is already revolutionizing our economy. The average human life span is increasing every year. Advances in medicine are offering hope for previously untreatable diseases.

Basically, there is a lot of good tech coming which will significantly improve our quality of life, if we can just hang on long enough.

Between those two forces, decreasing violence and rapidly advancing technology, the future looks pretty bright for humanity. We just don't hear that message often, because doom-saying gets better ratings.

I don't know what disability you're struggling with but most people have some marketable skills, i.e. they aren't "worthless". Based on your post, you clearly have good writing/communicating skills. That's a rare and valuable trait. You could look into a career leveraging those skills (e.g. as a technical writer or transcriptionist) which your disability wouldn't interfere with to badly (or which an employer would be willing to accommodate).

As for being powerless to change the world, many people feel that way because most of us are fairly powerless on an individual level. We are all in the grip of powerful forces (social, political, historical, environmental, etc.) which exert far more influence over our lives than our own desires and dreams.

The books I recommended post convincing arguments that those forces have us on a positive trend line, so a little optimism is not unreasonable. We may just be dust on the wind, but the wind is blowing in the right direction. That means the best move may simply be to relax and enjoy the ride as best we can.

u/bombula · 3 pointsr/Futurology

Any futurist or regular reader of /r/futurology can rehearse all of the arguments for why uploading is likely to be feasible by 2100, including the incremental replacement of biological neurons by artificial ones which avoids the "copy" issue. If you're not already familiar with these, the easiest single reference is probably The Singularity is Near.

u/lukeprog · 3 pointsr/Futurology

> This is my problem with Kurzweil, et al, who make arguments based on the availability of raw computing power, as if all that's required for the Singularity to emerge is some threshold in flops.

That's not quite what Kurzweil says; you can read his book. But you're right: the bottleneck to AI is likely to be software, not hardware.

> I don't think we're any closer to forming an AI now than medieval alchemists were to forming homunculi using preparations of menstrual blood and mandrake root

On this, I'll disagree. For a summary of recent progress made toward AI, see The Quest for AI.

u/maurice_jello · 3 pointsr/elonmusk

Read Superintelligence. Or check out Bostrom's TED talk.

u/j4nds4 · 3 pointsr/elonmusk

>Really? It's still their opinion, there's no way to prove or disprove it. Trump has an opinion that global warming is faked but it doesn't mean it's true.

From my perspective, you have that analogy flipped. Even if we run with it, it's impossible to ignore the sudden dramatic rate of acceleration in AI capability and accuracy over just the past few years, just as it is with the climate. Even the CEO of Google was caught off-guard by the sudden acceleration within his own company. Scientists also claim that climate change is real and that it's an existential threat; should we ignore them though because they can't "prove" it? What "proof" can be provided for the future? You can't, so you predict based on the trends. And their trend lines have a lot of similarities.

>Also, even if it's a threat(i don't think so, but let's assume it is), how putting it in your brain will help? That's kind of ridiculous. Nowadays you can turn your PC off or even throw it away. You won't be able to do that once it's in your brain. Also, what if the chip decides to take control over your arms and legs one day? It's insane to say that AI is a threat but to plan to put it inside humans' brain. AI will change your perception input and you will be thinking you are living your life but in reality you will be sitting in a cell somewhere. Straight up some Matrix stuff. Don't want that.

The point is that, in a hypothetical world where AI becomes so intelligent and powerful that you are effectively an ant in comparison, both in intelligence and influence, a likely outcome is death just as it is for billions of ants that we step on or displace without knowing or caring; think of how many species we humans have made extinct. Or if an AI is harnessed by a single entity, those controlling it become god-like dictators because they can prevent the development of any further AIs and have unlimited resources to grow and impose. So the Neuralink "solution" is to 1) Enable ourselves to communicate with computer-like bandwidth and elevate ourselves to a level comparable to AI instead of being left in ant territory, and 2) make each person an independent AI on equal footing so that we aren't controlled by a single external force.

It sounds creepy in some ways to me too, but an existential threat sounds a lot worse. And there's a lot of potential for amazement as well. Just like with most technological leaps.

I don't know how much you've read on the trends and future of AI. I would recommend Nick Bostrom's book "Superintelligence: Paths, Dangers, Strategies", but it's quite lengthy and technical. For a shorter thought experiment, the Paperclip Maximizer scenario.

Even if the threat is exaggerated, I see no problem with creating this if it's voluntary.

u/Mohayat · 3 pointsr/ElectricalEngineering

Read Superintelligence by Nick Bostrom , it answered pretty much all the questions I had about AI and learned a ton of new things from it. It’s not too heavy on the math but there is a lot of info packed into it, highly recommend it.

u/mavelikara · 3 pointsr/programming

The book has a different approach than standard textbooks. While all other books I have read classify algorithms by the problem they solve, this book classifies algorithms by the technique used to derive them (TOC). I felt that the book tries to teach the thought process behind crafting algorithms, which, to me, is more important than memorizing details about a specific algorithm. The book is not exhaustive, and many difficult topics are ignored (deletions in Red-Black trees, for example). The book is written in an engaging style, and not the typical dry academic prose. I also liked the use of puzzles as exercises.

The only other comparable book, in terms of style and readability, is that of Dasgupta, Papadimitriou and Vazirani. But I like Levitin's book better (only slightly). These two books (+ the MIT videos) got me started in algorithms; I had to read texts like CLRS for thoroughness later. If someone is starting to study the topic my recommendation would be to read Levitin and get introduced to the breadth of the topic, and then read CLRS to drill deeper into individual problems and their details.

u/ocusoa · 3 pointsr/Physics

Do you know which fields of physics are you interested in?

If Quantum Information/Quantum Computation sounds interesting, I would look at this book. I used it when I first learned about the topic. It doesn't assume much advanced math, just basic matrix/vector multiplications will suffice.
There's a reason the book doesn't assume much prior knowledge. It has two parts, Quantum Information and Quantum Computation. Roughly speaking the former is physics and the latter is computer science. And usually physicists don't know much about computer science and computer scientists don't know much about physics.


There's also another book, "Q for Quantum", published very recently by Terry Rudolph. I haven't read the book myself (I plan to), but from what he described in an email it might be something you're looking for:


> I have finally finished a book, "Q is for Quantum", that teaches the fundamentals of quantum theory to people who start off knowing only basic arithmetic.

> I have successfully used this method in outreach with students as young as 12, but of course it is much easier when you can have a proper back-and-forth dialogue. In practice it is late-stage high school students I am most passionate about reaching with this book - I believe quantum theory can (and should) be taught quantitatively in high school, not 2 years into an undergraduate physics degree! In fact I would be delighted if the 3rd and 4th year students entering my undergraduate lecture courses already understood as much quantum theory as covered in the book.


Have fun!

u/csp256 · 3 pointsr/compsci

The Mathematical Theory of Communication is short and sweet.

Once you have had your linear algebra, you might be interested in looking at quantum computing. The canonical text there is Quantum Computation and Quantum Information.

u/Orphion · 3 pointsr/quantum

I would recommend The Feynman Lectures on Physics. They're expensive books, but the description of quantum mechanics is particularly good, albeit 50 years old. Moreover, the lectures cover all of the other things you'll need to know in physics as well.

The problem with the Feynman lectures being old is that in the 50 years since they were given, quantum information has emerged as a field entirely separate from quantum mechanics/physics. The Mike and Ike book is the best single introduction to the field, but it, too, is expensive.

Luckily, there is a huge number of articles published on the physics arxiv, some of which are quite approachable. This introduction to quantum information is written by many of the giants in the field.

u/tburke2 · 3 pointsr/Physics
u/redcalcium · 3 pointsr/webdev

I'm not really good at explaining stuff, but this book is a great resource regarding turing machine and turing completeness: Introduction to the Theory of Computation

Turing machine is a model first proposed by Alan Turing. It's similar to Finite Automaton, but it has unlimited tape memory. Basically, modern computers is an implementation of the Turing Machine. One characteristics of a turing machine is it's possible to use a turing machine to simulate another turing machine.

A programming language is said to be turing complete if it can be used to simulate a turing machine. If you've written a program in a turing complete language, you should be able to translate that program to any turing complete programming language.

So if css become a turing complete language, I imagine people will use it to create crazy stuff such as an x86 emulator.

u/bonesingyre · 3 pointsr/coursera

Just my 2 cents: The Stanford Algorithms class is more about designing algorithms. The Princeton Algorithms class is more about implementation and real world testing.

The FAQ at the bottom:

How does Algorithms: Design and Analysis differ from the Princeton University algorithms course?

The two courses are complementary. That one emphasizes implementation and testing; this one focuses on algorithm design paradigms and relevant mathematical models for analysis. In a typical computer science curriculum, a course like this one is taken by juniors and seniors, and a course like that one is taken by first- and second-year students.


As a computer science student, I would encourage you to pick up a book on Discrete Mathematics, and pick up Robert Sedgwick's Algorithm's textbook. Sedgwick's Algorithms book is more about implementing algorithms, compared to CLRS, which is another algorithms textbook written by some very smart guys. CLRS is far more in depth.

I took a Data Structures and Algorithms class recently, we used Sedgwick's textbook. I will be taking another ALgorithms & Design class later using CLRS.

Books:
http://www.amazon.com/Discrete-Mathematics-Applications-Susanna-Epp/dp/0495391328/ref=sr_1_1?s=books&ie=UTF8&qid=1372267786&sr=1-1&keywords=discrete+mathematics
http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X/ref=sr_1_1?s=books&ie=UTF8&qid=1372267775&sr=1-1&keywords=algorithms
http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_1_1?ie=UTF8&qid=1372267766&sr=8-1&keywords=clrs
http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X/ref=sr_1_1?s=books&ie=UTF8&qid=1372267798&sr=1-1&keywords=theory+of+computation

The last book is super important for CS students, I would read that front to back as well.

u/falafel_eater · 3 pointsr/AskComputerScience

Computer Science is a pretty big field, so "strong foundation" can mean different things to different people.
You will definitely want the following:

  1. Introduction to Algorithms and Data Structures
  2. Introduction to Computability
  3. Introduction to Operating Systems

    For algorithms and data structures, a very commonly used textbook is Cormen.
    For computability, Sipser.

    Operating Systems I don't remember off the top of my head.
    That said, you are probably much better off finding a high-quality university course that is based on these textbooks instead of trying to read them cover-to-cover yourself. Check out lecture series from places like MIT on youtube or whatever.

    After that, you can take an Intro to Artificial Intelligence, or Intro to Communication Networks, or any other intro-level course to a more specific sub-area. But if you lack basis in computability to the point where you don't know what an NP-Complete problem is, or have no idea what a Binary Search Tree is, or do not know what an Approximation Algorithm is, then it would be hard to say you have a strong foundation in CS.
u/dionyziz · 3 pointsr/cryptography

Hi,

You should already know most of the math you need to know from your math major. It helps to know number theory, group theory, and algebraic curves, depending on what you do. Important knowledge is also discrete mathematics: Discrete probability theory, Markov chains, graph theory, logic, proof methods, solving recurrences, etc. are all helpful tools.

In terms of computer science, it's imperative to know how to program. You can learn a programming language such as Python. Project Euler is a good place to start for a mathematician. Knowledge of algorithms is also important, and you must understand computability and complexity theory.

Stanford's Cryptography I is a good place to start learning cryptography. I think you can go ahead and start the course without attempting prerequisites and see where you get stuck to go and learn what is required of you.

u/maruahm · 3 pointsr/math

I think learning proofs-based calculus and linear algebra are solid places to start. To complete the trifecta, look into Arnold for a more proofy differential equations course.

After that, my suggestions are Rudin and, to build on your CS background, Sipser. These are very standard references, though Rudin's a slightly controversial suggestion because he's notorious for being terse. I say, go ahead and try it, you might find you like it.

As for names of fields to look into: Real Analysis, Complex Analysis, Abstract Algebra, Topology, and Differential Geometry mostly partition the field of mathematics with corresponding undergraduate courses. As for computer science, look into Algorithmic Analysis and Computational Complexity (sometimes sold as a single course called Theory of Computation).

u/walker6168 · 3 pointsr/AskHistorians

You should organize it by tech. Remember that numbers are tools. You're going to end up talking about what they were for more than the maths. Navigation, architecture, calendars, all of those would develop at different rates in a civilization and have math supporting it.

Just as two examples:

The beginning of basic numbers as a system would have been merchants and people accounting for debt/business transactions. There were a lot of mechanisms for doing this in early civilization via the temples, tribes, or governments. For details on that you should check out Debt: The First 5000 Years.

Computers are a different branch that deal with information transmission. James Gleick's outstanding book The Information: A History, A Theory, A Flood is a concise history of how we go from drum codes in the Congo to the Difference Engine to Turing's computer language.

u/kaki024 · 3 pointsr/suggestmeabook

The Information: A History, A Theory, A Flood

https://www.amazon.com/dp/1400096235/ref=cm_sw_r_cp_apa_dZ3ozb100CF8D

u/scarletham · 3 pointsr/finance

Love stuff like this and this.

u/truancy-bot · 3 pointsr/math

In a way more general approach (and in my view), information is surprise. Anything else is just already known data presented in a certain way. I recommend The information by James Gleick for a detailed (and philosophical) analysis of this question.

u/waxxxd · 3 pointsr/gamedev

Hmm was just looking at this book today, can't vouch for it but might be worthwhile.

Mathematics for 3D Game Programming

u/gunnar_osk · 3 pointsr/gamedev

"I've never tried graphics programming (OpenGL or otherwise), but sure... this post looks intriguing"

[Opens the link]

"Looks like a well written and informative tutorial, but I don't know most of the stuff he's writing about"

[Goes down the rabbit hole of OpenGL information]

"Damn it, now I HAVE to learn OpenGL from the start. Been looking for an excuse to brush up on my C++ skills anyways."

[Bookmarks the webpage]

"I wonder I need to brush up on my math skills also? Oh well, guess I'll cross that bridge when I come to it"

[Thinks of that book I bought that's collecting dust on the bookshelf]

:)

u/TurkishSquirrel · 3 pointsr/learnprogramming

It depends a bit on what areas you're interested in. For interactive graphics you'll likely do OpenGL or DirectX or such.
Non real-time graphics usually means ray tracing or some variant like photon mapping where you want to produce physically correct images, with flexibility depending on your art direction e.g. Big Hero 6. With ray tracing you're essentially simulating how light interacts in the scene.

Here's some useful books/links for real time graphics:

  • Real-Time Rendering this is a great book covering a lot of theory/math topics behind real time graphics techniques, so it's agnostic to whatever rendering API you use. The book's website lists more graphics related resources and is quite good.
  • OpenGL Superbible good book focusing on OpenGL, written for beginners with the API.
  • open.gl very good introductory tutorials for OpenGL, I just wish it covered some more content. Should give you a solid start though.

    Here's some for ray tracing:

  • Physically Based Rendering this is basically the book for ray tracing, the 3rd edition should be coming out this spring though so if you want to save some money you could wait a bit. There's also a website for this book.

    For general math topics I also recently picked up Mathematics for 3D Game Programming and Computer Graphics which looks very good, though I haven't gone through it as thoroughly.

    As mentioned already /r/GraphicsProgramming is a good subreddit, there's also /r/opengl for OpenGL questions.
u/johnnyanmac · 3 pointsr/gamedev

personally, I used this book to refresh myself on the basic vector math and finally understand some 3d linear algebra concepts. It probably goes a bit deeper than you'd ever need to know if you're using an engine (how 3d transformations work on the matrix-level, quaternions, polar mathematics), but the book uses extremely accessible language to explain everything, so you rarely feel confused like your typical math textbook.

I haven't read it, but this book is that standard in what people typically refer to for gamedev math. If you want to be experimental, the same author just released the first part of a series for game engine development. while it ultimately goes in a different direction, the first book here should cover the important math needed, and it is under half the length of the other books.

u/othellothewise · 3 pointsr/opengl

I would recommend Mathematics for 3D Game Programming and Computer Graphics. It has all the derivations plus covers a whole lot of other useful topics. It's well worth the $45.

Another approach is to go through the math manually, by taking points and projecting them and trying to understand the behavior.

u/ebonyseraphim · 3 pointsr/gamedev

That is a great book with a math primer and continued coverage as you get deeper into the specifics of collision detection. I also own this which does a better job teaching plain math relevant to game dev and is agnostic about whether it applies to collision, physics, or rendering:

http://www.amazon.com/Mathematics-Programming-Computer-Graphics-Third/dp/1435458869/

I highly recommend it. It's well ordered and well written. It is the quality you'd expect from something you pay for and will save you time with its completeness and clarity.

u/frizzil · 3 pointsr/VoxelGameDev

Agreed 100%. Additionally, if you're trying to learn basic OpenGL, Java combined with LWJGL is actually a great choice, since the language is generally quick to iterate with. And definitely go with the advanced pipeline, as learning immediate mode isn't going to help you much if advanced is your end goal.

Also, big piece of advice -- you're really going to want a solid understanding of 3D matrix math before diving in. Particularly, you're going to want to know the difference between row-major and column-major systems, and how to perform basic manipulations in both. To this end, I highly recommend the book Mathematics for 3D Game Programming and Computer Graphics.

u/timostrating · 3 pointsr/Unity3D

TL;DR

Take a look at spaced repetition. Study without any music and use the absence of music as a check to see if you are still motivated to do your studying.

<br />

I fucked up my first part of my education too. Lucy i realized that and got motivated again before i finished school.

I am currently 19 years old and I also always loved math (and some physics). I am from the Netherlands so our education system does not really translate well to English but i was basically in highschool when i only did things that interested me. I got low grades on everything else.

1 moment in highschool really stayed with me where I now have finally realized what was happening. In highschool i had a course about the German language. I already had a low grade for that class so I sat to myself to learn extremely hard for the next small exam. The exam was pretty simple. The task was to learn 200 to 250 German words. So I took a peace of paper and wrote down all 250 words 21 times. 1 or 2 days later I had the exam. But when i got my grade back it sad that i scored a F (3/10) . I was totally confused and it only destroyed my motivation more and more.
What I now have come to realize is that learning something is not just about smashing a book inside your head as fast as possible.

<br />

So these are some tips I wished I could have give myself in the first year of highschool:

Go and sit in an quit room or in the library. This room should be in total silence. Now start with you studying. As soon as you feel the tension to put on some music than you should stop and reflect and take a little break.

The default in nature is chaos. Learn to use this as your advantage. I sit in a bus for 2+ hours a day. 1 hour to school and 1 hour back. Nearly every student does nothing in this time. So I made the rule for myself to do something productive in that time like reading a book. Normally I am just at my desk at home and before I know it it is already midnight. So this is for me at least a really good way to force my self to start reading a book in dose otherwise wasted 2 hours.

Get to know your body and brain. I personally made a bucket list of 100 items that includes 10 items about doing something for a week like running at 6am for a week of being vegan for a week. Fasting is also really great. Just do it for 1 day. So only drink water for one day and look how you feel. And try the same with coffee, sex, fapping and alcohol. Quit 1 day and look at how you feel. And have the goal to quit 1 time for a hole week strait.

Watch this video to get a new few about the difference of low and high energy. I never understood this but I think that everybody should know about the difference https://youtu.be/G_Fy6ZJMsXs <-- sorry it is 1 hour long but you really should watch it.

Learn about about how your brain stores information and how you can improve apon this. Spaced repetition is one of those things that really changed the way I now look at learning something. https://www.youtube.com/watch?v=cVf38y07cfk

<br />

I am currently doing my highschool math again for fun. After I am done with that again i hope to start with these 3 books.

u/SpaceToaster · 3 pointsr/gamedev

http://www.amazon.com/gp/aw/d/1435458869 is always on my desk (I have the 2nd edition)

A great cookbook of useful formulas and algorithms for graphics programming. Not really a reference for game logic level algorithms though.

u/morrison539 · 3 pointsr/gamedesign

Nice rundown. Here are some other books I would recommend OP check out:

u/tblaich · 3 pointsr/truegaming

Finally home and having a chance to reply. I pulled five books off of my shelf that I would recommend, but there are doubtless more that you should read.

Raph Koster's Theory of Fun for Game Design

Janet H. Murray's Hamlet on the Holodeck: The Future of Narrative in Cyberspace

Noah Wardrip-Fruin and Pat Harrigan's First Person: New Media as Story, Performance, and Game

Noah Wardrip-Fruin and Pat Harrigan's Second Person: Role-Playing and Story in Games and Playable Media

They wrote a Third Person as well, I just haven't gotten the chance to read it yet. You might be able to find PDF copies online somewhere, but if you have the money, you should try to support the writer's by buying. Show them that people are interested in critical discourse about games.

Next week I think I'm going to order a few new texts (after payday), and I'd be happy to let you know what I think once i have them in hand.

u/NoMoreBirds · 3 pointsr/tabletopgamedesign

You should check out Ralph Koster's A Theory of Fun, and Keith Burgun's Clockwork Game Design.

Those were the "eye openers" for me.

u/Goliathvv · 3 pointsr/DestinyTheGame

From Theory of Fun for Game Design by Ralph Koster:

> Human beings are all about progress. We like life to be easier. We’re lazy that way. We like to find ways to avoid work. We like to find ways to keep from doing something over and over. We dislike tedium, sure, but the fact is that we crave predictability. Our whole life is built on it. Unpredictable things are stuff like drive-by shootings, lightning bolts that fry us, smallpox, food poisoning—unpredictable things can kill us! We tend to avoid them. We instead prefer sensible shoes, pasteurized milk, vaccines, lightning rods, and laws. These things aren’t perfect, but they do significantly reduce the odds of unpredictable things happening to us.
>
> And since we dislike tedium, we’ll allow unpredictability, but only inside the confines of predictable boxes, like games or TV shows. Unpredictability means new patterns to learn, therefore unpredictability is fun. So we like it, for enjoyment (and therefore, for learning). But the stakes are too high for us to want that sort of unpredictability under normal circumstances. That’s what games are for in the first place—to package up the unpredictable and the learning experience into a space and time where there is no risk.
>
> The natural instinct of a game player is to make the game more predictable because then they are more likely to win.
>
> This leads to behaviors like “bottom-feeding,” where a player will intentionally take on weaker opponents under the sensible logic that a bunch of sure wins is a better strategy than gambling it all on an iffy winner-take-all battle. Players running an easy level two hundred times to build up enough lives so that they can cruise through the rest of the game with little risk is the equivalent of stockpiling food for winter: it’s just the smart thing to do.
>
> This is what games are for. They teach us things so that we can minimize risk and know what choices to make. Phrased another way, the destiny of games is to become boring, not to be fun. Those of us who want games to be fun are fighting a losing battle against the human brain because fun is a process and routine is its destination.
>
> So players often intentionally suck the fun out of a game in hopes they can learn something new (in other words, find something fun) once they complete the task. They’ll do it because they perceive it (correctly) as the optimal strategy for getting ahead. They’ll do it because they see others doing it, and it’s outright unnatural for a human being to see another human being succeeding at something and not want to compete.
>
> All of this happens because the human mind is goal driven. We make pious statements like “it’s the journey, not the destination,” but that’s mostly wishful thinking. The rainbow is pretty and all, and we may well enjoy gazing at it, but while you were gazing, lost in a reverie, someone else went and dug up the pot of gold at the end of it.
>
> Rewards are one of the key components of a successful game activity; if there isn’t a quantifiable advantage to doing something, the brain will often discard it out of hand.(...)

u/thegreatcollapse · 3 pointsr/gamedev

The suggestions from /u/random (wow that username!) are both great books and you should also check out Ralph Koster's A Theory of Fun for Game Design. Though not specific to game design, you might also be interested in Mihaly Csikszentmihalyi's Flow: The Psychology of Optimal Experience

u/adrixshadow · 3 pointsr/gamedesign
u/lemontheme · 3 pointsr/datascience

Fellow NLP'er here! Some of my favorites so far:

u/ajkn1992 · 3 pointsr/Python

This book. It contains recipes on how to write idiomatic python code.

u/chillysurfer · 3 pointsr/Python

Whereas I don't think Fluent Python woudld give you the "nitty-gritty" parts of Python, I seems like it would be a great book for an experience Python developer looking to polish Python programming.

Disclaimer: I haven't read this book yet, but it is no kidding in the mail on the way to me. Like you, I'm looking to polish certain parts of my Python programming, and just become and all-around better Python developer.

u/JollyRogers1503 · 3 pointsr/learnpython

I recommend Fluent Python

u/mvferrer · 3 pointsr/programming

Have you ever considered reading the SICP book from MIT? You could also try an algorithms book, and brush up on your design patterns.

u/ffualo · 3 pointsr/askscience

Hi RandomNumber37,

So here's a little bit about me first; I don't want to misrepresent myself. My background is in economics and political science, where I was interested in statistical models that predict rare international events like war and state failure. It's here I became obsessed with statistics, machine learning, etc. Also, I've been programming in many languages since I was a kid, so after my undergraduate work in the social sciences and statistics, I took a job with a bioinformatics group doing coding. I thought this would be a temporary job until graduate school in economics or quantitative political science.

However working with large-scale biological and sequencing data was way more awesome than I expected. This caused me to shift focus. I also did a fair amount of work on computational statistics, i.e. ways of trying to make R better, understanding compiler technologies, etc. So after, I became more purely interested in statistics and computational biology, and I thought I would go to graduate school for pure statistics so I could also devote some time to computational statistics. However, now I work in a plant breeding lab (which I absolutely love). I will do this about another 2-3 years before I transition into a graduate program. This would mean I've worked in the field about 6 years before applying to graduate programs.

So, with that out of the way here are answers to your questions and some advice I offer:

  1. How much of your time is spent working with the plants themselves vs with computer-organized data?

    Being that my background isn't in biology, I don't currently work with plants much. However, this is why I moved towards plant biology. Before getting obsessed about social science methods, I loved plants. I worked at an orchid greenhouse, and actually went to UC Davis thinking I'd study plant biology (until an awesome political science professor got me excited about science applied to political data). However, the scientists I work with are often not doing too much work with plants: many grow the plants, do the wet lab work, then spend more than half the time (sometimes up to 90%) analyzing the huge amount of data. I spend my full day in front of a computer, except when a colleague wants me to check out something cool in the lab, etc.

  2. With what kind of operations does your computer aid you?

    Everything. We get raw sequencing data, I have to analyze it from start to finish. Or, from raw sequencing files until the point where the numbers behind it tell a story. I also spend a huge amount of my time writing programs that do certain things for biologists in our group. Everything — protein prediction, data quality analysis, statistical modeling, etc.

  3. Do you see a full cycle... from plant, to data, to application of knowledge to your specimens (and back to data)?

    Yes, at this current position I am starting to (which I why I sought work in plant biology). It depends on what plant you work with (Arabidopsis = short life cycle, you can do lots of stuff, vs citrus tree = long life cycle, you can't do lots of stuff). But some of the more awesome longer term projects will take 4 years to fully materialize.

    So now, what steps were more important? I will tell you the three things that have helped me the most. As a point of how much they've helped me, I'll just mention that despite that not having a Phd (yet), or much of a background in biology other than what I've taught myself or learned on the job (which is actually quite a lot after 4 years in the field), I've had (and continue to receive) really nice job offers.

  4. Learn programming really, really, really well. If you want to be a step above the rest, learn python and R. Perl is huge in bioinformatics, but it's a disgusting ugly language that's dying out in my opinion. It sucks for reproducibility; no one can read anyone else's code. It was great when everyone was racing to get the human genome sequenced and had to write quick scripts constantly. Now, we have larger software platforms for that stuff, and what will count most in the future is the distribution of your scientific code. Reproducibility problems will soon be primarily dry lab, not wet lab. If you doubt that, read the "Forensic Bioinformatics" paper (http://projecteuclid.org/euclid.aoas/1267453942) which was a game changer for me. I've always been passionate about open science and reproducibility, but that made me realize that we'll have a huge problem in a few years if we're not careful.

    Anyways, I'd recommend learning:

  • Python (with BioPython). Also, with Django if you're building web apps to interface with scientific databases.
  • R (with Bioconductor).
  • Unix command line (sed/awk, bash)
  • Know your editor. I use emacs. Even if it takes you 80 hours to learn emacs or your editor well, you will regain that time over a year of work. I promise. People watch me use emacs and they say it makes them dizzy because they can't keep up. That's dozens of hours saved each week.

    Now, optionally (but highly, highly recommended):

  • C. Absolutely necessary to debug compiling programs or writing high-usage programs that need to be fast.
  • SQL. You'll be storing biological data in databases. SQL is important. Use SQLite a lot. People like huge PostgreSQL or MySQL databases for even small things, but this is a waste of time IMO if you're just going to be the one accessing it. Bioconductor leverages huge amount of SQLite because it's so easy and awesome.

    Now, even more optionally:

  • Lisp. Lisp will change the way you think about programming. It's also used with AraCyc, MetaCyc, and PlantCyc data. I've used it extensively in these applications. The ratio of how Lisp has changed my thinking to how much I use it in production code is HUGE. Learn functional programming concepts; then concepts like map/reduce will fall easily into place. Know object orientation too.

  • Javascript. I love JS. It's doing amazing things too. And part of being a very effective bioinformatician/statistician is being able to easily convey your data. There is no easier and more interactive medium than a browser. Check out d3.js. Even old scientists can click a link and interact with data via Javascript. In contrast, they wouldn't want to install some old dusty Java application. Of course, with this comes HTML, XML, JSON, etc, etc. so learn those too.

  1. Learn statistics REALLY WELL. Honestly, try to pick up a statistics minor (over a CS minor IMO). Lots of brilliant programmers buy the Cormen algorithm book and are set for data structures and algorithms. But understanding statistics at a deeper level — that takes intimate study via courses. I would recommend taking courses on probability theory and mathematical statistics. I took two courses as part of our mathematical statistics series and I cannot even begin to emphasize how helpful they were. I hear a quote once: at Google they use Bayes theorem like other programmers use "if" statements. Same thing in bioinformatics. Look at the best SNP callers, software, etc, and they're using population genetics models and Bayes approaches. Know math stats early, and it will permeate your thinking in the best ways.

    Another quick story: I had a statistics graduate student come tell me he was working for a rather well known genomics professor on campus. He asked me how to analyze RNA-seq data. He said he wanted to use ANOVA. Even though he was a statistics graduate student, he went immediately to normality-assuming models, which was definitely not the case with this data the case. So know your Poisson, negative binomial, gamma, etc distributions. A probability course should introduce them all to you. It will also means when you start learning more theoretical population genetics, you'll be set.

    Also, buy a book on machine learning (Elements of Statistical Learning II is good, and a free PDF is available). ESL II is good, but dense; don't let it discourage you. I also like this book. But again, this is dense stuff, don't let it discourage you.

  2. Learn data structures and algorithms well. I think a single course, or doing this on your own is sufficient. However, if you want to do what Heng Li does (author of BWA, samtools, and fermi assembler) you need much, much more. Compression-based data structures are huge in bioinformatics now. I love this stuff, but it's too removed from the biology to be very interesting to me. But if that's the direction you want to move into, hang around CS department more.

  3. Learn to code well. This is vastly underemphasized in the sciences. Learn about test-driven development. Get the habit of writing unit tests early, and writing good documentation. Learn Git too — this is a must.

u/mighty-byte · 3 pointsr/AskReddit

As other mentionned, your question is pretty hard to answer on a post, even if it is pretty interesting. But having done a bit of research in discrete math & algorithm design and being a math lover myself, I'll try to talk about why I find my area interesting.

Being a software developper, you should pretty much know what an algorithm is. Given a problem, an algorithm is a recipe/step-by-step set of instructions that, if followed exactly, solves your problem. Algorithms can be classified in a lot of categories: deterministic algorithms (given some input, you always get the same right answer), probabilistic algorithms (you get the right answer 'on average'), approximation algorithms (you get an answer that is within some (provable) factor of the optimal answer) and so on. The main measure (there are multiple others) for the performance of an algorithm is the time it takes to find an answer with respect to the size of the input you are giving it. For example, if your problem is to search through an unordered list of n items for a given item, then the trivial solution (look at every item) takes time n. This is what we call linear in n, but algorithms can run logarithmic in n (takes time log(n) ) (very fast), exponential in n (c^n for some c) (very slow) or polynomial in n (n^k for some k) (efficient).

So the fun part is to find THE best algorithm for your problem. For a particular problem, one may ask how fast can the best algorithm run. Well surprisingly, it turns out that some problems will never admit an efficient EXACT solution. I'll give an example of such problem, and then come to the main point of my discussion.

Consider a graph/network (we love these things in discrete math), which is basically a set of points/nodes/vertices that are connected by links/edges, and its size is usually the number of nodes (the maximum number of edges of a (simple) graph is n^2 - the maximum number of pairs you can make with n elements). The Internet is the best example : nodes are webpages and edges are hyperlinks between theses pages. We say that two nodes are neighbors if they are connected by an edge. A fundamental problem of discrete mathematics goes as follows: what is the minimum number k such that you can color the nodes of a graph with k colors such that not two neighboring nodes share the same color? (http://en.wikipedia.org/wiki/Graph_coloring). It turns out that this problem can be proven to be inherently hard - if we can find an efficient deterministic algorithm (we strongly believe we can't) to solve this problem, than there is an efficient (=fast) algorithm to solve many "hard" problems (ex.: proving a theorem, or solving a sudoku ! - probably not going to happen). Such hard problems are said to be NP-complete. It also turn out that most real life (interesting) problems are also that kind of hard (http://en.wikipedia.org/wiki/List_of_NP-complete_problems).

This sounds quite desperate. However, here is where the research starts. It is said that these NP-complete problems cannot have efficient DETERMINISTIC and EXACT algorithms. Nothing prevents us from producing randomized and approximate solutions. So with some clever analysis, input from other areas (read algebra, geometry, probability, ...) and other tricks, some algorithms are built to find solutions that are usually within a good factor of the optimal. Hell, some NP-complete problems (i.e. Knapsack Problem) even admit an arbitrarly precise efficient algorithm. How is this possible? Reading required!

I don't know of non-scholar books that covered this subject, but if you are motivated, here are the books I suggest:

u/sindrit · 3 pointsr/compsci

Skip Calculus (not really useful unless you do fancy graphics or sound generators or scientific stuff). Discrete mathematics is what you want to look at for CS. You might want to move on to a linear algebra course from there.

Get the CS specific University textbooks. Here are some to get you started.

u/wcastello · 3 pointsr/learnprogramming

Structure and Interpretation of Computer Programs (SICP) is still one of my favorites.

u/joshi18 · 3 pointsr/computerscience

You are welcome :).
This is one of the best book to learn programming http://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871. It's freely available and the class at MIT which uses this is here http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-001-structure-and-interpretation-of-computer-programs-spring-2005/
Peter Norvig's advice http://norvig.com/21-days.html
Programming Pearls : http://www.cs.bell-labs.com/cm/cs/pearls/
At https://www.codeeval.com you can solve questions and get interview calls.
You may also want to brush up your design skills as few interviewers might ask those kind of questions. http://www.amazon.com/Head-First-Design-Patterns-Freeman/dp/0596007124 might be a good place to start.
I think http://www.geeksforgeeks.org is a good place to look for nicely explained solutions and you can find almost all the questions ever asked in a software engineering interview at careercup.com

u/quantifiableNonsense · 3 pointsr/AskEngineers

Self taught professional software engineer here.

Which language you learn is not as important as learning about data structures and complexity analysis. Code organization is also very important.

Pick one high level scripting language (like Python, Ruby, Perl, etc) and one low level systems language (C, C++, Rust, etc) and learn them both inside out.

A couple of books I recommend:

  • Code Complete
  • SICP

    As far as practical skills go, you need to learn how to use git (or whatever VC system the companies you are interested in use). You need to learn how to use Unix systems. A great introduction is The UNIX Programming Environment. You need to learn how to read other peoples' code, open source projects are great for that.

    When you are getting ready to interview, there is no better resource than Cracking the Coding Interview.
u/simba09 · 3 pointsr/softwaredevelopment

I have started reading this book and so far I am enjoying it

Domain-Driven Design: Tackling Complexity in the Heart of Software https://www.amazon.com/dp/0321125215/ref=cm_sw_r_cp_apa_i_R60tDb2Y19V4X

It came recommended to me from a few different sources including Reddit.

u/fuzzycardboard · 3 pointsr/PHP

Do yourself a huge favor and just read this.

u/mctonka · 3 pointsr/PHP

Well, any logic which doesn't encode business rules is, by definition, not business logic. So, for example, most or all of the logic within your infrastructure layer would not be business logic. But I suspect what you're really asking is "what sort of non-business logic belongs in a VO?" The answer to that is simply: any logic which goes hand-in-hand with the value being represented, and is side-effect free. One of the examples Eric Evans gives is that of a Paint VO which has a method mixIn(Paint $otherPaint): Paint that encapsulates the logic of mixing one paint into another, and produces a new VO.

u/Trinition · 3 pointsr/softwaredevelopment

Domain Driven Design by Eric Evans.

u/Sk0_756 · 3 pointsr/learnprogramming

The few comments in here aren't explaining WHY the interviewer said "very 90s". Obviously, "very 90s" is over-exaggerated... "very 2000s" is a bit more accurate.

What you've got is a "typical" layered architecture, which is considered a bit out-dated these days. There's a couple major reasons people move away from that architecture now.

  • Putting the DAL at the bottom of your architecture means it and your database are effectively the foundation of your application - everything else builds on top of those. We now recognize that since the database is an infrastructural concern, it is more at home on the fringe of an application's architecture, instead of at the core. Business Logic should actually be the foundation of an application, because implementing the business logic is the real reason you're writing the application in the first place.
  • Layered architecture tends to force developers to think in terms of logic and data structure separately. All of your BLL classes tend to be "services" - classes that execute some business logic, but don't hold data. You supplement those with what's called an "anemic domain model" - a collection of objects that hold data, but don't execute any logic (you have an anemic domain model if you have a collection of classes which are just a long list of getters and setters). That setup kinda defeats the point of OOP, which is all about allowing us to create classes that hold data AND execute logic.

    For a quick look at an alternative project structure, you can check out Onion Architecture, which moves business logic to the application core on CodeProject (this is similar to ports & adapters)

    For more in-depth, look into Domain Driven Design (DDD). Many an architect, including myself, would suggest Eric Evans' book on the topic.

    For a more recent DDD book, see Vaughn Vernon's Implementing Domain Driven Design.
u/banuday17 · 3 pointsr/javahelp

There's a whole book dedicated to this topic called Domain Driven Design.

A customer purchases tickets, not train services. And a train service has customers, but the train has passengers. You should keep the concepts separate. How about introducing a third concept - a ticket?

A customer has many tickets for past journeys. The ticket doesn't need to maintain a reference to the train service, but the train service should be able to identify tickets it issued. And the ticket doesn't need to keep a reference to the customer. Nice clean separation between two indpedent concepts, or Bounded Contexts as it is called in DDD.

u/stevewedig · 3 pointsr/ruby

We evolved an app architecture similar to the one he presents. However, it is also proprietary. It is truly a joy to work with though :)

I would say the key characteristics are roughly:

  • It is a Ruby/Python app, not a Rails/Django app. Most files are domain logic that don't import the web framework.
  • The Rails/Django ORM is hidden behind Repositories
  • The domain objects are plain-old-Ruby/Python-objects, not web framework DB models.
  • The Rails/Django http container is wrapped to implement a protocol independent req/rep container.
  • In unit tests, we are free to either mock the repositories, or fake them with in memory dictionaries.
  • In unit tests, we use mocks to cut the object graph, since the component under test shouldn't need to travel long distances.
  • Plugging in the real database implementation is an afterthought. We had a significant portion of the app working with memory fakes before we got around to it.
u/pitiless · 3 pointsr/PHP

The following books would be good suggestions irrespective of the language you're developing in:

Patterns of Enterprise Application Architecture was certainly an eye-opener on first read-through, and remains a much-thumbed reference.

Domain-Driven Design is of a similar vein & quality.

Refactoring - another fantastic Martin Fowler book.

u/Kichigai · 3 pointsr/geek

For a while there I'd use the Wikipedia Book Creator to aggregate a bunch of articles on a certain topic and then download it to my eInk e-Reader to peruse in bed until I fell asleep.

One such topic was early computing up through the Microcomputing era and the 1977 Trinity.

At that point of history I was reading Empires of Light about the AC/DC war, Where Wizards Stay Up Late about the birth of ARPANET, Dealers of Lightning, about PARC, Commodore: A Company on the Edge (about the rise of Commodore through the PET, slaying TI, and faltering after the C64), and Tubes: A Journey to the Center of the Internet, which was enlightening, even though it was written for someone who couldn't tell a modem from a hub.

u/adx · 3 pointsr/technology

The book Where the Wizards Stay Up Late is a pretty good overview. It has more details from the ARPANet days and wraps up right around the point of the combination of ARPANet, NSFNet, and CSNet to create the modern Internet.

u/lotusstp · 3 pointsr/technology

Tip of the hat to the pioneers... Lawrence Roberts, Vin Cerf, Bob Taylor, Ivan Sutherland, Douglas Engelbart and J.C.R. Licklider, among many others. Well worth studying up on these dudes. Some excellent reads (available at your public library, natch): "Dealers of Lightning" an excellent book about Xerox PARC; "Where Wizards Stay Up Late" fascinating book about MIT and DARPA; J.C.R. Licklider and the Revolution That Made Computing Personal a turgid yet compelling book about J.C.R. Licklider and his contemporaries.

u/ghostmrchicken · 3 pointsr/HaltAndCatchFire

You may like, "When wizards stay up late" as well:

https://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674/ref=sr_1_1?ie=UTF8&qid=1506457291&sr=8-1&keywords=when+wizards+stay+up+late

Description from Amazon:

Twenty five years ago, it didn't exist. Today, twenty million people worldwide are surfing the Net. Where Wizards Stay Up Late is the exciting story of the pioneers responsible for creating the most talked about, most influential, and most far-reaching communications breakthrough since the invention of the telephone.

In the 1960's, when computers where regarded as mere giant calculators, J.C.R. Licklider at MIT saw them as the ultimate communications devices. With Defense Department funds, he and a band of visionary computer whizzes began work on a nationwide, interlocking network of computers. Taking readers behind the scenes, Where Wizards Stay Up Late captures the hard work, genius, and happy accidents of their daring, stunningly successful venture.

u/caphector · 3 pointsr/sysadmin

I'm not aware of any books that just like this, but here are some recommendations:

  • The Soul of a New Machine - The company is gone. The machine forgotten. What remains, 30 years later, is the story of building and debugging a 32 bit computer. Spends time on hardware and software development and has some excellent descriptions of how the computer works.
  • Where the Wizards Stay Up Late - This is about the people who put the Internet together. Goes into the work that was needed to build the inital networks.
  • Hackers: Heroes of the Computer Revolution - A lovely history of hackers, in the inital sense of the term. People that were enthralled by computers and wanted to do interesting things with them. Starts off with the MIT Tech Model Railroad Club and moves foward from there.
u/magus42 · 3 pointsr/computerscience

What's your education level? I can't speak for the one that you linked but the 'standard' textbook in the field is Nielsen & Chuang's Quantum Computation and Quantum Information.

u/mctuking · 3 pointsr/quantum

If you're looking for something that's an actual text book, there's no better than Nielsen and Chuang.

https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176

u/potatotub · 2 pointsr/NoStupidQuestions

You want to start with an intro to algorithms book.

https://www.amazon.com/dp/0262032937/?tag=stackoverfl08-20

u/fyndor · 2 pointsr/learnprogramming

In EE degree we had an algorithms class and I happen to have a copy of this book. It is massive but I probably should consider cracking that thing open and going over it again. As with many things if you don't use it you tend to lose it and I bet I'm pretty rusty on most of the things covered in that book.

I will check out "Cracking the Coding Interview". Thanks for the suggestion.

u/dearmash · 2 pointsr/AskReddit

If you want to save money on books for college here's better advice:

  1. Don't buy the books
  2. Organize study sessions for your classes
  3. Get free homework help along with free book usage
  4. At the end of the semester if the book is worth keeping, poach it from people looking to sell it back

    I can honestly say it saved me at least a few hundred dollars. The only books I ended up keeping were the literary books (looks good on bookshelves) and the reference-type books (any c.s. grads out there?). Then save the extra $200 and use it on a mini-fridge.
u/tiedtoatree · 2 pointsr/IAmA

If you are enjoying your Calc 3 book, I highly recommend reading Topology, which provides the foundations of analysis and calculus. Two other books I would highly recommend to you would be Abstract Algebra and Introduction to Algorithms, though I suspect you're well aware of the latter.

u/phleet · 2 pointsr/programming

Project Euler problems won't help you as much as some other places.
I recommend the following:

u/POGtastic · 2 pointsr/learnprogramming

SICP is definitely useful for pretty much everyone. It's also free, although you can get the paper copy with the famous cover if you really want to.

Note that all of the examples are in Lisp, which is weird and scary to newbies. Since it's a functional programming book, you'll find that it approaches programming very, very differently than other tutorials and books, which cover imperative programming.

---

Sipser and its ilk aren't really useful outside of the classroom. I think that Theory of Computation is important for people to learn, but it's not really relevant to programming in general.

u/LiterallyCarlSagan · 2 pointsr/argentina

No soy OP pero aprovecho para preguntar, que opinás de SICP? Empezé con Elisp hace unos dias y me pareció interesante, pero no se si vale la pena leerlo.

u/melancholiclabs · 2 pointsr/Drugs

Read a lot of books. Everything is usually available as a pdf on the internet and the ones that aren't are $10 to rent on Amazon. Here's the ones that I've read that relate to this project.

Java

u/swan--ronson · 2 pointsr/javascript

I'm in a similar boat to you. I studied Computer Science at university but it was a really watered down curriculum (we even did Excel as one of our first year modules for fuck's sake; I was so desperate to code that I made a VBA frontend haha), so there are many gaps in my theoretical knowledge.

I'm currently reading MIT's Structure and Interpretation of Computer Programs. It's based upon Scheme, a dialect of Lisp, but covers many important topics, such as:

  • Tree recursion
  • Hierarchical sequences
  • State, mutability, and immutability
  • Streams
  • Compiler design

    Now I just need to find more time to read it!
u/ProtossIsBroken · 2 pointsr/cscareerquestions

Introduction to Algorithms

Structure and Interpretation of Computer Programs

I linked to Amazon but obviously these can be easily found as .pdfs.

u/defialpro · 2 pointsr/MrRobot

Depending on the subject. Like there's a lot of foundational subjects in CS and programming that are still relevant since decades ago. Like this book https://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871

which was on teachyourselfcs.com

u/RainbowHearts · 2 pointsr/AskComputerScience

If you only read one work on the topic, it should be The Art of Computer Programming by Don Knuth: https://www.amazon.com/dp/0321751043/

The textbook for MIT's 6.001 (introduction to computer science) is the much loved Structure and Interpretation of Computer Programs by Abelson, Sussman, and Sussman: https://www.amazon.com/dp/0262510871/ . Originally it was in Scheme but the 2nd edition is in Python.

Finally, because people asking about computer science are often asking about something a bit broader than pure computer science, I recommend Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. It is a thorough tour of computing in practice at every level, top to bottom. https://www.amazon.com/dp/073560505X/

u/fff1891 · 2 pointsr/computerscience

Some schools don't cover much in the way of discrete math, formal languages, automata, or proofs... at least not very rigorously. My opinion here is colored by my own experience (and subsequent disappointment, but thats another story), and I'm sure most schools sort of exist on a spectrum. YMMV.

Some books that come to mind (might not be to surprising if you spend a lot of time on CS forums):

[Introduction to the Theory of Computation](
https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser-ebook/dp/B00B63LSA6)

CLRS Introduction to Algorithms

SICP

I think it's interesting to look at the history of computer science-- read about Bertrand Russel, David Hilbert, the Vienna Circle, Alonzo Church and his students (Alan Turing was one). Computer Science as an academic discipline was kind of born from the questions mathematicians and philosophers were trying to ask in the early 20th century. It's just as much about language as it is about mathematics. I could probably write a wall of text on the topic, but I'll just leave it at that. :)

u/ItsAConspiracy · 2 pointsr/INTP

C++ has more meticulous detail that just about any other language.

This book would be a good place to try again. Or if you're really ambitious, this one. Both will teach you more of what programming is really about, using a language that has a minimum of syntax to deal with.

u/SomeLuckyDeveloper · 2 pointsr/cscareerquestions

Quick note about number 6: Those terms are a lot more intimidating than they sound. Some of them take a little while to really grok what they are and why they are useful but once you understand them they are extremely useful (and huge selling points for yourself as a developer).

About the interview: From what I've heard the questions they ask are different based on different backgrounds. They are less likely to ask software development methodology questions to a new graduate since that's not really in the realm of computer science.

I did get asked some questions regarding data structures and algorithms. The ones that I was asked, and I feel I did really well on, were more related to problem solving and software architecture. Think of it this way, if they hire you, in your day to day job having memorized how to traverse a graph isn't going to come up that often. Your day to day is going to consist of coming up with solutions to problems presented with you.

So when they ask a question like how to traverse a graph, they aren't looking for you to be able to spit out Dijkstra's algorithm exactly. Instead, they are looking to see how your brain works and what kind of solution you can come to by logically breaking down the steps to solve the problem.

They are looking for how you take a problem, break it down in pieces, and what your solution would look like.

Example: Implement a scale, that has two sides and lets you know based on the objects on the two sides, which side is heavier and by how much?

My answer to this question would be something like:

  • Create a WeighableInterface that requires the getWeight method.
  • Create a few random classes that implement that interface. Maybe a horseShoe class.
  • Create a scale class and scaleInterface that has the methods addToLeft(WeighableInterface $object) addToRight, getWeightDifference, getHeavierSide.

    How do you store the set of items on each side of the scale internally in the scale class? Do you need to be able to remove items from the scale, and why would this affect how you store the items internally? Should the scale have a weight limit?

    These are all great questions to ask. Think out loud, talk out loud. They want to see that when confronted with a problem you don't know the answer to, or don't know the best solution to that you don't freeze up but instead chunk it up and try to reason your way through it.

    "Should the scale have a weight limit?" Asking the interviewer this question is a huge win in your favor. It shows that not only are you trying to solve the problem, but you're constantly thinking about issues that might have been overlooked in the initial assessment.

    Back on number 6: I learned these by googling a fuckton. Watching a lot of videos, reading a lot of tutorials, and just asking a lot of questions.

    Here's some resources I still have bookmarked from the last year/18 months. Some of them are for targeted for php, but the concepts are universal. But if any of these don't do it for you, google a bunch.

    Solid: 1 2

    Inversion of Control & Dependency Injection: 1 2 3 4

    Domain Driven Design: This is actually a ton of concepts, and you don't necessarily need to learn them all at once. This book is the only software architecture book I've read cover to cover, its that good. If you can afford to, do buy it. Also another helpful link Intro to DDD.

    Test Driven Development and UnitTesting: 1 2

    Also I've found many of the top answers stackoverflow user tereško are great sources of wisdom regarding software development.
u/guifroes · 2 pointsr/learnprogramming

Interesting!

Looks to me that you can "feel" what good code looks like but you're not able to rationalise it enough for you to write it on your own.

Couple of suggestions:

When you see elegant code, ask yourself: why is it elegant? Is it because is simple? Easy to understand? Try to recognise the desired attributes so you can try to reproduce on your code.

Try to write really short classes/methods that have only one responsibility. For more about this, search for Single Responsibility Principle.

How familiar are you with unit testing and TDD? It should help you a lot to write better designed code.

Some other resources:

u/vinnyvicious · 2 pointsr/gamedev

Have you ever heard of the open/closed principle? Or the single responsibility principle? Or Liskov substitution principle? All three are being violated. It drastically reduces the maintainability and extensibility of your code. I can't swap serializers easily, i can't tweak or extend them without touching that huge class and it's definitely not the responsibility of that class to know how to serialize A, B, C, D, E and the whole alphabet.

I highly recommend some literature on the subject if you're curious about it, it would drastically improve your approach to software architecture:

https://www.amazon.com/dp/0132350882

https://www.amazon.com/dp/0201485672

https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

http://cc2e.com/

https://www.amazon.com/dp/0321127420

u/ThereKanBOnly1 · 2 pointsr/csharp

You didn't go into too many details, so I just wanted to make sure you weren't missing the things that make for a a foundation.

I think Domain Driven Design is a good path for you, mainly because it doesn't necessarily over prescribe a technology stack (those are in constant flux) and focuses on the fundamentals of designing for a domain and understanding how to deal with the complexities that come with it.

Eric Evan's book is a great place to start. Personally I have two other books that are incredibly helpful for DDD. The first is Implementing DDD by Vaughn Vernon, and the second is Patterns, Principles, and Practices of DDD. I think the second is a little more along the lines of what you're looking for, but doesn't go into as much of ideas and structures of DDD. What it does do is go into more of the patterns that you're far more likely to see in an enterprise application.

Two patterns that you should pay attention to are Command Query Response Segregation (CQRS) and Event Sourcing. CQRS can be used in a number of ways and incarnations. Event Sourcing may or may not be applicable in your application, but its useful to think of an application's state and interactions as an immutible ledger of events.

If you go down the road of microservices, then I strongly suggest you also explore message queues as well. Although they certainly are useful, microservices are seen as somewhat of a fad or buzzword. Message queues, and the patterns that go with them, are quite useful in this context and also applicable in more "traditional" service oriented architectures as well. A great book for diving into those messaging patterns is Enterprise Integration Patterns.

NoSQL and Big Data solutions will likely come up along the way, and although they are worth exploring in their own right, I think its important that they are implemented for the right reasons. I'll leave this here for now, if this is something that you want to explore further, I'd suggest reading through Fowler's Polygot Persistence as a good jumping off point

u/_dban_ · 2 pointsr/java

I'd start with a classic: Patterns of Enterprise Application Architecture by Martin Fowler. He blogs a lot on topics related to enterprise architecture.

I'd follow up with another classic: Domain Driven Design by Eric Evans. Martin Fowler has a lot to say about DDD as well

u/recycledcoder · 2 pointsr/ruby

No worries, it's always good to make the implicit explicit. I am (and I suspect others would be as well) referring to Domain Driven Design.

"Domain", in this case, can be defined as the problem space that the software you're writing addresses. The business entities, actors, processes, activities, etc. that your software manipulates.

u/dont_mod_your_rhodes · 2 pointsr/learnprogramming

It will help -to me it was the first book that came to mind. The second would be the this -the pinnacle of software design.

u/Alarinth · 2 pointsr/AskProgramming

The book Domain Driven Design: Tackling Complexity in the Heart of Software sort of revolves around this topic. The author speaks at great length about projects where spending way more time in modeling / researching than coding allowed them to solve the problems in ways which allowed them to scale way better - and what they did during that time.

The book is a bit old, but if you take it for the modeling aspect it's still a great read.

u/timmyotc · 2 pointsr/cscareerquestions
u/SofaAssassin · 2 pointsr/cscareerquestions

For more foundational stuff, I'd look at:

  • GitFlow - a successful Git branching model - This is pretty much the prototypical Git usage model. I have seen it used pretty much everywhere I've worked that used Git, and a lot of software supports the model by default (Atlassian Stash, SmartGit, Git Tower, and more).

    • Also take note of alternative Git usage models like GitLab Flow and GitHub Flow. They fit rather different use cases, and GitFlow is typically used for software that follows a more old-school release model where releases happen periodically, or you need to support multiple production releases at any given time. GitLab and GitHub flow are geared more toward software that only sees a single version in production at any given time, and very common release cycles (e.g. daily or even shorter).

      Getting familiar with these branching models will also expose you to many commonly-used Git features like branching, squash merging, rebasing, tagging, and history rewriting.

  • How to write a commit message

    No one's really gonna ask you about this, but you should develop a habit of writing great, clear, and concise commit messages.

  • Continuous Delivery and Continuous Integration

    All the rage right now - having real/near-real time building/unit-testing/packaging/deployment of your software once you've made a code commit. Read the articles I linked, play with services like CircleCI or Travis-CI or CodeShip and integrate them with your projects in GitLab.

  • Test-Driven Development and Behavior-Driven Development

    Probably the two most commonly used overarching test-based software development processes. I'm a strong proponent of TDD (when done right), and many teams you work on will probably employ TDD or BDD.

  • Stemming from the last point, know how to write good unit tests and how they differ from integration tests or acceptance tests.

  • Code organization - a lot of this will likely be influenced by the language/toolset you're working in, but you'll be interested in learning about Layered Architecture and software packaging metrics.

  • Generic software design - all sorts of acronyms to read about and absorb, like YAGNI, KISS, DRY, and SOLID. Also, the Unix philosophy, which guided a lot of development of software for Unix and Linux these days. There will also be patterns that apply to specific languages or types of software, but the stuff above is rather generically applicable.

    Beyond those links, some books that cover a lot of general material are:

  • Clean Code
  • Pragmatic Programmer
  • Mythical Man-Month
  • Software Estimation - Okay, software estimation is really gonna be complex and difficult until you get a lot of experience, and even experienced developers get it wrong. I don't think it's particularly necessary to read this book when you're starting out.
  • Domain Driven Design - I love this book - it's about breaking down complex software designs.
  • Release It! - Nygard is a pretty battle-tested developer, so this book is about approaching software design practically rather than in a vacuum separated from real-world application.
u/tomthecool · 2 pointsr/ruby

I presume it's Domain Driven Design -- This was a popular, well received book. DHH in particular has blogged, talked and tweeted about it on numerous occasions.

u/catwiesel · 2 pointsr/technology

the internet was built on the backs of a few very dedicated nerds with a very limited social life. and of course, quite a few engineers who left because they didn't want to dedicate their lives.

There is a great book about that time and the people...

Where Wizards stay up late

https://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674

u/LegendaryPatMan · 2 pointsr/videos

They are jokes in my experience.. I don't think there is an actual chart that if you can do x you fall into y category. Like from what I've always known, level 100 is n00b and level 900 is internet god so at 400 I'm like sysadmin. But I've never heard of a level 900 or level 100..

I know a guy who could be a level 900 but he's know as a Wizard, which is a reference to a book called Where Wizards Stay Up Late. I put myself at 400 because it's about average. I use it daily for work and for some stuff at home but I'm not an absolute god, I know what I need for work and some more and I know Stack Overflow for everything else

u/jmct · 2 pointsr/askscience

Sorry for the delay,

Do you mean Quantum Computing or DNA computing more specifically?

Quantum Computing has a plethora of good material out there.

this and this

are two REALLY good books on the subjects that don't require very much physics knowledge.



DNA computing will be harder to point you towards good materials as it's a newer field but I can probably find you some good stuff by asking the lecturer who taught it.

Cheers

u/eightOrchard · 2 pointsr/QuantumComputing

This page has some decent resources https://codeforces.com/blog/entry/65063

Also there is a free QC MIT course https://ocw.mit.edu/courses/mathematics/18-435j-quantum-computation-fall-2003/

Last but not least I am trying to put together a QC learning resource https://stevefroehlich.github.io/ I have a graduate degree in CS so I'm trying to make it a resource for people like us that come from a CS background. I picked up the standard text book https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176?SubscriptionId=AKIAILSHYYTFIVPWUY6Q&tag=duckduckgo-ffab-20&linkCode=xm2&camp=2025&creative=165953&creativeASIN=1107002176 and realized I am missing some of the core Linear Algebra concepts (Basis, Vector Space, Hamiltonian matrix, ect) so that is where my site starts. Its a work in progress and should get better/more helpful as I add more to it.

u/ilmmad · 2 pointsr/science

Mike&Ike for the uninitiated.

u/tibblf · 2 pointsr/QuantumComputing

Full disclosure: I'm a software engineer at Microsoft

Here's a few resources I found useful. I just started learning quantum computing recently too:

u/Crankenterran · 2 pointsr/DebateReligion

I do not believe that is true. Classically you are correct, but if we look at qubits instead of bits (quoting from wikipedia because I am too lazy to flick through my quantum computing textbook):

"However, the computational basis of 500 qubits, for example, would already be too large to be represented on a classical computer because it would require 2^500 complex values to be stored.[10] (For comparison, a terabyte of digital information stores only 243 discrete on/off values.)"

u/shivstroll · 2 pointsr/AskScienceDiscussion

For trapped ion quantum computation if just looking at books:

Atomic Physics by Budker, Kimball, and Demille

Laser Trapping and Cooling by Metcalf and van der Straten

Quantum Computation and Quantum Information by Nielsen and Chuang

u/tempforfather · 2 pointsr/atheism
u/Thanks-Osama · 2 pointsr/learnprogramming

If your not afraid of math then I would recommend the books by Robert Sedgewick. His java book really shows off Java. His Algorithms book is a religious experience. And if your feeling masochistic, the Sipser book is well suited.

u/cbarrick · 2 pointsr/computing

Sipser's Introduction to the Theory of Computation is the standard textbook. The book is fairly small and quite well written, though it can be pretty dense at times. (Sipser is Dean of Science at MIT.)

You may need an introduction to discrete math before you get started. In my udergrad, I used Rosen's Discrete Mathematics and Its Applications. That book is very comprehensive, but that also means it's quite big.

Rosen is a great reference, while Sipser is more focused.

u/shared_tango_ · 2 pointsr/AskReddit

Here, if you can find it somewhere on the internet (cough), this book covers it nicely and is widely used (at least in German universities)

https://www.amazon.de/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

u/nerga · 2 pointsr/math

this is the book you are talking about I am assuming?

I think I am looking for a more advanced book than this. I have already learned most of these topics in my own classes in school. I suppose I would be looking for the books that would be recommended after having learned this book.

u/mightcommentsometime · 2 pointsr/math

My favorite relaxing math book was Chaos, Making a New Science by James Gleick

And The Information by James Gleick Was pretty good too.

u/colo90 · 2 pointsr/compsci

you must be referring to this book; you seem to have forgotten to include a link (or the name) to the book you're referring to

u/lisbonant · 2 pointsr/iamverysmart

For a historical and theoretical overview that doesn't get too technical but is still comprehensive and fun to read, I highly recommend The Information by James Gleick. If you dig Zero, I think you'd dig this.

u/ReinH · 2 pointsr/AskComputerScience

The Annotated Turing is fantastic! Also check out Turing's Cathedral for some insight into how his 1936 paper influenced computing into the next few decades and The Essential Turing to read Turing in his own words.

For a look at how Turing influenced information theory (and a fascinating general introduction to its history), check out The Information.

u/Blindocide · 2 pointsr/DebateReligion

You should check out this book called The Information. it talks about information theory and how all material interactions are really just transferring information in the form of momentum and spin.

While I was hallucinating on 2C-E, after reading about schroedinger's cat, I had actually theorized that quantum interactions are, at a base level, information transfer. It was interesting to see that come up in a book way after I had thought of it.

/boast

u/miketuritzin · 2 pointsr/opengl

This is surprisingly tricky to get right. I used the "polyboards" technique described in chapter 9 of this book, and it works well: https://www.amazon.com/Mathematics-Programming-Computer-Graphics-Third/dp/1435458869

u/CodyDuncan1260 · 2 pointsr/gamedev

Game Engine:

Game Engine Architecture by Jason Gregory, best you can get.

Game Coding Complete by Mike McShaffry. The book goes over the whole of making a game from start to finish, so it's a great way to learn the interaction the engine has with the gameplay code. Though, I admit I also am not a particular fan of his coding style, but have found ways around it. The boost library adds some complexity that makes the code more terse. The 4th edition made a point of not using it after many met with some difficulty with it in the 3rd edition. The book also uses DXUT to abstract the DirectX functionality necessary to render things on screen. Although that is one approach, I found that getting DXUT set up properly can be somewhat of a pain, and the abstraction hides really interesting details about the whole task of 3D rendering. You have a strong background in graphics, so you will probably be better served by more direct access to the DirectX API calls. This leads into my suggestion for Introduction to 3D Game Programming with DirectX10 (or DirectX11).



C++:

C++ Pocket Reference by Kyle Loudon
I remember reading that it takes years if not decades to become a master at C++. You have a lot of C++ experience, so you might be better served by a small reference book than a large textbook. I like having this around to reference the features that I use less often. Example:

namespace
{
//code here
}

is an unnamed namespace, which is a preferred method for declaring functions or variables with file scope. You don't see this too often in sample textbook code, but it will crop up from time to time in samples from other programmers on the web. It's $10 or so, and I find it faster and handier than standard online documentation.



Math:

You have a solid graphics background, but just in case you need good references for math:
3D Math Primer
Mathematics for 3D Game Programming

Also, really advanced lighting techniques stretch into the field of Multivariate Calculus. Calculus: Early Transcendentals Chapters >= 11 fall in that field.



Rendering:

Introduction to 3D Game Programming with DirectX10 by Frank. D. Luna.
You should probably get the DirectX11 version when it is available, not because it's newer, not because DirectX10 is obsolete (it's not yet), but because the new DirectX11 book has a chapter on animation. The directX 10 book sorely lacks it. But your solid graphics background may make this obsolete for you.

3D Game Engine Architecture (with Wild Magic) by David H. Eberly is a good book with a lot of parallels to Game Engine Architecture, but focuses much more on the 3D rendering portion of the engine, so you get a better depth of knowledge for rendering in the context of a game engine. I haven't had a chance to read much of this one, so I can't be sure of how useful it is just yet. I also haven't had the pleasure of obtaining its sister book 3D Game Engine Design.

Given your strong graphics background, you will probably want to go past the basics and get to the really nifty stuff. Real-Time Rendering, Third Edition by Tomas Akenine-Moller, Eric Haines, Naty Hoffman is a good book of the more advanced techniques, so you might look there for material to push your graphics knowledge boundaries.



Software Engineering:

I don't have a good book to suggest for this topic, so hopefully another redditor will follow up on this.

If you haven't already, be sure to read about software engineering. It teaches you how to design a process for development, the stages involved, effective methodologies for making and tracking progress, and all sorts of information on things that make programming and software development easier. Not all of it will be useful if you are a one man team, because software engineering is a discipline created around teams, but much of it still applies and will help you stay on track, know when you've been derailed, and help you make decisions that get you back on. Also, patterns. Patterns are great.

Note: I would not suggest Software Engineering for Game Developers. It's an ok book, but I've seen better, the structure doesn't seem to flow well (for me at least), and it seems to be missing some important topics, like user stories, Rational Unified Process, or Feature-Driven Development (I think Mojang does this, but I don't know for sure). Maybe those topics aren't very important for game development directly, but I've always found user stories to be useful.

Software Engineering in general will prove to be a useful field when you are developing your engine, and even more so if you have a team. Take a look at This article to get small taste of what Software Engineering is about.


Why so many books?
Game Engines are a collection of different systems and subsystems used in making games. Each system has its own background, perspective, concepts, and can be referred to from multiple angles. I like Game Engine Architecture's structure for showing an engine as a whole. Luna's DirectX10 book has a better Timer class. The DirectX book also has better explanations of the low-level rendering processes than Coding Complete or Engine Architecture. Engine Architecture and Game Coding Complete touch on Software Engineering, but not in great depth, which is important for team development. So I find that Game Coding Complete and Game Engine Architecture are your go to books, but in some cases only provide a surface layer understanding of some system, which isn't enough to implement your own engine on. The other books are listed here because I feel they provide a valuable supplement and more in depth explanations that will be useful when developing your engine.

tldr: What Valken and SpooderW said.

On the topic of XNA, anyone know a good XNA book? I have XNA Unleashed 3.0, but it's somewhat out of date to the new XNA 4.0. The best looking up-to-date one seems to be Learning XNA 4.0: Game Development for the PC, Xbox 360, and Windows Phone 7 . I have the 3.0 version of this book, and it's well done.

*****
Source: Doing an Independent Study in Game Engine Development. I asked this same question months ago, did my research, got most of the books listed here, and omitted ones that didn't have much usefulness. Thought I would share my research, hope you find it useful.

u/Waitwhatwtf · 2 pointsr/learnprogramming

It's going to sound really far outside of the topic, but it really helped with my logical mathematical reasoning: Mathematics for 3d Game Programming and Computer Graphics.

I'll also preface this by saying you're probably going to need a primer to get into this book if you're not sure how to reason a greatest common factor. But, being able to tackle that book is a great goal, and can help you step through mathematical logic.

Also, graphics is awesome.

u/pyromuffin · 2 pointsr/Games

I read this book and it gave me great power:
https://www.amazon.com/Mathematics-Programming-Computer-Graphics-Third/dp/1435458869

it wasn't easy, but if you try really hard it'll be worth it.

u/naranjas · 2 pointsr/funny

> Can you give me any more info on what types of things you simulate

There are so many different things. One example that involves physical simulation is rendering. Rendering, turning a 3d description of a scene into a 2d image, is all about simulating the pysics of light transport. Given a set of lights and surfaces you simulate how light bounces around and what a virtual observer placed somewhere in the scene would see. Another example is explosions. Cool/realistic looking explosions for movies involve simulating burning materials, fluid/gas movement, sound propagation, fracture, plastic/non-plastic deformation, the list goes on and on.

Here are some books that might get you started in the right direction

  • Fundamentals of Computer Graphics: This is an entry level book that surveys a number of different areas of computer graphics. It covers a lot of different topics but it doesn't really treat anything in depth. It's good to look through to get a hold of the basics.

  • Mathematics for 3D Game Programming and Computer Graphics: Pretty decent book that surveys a lot of the different math topics you'll need.

  • Fluid Simulation for Computer Graphics: Really, really awesome book on fluid simulation.

  • Do a google/youtube search for Siggraph. You'll find a lot of really awesome demonstration videos, technical papers, and introductory courses.

    As for programming languages, you're definitely going to need to learn C/C++. Graphics applications are very resource initensive, so it's important to use a fast language. You'll probably also want to learn a couple of scripting languages like python or perl. You'll also need to learn some graphics API's like OpenGL or DirectX if you're on Windows.

    I hope this helped!
u/HarvestorOfPuppets · 2 pointsr/learnmath

Algebra - Required

Trigonometry - Required

Linear Algebra - Required

Calculus - Required for advanced graphics

After these there are bits and pieces depending on what you are doing.

Differential Geometry

Numerical Methods

Sampling Theory

Probability

Computational Geometry

As for discrete math there are parts you might need. You don't necessarily need to learn whole topics, like quaternions are used for rotations in 3d. This is a good book that takes parts of topics that are important to graphics specifically.

https://www.amazon.com/dp/1435458869/?tag=terathon-20

u/TexturelessIdea · 2 pointsr/gamedev

If you want to know how to deal with all the negative comments people give you, the only real answer is to ignore them. If you want to know how to convince people that being a indie dev is a worthwhile pursuit, then you have to release a game that sells really well.

This may sound like very useless advice, but the truth is that getting into indie game development is not a good idea. Most likely you will never finish a game, or you will release a game that nobody cares about and doesn't make you much(if any) money. Some people spend years making a game, and still end up releasing a bad game. The simple truth is that no amount of hard work, dedication, or love of game development is going to guarantee your success.

Most aspiring gamedevs like to talk about Minecraft, Stardew Valley, or Dwarf Fortress, as if the existence of those games guarantees their success. Most people don't realize that for every Notch, there are 1,000 people who make games nobody even knows about. Most likely you, me, and most of the other people here will fail to make a game that earns enough money to live off.

If you can't afford to release a game or two (or 5) without turning a profit, then game development just isn't for you. If my post upsets you, keep in mind that you will hear much worse from loads of people no matter how good your game is. I would never recommend developing indie game to anybody as a career choice, it is very hard work that will most likely earn you less money than working part time at minimum wage. You should think of game development as a fun hobby; because, until you make a big hit, game development isn't a career any more than buying lottery tickets.

If you've made it to the end of my post and you still want to be a game developer, well that's the kind of attitude you're going to need, so you have that going for you. I do also have some practical advice for improving your gamedev skills. When you're talking about your knowledge of programming, you seem hung up on the language itself. Knowing a programming language makes you about as much of a programmer as knowing a human language makes you a writer. I'm not saying this to be mean (you may find that hard to believe at this point); I'm just trying to point out that there are other aspects of programming for you to learn. Some good things to read up on are programming(or design) patterns, algorithm design, and general (language agnostic) programming topics. There are also game design topics that don't relate to the programming aspects. I'll leave a quick list of resources below.

Project Euler

Theory of Fun for Game Design

Game Programming Patterns

Coursera's Software Development Category

MIT Open CourseWare Computer Science Category

u/raydenuni · 2 pointsr/boardgames

> I also like these discussions! This is actually a subject of some interest to me, because people have been complaining about a lack of board game content as true critique rather than just more "consumer oriented" reviews.

If you're interested in the theory of games (not to be confused with game theory, which is an interesting type of math), then look into "ludology".

> Have you played My First Orchard?

I've never heard of it. But it sounds like you make choices and can get better at the game, much like tic-tac-toe, so I would call it a game. Interestingly enough, from a mathematical, complexity tree point of view, tic-tac-toe, checkers, and chess are also essentially equivalent. Some are more complex than others, but at the end of the day there's a branching tree of moves, you take turns moving through this tree, and at the end of some branches, a player wins. We consider tic-tac-toe to be trivial and not worth our time because our brains are able to solve it. But checkers and chess are just as theoretically solvable, we're just not smart enough.

> What about a game where you roll 100d6 and your opponent rolls 100d6.

I guess. I'm a fan of extreme examples proving stuff. It's an easy way to see if your theory holds up or not. I'd argue that's a pretty poor game, but it's not technically any different than any number of games. Take MTG for example, given a specific shuffle of each player's deck, you could say one person has a 100% chance to win. For most shuffles though, it's probably a lot closer to 50%. In those cases, your choices matter.

> Also, as for a game needing "goals," doesn't this eliminate many "sandbox" style games (whether video games or sandboxy narrative games like RPGs)?

It does. Sandbox style games are often considered toys and not games. Sim City has been famously described by Will Wright as a toy and not a game. If you're looking at the dressing instead of the content, a lot of not-games become games. Toys are fine. Toys are good. Nothing wrong with toys. There are a lot of cool toys where you learn a lot of really useful stuff and given self-imposed goals, you can learn stuff about them and reality. But they're not games.

> Also, I'm not sure if I agree that activities that aren't about learning are not fun? Can't something be fun because it's physical (e.g., thrill rides)? Because it's nostalgic?

This is potentially a weakness of the argument and might be enough to prove my radical stance false. But the idea is that all of those things involve learning of some sort. It starts to blur the lines between learning and experiencing new things though for sure.

> Finally, is there nothing to say about the fact that S&L is routinely referred to as a game?

There is. And words are used differently in different contexts, with different people, to mean different things. If we're just speaking colloquially, then yeah, anything on the same shelf at Target can be considered a game. But if we're using these terms to actually mean something so we can have an intellectual, academic discussion about games, it's useful to differentiate between toys and competitions and games. If our terminology can't distinguish between Chess and Lego in some definitive manner, we're going to have trouble coming up with any interesting conclusions. I don't mean to assign any quality to the term game, there are a lot of really great non-games out there. We already differentiate between different types of board games. People will often refer to something as multi-player solitaire. Could we not refer to these as competitions? And of course when speaking colloquially, it doesn't really do to categorize Dominion as a card based competition instead of a card-game.

Raph Koster's blog has a bunch of good content: https://www.raphkoster.com/ But I would start with his book A Theory of Fun. Apparently there are some PDFs here and here, 10 years later. It's a super easy to ready book, but really insightful. I highly recommend it. It looks like maybe the PDFs are a subset of the book. Let me know what you think.

u/MyJimmies · 2 pointsr/truegaming

It's been on my mind again, so I'm happy to see it here on Truegaming. But there's this video that might help out a bit or at least be a bit entertaininly-interesting.

It might be awhile until we are at the point where we can have entire schools based around this kind of discussion. But hopefully someday. There's plenty of interesting books outthere that have already been suggested here. There's some books based around game design like Raph Koster's "A Theory of Fun". There're YouTubers like aforementioned MrBTongue and Satchbag that fondly talks about games or themes in games and how it affects them and those around them. Then there's /r/truegaming that talks about these things as well, albeit a bit more fanatically.

But sadly I got nothing to fit exactly your category that you want to see, though I'd love to see it myself. Perhaps a start for finding some stories of interesting user interactions in MMOs can start with Eve Online. Check out The Mittani. Although I haven't read it in a long while I do remember its launch when I still flew with Goonswarm/Goonwaffe and the cool pilots and writers of the site. Some great stories and unintentionally interesting insight into the mindset of players interacting in an MMO space.

u/dindenver · 2 pointsr/RPGdesign

Thanks for sharing!

These are resources that helped me better understand game design:

This is about the gamificaiton of non-game designs. But it really expounds on what makes it a game as opposed to other activities (play or work for instance):
https://www.microsoft.com/en-us/research/video/9-5-theses-on-the-power-and-efficacy-of-gamification/

RPG Design Patterns:
http://rpg-design-patterns.notimetoplay.org/

The Theory of Fun for Game Design book:
https://smile.amazon.com/Theory-Game-Design-Raph-Koster/dp/1449363210/ref=sr_1_1

u/SharpSides · 2 pointsr/pcmasterrace

Our very own E-Book HERE has a lot of helpful stuff on getting started!

I'd also recommend the following:

http://www.amazon.com/Art-Game-Design-book-lenses/dp/0123694965/

http://www.amazon.com/Theory-Game-Design-Raph-Koster/dp/1449363210/

u/jseego · 2 pointsr/Parenting

Well, it's true that any well-designed game does that. Here is a wonderful book on that and other topics from the world of game design.

I guess the question is whether it's exploited or not.

And a further question is: should we consider it exploitation if the goal is just to keep the user playing the game for long periods of time? I think the answer to this is yes, but you may not, and that's okay.

It sounds like neither of us are okay letting our kids play unlimited amounts of video games with no supervision, so we're both doing the right thing.

u/Chowderman · 2 pointsr/gamedesign

I agree with others that you should just start trying to make games, even if they're clones of other games to get you started. Stay small. Smaller than you think you can handle even. Don't make your first game your massive 100 hour JRPG epic.

​

A great book is A Theory of Fun, also. Good luck! And don't get discouraged when it gets tough!

u/biochromatic · 2 pointsr/gamedev

Theory of Fun is a pretty standard book to read to answer your questions. It's full of comics and quite fun to read.

u/enalios · 2 pointsr/gamedesign

If you want to be a game designer, just first accept that you're training for a marathon not a sprint.

Start with small exercises, not a full game just, like, quick sketches of game mechanics or ideas.

Do lots of tutorials, like "how to make a shmup in [whatever game engine]" and then when you finish the tutorial just add one or two things to make it your own, then move on to another tutorial.

After a few of those, start participating in 48 hour game jams.

There's a site I participated in for a bit called 1 Game a Month in which the idea was simply to finish one game a month. Not a masterpiece every month, just something finished every month.

It really is worth it to invest time in learning how to actually finish a project as opposed to always thinking about finishing it.

I recommend reading the following short articles:

The Chemistry of Game Design

Understanding Challenge

And I recommend the following books, not necessarily to read cover to cover but to read until the content doesn't seem to interest you, then just kinda skip around to the interesting bits:

Challenges for Game Designers by Brenda Braithwaite

The Art of Game Design by Jesse Schell

And finally I recommend reading this book from cover to cover:

A Theory of Fun by Raph Koster

u/iAbortedFetus · 2 pointsr/learnpython

There has been many sources. I'll just copy and paste my comment above regarding some of the more helpful sources I've come across.

> I'd say Corey Schafer's YouTube channel has been a huge help with visualizing and understanding the basics.

> Fluent Python by Luciano Ramalho has had an impact on me as well. There's a lot of advanced topics in that book that's out of my level, which I try my best to understand, but there's also tons of good examples for beginners to pick up on.

> I've watched hours of CS lectures regarding python. This lecture by Raymond Hettinger helped me break out of some bad practices.

u/iznk · 2 pointsr/Python

Not a bible though, but one of the best books on Python I've read. A lot of real life examples https://www.amazon.com/Fluent-Python-Luciano-Ramalho/dp/1491946008

u/invictus08 · 2 pointsr/flask

First of all, applause for the great start.

Here are some criticisms/suggestions I would like to offer. Keep in mind, I am not assuming your level/experience as a software developer:

  1. Functions with smaller size. You see, most of the functions that you have written is lengthy because of the sql statements. Here comes my second point.

  2. Separate business logic, application code, data storage related stuff etc. Keep things modular. That separation is important because you want things to be maintainable and reusable. Your code should be open for extension, but close for modification. If that does not make sense to you, that's perfectly fine, just start from this

  3. On that note, since you are using flask, might I suggest using flask-sqlalchemy instead of sqlalchemy? You may like it better. I know you have mentioned

    > I force myself to write raw SQL Request to get better with SQL

    while that is commendable, it is not really a good idea to write raw sqls in production code if there are ORM library alternatives available. Remember, it's not always you that is going to read/modify the code. While ORM syntax will be fairly universal, your style of writing SQL may vary starkly from other people - which is what creates confusion and lets errors sneak in. Even if you want to do that, maybe keep the raw sql in separate modules (point 2).

  4. Instead of computing everything and then sending the result along with the page, maybe create api endpoints for specific sections; render page with bare minimum info and from the webpage make multiple calls to update the page sections when required. This way, it will be far more responsive, user will not be waiting for you to finish all the computation and if you detect any change in any section of the page, you can just update that particular section with an appropriate api call, thereby avoiding a whole page reload. Design choices.

  5. PEP8. You don't have to blindly follow every rule - just make sure you understand why those rules are there, and that if you are breaking any, you know that it is absolutely necessary for accomplishing what you want. Again, what you want may not always be what you actually need - so be really careful.

  6. This is something I wish I knew earlier - Design Patterns. Without going into much details, I would recommend reading these books to start with and really understand instead of memorizing:
  7. Documentation is also important. Follow the good practices there. A remarkable reference would be Ken Reitz's Requests library.

    Finally, remember that all these are just suggestions, and you may already know them. You will decide which ones to take and which ones to leave behind based on your situation.

    Again, great job (I also learnt something from this). Just make sure you keep running.
u/xPolydeuces · 2 pointsr/learnpython

My friend who was getting into Python recently asked me about the same thing, I've made some research and this was what I came with. Just for the record, I'm personally a book dude - can't really focus much when learning from Udemy courses and stuff like that, so I will only cover books:


First book:


Python Crash Course by Eric Matthes
Very solid position for beginners. The first part of the book covers Python's basics - data types, lists, functions, classes, pretty much everything you need to get a good grasp of Python itself. The second part of the book includes three practical projects, mini-game, data visualization and an introduction to making web apps with Django. From what I so, it's a pretty unusual approach to beginner friendly books, since most of them avoid using additional libraries. On the other hand, it's harder to get bored with this book, it really makes you want to learn more and more once you can actually see the effects of all your work.


Automate the Boring Stuff with Python by Al Sweigart
Best alternative if you want to spend 0 bucks or want to dive all into projects. Even though it covers basics as well, I still recommend to read it, even if you have done Python Crash Course before, even just for the sake of all those projects you can make to practice your Python. He also has a Youtube channel where he has a loooot of Python content and sometimes does cool things like streaming and helping people make their code better, really cool guy, be sure to check his channel!


Second book:


Writing Idiomatic Python by Jeff Knupp

Very solid book, filled with examples what you, as a Python developer should do, and what you shouldn't (and why not). Sounds like not much, but it is actually a lot of useful knowledge that will make your code shorter, cleaner and better.


Effective Python by Brett Slatkin

A bit easier to understand and easier to approach than a previous book, but still has a load of knowledge to share.


Third book:


Fluent Python by Luciano Ramalho

One of the best Python books overall, covers all of the things that previous books could have missed or didn't have time to introduce. My personal favorite when it comes to books for advanced Python developers.


All of those recommendations are my personal opinion, so if anyone has anything to add, I will gladly listen to any comments!

u/chra94 · 2 pointsr/learnpython

How about https://learnxinyminutes.com/docs/python3/ ? It's a webpage which covers the basics at least. Might be smaller than you want but it's something you can start reading right now at least.

E: Learnxinyminutes has some book suggestions at the bottom which might interest you.

E2: You might be interested in Fluent Python. It goes into the nitty gritty parts of Python and shows (to my knowledge) idioms in Python. Book here

u/Kadoba · 2 pointsr/gamedev

I personally love Programming Game AI By Example. It gives lots of very usable examples in an entertaining and understandable way. It's pretty friendly for beginners and even offers a game math primer at the start of the book. However the examples still have a lot of meat to them and thoroughly explains some important AI concepts like state machines and pathfinding.

u/Greystache · 2 pointsr/gamedev

Steering behaviors are explained in depth (with c++ code) in the book Programming Game AI by Example

It's mentioned in the article, but I think it's worth pointing it out again as it's a very good book on the topic.

u/BlindPaintByNumbers · 2 pointsr/unrealengine

Sorry, no, Goal Oriented Action Programming is the name of an AI strategy, like Finite State Machines. There is a wonderful AI book called Programming Game AI by Example if you're interested into delving into all the possible mechanics. There's plenty of free online resources too. Read up on both types of AI and I bet you have an AHA! moment. Other people have thought about these problems for a long time.

u/linuxlass · 2 pointsr/learnprogramming

I started by showing my son Scratch when he was 9.5yo and helping him make a couple of arcade games with it. He was never all that interested in Logo, but got really turned on by Scratch. After a couple of months he was frustrated by Scratch's limitations, and so I installed Ubuntu on an old computer, showed him pygame/python and worked through a couple of online tutorials with him, and let him loose.

He learned to use Audacity to edit files from Newgrounds, and Gimp to edit downloaded graphics as well as create his own. He made a walk around, rpg-like adventure game, a 2D platformer, and then decided he wanted to learn pyggel and has been working on a 3D fps since last summer.

Soon, I'm going to get him started on C++ so we can work through a book on game AI (which uses C++ for all its examples). He's 13.5 now, and thinks programming is great and wants to grow up to be a programmer like his mom :)

I highly recommend a simple language like python for a beginner, but Scratch is wonderful for learning all the basic concepts like flow control, variables, objects, events, etc, in a very fun and easy way. The Scratch web site also makes it simple to share or show off because when you upload your program it gets turned into a Java applet, so anyone with a browser can see what you've done.

u/gamya · 2 pointsr/Unity3D

Very good response.

In the book Programming game AI by example. This is explained very well. In c++ though.

Moreover, you can have more than one state machine. One global, one for the weapon, one for the player... etc. Depending on your game needs.

u/apieceoffruit · 2 pointsr/learnprogramming

You are looking for a mix of "neural networks" and "fuzzy logic"

Personally I'd suggest Game AI by example great ai book.

u/efofecks · 2 pointsr/gamedev

This book should have everything you need. It's accessible, quite funny at times, but has a good introduction to everything from finite state machines and communication (what boxhacker described), to pathfinding and goal seeking behavior. What's best is that they have sample c++ code on their website you can look through.

u/workaccountthrowaway · 2 pointsr/gamedev

I've found working my way through this book: Programming Game AI By Example really helped me figure out how to do all you're asking. I would recommend it if you are serious in learning how to make basic to more advanced AI.

u/duckyss · 2 pointsr/gamedev

Look at Programming Game AI by Example. It has a lot of collision detection information and how it is used in various behaviors.

u/venesectrixzero · 2 pointsr/gamedev

Programming game ai, http://www.amazon.com/gp/aw/d/1556220782/ref=redir_mdp_mobile, has great example code for a soccer game including path finding and steering.

u/alex47ka · 2 pointsr/ItalyInformatica

Anche in inglese van bene, non ho problemi. Questo è quello a cui ti riferisci? Sembra un libro più mirato alla sicurezza informatica dalla descrizione. Cercavo qualche libro giusto per riprendere i concetti della sintassi e quant'altro, dato che non uso Python da un po' e data la mia altissima disponibilità economica /s stavo decidendo tra quei due.

u/callmedoge · 2 pointsr/HowToHack

This book might be a good start.

u/observantguy · 2 pointsr/AskNetsec

In that case, Violent Python may be helpful--not a tutorial on kali/netsec, but it'll help you learn about netsec aspects through coding your own "exploits"...

u/ardtus · 2 pointsr/hacking
u/sharplikeginsu · 2 pointsr/Python

OP mentioned books or tutorials, so I wasn't limiting it to only things that there are books about, but re: just books:

u/kimchi_station · 2 pointsr/netsecstudents

So this is aimed at people in a cyber security degree? What kind of knowledge do they have?

> using all the tools of kali

Pleaseeee no. There are hundreds of programs and scripts in Kali, it would not be feasible to learn and remember them all. Off the top of my head what I would do is:

  • Have people do some of the starter wargames at overthewire so they are familiar with the linux command line. Maybe even make this a requirement to participate so you know that people are committed and have a base level of knowledge.

  • Read write-ups on attacks and attackers, here is a good one by Mandiant<--(PDF link)

  • Culture. I feel like this is one of the most neglected fields in cyber security. Read some phrack.

  • Split people into teams to work on projects so that they have experience working together.

  • Find some old CTFs or images on Vulnhub. See if you can register for some CTFs, looks great on a resume.

  • Learn about sql and sql injection.

  • Learn python, take a look at violent python or Grey Hat Python and Black Hat Python for more advanced stuff. There is also Hacking Secret Ciphers with Python for more of a crypto angle.

  • linux, linux, linux. feel at home in the terminal and be able to script bash.

  • Going over basic tools like nmap, aircrack-ng (airmon-ng, etc.), sqlmap, hydra, hashcat, metasploit, etc. Make whole day labs that use just one tool, You could maybe find an easy Vulnhub image or use Metasploitable to practice these.

  • Make sure everyone has a github and populates it with stuff they create in this class. Incorporate it into your class so you got people forking and contributing to other members/teams projects.

  • Look over books like The Hacker Playbook, Hacking, the Art of Exploitation, and so on for more ideas.

  • Maybe most importantly, have the students teach. I'm sure there are people in there who specialize in one tool or subject. Have them design and lead a lesson/lab/activity. The best way to solidify and expand on what you know is to teach it.
u/webauteur · 2 pointsr/ProgrammerHumor

I recently came across the book Violent Python: A Cookbook for Hackers, Forensic Analysts, Penetration Testers and Security Engineers for when it is no longer mister nice snake.

u/CounterSanity · 2 pointsr/LiveOverflow

Violent Python: A Cookbook for Hackers, Forensic Analysts, Penetration Testers and Security Engineers https://www.amazon.com/dp/1597499579/ref=cm_sw_r_cp_api_nZOMAbWE1K8Y9

u/FertileLionfish · 2 pointsr/learnprogramming

I personally love Python and try to get a lot of my college classmates to try it. Python is very simple, but powerful and in my opinion intuitive. While it is type-less, some few this as a plus or a negative, I could really care less. The biggest reason why I'll recommend Python to somebody new and interested in programming is how it enforces styles, so later on down the road when coding in other languages it just feels natural and your code will generally make more sense. If you're also interested in security/pentesting look into Violent Python. I wish you the best of luck getting into programming, its frustrating at times, but very rewarding in the long run.

u/AcadianMan · 2 pointsr/cybersecurity

You will definitely want to learn Python.

Something like this book would give you a solid foundation.

http://www.amazon.ca/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579/ref=sr_1_1?ie=UTF8&qid=1452120799&sr=8-1&keywords=violent+python

You might want to look into CISM and CISSP for certifications.

http://www.tomsitpro.com/articles/information-security-certifications,2-205.html


You can also look into a CyberSecurity meetup group in your area, or you could start your own and learn off of other people.

u/RagingSantas · 2 pointsr/HowToHack

Get Violent Python this will show you everything that you need in Python for PenTesting.

u/rdguez · 2 pointsr/algorithms

There’s a book called Grokkin Algorithms, which illustrates some of those concepts quickly. I also liked another book called Cracking the Code Interview.

u/hilduff5 · 2 pointsr/OSUOnlineCS

I took 325 last winter and it was a bit rough. Join the slack group for the class and the class will be tad bit easier and less frustrating. also supplement the class with the Grokking Algorithms book (link below). I guy who graduated a year ago from the program wrote about it in his blog. Like him, the book helped me out tremendously.

Grokking Algorithms

u/jeebusfeist · 2 pointsr/learnprogramming

I'd recommend Grokking Algorithms

u/Mydrax · 2 pointsr/learnprogramming

Inside the Machine, visually illustrates concepts within a computer system so beautifully that it will make you cry.
Also, TeachYourselfCS, not really a book but a list of links of videos and books that will help you grasp various sections of CS.

Grokking Algorithms, not a book based on CS, but it's a really good book on algorithms with funny illustrations that will help you through.

The other books that have been mentioned like Clean Code for example are a must read!

u/MrMoustach3 · 2 pointsr/portugal

Este é mais indicado.

u/BookOfFamousAmos · 2 pointsr/IWantToLearn

I would suggest Grokking Algorithms in Python:

https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230

It explains them in an accesible, easy to understand format. Whether or not you like or know python, it's a good learning resource. I highly recommend it.

u/fernly · 2 pointsr/learnpython

On Amazon you can preview quite a bit, although not the recursion chapter which looks pretty short:

https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230

u/Cracklings · 2 pointsr/csharp

This is just my 2 cents, but the first thing you should be asking when learning any language is what problem are you trying to solve with the language?

C# as a language doesn't amount to anything, but it's real potential comes from the frameworks it is associated with.

​

If you're wanting to:

web develop then you would need to look into .NET Core + WebApi + MVC or a front-end framework (Angular, React, Vue).

This is a great course to get you started with. It'll create a basic web application you can modify and play around with from the database to the front-end:

https://www.pluralsight.com/courses/aspdotnetcore-efcore-bootstrap-angular-web-app

​

desktop development then look at Wpf (window presentation forms) or use electron and c#

mobile development then take a look at xamarin

​

from your use of Unity though, it seems as if you are more into game development which I would advise then to go a bit lower and really learn algorithms and good implementations. For this, there are some greats books you can use to help you get started:

Cracking the Coding Interview: 189 Programming Questions and Solutions by Gayle Laakman - Even though this is an interview book it's a great intermediate book to algorithms. The book does assume you have a basic understanding of elementary data structures.

The Algorithm Design Manual by Steven Skiena - This is definitely more advanced and heavy but it's a great book to really dig down into the nitty gritty

A great website for practicing writing algorithms in c# is leetcode.com. It's a site that basically lists a bunch of small questions you can solve with an in browser compiler that includes c#. This way you wouldn't need to download visual studio to practice coding.

if you're up for the challenege, then you can download a framework like SFML.Net and try to develop a game from the ground-up without using an engine like unity. But this is obviously a lot of work ;)

​

Overall it's hard to give really specific advice without knowing where you're trying to head. But it's a good time to get into c# and in general microsoft's development stack. In the past people were shoe horned into using microsoft's technology stack from top to bottom but recently microsoft has made a lot of stride in making there tech more open which is making a lot of people turn heads.


If you are also looking for a more lite-weight ide then I recommend visual studio code or vscodium which is the same but without the trackers :)

u/last_useful_man · 2 pointsr/learnprogramming

You don't need to start off with 'Cormen' (CLRS) imo. It has way more complication than you need in a first class, and is so heavy to lug around and painfully clearly written (=> verbose), that you'll dread picking it up. Really, whatever the cheaper class text is, is ok. Skiena's "Algorithm Design Manual" is pretty complete for beginners in the first part, is less heavy on math (but has some) - the book has a practical bent - and the book weighs far less. Plus the guy is a witty writer. It's a bit more terse, so there's room for an algorithm survey in its 2nd half. For this reason the book is a keeper. It's especially great for learning dynamic programming (which you may not do).

u/Vitate · 2 pointsr/cscareerquestions

My Story

Hey pal, I was in a similar boat about 8 months ago. It was my senior year as an Economics major, and after taking a programming class, I instantly fell in love with it. I crammed a few more programming classes in before graduating, but in the end, I sure as hell wasn't employable as a software engineer.

​

I had a choice: become a data analyst (the path I was currently on) or follow the software engineering dream. I chose the latter.

​

I decided to go to a (remote) coding bootcamp after college. The program was 6 months. It taught web development (Node, React) and some very basic CS fundamentals. I spent my free time outside the bootcamp inhaling all the computer science and industry information I could. I did this because I wanted to be a competent programmer. I found it fun.

​

This week I had my second onsite. I expect to receive a full-time software engineer offer (my second offer so far) later today, and I have 4 other onsites in the near future (a big 4 + a few startups). It has been a heck of a lot of work to get here, but if you want it badly enough, it's possible.

​

My Tips

  • Try not to be intimidated by these tips. Software engineering is something that you take little bites out of. You cannot become an employable developer in one bite, and sometimes the field can be intimidating.
  • Your options right now are self-teaching, a coding bootcamp, or a CS master's degree (might be hard to get into a good program without a bit more relevant experience, tbh.).
  • It's going to be pretty difficult to break into anything other than web development for your first programming job without a CS degree. Titles like Front-end Engineer, Full Stack Engineer, Backend Engineer, and Software Engineer (at a web company) are within reach. More specialized titles probably aren't very realistic.
  • Basic toy projects (i.e., simple HTML/CSS or similar) probably aren't enough to get significant attention. You need things more complex, like full-stack applications built from scratch. This means a working backend, a working database, a modern front-end (using a framework like React, etc.). Here's my portfolio if you're curious about the type of apps I mean.
  • Other types of programming applications outside of web dev are also fine, as long as they are sufficiently complex and interesting.
  • Put your projects on your GitHub no matter what. Learning how to commit code to GitHub is an important industry practice. Having a green GitHub history makes you look better.
  • Try and build a portfolio once you get better at coding. Don't kill yourself making it look amazing, but do try and make it look good. Not everyone will care about your portfolio, but some people will. I got an interview just based on having a nice portfolio.
  • Your university course sounds like a great primer, but you need to go deeper to be competent enough to pass interviews. I took similar courses at my university, but what really helped me was going through a few textbooks (1, 2, 3 -- some suggestions) and watching MIT 6.006 lectures. You will still have gaps in things like web security, scaling systems, networks, and operating systems, but I wouldn't spend a ton of time learning those topics as a new grad. Knowing the basics can be helpful though, because these things do definitely come up in interviews.

    ​

    Happy to answer any other questions you may have. I'm not an expert or an experienced software engineer yet, but I've walked the path you're considering, so hopefully my tips are helpful.
u/IRLeif · 2 pointsr/learnprogramming

Thanks! Just to make sure, do you mean these two books?

  • Introduction to Algorithms
  • Algorithms

    If so, I'm glad you mentioned those. Both of them are already on my to-read list, but they were further down than Knuth's work. Since, as you say, they might better for starters, I'll check those out first.

    By the way, I have also heard some good things about this one:

  • The Algorithm Design Manual

    Have you any experience with that book? It's also on my to-read list.
u/chernn · 2 pointsr/webdev

If you are a beginner and you have wifi on the plane: https://www.freecodecamp.org/.

For deeper reads:

u/Nihili · 2 pointsr/compsci

I'd recommend

u/TheFakeNoob · 2 pointsr/cscareerquestions

If you plan on self teaching I find these materials to be quite sufficient:

Data Structures and Algorithm Analysis This book also has a C++ version and can be found online for free(legally, from the author). I prefer reading a real book over PDFs so I opt to buy it but your opinion may differ.

The Algorithm Design Manual. This is pretty much the 'go to' book for self learning DS/Algo since it covers a lot of material but does not go that deep into the details beyond a working knowledge.

Introduction to Algorithms. This is the standard text on Algorithms and is used in most undergrad/graduate level courses on the subject. It is very detailed and goes deep into the theory and mathematical proofs of algorithms. It's a much more academic text but still worth mentioning and being aware of.


Edit:
Out of the 3 I think the first is the easiest to read but the second is the best in covering relevant material quickly and sufficiently enough. The last one is only for those who want mastery of the topic or intend to use it for a course.

u/sessamekesh · 2 pointsr/learnprogramming

In almost every field, you're going to end up dealing with groups of things - lists of contacts, groups of users, categories of billing items, whatever. Having a deep understanding of arrays vs. (hash) sets vs. (hash) maps is huge, by far that's the most common decision I make on a day-to-day basis.

I think it's worthwhile to expose yourself to a bunch of specialized algorithms and data structures. You don't need to study them all super in-depth, but reading about them and maybe hacking them out in Python/Java/C++/whatever is a great learning exercise. I personally like "The Algorithm Design Manual" by Steven Skiena, (EDU PDF, Amazon). It's on the dry side, but it categorizes algorithms into a handful of types (sorting, graphs, combinatoric search, etc) which makes it great reference material for learning.

A handful of useful exercises, maybe useful:

  • Quicksort (and why is it faster than a trivial sort like selection sort? Explain it like I'm five)
  • Caching! Also, how it makes pure functions preferable to unpure ones for expensive code.
  • Implement a Fibonacci function, and describe why the recursive solution is terrible (hint: draw out each function call for a number like 10). This is a great exercise in runtime analysis. Implement it better and know why your second shot is better.
  • Graphs manage to sneak into all sorts of places - knowing Dijkstra's algorithm by heart probably isn't important, but being comfortable with graphs is valuable. Many problems can be thought of as transforming data from group A and B into C, and thought of as information travelling through a graph, being changed at each node. The graphics pipeline used for 3D graphics is a fun example of an application of this idea.
u/kaluuee · 2 pointsr/learnprogramming

You need to read this
Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) https://www.amazon.com/dp/B00JDMPOK2/ref=cm_sw_r_cp_apa_i_v3xwDbV1177NS

u/Anaufabetico · 2 pointsr/brasil

Não li, mas já botei na minha lista porque sou programador nerdão assumido e entusiasmado. Obrigado pela dica.


"Code", do Charles Petzold faz a mesma coisa e também é muito bom.

u/BitterFortuneCookie · 2 pointsr/explainlikeimfive

The above answers were really good. I recommend a look at the book Code: The Hidden Language of Computer Hardware and Software to get a sense of the history of how computer languages evolved into how we build applications today.

https://www.amazon.com/dp/B00JDMPOK2/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1

u/urnlint · 2 pointsr/computerscience

I do not read textbooks as a hobby like some people seem to, but this book seems to have a large chunk of my 5 years of college (yeah for a bachelor) into a single book. Code

u/ConstantScholar · 2 pointsr/csbooks

Code: The Hidden Language of Computer Hardware and Software is a really good and very readable computer architecture book.

u/devilbunny · 2 pointsr/explainlikeimfive

The voltage represents a 1 or 0. They're not translated, they just are.

You really ought to read Charles Petzold's Code: The Hidden Language of Computer Hardware and Software. It will answer your questions.

u/fav · 2 pointsr/argentina

No hay nada como la universidad. Si querés aprender por tu cuenta, seguí un plan similar. Empezá por lo más básico de la teoría y andá subiendo en complejidad. Una vez que tenés los conceptos, aprender un lenguaje de programación es sencillo.

Si no tenés idea o pensás que lo computable es algo más que un montón de conjuntos de números naturales, empezá por algo como code.

Sobre algoritmos y estructura de datos hay un montón de libros y cursos (coursera, khan, etc. Empezá por los teóricos, huí de los que te enseñen un lenguaje particular.).

Luego paradigmas de lenguajes de programación, teoría de lenguajes, y estás hecho. :)

u/jasonwatkinspdx · 2 pointsr/AskComputerScience

It varies in industry. I think it's a great idea to have a general understanding of how processors execute out of order and speculate, how caches and the cache consistency protocols between cores work, and how the language implementation transforms and executes the source you write.

The Hennesy and Patterson book covers almost everything hardware wise. Skim the areas that seem interesting to you. For language internals I like Programming Language Pragmatics. Compared to other "compiler course" textbooks like the famous dragon book it's got a lot more of the real world engineering details. It does cover quite a bit of theory as well though, and is written in a really straightforward way.

Skimming these two books will give people a pretty accurate mental model of what's going on when code executes.

u/trsohmers · 2 pointsr/Steam

Very long and unusual story which I'll have to write about some day. I'm actually a high school dropout that was already working in the technology research industry. I was already working on high performance computing (supercomputers) systems, and thought of everything that was wrong with the current systems that was out there, which lead me to learning the history and why systems were built that way. I then got deeply interested in computer architecture design, and after reading this (and many other) books, I decided I wanted to make build my own processor. I got the opportunity to start a company, and am currently working on building that business (and goof off on reddit on the side). As for what it is like, it involves 10 to 16 hours a day in front of a computer and series of whiteboards, and frequently wanting to throw that computer out the window.

I would not recommend dropping out of school in most circumstances, but the information needed to do something like this is readily available, and isn't all that complicated/expensive to do. The cheapest way to start is buying a book on Verilog/FPGAs ($20-$60... there are also ones online for free and classes on Udacity/Coursera/edX/etc), and looking into the history. Modern processor design is only 50 years old, and they were doing roughly the same thing 50 years ago with equipment that was significantly worse. Recreating a simple 8 bit ALU should take a couple of hours if you understand basic circuitry, and recreating an old 80's (16bit) processor would take maybe a week of half-assed effort while learning and making mistakes.

u/DMRv2 · 2 pointsr/emulation

I don't know of any resources on emulation in general, sorry. What aspect of emulation are you interested in? Dynamic recompilation? Binary translation? Cycle-accurate emulation?

If you're interested in cycle accurate emulation, it helps to have a background in computer architecture. H&P is a great textbook:
https://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X

u/dotslashzero · 2 pointsr/Cplusplus

> Thanks for the comprehensive reply.

No problem.

> I'm already familiar with the basics of computer architecture, although I could certainly stand to know more. If you know of any good medium-level textbooks on that, I'd appreciate it.

I learned computer architectures using this book (but with an earlier edition): http://www.amazon.com/Computer-Architecture-Fifth-Edition-Quantitative/dp/012383872X. I think this is the same book they use in MIT.

> I realize that this is the case notionally, but I've come across posts in various places on the internet that claim compiler optimizers will sometimes inline recursive functions up to some arbitrary compiler-dependent depth so that a single stack creation can handle a recursion depth up to that arbitrary count, making the performance difference between recursion and iteration much less significant for relatively small scales. Then there's also tail recursion optimization where the recursion can go arbitrarily deep without needing to increase the stack.

You said it yourself twice:

  • up to that arbitrary count
  • can go arbitrarily deep

    That is the thing, because it is arbitrary, the only way you will be able to tell is to look at the compiled code and check if the recursion/iterative loop was optimized as you expect it to. Compilers have many levels of optimization. There is compile time optimization, which we are all familiar with. There is also link time optimization where optimization happens during link stage. There is also a technique performed by LLVM where the compiler creates an intermediate language bit code then perform optimization during link time based on the generated bit code. I am sure there are other optimization stages/techniques that exist on other compilers. These different levels of optimization stages use varying criteria to judge how a code will be optimized. You will need to check the resulting binary, using the optimization switches of your choice, whether the compiler's technique is used or not.

    > The problem is that I don't have any authoritative sources on any of these optimizer techniques, so there might be lots of useful tricks that I'm not exploiting.

    To be honest, many of these things are learned more through experience. You are on the right track on being curious about what the compiler does (or specifically, how it optimizes code), but in most cases, you will probably learn them through your own experiments (e.g. getting curious about whether solution y works better than solution x, then doing it, and then profiling and verifying through the internet forums).

    > So I guess compiler documentation is where I need to look to find these things out?

    Cannot say that it isn't, but I doubt compiler manuals/documentation will talk about these things in detail. It is your best bet, but most likely, you will have to go through the source code of the compiler (if it is open source), or find some white papers talking about how such optimizations are achieved by the said compiler.
u/YoloSwag9000 · 2 pointsr/computerarchitecture

Typically companies do not publish full details about their IP, because then it would be easy to copy them and they would lose any competitive advantage they have. However, there is a remarkable amount of detail about how processors work, as many of the old techniques for branch prediction, caching and so forth are still around. There is a good (and free!) Udacity course called "High-Performance Computer Architecture" where some of these things can be learned. I can also recommend the books "Advanced Computer Architecture: A Design Space Approach" (Sima) and "Computer Architecture: A Quantitative Approach" (Hennessey & Patterson). The website Real World Tech post some very informative articles where they dive deep into the microarchitecture of Intel processors (such as this Haswell writeup) and others. Another port of call is the ecosystem of RISC-V, an open-source instruction set. They have a partial list of core and SoC implementations that you could pick through. If you fancy looking into GPUs, the book "Real-Time Rendering" (Akenine-Moller et al.) will start you off with the basics of the graphics pipeline. Both AMD and NVIDIA publish varying amounts of information about how their GPUs. The Broadcom VideoCore-IV has had full microarchitecture specs published, which you can find easily with Google.

​

If you really want to learn this stuff in detail, I would highly recommend designing a CPU/GPU and writing a simulator of it. Start by designing an instruction set, then building a very simple scalar in-order processor. Then add features such as branch prediction, register renaming, out-of-order execution and so forth. At University I wrote a CPU simulator for my Advanced Architecture class, then a cutdown GPU simulator for my Master's Thesis project. From these I managed to land an awesome job writing GPU simulators, so if computer architecture is something you want to pursue as a career I can strongly recommend completing a project like this. You will learn plenty and have something to talk about with potential employers.

Good luck!

u/ctcampbell · 2 pointsr/netsec

Or "Computer Organization and Design, Fifth Edition: The Hardware/Software Interface"

http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269

u/ziptofaf · 2 pointsr/learnprogramming

>is book could have been useful also for C++ real-time programmers of course because i would include also HW information used in that field.. probably I'm asking too much now..

It wouldn't be. You misunderstand how that field of programming works. Differences can be HUGE and you would end up using at least something like https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269.

Why? Because hardware used there can be fundamentally different than your typical computer. How much? Well... some CPUs don't support recursion. No, really. Do more than 2-3 recursive calls and CPU rans out of memory. You also end up using FPGAs and ASICs. To explain all that is way more than a book.

You seem to want a hypotethical book on "current PC hardware and it's performance". Which frankly is not in a form of a book but comes from visiting places like Guru3d and anandtech. Actual low level differences that WILL matter for a programmer are hidden in CPU specs sheets and to read that you need resources that target computer architectures and your problem domain specifically. Well, that and practice really - someone working in game engine development is likely to know graphics pipeline like the back of their hand and can easily talk about performance on several GPUs and pinpoint what makes one better than the other. But that came from experimenting and plenty of articles, not a singular resource. Too many different requirements really to create a single resource stating that "X is good for Y branch of programming but bad for Z".

I mean, even within desktop CPUs themselves. Would you rather have a 14 Core i9 9980XE or a 32 core Threadripper 2990WX? Answer is - it depends. One has far superior single threaded performance due to higher clock, the latter will eat it alive in heavily multithreaded and independent processes (2990WX has 32 cores but only 16 are connected to the rest of your computer, this can cause very visible delays so there are multithreaded scenarios when it will underperform). And in some cases you will find out that an 8-core 9900k is #1 in the world. It ALL depends on a specific application and it's profile.

u/0x5345414E · 2 pointsr/webdev

You shouldn't worry so much about different programming languages. They all more or less work the same. I would recommend you learn how a computer works at a low level and work your way up. You could start here, then move on to this. This kind of knowledge makes a lot of things clearer.

u/Nullsrc · 2 pointsr/unexpectedfactorial

Find it here on Amazon. It's actually a pretty good textbook and worth reading even if you're mostly a software developer.

u/rjt_gakusei · 2 pointsr/programming

This book has a pretty strong breakdown of how computers and processors work, and goes into more advanced things that modern day hacks are based off of, like address translation and virtualization with the recent Intel bugs:
https://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040
The book can be found online for free. The author's website has practice challenges that you can download, one of them being a reverse engineer of a "binary bomb". I did a challenge similar to it, and it felt pretty awesome when I was able to get around safeguards by working with the binaries and causing buffer overflows.

u/bluebathysphere · 2 pointsr/compsci

The two starting books that gave me a great deal of understanding on systems (which I think is one of the toughest things to grasp and CLRS and the Art of Programming have already been mentioned):

[Computer Systems: A Programmer's Perspective] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=sr_1_2?ie=UTF8&qid=1407529949&sr=8-2&keywords=systems+computer)

This along with its labs served as a crash course in how the system works, particularly a lot about assembly and low-level networking.

The Elements of Computing Systems: Building a Modern Computer from First Principles

I've mostly only done the low-level stuff but it is the most fun way I have found to learn starting all the way at gate architecture. It pairs well if you have read Petzold's Code. A great introduction to the way computers work from the ground up.

u/ac1d8urn · 2 pointsr/asm

Maybe you're not looking for this sort of thing and it's a bit more advanced (expecting you to know C, or Java. Any programming experience will be good.), but it's a goldmine of information and covers a broad range of topics.

http://www.amazon.com/dp/0136108040/

From a quick Googling you can find a pdf of the most recent version of it on this guy's Github:
https://github.com/largetalk/datum/blob/master/others/Computer%20Systems%20-%20A%20Programmer's%20Perspective%20(2nd%20Edition).pdf

You can view the raw file to download the PDF.

u/ultimatt42 · 2 pointsr/linux

You're looking for a clear dividing line, and there isn't one. The term "emulator" is more descriptive of the problem you're trying to solve (I have a program for X but I only have Y, how can I get it to run on Y?) than any particular implementation. It's all in the name, "emulate" means to copy or imitate. If that's the goal of the software, or even just how you're using software that was designed for another purpose, it could be considered an emulator.

> So, [...] it has to be called "emulating"?

No, but if it fits the definition you shouldn't complain if someone says it is.

> I really think that "computer system" refers to hardware, not software.

Maybe you'll trust the textbook I was taught from.

> A computer system consists of hardware and systems software that work together to run application programs.

u/Intrexa · 2 pointsr/shittyprogramming

Because unlike what the OP said, IP's that are actively being used to route information are ints in memory. When I ping Facebook.com, the program doesn't get 173.252.120.6 back, it gets 2919004166 back (which is identical to getting 0xADFC7806 back, because it's just a bit order, there's no way to differentiate the decimal and the hex. It's just a display thing). It has to convert that number to the 4 octets that make sense to humans.

Same thing in reverse. When I go to http://173.252.120.6, it can't just start sending data out to 173.252.120.6, it has to convert it to 0xADFC7806 first. This is true of any program that does networking. Why is that? It saves space in an area where space is most important. So you see, it's not like the program is doing an extra step to allow this, it's actually doing 1 less step. What's more, I bet Mozilla (I use firefox, this is probably true of all browsers) didn't implement a specific "convert IP to int" function, this is default functionality of networking libraries, libraries that would routinely be handling both octet notation and ints, because most programs dealing with this would already dealing with low level ints. So not only are they doing 1 less step, Mozilla would have to go out of their way to specifically disallow this.

And again, for completeness sake, an IP address that is traversing a network is stored in a big endian int, meaning the bit order you see here (1010 1101 1111 1100 0111 1000 0000 0110) isn't actually the bit order a network switch will receive when it needs to route the packet. Also, it's technically not an int, it's a structure that only contains a single int.

/ Internet address. /
struct in_addr {
uint32_t s_addr; / address in network byte order /
};

If you want to learn about low-level hardware and what is actually happening behind the scenes, I strongly recommend Computer Systems a Programmers Perspective. It's a hard book, no doubt, but it will show you everything that happens that you don't see when you compile a program and then run it, including memory management, cache fetching, how hard drives store data, how processor pipelining works, all to the level of detail in my posts.

u/dataCRABS · 2 pointsr/DestinyTheGame

If you guys enjoy this stuff and the idea of AI, I implore you to check out Ray Kerzweil revolutionary docubook on "The Singularity" aka the point in time in the near future where technological evolution surpasses the rate of biological evolution.

u/linuxjava · 2 pointsr/Futurology

While all his books are great. He talks a lot about exponential growth in "The Age of Spiritual Machines: When Computers Exceed Human Intelligence" and "The Singularity Is Near: When Humans Transcend Biology"

His most recent book, "How to Create a Mind" is also a must read.

u/Supervisor194 · 2 pointsr/exjw

I know this is probably going to be considered a weird, nostandard answer but I'm going to give it a a go anyway. One of the things I have done personally is pursue optimistic things rather than pessimistic things. The fact that not a day goes by that you "don't read something or study something that emphasizes how fucking doomed we are" is a symptom that you are perhaps subconsciously seeking out those things which reinforce your previous negative worldview.

In point of actual fact there are a lot of interesting positive things happening in the world. I highly recommend that you check out /r/futurology and in particular, there is a book I read that really blew me away about what some people think is going to happen in our lifetimes. It's called The Singularity is Near. Whether or not you accept the conclusions of the book it can really open your eyes to how much is going to change in the next few decades and it will make you think.

And I think that's the rub. You need to think about different things and seek out the positive. Because the simple fact of the matter is we are not necessarily doomed. We have very big problems, but in reality, they are all eclipsed by the fact that the sun will swallow up the Earth in 5 billion years leaving no trace of anything behind. So honestly, we are either going to transcend our biological inheritance in a big way or none of it really matters anyway. A lot of people are working towards the former, myself included.

It helped me, is all I'm saying. :)

u/FantasticBastard · 2 pointsr/AskReddit

If you're interested in the subject, I would recommend that you read The Singularity is Near by Ray Kurzwiel.

u/admorobo · 2 pointsr/suggestmeabook

It's a bit dated now, but Ray Kurzweil's The Age of Spiritual Machines is a fascinating look at where Kurzweil believes the future of AI is going. He makes some predictions for 2009 that ended up being a little generous, but a lot of what he postulated has come to pass. His book The Singularity is Near builds on those concepts if you're still looking for further insight!

u/florinandrei · 2 pointsr/baduk

> AI is too dangerous and the humanity is doomed.

There are undergoing efforts to mitigate that.

https://openai.com/about/

https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/

u/flaz · 2 pointsr/DebateEvolution

Okay, so that makes sense with Mormons I've met then. The "bible talking" Mormons, as I call them, seemed to me to be of the creation viewpoint. That's why I was confused about your view on it. I didn't know the church had no official position.

I read some of your blog posts. Very nice! It is interesting and intelligent. Your post about the genetic 1% is good. Incidentally, that is also why many folks are hypothesizing about the extreme danger of artificial intelligence -- the singularity, they call it, when AI becomes just a tiny bit smarter than humans, and potentially wipes out humanity for its own good. That is, if we are merely 1% more intelligent than some primates, then if we create an AI a mere 1% more intelligent than us, would we just be creating our own master? We'd make great pets, as the saying goes. I somehow doubt it, but Nick Bostrom goes on and on about it in his book, Superintelligence, if you haven't already read it.

Continuing with the "genetic 1%", it is possible we may be alone in our galaxy. That is, while abiogenesis may be a simple occurrence, if we think about the fact that in the 4.5 billion years of earth's existence there is only one known strain of life that began, it might be extremely rare for life to evolve to our level of intelligence. Some have speculated that we may be alone because we developed early. The idea is that the universe was cooling down for the first few billion years, which completely rules out life anywhere. Then another few billion years to create elements heavy enough for complex compounds and new star systems to emerge from the debris. Then the final few billion years when we came to be. Who knows?

u/mossyskeleton · 2 pointsr/Showerthoughts

If you haven't read Superintelligence by Nick Bostrom yet, you should probably read it. (Or don't, if you don't want your supercomputer/AI fears to be amplified a hundred-fold.)

u/antiharmonic · 2 pointsr/rickandmorty

He also wrote the wonderful book Superintelligence that explores routes and concerns with the possible creation of AGI.

u/TrumpRobots · 2 pointsr/artificial

If you haven't already, read this book. It really made me realize that there are very few paths forward for humanity if an AI comes online.

u/subdep · 2 pointsr/Futurology

Anybody who is seriously interested in this subject, you must read Nick Bostrom’s book: Superintelligence: Paths, Dangers, Strategies https://www.amazon.com/dp/0198739834/ref=cm_sw_r_cp_api_1PpMAbJBDD4T0

u/funkypunkydrummer · 2 pointsr/intj

Yes, I believe it is very possible.

After reading [Superintelligence] (https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/ref=sr_1_1?s=books&ie=UTF8&qid=1479779790&sr=1-1&keywords=superintelligence), it is very likely that we may have whole brain emulation as a precursor to true AI. If we are on that path, it makes sense that we would be running tests in order to remove AI as an existential threat to humanity. We would need to run these tests in real, life-like simulations, that can run continuously and without detection by the emulations themselves in order to be sure we will have effective AI controls.

Not only could humans run these emulations in the future (past? present?), but the Superintelligent agent itself may run emulations that would enable it to test scenarios that would help it achieve its goals. By definition, a Superintelligent agent would be smarter than humans and we would not be able to detect or possibly even understand the level of thinking such an agent would have. It would essentially be our God with as much intellectual capacity beyond us as we have above ants. Time itself could run at nanosecond speeds for the AI given enough computational resources while we experience it as billions of years.

So who created the AI?
Idk, but that was not the question here...

u/Havelok · 2 pointsr/teslamotors

Read this and you'll understand why he's so sober about it: https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834

u/wufnu · 2 pointsr/news

I think the problem is that AIs will not think like us in any way imaginable and what is reasonable to us may be irrelevant to an AI. There are literally hundreds of books about this, describing in excruciating detail the many, many thousands of ways we can fuck it up with nobody even getting close to anything approaching a fullproof way of getting it right. The problem is, any screw up will be catastrophically bad for us. Here's a cheap, easy to understand (if a bit dry) book that will describe the basics, if you're really interested.

u/Gars0n · 2 pointsr/suggestmeabook

Superintelligence by Nick Bostrum seems to be just ehat you are looking for. It straddles the line of being too technical for someone with no background knowledge but accessible enough for the kind of people who are already interested in this kind of thing. The book is quitr thorough in its analysis providing a clear map of potential futures and reasons to worry, but also hope.

u/skepticalspectacle1 · 2 pointsr/worldnews

I'd HIGHLY recommend reading Superintelligence. http://www.amazon.com/gp/aw/d/0198739834/ref=tmm_pap_title_0?ie=UTF8&qid=&sr= The approaching singularity event is maybe the worst thing to ever happen to mankind... Fascinating read!

u/Liface · 2 pointsr/ultimate

You're strawmanning. I am not insinuating that we should not protest or report human rights violations and social injustice — simply that identity politics is being used as a distraction by, well, both parties, but annoyingly by the left, and is disproportionately represented in present minds and the mainstream media due to human cognitive biases.

Also, your use of scare quotes around artificial intelligence risk suggests to me that you lack information and context. Not surprising, given that the issue is often treated as a joke in the public discourse.

I recommend informing yourself with at least a basic overview, and then you're free to form your own opinions. Nick Bostrom's Superintelligence is a good primer.

u/odd84 · 2 pointsr/AskReddit

Is intro to algorithms really considered one of the hardest across all undergrad?? I've taken it at another university, using the MIT text and exams, and there were many harder courses even in my major...

u/projectshave · 2 pointsr/programming

Buy this book. Code and test every algorithm in any language. Learn discrete math.

IT and web dev are trades, so you need to learn DBs and networking and sysadmin stuff. CS is engineering, so you need to learn abstract concepts and how to apply them to real problems. Lots of faux-CS programs are really trade schools. If that's your situation, then you've got decide which path you want.

u/teraflop · 2 pointsr/programming

Dynamic programming is all about optimal substructure. Wikipedia's explanation isn't that great, so if you want to really grok it, go get your hands on a copy of Introduction to Algorithms aka CLRS.

u/Notlambda · 2 pointsr/dataisbeautiful

Sure. Without anything to go on, I'll just recommend some of my favorites. :)

  • Godel Escher Bach - Mindbending book that delves into connections between art, music, math, linguistics and even spirituality.
  • Code - The Hidden Language of Computer Hardware and Software - Ever wondered how the black box you're using to read this comment works? "Code" goes from transistor to a fully functioning computer in a sequential way that even a child could grasp. It's not at the "How to build your own computer from Newegg.com parts". It's more at the "How to build your own computer if you were trapped on a desert island" level, which is more theoretical and interesting. You get strong intuition for what's actually happening.
  • The Origin of Consciousness in the Breakdown of the Bicameral Mind - An intriguing looking into the theory that men of past ages actually hallucinated the voices of their gods, and how that led to the development of modern civilization and consciousness.
u/NothingWasDelivered · 1 pointr/computerscience

If you want a good, understandable explanation of this, read [Code]( Code (Developer Best Practices) by Charles Petzold http://www.amazon.com/dp/B00JDMPOK2/ref=cm_sw_r_udp_awd_7uYQub1VZ80TX) by Charles Petzold. He basically walks you through building a CPU from the ground up.

It's an excellent laypersons explanation of how computers work at a very fundamental level. How you can use relays (and transistors, their analog) to read 1's and 0's and make decisions based on that. From there you get to machine language (physically encoded into the chip), and everything above that is basically abstraction.

u/wnoise · 1 pointr/QuantumComputing

Mike and Ike is still a great intro. Restricting to finite-dimensional systems really does let you work with concrete representations that cover most of the unintuitive bits of quantum mechanics with mere linear algebra.
https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/

u/cmonnats · 1 pointr/OMSCS

I presume this is the book you are referring to, correct? this

It seems pretty old, considering they have 10th anniversary editions out. Is it still regarded as one of the better textbooks out there for this subject matter today?

u/nullcone · 1 pointr/QuantumComputing

...also I recommend to you the textbook by Kaye, Laflamme, and Mosca. Alternatively, if you're feeling daring you can pick up the QC bible. And if you're feeling too cheap to buy books, you can find lecture notes from a lot of QC courses posted online. Check out John Preskill's website, or maybe MIT open courseware.

u/n4r9 · 1 pointr/AskReddit

This is the standard text for many quantum information and foundation courses. It introduces most of the mathematical requirements but it's very useful to have a handle on linear algebra.

For more foundational issues, this is a good start, although a little too outdated to cover the stuff talked about here.

u/DevFRus · 1 pointr/neurophilosophy

My reply was uncivil for the specific purpose of trying to motivate you to either learn the basics of quantum computing or stop talking about it.

What you were doing is equivalent to what Descartes did when he said that consciousness resided in the pineal gland, or others do by dropping the word 'emergent'. You are taking something you don't understand and can't define well scientifically (in this case consciousness) and saying it must depend on something you don't understand but is currently popular and studied by scientists (in your case: quantum computing).

However, in no way are showing an understanding of quantum computing or consciousness. Do you know what it means to "calculate in super-position states"? Have you bothered to look it up (apart from popular news articles)? Say by grabbing Nielsen & Chuang? Or are you using the word based on some 'intuitive' understanding of it which is not grounded in science?

Of course, it was very rude of me to take out my anger at the ignorance of many on just you. However, your comment embodied the worst of philosophy and popular science and that is why I reacted as such.

u/semperlol · 1 pointr/web_design

Oh my god you're a fucking moron. Did you even read my comment? If you are discussing theory and this is your reply to my comment, you have a fundamental misunderstanding of the theory. The other explanation is you read something incorrectly, which wouldn't be such a problem but then you adopt such a cunt tone in your reply.

In theory

>Anything that can be done with a regex can be done with a finite automaton, and vice versa

Where did I state that recognising an email is impossible with finite automata? If something can be recognised by a finite automaton, it can be done with a regex.

Your original comment said that you cannot do this with regex but can with finite automata, but in theory

>They are equivalent in their expressive power, they both recognise the set of regular languages.

Anybody who has a semblance of an idea of what they're talking about will agree that they are in theory equivalent. So you can do it with regex, in theory.

Your article that you linked but didn't read carefully, states this same fact.

>And can you fully implement the complex grammars in the RFCs in your regex parser in a readable way?

It talks about the practical issues, e.g. being able to do it in a readable way with regex, because in fucking theory they are equivalent in their expressive power.

You may find the below useful:

https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

Alternatively:

https://www.amazon.com/gp/product/B00DKA3S6A/ref=s9_acsd_top_hd_bw_b292I_c_x_5_w?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-3&pf_rd_r=DQJA7YYF6XRPQ9DCCW1S&pf_rd_t=101&pf_rd_p=b949820f-ff03-5be8-b745-f0a5e56b98c9&pf_rd_i=511394

https://www.amazon.com/gp/product/B001E95R3G/ref=s9_acsd_top_hd_bw_bFfLP_c_x_1_w?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-4&pf_rd_r=MXQ2SVBM01QEAAET2X18&pf_rd_t=101&pf_rd_p=c842552a-f9c9-5abd-8c7d-f1340c84cb6d&pf_rd_i=3733851

u/eigenheckler · 1 pointr/compsci

>If you want to go more theoretical, look into set theory, regular languages, automata and state machines.

Sipser covers these with a theoretical-leaning book: https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

u/IAmNotFromKorea · 1 pointr/learnmath

Then you could take Linear Algebra, Real Analysis or Abstract Algebra.

Or you could read books like Introduction to the Theory of Computation by Michael Sipser

u/umaro900 · 1 pointr/math

Can I say Sipser's Introduction to the Theory of Computation? I know there are issues people have with the book, but in terms of accessibility and ease of reading, I think this text is second to none. I mean, though it says in the preface it's designed for upper-level undergraduates or fresh grad students, it makes no assumptions on the reader's level of knowledge, and as such I would feel comfortable recommending it even to some high school students.

u/cunttard · 1 pointr/C_Programming

Start with getting the XCode developer tools and the command-line package.

C is an important language in Computer Science because it is pretty much the language for heavy duty Operating Systems, the type you see in Desktop OSes, Network OSes (the type that runs on a networking router/switch), Server OSes (Linux, BSD, Windows, etc.).

I think C is a hard language to learn, but it is a great first serious language while also simultaneously learning an easier language like shell or Python to make yourself more efficient/productive.

However fundamental to CS is about the theory of comptuation not really languages. Languages are just a way to express computation. Some languages are better than others for expressing computation to solve certain problems. I would highly encourage also looking into understanding computation from first principles, a great introduction is Theory of Computation (2nd edition is really really cheap used). The only background knowledge you need to know is highschool mathematics.

u/Holy_City · 1 pointr/learnprogramming

You'd like this book a lot.

Another one that's more hardcore EE than computer science is The Mathematical Theory of Communication. Shannon's 1948 paper this book highlights is the foundation of information theory.

u/metaobject · 1 pointr/csbooks

Introduction to the Theory of Computation by Michael Sipser: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

Edit: wow, that book is more expensive than I remember. I have the 2nd edition, which can be found for a fraction of the price of the latest new edition. I'm not sure how they compare in content, though.

u/white_nerdy · 1 pointr/learnprogramming

> I want to be able to create functional programs with a bucket transistors, a cup of magnets, pen, and paper

I've heard good things about nand2tetris which goes from logic gates to a complete system with simple assembler, compiler and OS.

One good exercise might be to create an emulator for a simple system, like CHIP-8 or DCPU-16.

If you want to go deeper:

  • If you want to build compilers, the dragon book is the go-to resource.

  • If you want to start learning about theory, I recommend Sipser.
u/_--__ · 1 pointr/compsci

Wow. That's quite a range - normally those would be covered in two, three, or even four courses. Any wonder the pass rate is so low.

It's difficult to recommend anything sensible for the whole course - I'd have suggested Rosen - I have found it somewhat useful as a reference text later on - but it seems to be unpopular with amazon reviewers. For the latter part (finite-state machines etc), Sipser is a common course textbook - but it may be too advanced for a starting point (though might be worth getting so you can get ahead before you get to that part of the course, which will almost certainly be the trickiest).

u/bitcoinagile · 1 pointr/Bitcoin

From deleted twitter account

Dr_Craig_Wright twitted at 2015-05-10 02:06:51.000:

For those who wonder just how far you can "push" the scripting language in Bitcoin... http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

u/guiraldelli · 1 pointr/compsci

Excellent! I'm glad to know the concept is clear to you.

I would recommend you to using the Lewis & Papadimitriou book as well as the Sipser one: for me, the former is more formal than the latter, that is more didatic (specially for undergraduate students); however, both use a simple language and are very didatic.

My advice is: take both books and keep studying by them. I've learned Theoretical Computer Science by the Lewis & Papadimitriou book, but always I couldn't get a concept, I went to Sipser. And vice-versa.

At last, the 2012 (3rd) edition of the Sipser is so beautiful, with good automata drawings to understand Pushdown Automata. :)

u/gnuvince · 1 pointr/programming

The classic Introduction to the Theory of Computation is quite excellent, though very pricey. Also, I had the pleasure of having a university class that used this book, so having a professor that can clear things up from the book was a big help.

u/TroyHallewell · 1 pointr/todayilearned

I don't remember the details. But I do remember that this book goes into a bit of detail on how communication through the drums occurs.

https://www.amazon.com/Information-History-Theory-Flood/dp/1400096235

u/steelypip · 1 pointr/DebateReligion

> What I don't see is why this distinction is particularly relevant to the point that information is a particular arrangement of matter and energy, as opposed to a third fundamental component of the universe...

You need matter and/or energy for information to be encoded, but it is something separate from any particular arrangement of matter and energy. The same information can be encoded in many different ways, and it is still the same information. Beethoven's 9th symphony is still Beethoven's 9th symphony whether it is encoded in pits burnt on a CD with a laser, an MP3 file stored on my hard drive, or a book of sheet music.

I recommend reading The Information: A History, A Theory, A Flood by James Gleick for an introduction to history of information theory as a science.

u/SchurThing · 1 pointr/math

It's a good time for history of computer science books. We had Gleick's The Information last year, and George Dyson (Freeman's son) just published Turing's Cathedral.

u/HasFiveVowels · 1 pointr/IAmA

To anyone interested in reading about this stuff, I'd very highly recommend "The Information" by James Gleick. It's probably my second favorite book and discusses information starting with African tribal villages sending messages with drums, going through to the telegraph, Shannon's creation of Information Theory, etc. A decent amount of the book is dedicated to Shannon and it's generally a great read.

u/Tamatebako · 1 pointr/suggestmeabook

I really enjoyed Richard Holmes' The Age of Wonder and also Chaos and The Information by James Gleick.

u/Navichandran · 1 pointr/ABCDesis

Yeah it was great. Alan turning was a genius.

One of my favorite books about information theory and entropy for all the nerds out there:

http://www.amazon.com/The-Information-History-Theory-Flood/dp/1400096235

u/mike_bolt · 1 pointr/math
u/mrdevlar · 1 pointr/StonerPhilosophy

Please read this:

The Information

It will resolve your question.

u/EGKW · 1 pointr/Art

Thanks for the upvote. Have one from my side too. :-)
But that certainly isn't a semaphore station but indeed a windmill.
Semaphores and semaphore stations became somewhat popular starting from the end of the 18th century. The Ruisdael-painting dates from halfway the 17th century.
Furthermore (Chappe) semaphores had 2 arms, with 1 articulation each. The Ruisdael windmill clearly has 4 blades or sails.
If you want to learn some more about semaphores and sempahore stations there's an insightful chapter on that subject in James Gleick's book The Information.

u/k_Reign · 1 pointr/gamedev

Alright, so bookmark on the first one hahah.

Looks like I'll check that one out some more - do you know anything about Mathematics for 3D Game Programming
and Computer Graphics
? Thanks!

u/asleepinthetrees · 1 pointr/Unity3D

did you ever take a look at the book mathematics for 3d game programming and computer graphics I've been thinking of giving it a read this summer

u/stainedpickle · 1 pointr/gamedesign

It really depends on what you plan on working on, but a book I am currently working through is Mathematics for 3D game programming and computer graphics. It covers a lot of what tjgrant mentioned, but in context of developing a 3D graphics engine. It also includes exercises in each chapter so should help as a textbook of sorts. I'm not sure how much use it would be to you without some prior exposure to the mathematical topics contained, but it does go through the very basics of each in a pretty concise manner.

u/horsepie · 1 pointr/gamedev

Mathematics for 3D Game Programming and Computer Graphics is supposed to be the best, from a few recommendations.

I can't tell you how good it is myself, it's still on my "to read" list. I have had a look at the contents pages and a quick flick through though, and I can say that it contains pretty much everything I learnt in my Computer Graphics MSc, and a ton of other stuff.

u/Driamer · 1 pointr/learnpython

Mostly just old school googling and researching. When you have a clear problem that you're trying to solve the direction is pretty clear. I always tried to just solve the problem with just trying different stuff first, and after finding a way to solve it I researched how it could be done the correct way. That way I would always get the wow-effect, because I really appreciate the genius in the algorithms some people have come up with way back in history to solve the problem I'm having now making a stupid monster game.

Other than that there are some cool books like https://www.amazon.co.uk/dp/1435458869/ and of course Khan academy for the basics. But the main point that worked for me was trying to solve it myself first so I can really appreciate the math when I research the solutions.

u/Aeiorg · 1 pointr/GraphicsProgramming

1 ) Real-time rendering is the go-to bible for every graphic programmer. It's starting to get pretty old tho but still relevant. Then you have a lot of different books for different topics, like Real-time shadows, VR, etc. In fact what you are looking for is a Computer Graphics course/book, not an OpenGL one (or you can try OpenGL SuperBible). OpenGL is a tool (you can learn this one, or DirectX, or Vulkan ...), Computer Graphics is the science ( you learn it one time, and you use an API to apply it). You have tons of books for computer graphics, some are more into mathematics , some are more into specifics techniques, etc

2 ) OpenGL is just an API, it doesn't do anything in itself. Therefore, "the rest" is just a superset of functions that passes data differently on GPU and give you more control of your app. If you want to begin to understand why/how OpenGL is evolving, you can have a look at this supa great talk

Have fun learning !

u/invicticide · 1 pointr/gamedev

I study this book (I actually own the first edition, but the third edition is current).

math.stackexchange.com is sometimes useful as well.

You do need at least a basic understanding of algebra and trig in order for any of this to make sense. If you don't even have that, you need to learn it post-haste. Those skills are so fundamental in game dev, I cannot even express how important they are.

u/gavinb · 1 pointr/opengl

Well if you want to be the next Carmack, get cracking! :) You have a lot of ground to cover, such as: mathematics (matrices, linear algebra, etc), physics, artificial intelligence, real-time processing, multithreading, architecture, networking and protocols, rendering, sound, and much more!

It is certainly possible with enough time and dedication to develop your own engine. It's just that there are so many excellent engines already out there, that you would be competing with projects that have already invested many thousands of hours and have loads of titles already developed for them. Why not get involved with an existing project to start?

BTW I really like your idea of creating a FPS with one room and focusing on making that environment the richest possible, exploiting a wide variety of techniques. Do it!!

Is your ultimate goal to create an engine? Or to create a game? Remember, the engine is in many ways a means to an end - it's not much use without a game that uses it!

Either way, I think you would be well advised to get involved with one of the open source game engine projects, and start contributing. Once you've learned how they work, you will be in a much better position to design your own. And realistically, you can't really just design an engine without a game - you need to know how games work in the first place, and what features and architectural decisions and designs make for a good engine.

Consider joining:

u/pjsdev · 1 pointr/gamedesign

Okay, here are 4 suggestions about theory. There are plenty more, but these are a few of my favourites.

Rules of Play: Game Design Fundamentals

  • Chunky theory book and one of my favourites. Also has a companion book of essays

    Characteristics of Games

  • Really nice combination of chapters from various designers (including Richard Garfield of MtG) looking into different aspects of design.

    Game Mechanics: Advanced Game Design

  • All about systems and how resources move through them in games and the affect that has.

    Theory of Fun for Game Design

  • Easy to read, nicely illustrated and conveys a powerful fundamental idea for game design.

    Good luck and happy reading.
u/swivelmaster · 1 pointr/gamedesign

https://www.amazon.com/Theory-Game-Design-Raph-Koster/dp/1449363210

A much faster read, with pictures, but will give you a good framework for thinking about design.

u/SebastianSolidwork · 1 pointr/gamedesign

Its hard for me to grasp what you are looking for. I'm even not sure if even you are (in extend of Frasca).

I have worked on a improved differentitation: Ludonarrative Synthesis

Additionally i prefer the differentitation (on the ludo side) into the four interactive forms. Paidi matches to toy.

Simulation is to me anything that tries to be realistic as it can be. Which is mostly boring. An interessting system or narravtion is never realistic.

About boredomness i recommend Raph Kosters Theory of fun.


Most words i used here, are meant in a very specific way. Not in colloquial language.

u/YouAreSalty · 1 pointr/xboxone

>Feel free to lay out your case for this in great detail so I can tear it apart.

No need. Educate yourself. Start here.

Random internet user, feel free to tear apart Raph Koster book.

>However, Rare communicated and marketed heavily. I took in all that communication and marketing and they convinced me to purchase the game. A large number of people feel that their communication and marketing did not align with what they delivered. I could get a refund, but I believe in the future of the game so I choose not to and that is my choice.

That is not the impression I get at all, that they lied. Rather people were hoping there were more and got disappointed... This game has been well covered in alpha/beta and streamed.

> I could get a refund, but I believe in the future of the game so I choose not to and that is my choice.

That is your choice, but if you are unhappy with it I suggest you do get a refund. Vote with your wallet, because voting with your voice makes a lot of us others tired of hearing the same thing over and over. It is also likely more effective.

> If I have an accurate or inaccurate opinion, it is often that the business did something to contribute to that opinion even if it was inaccurate. So no matter what it is I think, I have in fact earned the ability to express it.

As I said, there is no "earning". Everybody has a right to their opinion. Earning suggest that someone else hasn't.

>Rare is better than this, so something is going on.

The one thing I was disappointed about is that they seemed to have indicated some sort of surprise with the Kraken, and then it turned out to just be a bodyless tentacles. I also wish the combat was a little tighter and less casually, but I think that is their design goal.

u/Krudflinger · 1 pointr/learnpython

Try out python koans for the basics.

Once I got that out of the way, I grokked through Fluent Python.

u/Drago0909 · 1 pointr/learnpython
u/danysdragons · 1 pointr/learnprogramming

As others have said, Automate the Boring Stuff is a good place to start.

Down the road if you really want to go deep into Python and want an advanced book, [Fluent Python] (https://www.amazon.com/Fluent-Python-Concise-Effective-Programming/dp/1491946008/) is fantastic.

u/autisticpig · 1 pointr/Python

If you were serious about wanting some deep as-you-go knowledge of software development but from a Pythonic point of view, you cannot go wrong with following a setup such as this:

  • learning python by mark lutz
  • programming python by mark lutz
  • fluent python by luciano ramalho

    Mark Lutz writes books about how and why Python does what it does. He goes into amazing detail about the nuts and bolts all while teaching you how to leverage all of this. It is not light reading and most of the complaints you will find about his books are valid if what you are after is not an intimate understanding of the language.

    Fluent Python is just a great read and will teach you some wonderful things. It is also a great follow-up once you have finally made it through Lutz's attempt at out-doing Ayn Rand :P

    My recommendation is to find some mini projecting sites that focus on what you are reading about in the books above.

  • coding bat is a great place to work out the basics and play with small problems that increase in difficulty
  • code eval is setup in challenges starting with the classic fizzbuzz.
  • codewars single problems to solve that start basic and increase in difficulty. there is a fun community here and you have to pass a simple series of questions to sign up (knowledge baseline)
  • new coder walkthroughs on building some fun stuff that has a very gentle and friendly learning curve. some real-world projects are tackled.

    Of course this does not answer your question about generic books. But you are in /r/Python and I figured I would offer up a very rough but very rewarding learning approach if Python is something you enjoy working with.

    Here are three more worth adding to your ever-increasing library :)

  • the pragmatic programmer
  • design patterns
  • clean code

u/mitchell271 · 1 pointr/Python

Been using python for 5 years, professionally for 1. I learn new stuff every day that makes my python code more pythonic, more readable, faster, and better designed.

Your first mistake is thinking that you know the "things that every python programmer should know." Everyone has different opinions of what you should know. For example, I work with million+ entry datasets. I need to know about generators, the multiprocessing library, and the fastest way to read a file if it's not stored in a database. A web dev might think that it's more important to know SQLAlchemy, the best way to write UI tests with selenium, and some sysadmin security stuff on the side.

If you're stuck with what to learn, I recommend Effective Python and Fluent Python, as well as some Raymond Hettinger talks. You'll want to go back through old code and make it more pythonic and efficient.

u/KennedyRichard · 1 pointr/learnpython

Python Cookbook 3rd ed., by David Beazley and Brian K. Jones, has a dedicated chapter about metaprogramming. The book is so good the other stuff may also give some insight on metaprogramming or alternatives. I already read it and it gave me insight on my code about metaprogramming and other topics, so it is pretty useful. You can also find a lecture from Beazley about the topic with a quick google search with his name and the "metaprogramming" word.

There's also Fluent Python by Luciano Ramalho, which has three dedicated chapters about metaprogramming. Didn't read the chapters myself but I'm half way into the book and it is awesome, so I'm having great expectations about those chapters.

Don't mind the metaprogramming "chapter count", it is just a piece of information. Quality is what you should be looking for. And I believe those books have it. Even though I admit an entire book about metaprogramming would be neat.

u/Rhemm · 1 pointr/learnpython

Start with idiomatic python. That's a very good, short book of important pythonic features. Also make sure to learn standart library. You can refer to docs for exact arguments, but you have to know all of functions and what they do. After you can read fluent python. It will show you how python works, python data model and I think overall it's the best python book. Then practice, read others code. When you use some library or whatever don't be afraid to dive into source code. Good luck and happy programming

u/Acidictadpole · 1 pointr/gamedev

I highly suggest this book (Programming Game AI by Example), second chapter goes over the finite state machine with some good examples.

u/GloryFish · 1 pointr/Unity3D

I highly recommend reading Programing Game AI by Example. It includes a chapter on modeling an AI controlled soccer team. The book itself is a fantastic primer on a variety of workhorse AI techniques that every game dev should be familiar with.

That being said, oftentimes fun gameplay can arise from simple rules + player interaction. So, read the book, but don't be afraid to just throw down some simple triggered behavior and see what's fun.


u/FrogBoiling · 1 pointr/AskReddit

I would recommend this book and this book

u/Bdee · 1 pointr/Unity3D

I had the same problem. I ride the subway every day and have a ton of time to read, so I've been trying to collect similar resources.

Here are some resources I found really helpful:

  1. Beginners book on Unity - http://www.amazon.com/Development-Essentials-Community-Experience-Distilled/dp/1849691444

    This is a VERY basic (think: learn how to code!) introduction to Unity. I personally found it too elementary, having coded in a few different languages, but it might be a good place to start as it explains basic Unity design concepts like Components, Materials, Colliders, etc.

  2. Your first Unity project (helps to have Unity accessible to follow alone) - Building a 2D RPG in Unity: http://pixelnest.io/tutorials/2d-game-unity/

    This is by fast the best 'getting started' tutorial I've found. It walks you through creating a really basic project from scratch using Unity basics and scripts. This is what I based most of my code off of when I first started my project.

  3. REALLY great book on game design/physics and AI - http://www.amazon.com/Programming-Example-Wordware-Developers-Library/dp/1556220782

    This has been the most helpful resource for me. It's not Unity specific but will teach you A TON of great fundamentals for things like how to move a character, common patterns like StateMachines, how to handle AI, etc.... All of these concepts will be relevant and many are already in place in Unity so you'll recognize them right away.

    Advanced: Game Programming Patterns - http://gameprogrammingpatterns.com/

    This is a book (online/pdf/epub) that teaches the more advanced patterns you'll be applying in your code. I'd suggest this once you finish with the above resources and have been working through your game for a bit.
u/glial · 1 pointr/learnpython

They're not free but you might check out these two books.

u/Rybis · 1 pointr/gamedev

Sounds like you want something like Goal Oriented Behaviour and planning. I'm working on something that uses this.

I highly, highly recommend this book.

u/rhiever · 1 pointr/artificial

Programming Game AI by Example has a great, easy-to-understand explanation and walkthrough for learning ANNs: http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782

Once you've learned at least ANNs, you can delve into the popular approaches to GAI:

u/z4srh · 1 pointr/gamedev

You know, a fantastic book is Programming Game AI By Example. It's fantastic for learning about AI, but the author also put a lot of effort into the code, so you can learn a lot about general game design from it as well. Well worth the price. http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782 . You can download the code samples from the author's website to see what I mean. It is only 2D, so it won't help you with collision detection and some of the more 3D specific topics, but the core of the engine can be applied to anything.

One thing that is really important is to realize that there's no silver bullet. Every design decision has its benefits and its trade offs. It's easy to fall into the trap of overthinking your design, especially for personal projects - it is more beneficial for you to try to write it and have to rewrite it because of a bad design decision than to spend months trying to come up with the perfect architecture. That's not to say that you should ignore design, but rather that once you think you have a good idea, try it out, experiment, see what works and what doesn't. If you focus on having a modular design to your software, you'll find that rewrites get easier and easier.

u/HardlineStudios · 1 pointr/gamedev

This book dives nicely into exactly what you want to do and even suggests some ways to combine steering behaviors:
http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782

u/Elador · 1 pointr/cpp

It sounds to me like you need to read a book on game dev and a bit about general programming. I can recommend

u/AngryAlt1 · 1 pointr/gamedev

Programming Game AI by Example

Is really helpful for implementating agent AI, which is a big component for those types of games.

u/--aceOfSpades-- · 1 pointr/HowToHack

Continue to learn python and c outside of school, go into more depth. May not be what your looking for but try reading hacking the art of exploitation and depending on your current knowledge of python violent python may be good for you.

u/AtomicWedgy · 1 pointr/learnpython

I did a quick search and found 2 books specifically in your current field of interest. Gray Hat Python and Violent Python

I've never read either of them, but they look interesting.

u/moomoocow · 1 pointr/AskNetsec

I recommend reading the following to get an overview:

The Basics Hacking Penetration Testing

If you want to do some programming specific (i.e. Python) try

Violent Python

u/b4xt3r · 1 pointr/Python

I enjoyed Violent Python quite a bit.

u/Miro360 · 1 pointr/hacking

You can't "Hack" something with python, python is great as a scripting language and can be used to automate some processes that would take rather a long time doing it by hand ie: "Fuzzing" and writing exploits. if you wanna start "hacking with python" you need to have more than basic knowledge and you need knowledge about what you're going to be using python on.
If the terms "Fuzzing" and exploit writing doesn't sound familiar to you then i suggest you go back and do some more research.
There's a great book on that topic though called Violent Python that should give you an idea of what you're dealing with.

u/Yukanojo · 1 pointr/cybersecurity

Google has a free python course that is great as an introductory: https://developers.google.com/edu/python/

I'd also recommend a book called Violent Python: https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579 (ISBN-13: 978-1597499576 )

Violent Python is written with cyber security in mind and has plenty of code samples where python is applied with cyber security in mind. I'd also highly recommend following Mark Baggett on twitter (I believe he was the technical editor for the book) as the man is an absolute python genius. He always shares inciteful info related to cyber security and usually goes into the very technical analysis of what he shares.

u/d0cc0m · 1 pointr/cybersecurity

It's never too late. I didn't get into the field until my mid 20s. It really just takes an interest and a desire to learn. Cyber security is a pretty large field so play around in the different sub-fields and find the one(s) that interest you.

Here are some resources to get you started:

Books:

u/ItalyInformatica_SS · 1 pointr/italy_SS

Questo è quello a cui ti rivolgi questa è la risposta sbagliata a un falso problema.
Python ha una curva di apprendimento ripida, versione 1 e 2 sono quelli che producono sw migliori.

u/xzieus · 1 pointr/uvic

At UVic, I think there are security specializations for degrees such as the MTIS or the Computer Science Options (such as Network Security -- although I did the Software Engineering option for C.Sc. in my undergrad)

I focused on taking classes, but I did a LOT of my own (legal) research/projects. That "legal" caveat is IMPORTANT. Don't get arrested for a hobby, it doesn't achieve your goal, and it's not worth it. Do things the right way, don't trespass or break the law.

Most of the government cyber defense jobs are in Ontario -- so expect to have to move there if you want to work with them. I hear there are ... "sites" ... elsewhere, but realistically you would have to "do your time" there before anything like that became available.

Business and Finance classes are always a good idea -- not just for business but personal benefit. My wife is an accountant and those skills are really helpful to have for our daily/monthly/etc finances.

Advice

  • You have to "shoot straight" when it comes to security. Gone are the days where someone hacks the FBI and they offer him a job. Now they just arrest you and you stay there. It makes sense, why incentivise it. Don't do something that might even be construed as illegal. (With that being said, there is an argument to be made for making security education too "academic" and forgetting that people actually have to work on practical aspects -- this is outside the scope of this conversation though)
  • There are plenty of projects such as OWASP Broken Web App, classes like Elec 567 at UVic, or just learn how to make your own VMs and attack them locally (the best route -- then you can control what's installed, with a fine-tooth comb) -- this also helps test new patches, etc to see if the software is vulnerable.
  • Read. Lots. Subscribe to blogs, order books (I am partial to books such as Hacking: The Art of Exploitation (Pretty low level, but helps you understand what is going on under the hood), and Violent Python (more of a cookbook / handbook)), and read up on security news. Rule of thumb: Read at least 2 new security books every year (at a minimum) -- It gets easy when you have a dedicated app for security podcasts, RSS feeds, and you keep a book or two with you all the time.
  • When interviewing for government security jobs, don't lie to them. If they asked you if you have smoked pot, tell them if you did. They are looking for truthfulness.
  • Look at open source projects where you can contribute (general coding advice, but it helps). It doesn't have to be the Linux kernel, just work on something that isn't an assignment/project from school.
  • Learn who the big players are in security -- Like everything on the internet, there is lots of talk. Find the people who actually know what they are talking about and listen to them. Take EVERYTHING (including this post) with a grain of salt! The classic motto is "Trust but verify". This applies to everything. The security industry is ... interesting ... Think of it as a cross between the mafia (Pay us for protection ... or else), "tinfoil hattiness" (Comes with the territory -- you see a lot more than the average person, so it skews your view on certain subjects... not all of which you can even talk about), and the classic balance between privacy and security (ranges from surveillance state and anarchy) ... Politics play a HUGE part.
  • Always be learning. Show this to prospective employers. Don't just talk, do.


    Sorry, this turned into a bit of an essay. I'm just one opinion out there, but hopefully you get something out of this. As always, "trust but verify".

    [edit: a word]
u/nicmakaveli · 1 pointr/learnprogramming

Hmm, why don't you do both? I mean think python is free
and I did both too.
I'm sure you saw the link but you can read the HTML here http://www.greenteapress.com/thinkpython/html/index.html
and just donate.
I found this very good too, it's the first I read http://beastie.cs.ua.edu/cs150/book/index.html
but it's probably to basic for you already.

if you want to go into security

read violent python one of the best http://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579

u/bombsa · 1 pointr/learnprogramming

The UK amazon doesn't have any versions directly from you just resellers: https://www.amazon.co.uk/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230/ref=sr_1_1?ie=UTF8&qid=1464778753&sr=8-1&keywords=Grokking+algorithms
Any chance you will be adding new versions available through Amazon?

u/Viginti · 1 pointr/learnprogramming

Some have mentioned in replies to my post that the algorithms presented in the book are either too abstract or simple. To that I say read this, grokking algorithms: An illustrated guide for programmers and other curious people https://www.amazon.com/dp/1617292230/ref=cm_sw_r_cp_apa_KG7CAbE9E4243. It's written well and explains some more complex ideas in an easy to understand manner .

u/Ogi010 · 1 pointr/Python

can confirm, using this book in my MSCS program (my Uni doesn't have undergrad programs); I'm implementing a lot of the algorithms using custom objects in Python, so it's doing wonders for my object oriented game...

The book is also relatively straight forward to read (again, keep in mind I'm a graduate student), but other books such as the Algorithm Design Manual have been known to have easier to understand explanations and not be as theoretically deep.

Also lastly, for when I was testing my toes in the water, I flipped through Grokking Algorithms which for a short summary, and explanations on how things work is a great place to start (but of course you won't see this book in academic environments).

u/frompdx · 1 pointr/gamedev

You bet, although it uses python rather than C/C++, I think Grokking Algorithms is a good read that covers both linked lists and hash tables amazon link.

u/redditfend · 1 pointr/learnprogramming

There is this awesome book called Grokking Algorithms.

This should help you really well. Every concept is taught in a clear, concise manner with lots of illustrations.

u/Tiramelacoma · 1 pointr/learnprogramming

I would recommend Wengrow's or Bhargava's to learn the basics in a more pleasant way and and then continue with others that dig deeper (Cormen, Sedgewick, etc.).

I'm just following that plan actually.


^((Sorry for my bad english))

u/opaz · 1 pointr/learnjava

Thank you for suggesting this! The style of learning the book serves is right up my alley. Do you have any other suggestions of books that are just like this? One book I enjoyed reading in a similar fashion was Grokking Algorithms

u/Haatveit88 · 1 pointr/learnprogramming

I understand how you feel, honestly - as someone who did poorly in school, and I am somewhat dyscalculic, I really feel like I can relate!

The important thing for you, in my opinion, based on your explanation there, is to look for learning materials that suit you. Some people learn easily from really academic materials, some (like me) don't - and fortunately, there are lots of materials out there trying different approaches to teaching this kind of stuff. It gets easier as you go, as well - once the ball starts rolling you find it much easier to grasp future concepts. I got a massive 1300 page book called "An introduction to Algorithms" many years ago... Introduction my arse. It might as well have been alien language to me. But now, years later, I can actually understand its contents. It definitely was not an introduction (but it is a great book, both physically and literally).

A few recommendations for actual introductory books on these subjects:

"A Common-Sense Guide to Data Structures and Algorithms" by Jay Wengrow (2nd Edition coming 2020)

This book says the following in the opening chapter:

>"The problem with most resources on these subjects is that they're...well...obtuse. Most texts go heavy on the math jargon, and if you're not a mathematician, it's really difficult to grasp" . . . "Because of this, many people shy away from these concepts, feeling like they are simply not 'smart' enough to understand them."

It's not a perfect book, but it goes into a lot of basic data structures and explains them in a not-insane way. It helped me a lot! Understanding not just how they work, but why they are useful, is so helpful.

"Grokking Algorithms: An illustrated guide for Programmers and other curious people" by Aditya Y. Bhargava

A similar book, however, more algorithm and less data structure focused, and it goes into somewhat more depth, although usually the extra material is considered optional. The author here expresses a similar concern that books and learning materials on these concepts are often very hard to understand, and it need not be that way!

You can learn these things, you just need to find the right book/method that works for you! It can take some searching to find it. I know from experience!

Read the books, try to implement some of their concepts, and then try applying those things to real problems (i.e. from HackerRank or similar sites, try more than just HR). Read the book again. Repeat. You will understand a bit more each time. That was what worked for me, at least.

u/ruski550 · 1 pointr/iastate

I highly recommend this book. It takes a lot of key concepts on 228 and 311 a d dumbs it down into pictures with psuedocode. I would definitely use this along with the help room if you can.

https://smile.amazon.com/dp/1617292230/ref=cm_sw_r_sms_apa_i_6KATDbSJF44PD

u/ShenmeNamaeSollich · 1 pointr/cscareerquestions

Why stay at a school where you're not studying what you want, and which doesn't even offer what you want nor one of the most popular in-demand majors??

Anyway, there are any number of online courses/tutorials about Data Structures, and how to build/use them in various languages. You can use C++ for them, or try to learn something else too. For speed & simplicity in interviews, a lot of people seem to prefer Python for discussing DS&A, but by their nature the concepts are fairly language-agnostic.

Try visualalgo for one ... there are plenty of others.

Since a lot of algorithms require/suggest the use of specific data structures to make them work, it's probably better to learn what those are first, and then try to tackle the algorithms that rely on them.

Grokking Algorithms - illustrated & pretty basic intro to concepts

Common Sense Guide to Data Structures and Algorithms - slightly less so, but still pretty basic intro to concepts

CTCI - problems covering arrays, linked lists, stacks & queues, trees, graphs ... Actually kind of useless if you don't already know what those are though.

Introduction to Algorithms (CLRS) - 1 of 2 standard U.S. college-level algorithms textbooks

Algorithms, 4th Ed. - the other standard U.S. college-level textbook, w/free online "book site", code, and a free Coursera course to go along with it.

u/ell0wurld · 1 pointr/cscareerquestions

Grokking Algorithms: An illustrated guide for programmers and other curious people https://www.amazon.com/dp/1617292230/ref=cm_sw_r_cp_api_i_LPkTCb09XVDTR

u/loops_____ · 1 pointr/cscareerquestions

>The Algorithm Design Manual

Is this another algorithms book? How is it compared to Cormen's Introduction to Algorithms or Grokking Algorithms? I tried Cormen's and Grokking's, but it was a hard read and I generally prefer Youtube videos (mycodeschool) and so on. Is The Algorithm Design Manual similar?

>EPI, and PIE

What are these?

u/plbogen · 1 pointr/compsci

CLRS is an awful book for learning or even gauging interest. Despite its love in AoA classes it is really more of a reference book than an instructional one. Not to mention not all of Computer Science is about algorithm analysis. I'd suggest reading Skiena instead, it is well presented and interesting. It will do much more for gauging his interest in AoA than wading through CLRS will.

In a way CLRS is like telling a kid thinking about going to grad school for Roman History to go read all of "The Decline and Fall of the Roman Empire" first. You are setting him up to be turned off.

u/SnailHunter · 1 pointr/compsci

>I highly, highly, highly recommend reading Skiena:

>http://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1849967202/ref=sr_1_5?ie=UTF8&qid=1320821047&sr=8-5


>It's really readable, and is a really good refresher. I've actually reread it a couple times now (like, actually read like a novel, unlike what you do with CLRS), and each time I've been glad I've done so.

Just wanted to let you know that I checked this out online thanks to you and it looked good so I just bought it. I'm gonna read through it during my break.

u/hawkbearpig · 1 pointr/cscareerquestions

No internships at any point in my college or post-college career. I had a modest amount of projects, mostly relating to my coursework in school. My senior year I designed and programmed a video game with a team of fellow classmates, which was my main project on my resume.

As far as independent study goes, mainly hunting down UC Berkeley / MIT comp sci lecture series videos on youtube and watching them, using resources like hacker rank to work through coding exercises. I also read through the Algorithm Design manual https://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1849967202

u/abcininin · 1 pointr/learnprogramming

You gotta start somewhere! I'm glad that you are taking the programming classes. Programming is fun, and challenging, you'll see. For starts, just go through your course and pay close attention to (a) algorithms and (b) data structures. If you find it hard to understand, just come back here or go r/learnpython. We are here to support you. Also, if you prefer books, i recommend this one - he talks through the concepts from problem solving and steps through psuedo-code before writing a functioning program. If you prefer an online experience, try all the easy problems on leetcode, don't get intimidated if you don't get the solutions, don't be afraid to peek at hints and solutions.

u/mawattdev · 1 pointr/C_Programming

Sounds like you could use a good foundation in data structures and algorithms. I'd find a book on the subject and dive in.

My recommendation: The Algorithm Design Manual by Skiena.

Link: https://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1849967202

u/tsuru · 1 pointr/PHP
u/ApokatastasisPanton · 1 pointr/compsci

It's incredibly expensive as well.

For more affordable resources :

u/aagamezl · 1 pointr/devco

Jeison Higuita recomienda este libro para algoritmia y competitive programming: Competitive Programming

Esteban Foronda recomienda este libro para algoritmia: The Algorithm Design Manual

u/Arubadoo · 1 pointr/learnprogramming
u/infinitelyExplosive · 1 pointr/pcmasterrace

Here are some different sources for different aspects of computers.

The book Code: The Hidden Language of Computer Hardware and Software is an excellent introduction into the low-level concepts which modern CPUs are built on.

Link hopping on Wikipedia is a totally viable method to learn many aspects of computers. Start at some page you know about, like Graphics Cards or Internet, and just keep reading and clicking links.

Hacking challenges are a great way to learn about how computers work since they require you to have enough knowledge to be able to deliberately break programs. https://picoctf.com/ is an excellent choice for beginner- to intermediate-level challenges. https://overthewire.org/wargames/ also has some good challenges, but they start off harder and progress quickly. Note that these challenges will often require some programming, so learning a powerful language like Python will be very helpful.

This site is not very active anymore, but the old posts are excellent. It's very complex and advanced though, so it's not a good place to start. https://www.realworldtech.com/

In general, google will be your best friend. If you run into a word or program or concept you don't know, google it. If the explanations have more words you don't know, google them. It takes time, but it's the best way to learn on your own.

u/GilgamEnkidu · 1 pointr/explainlikeimfive

I HIGHLY recommend the book ["CODE" by Charles Petzold] (http://www.amazon.com/Code-Developer-Practices-Charles-Petzold-ebook/dp/B00JDMPOK2/ref=sr_1_1?ie=UTF8&qid=1412051282&sr=8-1&keywords=code). It explains how computers and programming languages are built starting with the simplest pieces (circuits, telegraph relays, transistors binary, assembly, functional languages) up through almost modern day. He also puts it all into historical context. I'm in the middle of it now and it is thoroughly interesting and elucidating.

u/sacredsnowhawk · 1 pointr/explainlikeimfive

If anyone is interested in learning about this stuff in-depth, I really recommend the book 'Code' by Charles Petzold. You won't feel like a computer moron again.

http://www.amazon.com/Code-Developer-Practices-Charles-Petzold-ebook/dp/B00JDMPOK2/ref=sr_1_1

u/toddspotters · 1 pointr/askscience

I strongly recommend that you read Petzold's book, Code: The Hidden Language of Computer Hardware and Software. It walks through the process of building circuits, using those circuits to represent information, and then gradually building those up into more complex systems, including programming languages.

u/maholeycow · 1 pointr/SoftwareEngineering

Alright man, let's do this. Sorry, had a bit of a distraction last night so didn't get around to this. By the way, if you look hard enough, you can find PDF versions of a lot of these books for free.

Classic computer science principle books that are actually fun and a great read (This is the kind of fundamental teachings you would learn in school, but I think these books teach it better):

  1. https://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2 - this one will teach you at a low level about 1's and 0's and logic and all sorts of good stuff. The interoperation of hardware and software. This was a fun book to read.
  2. https://www.nand2tetris.org/book - This book is a must in my opinion. It touches on so many things such as boolean logic, Machine language, architecture, compiling code, etc. And it is f*cking fun to work through.

    Then, if you want to get into frontend web development for example, I would suggest the following two books for the fundamentals of HTML, CSS, and JavaScript. What I like about these books is they have little challenges in them:

  3. https://www.amazon.com/Murachs-HTML5-CSS3-Boehm-Ruvalcaba/dp/1943872260/ref=sr_1_2_sspa?keywords=murach%27s+html5+and+css3&qid=1557323871&s=books&sr=1-2-spons&psc=1
  4. https://www.amazon.com/Murachs-JavaScript-jQuery-3rd-Ruvalcaba/dp/1943872058/ref=sr_1_1_sspa?keywords=murach%27s+javascript&qid=1557323886&s=books&sr=1-1-spons&psc=1

    Another great book that will teach you just fundamentals of coding using an extremely flexible programming language in Python, how to think like a programmer is this book (disclaimer: I haven't read this one, but have read other Head First books, and they rock. My roommate read this one and loved it though):

  5. https://www.amazon.com/Head-First-Learn-Code-Computational/dp/1491958863

    Let me know if you want any other recommendations when it comes to books on certain areas of software development. I do full stack web app development using .NET technology on the backend (C# and T-SQL) and React in the frontend. For my personal blog, I use vanilla HTML, CSS, and Javascript in the frontend and power backend content management with Piranha CMS (.NET Core based). I often times do things like pick up a shorter course or book on mobile development, IoT, etc. (Basically other areas from what I get paid to do at work that interest me).

    If I recommended the very first book to read on this list, it would be the Head First book. Then I would move over to the first book listed in the classic computer science book if you wanted to go towards understanding low level details, but if that's not the case, move towards implementing something with Python, or taking a Python web dev course on Udemy..

    Other really cool languages IMO: Go, C#, Ruby, Javascript, amongst many more

    P.S. Another book from someone that was in a similar situation to you: https://www.amazon.com/Self-Taught-Programmer-Definitive-Programming-Professionally-ebook/dp/B01M01YDQA/ref=sr_1_2?keywords=self+taught+programmer&qid=1557324500&s=books&sr=1-2
u/terraneng · 1 pointr/learnprogramming

Code by Charles Petzold

Pretty low level but a nice read and very informative. I read it on a plane last year.

u/ceciltech · 1 pointr/AskElectronics

If you really want to understand how a processor works I recommend Code: The Hidden Language of Computer Hardware and Software. This is a great book that works its way through the history of the development of processors and the code that runs them, easy read and so well written!

u/di0spyr0s · 1 pointr/resumes

Thanks so much!

Where do hobbies and interests go? Below Education somewhere? Sample stuff I could add:

  • I started sewing this year and have achieved my goal to knit and sew all my own clothes for 2015.
  • I play guitar, drums, and piano, and I'm learning to play bass. A friend and I started a band called OCDC, because we're n00bs and play the same thing over and over a lot.
  • I read insatiably. Most recently Code: The Hidden Language of Computer Hardware And Software and A Guide to the Good Life: The Ancient Art of Stoic Joy, but also the backs of cereal packages and the "In case of fire" escape instructions on doors if there's nothing else.
  • I'm from New Zealand and can, if necessary, butcher a sheep/pig/deer/rabbit, build a fence, milk a cow by hand (or milk several hundred, given a decent sized milking shed), TB test deer, fell trees, and use the word "munted" in a sentence.
  • I've ridden horses all my life and still volunteer occasionally as an equine masseuse for some of the carriage horses in Central Park.
  • I love automating stuff and am working on fully automating my home aquaponics set up: a combination of an aquarium and a grow bed which currently produces great quantities of grass for my cats to puke up.

    I had sort of planned to put all this stuff in my personal website - write ups of personal projects, a good reads feed, an "About me" section, and maybe a page of my sewing/knitting creations.

    I'll certainly look into adding some more personality into the resume design, it is currently the result of a google template, which is pretty blah.

    Again, Thanks so much for your feedback! It's been really helpful!
u/awj · 1 pointr/programming

Seriously, that book is awesome. It seems like, for a while, my CS professors were able to pick whatever books they wanted in teaching their classes. So we had theory of computation with Sipser, algorithms and complexity with this book (not sure if it has a shorthand reference), compilers was the Appel book by preference, with several others acceptable.

Then, I would guess, someone complained about "difficulty". The administration got involved and things are more "approachable" now. Not so wonderful anymore.

Also, for me this book handled abstract algebra in much the same way that Sipser handles the theory of computation.

u/fatso784 · 1 pointr/compsci

Anything can be self-taught if you're determined enough. Buy a good book on algorithms (http://www.amazon.com/Introduction-Algorithms-Second-Thomas-Cormen/dp/0262032937) and watch http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-introduction-to-algorithms-sma-5503-fall-2005/. That book is definitive, so if you finish it and learn it, it's all you'll really need to get ahead of most programmers.

In terms of math, you'll pick it up. Recursive and Big-O notation were surprisingly easy to understand after watching a few lectures, not even looking at the book.

u/mycall · 1 pointr/programming

Why is that version $113.29 yet this one is $65?

If you can't afford either, try this.

u/Jaco__ · 1 pointr/AskProgramming

Haskell Programming from first principles
Excellent choice if you are kinda new to programming. Really thorough.

Programming in Haskell




SICP

u/MrWhis · 1 pointr/AskProgramming

On the internet I'm seeing an unofficial textbook version that is free, while on Amazon they sell the original. Are they the same book or should I grab one in particular?

u/letrec · 1 pointr/java
u/lingual_panda · 1 pointr/cscareerquestions

Thank you! I'm actually realizing that I learn more from my textbook than I do from the lectures for my intro class, so after my midterm tomorrow I think I'm going to focus on reading ahead for the next couple weeks and spending the second half of summer getting a handle on data structures and algorithms.

Aaand I just looked it up, the data structures course just has a C reference book and the algorithms course doesn't have a required textbook. I'll have to look into which textbooks are commonly used for those classes elsewhere.

I'm also planning to read MIT's Structure and Interpretation of Computer Programs at some point, cause I heard it was a really good intro book. Should I try to get through that first and then hit up data structures and algorithms?

u/spaghetti_slinger · 1 pointr/cscareerquestions

Books:

The Pragmatic Programmer ($30)

Clean Code ($40)

Structure and Interpretation of Computer Programs ($40)

T Shirts ($20-30):

Works on my machine

Hello world

Other:

A nice coffee mug or teapot for the office

Nerdy desk top calendar

u/eat_those_lemons · 1 pointr/haskell

That makes sense, I guess I am slightly despairing because it is such a huge project and I am a sole developer and because of how much it is all tied together it takes a full day to just add a simple button, and to add all the changes all through the code base

​

Thanks for the a) reassurance and b) the recommendation

​

When you say SICP do you mean this book? https://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871/ref=asc_df_0262510871/

u/Hdandhf · 1 pointr/cscareerquestions

Check out Structure and Interpretation of Computer Programs.

https://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871

I'm pretty sure you can find a PDF for free, if that's too much.

u/NicolasGuacamole · 1 pointr/learnprogramming

Buy this book and read them a chapter every night as a bedtime story. It has wizards and suchlike so is very workable for children.

SICP

u/all_reddits_are_mine · 1 pointr/college

Exactly. There's a low market for course books in India, so the prices are really low (as with all international editions). Almost all of my friends haven't heard of any books like TAOCP, and SICP. Everything's so cheap! They're missing out on so much!

u/ForeverAlot · 1 pointr/PHP

These are insightful questions and the answers are less obvious that it might seem. There are subtleties that for various reasons I can't and won't cover here, and the best advice I can give you is to read about the subject(s). The second best advice I can give you is to search Google like mad, because there's a lot of stuff out there and I don't have time to find other books.

  • Scott Meyers' Effective C++ series are aimed at C++ but contain some general purpose advice w.r.t. API design and code structuring (in contrast, Herb Sutter's stuff is almost purely C++, so don't bother with that).
  • Eric Evans' Domain-Driven Design is a little more high-level than your questions but a good read.

    > but setters should be protected, for extensibility

    How often do you extend your classes? If it's often, it's probably too much -- favour composition ("has-a") over inheritance ("is-a"). I declare all my classes final to start with and I rarely have to change that. More, your private members are the only ones you have complete control over -- as soon as something is protected a deriving class can use it, and then the member is effectively part of the public API after all. For this reason, protected is nearly as bad as public. This is one of the things Scott Meyers talks about.

    > Furthermore, do I even need setters if my class is going to be the only thing setting its variables?!

    Automatic code generation is one of the least helpful things IDEs have given us and, I contend, a direct source of much shitty code. Make no setter or getter until it is absolutely needed, and provide neither for public fields (that will cause confusion).

    > Should I ever be setting object variables outside the class?!

    "It depends". There are valid reasons for it, and of course I can't think of any off the top of my head. What you should absolutely avoid is the ability to construct a Foo without a Bar when Foo cannot work without Bar. In this case, it's a constructor dependency, and a Foo::setBar() implies that Foo's Bar instance can be in an invalid state.
u/ThePsion5 · 1 pointr/laravel

No problem! I've been reading this one, and it's an awesome resource: http://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

Aside from that, Domain-driven design gets occasionally discussed over at http://laracasts.com, and there's a google group specifically for using DDD in php: https://groups.google.com/forum/#!forum/dddinphp

u/kubr0t0 · 1 pointr/programming

> That's exactly a multi-agent simulation.

Well yeah, that was kind of my point. At least in biology, this isn't unusual.

> For most processes this is a too low level representation to be
> useful.

I find very useful, and I've modeled pretty much all business processes that I've implemented this way. Eric Evan's excellent book Domain Driven Design describes some very useful ways to reason about business processes, in effect, as multi-agent simulations (if you call an agent an entity in DDD terms).

u/K60d56 · 1 pointr/programming

> I don't know what the fuck you're actually advocating.

Try to get some more experience, kid. Especially with things like Dependency Inversion and Domain Driven Design. Then maybe you'll understand.

> Put adaptors on all the things so you don't depend on them, but
> you're not passing interfaces to everything.

You don't create such fine grained interfaces that wrap around a single library in particular. That would make the adapter nothing more than a call translator. You would end up with a bunch of interfaces around every little library you need, and that would be dumb.

The whole point is that your code represents some business functionality, and the architecture should be dependent on the rules of the business. At some point you'll need help with some aspect of the business or another. You write an interface to describe that need. The adapter simply fulfills that need.

The adapter will probably have functionality of its own, and depending on the system, perhaps not trivial. But that functionality is isolated from the rest of the system. It can be replaced at some cost that is isolated.

J2EE application servers do a lot of work, and the EJBs leveraged that capability (like clustering, distributed transactions, etc.). We had to find replacements for those capabilities. The replacement wasn't trivial. But the application itself could care less about most of that. So the risk to the application was substantially reduced.
Which allowed us to focus on specific, isolated areas and didn't risk the entire application.

> Then you talk about the one time that architecture saved you
> time and not the five hundred times it didn't.

Prove it.

> If you're putting all libraries in an adaptor then you're interfacing
> all the things and your constructors will be huge or nested.

The number of interfaces depends on how fine grained your abstractions are. You don't want your abstractions to be too fine grained, or you'll have a ton of dependencies. If you end up creating 15 interfaces in your constructor, that means you have way too many dependencies. That's a definite code smell and a definite sign you need to rethink your abstractions.

> If you're transforming your data

And what data would that be? For the most part, we are talking about parameters.

> you've got application logic in your adaptor which means
> replacing it is not low risk.

Not application logic, which is another domain. But logic related to the domain of the library. Because that logic is isolated, and independent of the business rules of the application, it is a lot easier to replace.

> If you're encapsulating your errors you've broken your stack
> trace.

Nope. The error is simply being recast into the application's domain. That is if it needs to be recast into the applications domain at all. The adapter can do much of the error handling itself, if that error handling falls into its domain.

> Separating your business logic from your presentation layer has
> nothing to do with being dependent on libraries.

It has everything to do with it. The separation of presentation layer from business logic is small potatoes compared to what I'm talking about. The Domain Driven Design book has a great section about the Anti-Corruption Layer, which describes how to protect an applications architecture from the architecture of other libraries and systems.

Libraries that represent an entire domain separate from the business domain shouldn't affect its architecture. If the library is substantial enough that it could effect the architecture of the application and more importantly is a risk to the application, it belongs to a different domain and the domains should be separated to reduce risk. We're not talking apache-commons-lang which just does string manipulation.

u/toffeescaf · 1 pointr/javascript

I would recommend looking into Domain Driven Design (DDD). A book a lot of people recommend is Domain Driven Design by Eric Evans but sometimes it's hard to translate the examples in the book to something you can apply in JavaScript. However you can probably find articles about DDD in JavaScript if you google for it.

u/signalling · 1 pointr/ExperiencedDevs

I’m fairly certain (sorry if I’m wrong) he means the book by Eric Evans:
https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

u/zem · 1 pointr/learnprogramming

non programming, but for history of computers, my two favourite books are where wizards stay up late and dealers of lightning. as a programmer, you will definitely appreciate and be inspired by both of them.

for programming, sandi metz's practical object-oriented design in ruby is a good read, and possibly worth it even if you're not a ruby programmer.

u/PLanPLan · 1 pointr/ukpolitics

Get a copy of "Where Wizards Stay Up Late", great history of where the Internet came from and the key people involved.

u/ancap_throwaway0424 · 1 pointr/CapitalismVSocialism
u/emilvikstrom · 1 pointr/todayilearned

I can highly recommend the book Where wizards stay up late to anyone who wants a more complete picture of the history of the Internet. It can be a bit dry at times but the story in itself is interesting enough that I could hardly put down the book.

u/EtherMan · 1 pointr/KotakuInAction

> For your reading pleasure: http://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674 Knowledge is power.

Thanks. But I work in that field and are well versed in what the internet is, what it is not, and it's history. ARPANET is not the internet. It never was, and never will be. As I said, it's a completely different network for a completely different purpose. The TECH used for the development of ARPANET, was repurposed to create the internet. But saying that it therefor is the internet, is like saying a car is a horse cart. It's simply not true.

> The OP did, when contending that the rough-and-tumble days of Usenet justified, even necessitated, a similar treatment of the ideology of "safe spaces" today.

No he didn't. You trying to cram that interpretation into it, does not make it one. Nor was that what he said. Why are you lying?

> Our legal system disagrees with you, vehemently. Here is achildren's primer on the subject, which seems about your speed:

No it does not. There's plenty of rulings from multiple courts in multiple levels regarding this. Free speech has nothing to do with what a company can and cannot do with their platforms. Free speech is about free speech, nothing else. That's not to say they're not allowed to control their platform, it simply have nothing to do with free speech. That a company has no obligation to allow you to use their platform for your free speech, is COMPLETELY different, from the company having free speech rights. Is the education on the difference between free speech and the first amendment and what the difference between those things is, REALLY this bad in the US that even people in Europe knows it better? You're just being silly now... Seriously...

u/tealeg · 1 pointr/compsci

Generally speaking there are very few books with the effortless, readable style that "Hackers" has, and those that do tend to be more specific in focus.

On Free Software this is OK: http://oreilly.com/openbook/freedom/

On Apple, avoid the Steve Jobs love-in and get the older exploration of what went wrong in the first place: http://www.amazon.com/Infinite-Loop-Michael-Malone/dp/0385486847

Microsoft was dull as dishwater after the early years (already covered in "hackers").

For the roots of the Internet you want this:
http://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674/ref=sr_1_1?s=books&ie=UTF8&qid=1333743593&sr=1-1

u/Hurkamur · 1 pointr/worldpolitics

Read a book for once in your miserable, ignorant life you pathetic fuckwit.

u/ericGraves · 1 pointr/askscience

I wasn't aware that we had a standardized protocol on quatum key distribution.

It is surprising that they would standardize an abort for no reason.

Edit-- Can not find anything on this standard. By ``they'' do you refer to the authors of BB84? Because as I linked, they did later work showing that by privacy amplification you can handle a good amount of error from adversary observations. If you have access to Mike and Ike, they explicitly discuss privacy amplification as part of BB84. And it handles a large fraction of errors.

edit-- I need to relax.

u/Quadra_Slam · 1 pointr/IAmA

Just Googled the textbooks, as I do not really know the field so well, but are you referring to this book?

u/-nirai- · 1 pointr/philosophy

As far as we know quantum computation does not transcend Turing computability. The following is from Nielsen's book on Quantum Computation:

> quantum computers also obey the Church–Turing thesis. That is, quantum computers can compute the same class of functions as is computable by a Turing machine. The difference between quantum computers and Turing machines turns out to lie in the efficiency with which the computation of the function may be performed

u/HelloAnnyong · 1 pointr/askscience

You're either a troll or you're just exceptionally bad at what you do. Pick up a copy of Nielsen and Chuang if you actually want to learn this stuff. Otherwise I'm done responding to you.

Scott Aaronson also has an excellent series of lectures available online which may help you.

u/SoSweetAndTasty · 1 pointr/AskPhysics

Books like Griffiths quantum or Nielsen and Chuang quantum information? From the sounds of your post you have some large gaps in your understanding.

u/c3261d3b8d1565dda639 · 1 pointr/programming

> I know basically nothing about x86 internals to make an accurate statement

If you're interested in learning about the internals, check out some real world technologies articles. For instance, Intel’s Haswell CPU Microarchitecture. On page 3, Haswell Out-of-Order Scheduling, it talks about the register renaming that goes on to support out-of-order execution.

It's more detail than most people really need to know, but it's interesting to understand what modern microprocessors are doing under the hood during program execution.

For anyone else reading, an even easier introduction to the topic is in the awesome Computer Systems: A Programmer's Perspective. It'll get you comfortable with the machine language representations of programs first, and then move on to basic architecture for sequential execution, and finally pipelined architecture. It's a solid base to move forward from to modern architecture articles like on real world technologies. There are more detailed treatments if you're really interested, e.g. Computer Architecture, A Quantitative Approach, but I have never read it so can't say much about it.

u/bobj33 · 1 pointr/hardware

Computer Architecture: A Quantitative Approach

This is THE classic textbook on CPU design. It was written by John Hennessy the creator of MIPS CPU who went to Stanford and David Patterson the creater of the SPARC CPU from Berkeley. I've probably talked to 50 engineers over the years and everyone used the same book. I interviewed new grads last year and they are still using the same book.

https://en.wikipedia.org/wiki/John_L._Hennessy

https://en.wikipedia.org/wiki/David_Patterson_%28computer_scientist%29


This is the current 5th edition.

http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/

I used the 2nd edition back in 1997. You can get it for 1 penny used. Considering that 3 of the students I interviewed last year did the exact same 3 projects I did 18 years before I'm not sure that you need the new edition!

http://www.amazon.com/Computer-Architecture-Quantitative-Approach-Second/dp/1558603298/


There are other books on analog circuits, logic design, layout etc. I linked to a very basic intro to layout book elsewhere in the thread.

The sophomore level intro to digital logic design class used this book. This explains AND / OR gates and flip flops which are the foundation of any CPU, GPU, ASIC.


http://www.amazon.com/Contemporary-Logic-Design-Randy-Katz/dp/0201308576/

u/invalid_dictorian · 1 pointr/hardware

Unless you're trying to squeeze every last ounce of performance out of a fixed system (e.g. say you're reprogramming the firmware for Voyager I and II), writing software to only work for a specific piece of hardware is generally a bad idea. Yeah, it's nice and interesting to know how Sandy Bridge's branch prediction or reordering buffer works, but if your code is depending on that, then you're signing yourself up for software maintenance nightmare.

As for where to learn about CPUs, go with textbooks. Go with Hennessy & Patterson. That edition is from 2011. I graduated more than a decade ago and have the 2nd edition, which talks alot about MIPS architecture. I don't know if this latest edition has recent updates. However, as my professor like to say, most of the ideas of speeding up CPUs have been invented in the 60s. It's just that fabrication technology has improved enough in recent times that allow those ideas to actually be implementable. The more recent trends are all for low power savings, clock scaling, turning on and off parts of the processor when it is not in use, so a lot of micromanaging of all the different parts of the CPU.

Wikipedia has lots of details too, but you have to know the terms to search to find the articles. Here's a few terms off the top of my head that might help you get started: Branch Prediction, mis-prediction penalty, instruction bubbles, Cache, Memory Subsystem, TLB, Register Renaming, Reorder Buffer, ALUs, execution units. Vector processors vs. Scalar processors, Super-scalar processors. SIMD (single instruction multiple data), data hazards, instruction prefetching, simultaneous multithreading (SMT) aka hyperthreading, IO virtualization.

u/remember_khitomer · 1 pointr/askscience

A college textbook. I used an earlier edition of this book and thought it was excellent:

http://www.amazon.com/Computer-Architecture-Fifth-Edition-Quantitative/dp/012383872X

u/happyscrappy · 1 pointr/SubredditDrama

Just because people use other words most of the time doesn't mean one word doesn't cover the other.

Storage is memory.

If you read Hennessy & Patterson (and you should) a big feature of the book is the Memory hierarchy which includes mass storage. It even includes off-line storage.

u/jmknsd · 1 pointr/hardware

I learned mostly from:

http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X

But this has alot of information in it, and was the book for the prerequisite of the class I took that Used the above book:


http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938

u/MrGreggle · 1 pointr/AskMen

This book will explain how you go from a transistor to all of the basic components of a CPU: https://www.amazon.com/Digital-Design-RTL-VHDL-Verilog/dp/0470531088

This book will explain all kinds of advanced processor design and largely explain how the vast majority of modern processors work: https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269/ref=sr_1_1?s=books&ie=UTF8&qid=1484689485&sr=1-1&keywords=computer+organization+and+design

There really aren't any prerequisites, you can just read those 2 books.

u/hell_0n_wheel · 1 pointr/Python

All I could recommend you are the texts I picked up in college. I haven't looked for any other resources. That being said:

http://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

This text is really only useful after learning some discrete math, but is THE book to learn algorithms:

http://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844

This one wasn't given to me in college, but at my first job. Really opened my eyes to OOD & software architecture:

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612

u/AsteriskTheServer · 1 pointr/learnprogramming

IMO the best book that gives an general overview of computer architecture is the book Computer Organization and Design by David A. Patterson and John L. Hennessy. That being said this is a difficult book. However, it goes over how memory hierarchy works to virtual memory to even showing you the data path of how instructions are executed. Is this is going to tell everything you need to do pen testing and so forth?. Not a chance.

u/ravenorl · 1 pointr/ucf

I had to look it up. It's the intro to computer architecture class.

Here's the textbook on Amazon without an affiliate link -- https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269

Not because I think you're going to buy it on Amazon, but because you can "Look Inside" from that link.

It looks like they're still teaching from an old edition of that textbook. I would have converted to the RISC-V edition by now.

Sad.

That's a good textbook, though. The authors are legends (literally, look them up) in computer architecture. I actually prefer their other textbook ( -- A Quantitative Approach), but it's not suited for an intro class.

Have fun.

u/zweischeisse · 1 pointr/jhu

These are what I had. Different professors might use different books, obviously.

u/laneLazerBeamz · 1 pointr/ComputerEngineering

We use vivado in school and they teach verilog. My impression is that VHDL is more of an industry standard, but I'm still a student so don't quote me on that. The way my university introduced digital logic was by having us start at logic gate level then use those modules to make state machines and last semester we made a MIPS processor.


Vivado (web pack should be free)
https://www.xilinx.com/products/design-tools/vivado.html

Here is the book we used for the processor
https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

u/dnabre · 1 pointr/asm

Apparently the side bar needs additions, there is nothing there for MIPS.

edit Asked a colleague about online courses, Programmed Introduction to MIPS Assembly Language was recommended. Look well paced, and even has little quizzes you take to test your understanding as you go.

Books though, I can help with:

For high level theory and general architecture, the goto book is Computer Organization and Design by Patterson & Hennessy. It uses MIPS for the examples and the circuit diagrams throughout. I think its in it 5th edition. There are a few chapters at the end about where computer architecture is going and such (unrelated to MIPS) that changes between editions. University libraries will definitely have this, possibly even public one. This text is the standard for college-level computer science courses on computer architecture, and has been for something in the ballpark of 20 years.

For practical coding, I'd recommend (See MIPS Run by Dominic Sweetman](https://smile.amazon.com/Morgan-Kaufmann-Computer-Architecture-Design/dp/0120884216). It's in its 2nd edition, which I haven't read, so I don't know if it offers anything more than the first. The [first edition]
(https://smile.amazon.com/Morgan-Kaufmann-Computer-Architecture-Design/dp/1558604103) can be had used for next to nothing. It's especially good if you're writing real MIPS assembly on Linux as opposed to writing it on a simulator.

u/empleadoEstatalBot · 1 pointr/argentina
	


	


	


> # Teach Yourself Computer Science
>
>
>
> If you’re a self-taught engineer or bootcamp grad, you owe it to yourself to learn computer science. Thankfully, you can give yourself a world-class CS education without investing years and a small fortune in a degree program 💸.
>
> There are plenty of resources out there, but some are better than others. You don’t need yet another “200+ Free Online Courses” listicle. You need answers to these questions:
>
> - Which subjects should you learn, and why?
> - What is the best book or video lecture series for each subject?
>
> This guide is our attempt to definitively answer these questions.
>
> ## TL;DR:
>
> Study all nine subjects below, in roughly the presented order, using either the suggested textbook or video lecture series, but ideally both. Aim for 100-200 hours of study of each topic, then revist favorites throughout your career 🚀.
>
>
>
>
>
> Subject Why study? Best book Best videos Programming Don’t be the person who “never quite understood” something like recursion. Structure and Interpretation of Computer Programs Brian Harvey’s Berkeley CS 61A Computer Architecture If you don’t have a solid mental model of how a computer actually works, all of your higher-level abstractions will be brittle. Computer Organization and Design Berkeley CS 61C Algorithms and Data Structures If you don’t know how to use ubiquitous data structures like stacks, queues, trees, and graphs, you won’t be able to solve hard problems. The Algorithm Design Manual Steven Skiena’s lectures Math for CS CS is basically a runaway branch of applied math, so learning math will give you a competitive advantage. Mathematics for Computer Science Tom Leighton’s MIT 6.042J Operating Systems Most of the code you write is run by an operating system, so you should know how those interact. Operating Systems: Three Easy Pieces Berkeley CS 162 Computer Networking The Internet turned out to be a big deal: understand how it works to unlock its full potential. Computer Networking: A Top-Down Approach Stanford CS 144 Databases Data is at the heart of most significant programs, but few understand how database systems actually work. Readings in Database Systems Joe Hellerstein’s Berkeley CS 186 Languages and Compilers If you understand how languages and compilers actually work, you’ll write better code and learn new languages more easily. Compilers: Principles, Techniques and Tools Alex Aiken’s course on Lagunita Distributed Systems These days, most systems are distributed systems. Distributed Systems, 3rd Edition by Maarten van Steen 🤷‍
>
> ## Why learn computer science?
>
> There are 2 types of software engineer: those who understand computer science well enough to do challenging, innovative work, and those who just get by because they’re familiar with a few high level tools.
>
> Both call themselves software engineers, and both tend to earn similar salaries in their early careers. But Type 1 engineers grow in to more fullfilling and well-remunerated work over time, whether that’s valuable commercial work or breakthrough open-source projects, technical leadership or high-quality individual contributions.
>
>
>
> Type 1 engineers find ways to learn computer science in depth, whether through conventional means or by relentlessly learning throughout their careers. Type 2 engineers typically stay at the surface, learning specific tools and technologies rather than their underlying foundations, only picking up new skills when the winds of technical fashion change.
>
> Currently, the number of people entering the industry is rapidly increasing, while the number of CS grads is essentially static. This oversupply of Type 2 engineers is starting to reduce their employment opportunities and keep them out of the industry’s more fulfilling work. Whether you’re striving to become a Type 1 engineer or simply looking for more job security, learning computer science is the only reliable path.
>
>
>
>
>
> ## Subject guides
>
> ### Programming
>
> Most undergraduate CS programs start with an “introduction” to computer programming. The best versions of these courses cater not just to novices, but also to those who missed beneficial concepts and programming models while first learning to code.
>
> Our standard recommendation for this content is the classic Structure and Interpretation of Computer Programs, which is available online for free both as a book, and as a set of MIT video lectures. While those lectures are great, our video suggestion is actually Brian Harvey’s SICP lectures (for the 61A course at Berkeley) instead. These are more refined and better targeted at new students than are the MIT lectures.
>
> We recommend working through at least the first three chapters of SICP and doing the exercises. For additional practice, work through a set of small programming problems like those on exercism.
>
> For those who find SICP too challenging, we recommend How to Design Programs. For those who find it too easy, we recommend Concepts, Techniques, and Models of Computer Programming.
>
>
>
> [Structure and Interpretation of Computer Programs](https://teachyourselfcs.com//sicp.jpg)
>
>
>
> ### Computer Architecture
>
> Computer Architecture—sometimes called “computer systems” or “computer organization”—is an important first look at computing below the surface of software. In our experience, it’s the most neglected area among self-taught software engineers.
>
> The Elements of Computing Systems, also known as “Nand2Tetris” is an ambitious book attempting to give you a cohesive understanding of how everything in a computer works. Each chapter involves building a small piece of the overall system, from writing elementary logic gates in HDL, through a CPU and assembler, all the way to an application the size of a Tetris game.
>
> We recommend reading through the first six chapters of the book and completing the associated projects. This will develop your understanding of the relationship between the architecture of the machine and the software that runs on it.
>
> The first half of the book (and all of its projects), are available for free from the Nand2Tetris website. It’s also available as a Coursera course with accompanying videos.
>
> In seeking simplicity and cohesiveness, Nand2Tetris trades off depth. In particular, two very important concepts in modern computer architectures are pipelining and memory hierarchy, but both are mostly absent from the text.
>
> Once you feel comfortable with the content of Nand2Tetris, our next suggestion is Patterson and Hennesy’s Computer Organization and Design, an excellent and now classic text. Not every section in the book is essential; we suggest following Berkeley’s CS61C course “Great Ideas in Computer Architecture” for specific readings. The lecture notes and labs are available online, and past lectures are on the Internet Archive.
>
>
>
>
>
> ### Algorithms and Data Structures
>
> We agree with decades of common wisdom that familiarity with common algorithms and data structures is one of the most empowering aspects of a computer science education. This is also a great place to train one’s general problem-solving abilities, which will pay off in every other area of study.
>
> There are hundreds of books available, but our favorite is The Algorithm Design Manual by Steven Skiena. He clearly loves this stuff and can’t wait to help you understand it. This is a refreshing change, in our opinion, from the more commonly recommended Cormen, Leiserson, Rivest & Stein, or Sedgewick books. These last two texts tend to be too proof-heavy for those learning the material primarily to help them solve problems.
>

> (continues in next comment)

u/interiorcrocodile666 · 1 pointr/learnprogramming

Sounds like you need to read a book on Computer Organization and Design. This book will teach you how to build a computer from basic logic gates. I can't recommend it highly enough.

u/LogBaseE · 1 pointr/ECE

It's verilog based but I like ciletti, mano, and patterson:

https://www.amazon.com/Advanced-Digital-Design-Verilog-HDL/dp/0136019285

https://www.amazon.com/Digital-Design-Introduction-Verilog-HDL/dp/0132774208

https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

https://www.amazon.com/dp/0124077269/ref=pd_luc_rh_ebxgy_01_01_t_img_lh?_encoding=UTF8&psc=1

I just went through a project course and here were some good project ideas:
Conway's game of life with VGA/LEDPanel
Single Cycle CPU
2D convolution with Systolic Arrays (really cool)


u/stillinmotionmusic · 1 pointr/utdallas

The Assignments changed in difficulty a lot, some things were very nit picky things from the textbook, others were writing general Assembly programs.

the topics in general were what was difficult, writing Assembly itself wasn't that hard, but understanding how everything fits together with the diagrams was the difficult part.

we used Computer Organization and Design as our textbook, however the book is not designed very well and contains errors, it has a lot of information, but making sense of the information is what was difficult for me.

That's why I am trying to learn Digital Logic because I feel being grounded in that would make reading the diagrams and everything easier. Appendix B in that textbook, tries to cover digital logic, but it isn't explained very well and assumes you already know it when going through the actual material in the book.

u/Flynzo · 1 pointr/RPI

https://www.amazon.com/Computer-Organization-Design-MIPS-Fifth/dp/0124077269

I believe this is the one. I had just rented it for the semester from Amazon, but they also have it in the bookstore. I don't doubt there's PDF's you could find of it online too.

u/lordyod · 1 pointr/UCSC

That was a typo it's CE12. The past three quarters it has focused on digital logic structures, binary/hex math, basics of building a processor, and the MIPS assembly language. If you want to get a head start on the book pick up Computer Organization and Design.

CS101 will depend on the instructor. If you are assigned to Tantalo's class then you will be doing a mix of programming assignments and proof stuff. I'm not super familiar with the details but luckily, his materials are all posted on his course websites, just google UCSC CMPS 101 and find it. If on the other hand you are assigned to Sesh's class then (at least based on this last quarter) you won't be doing coding, you'll be doing very thorough proofs about algorithms. Both of these classes use CLRS which, if you're serious about CS, you'll probably want to have as a desk reference regardless.

u/Merad · 1 pointr/askscience

> I wanted to write a program to emulate a CPU so I could fully understand how it's operation actually worked, but I have no idea of what the "start up" process is, or how we determine when to get a new instruction

The CPU begins loading instructions at a fixed address known as its reset vector. On AVR microcontrollers the reset vector is always at address 0, and it is always a jump instruction with the address where startup code actually begins. On x86 processors the reset vector is at address 0xFFFFFFF0. For a CPU emulator, which presumably doesn't need to deal with interrupt vectors or anything like that, I would just start loading instructions from address 0.

Also, you should should look at some of the simplified CPU designs that have been made for teaching. In my classes we used the LC-3 (a very simple and straightforward design) then moved to the y86 (a simplified x86 design mainly used to teach pipelining). It will be much more realistic to make an emulator for one of them rather than an extremely complex real-world design. I've linked below to textbooks that are based around each of those designs.

http://highered.mheducation.com/sites/0072467509/index.html

http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040

u/zifyoip · 1 pointr/learnprogramming

This kind of stuff is commonly taught in university CS courses called something like "Computer Architecture." The book that was used in my computer architecture course was Computer Systems: A Programmer's Perspective, by Bryant and O'Hallaron (book home page). This book uses C and IA32 assembly (or rather something the authors call "Y32," which is a simplified version of IA32).

I cannot support copyright violations, so I will not say anything that might lead you to believe you might be able to find a PDF of this book on the Web if you Google for the title.

u/modernzen · 1 pointr/cuboulder

We used this one when I took the course two years ago. I don't think it's changed, but you might want to double check that.

u/wgunther · 1 pointr/learnprogramming

The question you want to ask is how is dynamic memory allocation implemented. You can probably find something online that walks through am implementation of malloc. Essentially, it has to take a big block of memory, and when a request is received it has to find space for it in this big block. A naive implementation might have a header for each block of memory, and that header points to the next block of memory and the current state of the block (free or allocated). When a memory allocation request is received, it finds a block of free memory large enough by "jumping" from header to header, alters that header to say allocated, and then adds a header at the end of allocated bit of memory which says how much is left over. When memory is freed, the header is changed to reflect its status and merged with surrounding free blocks.

Clearly, I'm brushing over a lot of details. And a challenge is finding strategies for avoiding fragmenting memory. You should read the chapter on memory allocation from Computer Systems.

u/PicklesInParadise · 1 pointr/learnprogramming

I haven't read it in years, but I remember The C Programming Language being very useful.

If you want to learn more about the low level details of how computers work in general, I own the following books and recommend them:

---

u/__mp · 1 pointr/sysadmin

For this case I usually recommend Computer Systems: A Programmer's Perspective. However it's a layer deeper than the OP is looking for:

> This book focuses on the key concepts of basic network programming, program structure and execution, running programs on a system, and interaction and communication between programs. (Amazon)

However, I think that this should giver a better understanding of system and networking internals and will allow to pick her up on any other topic. Additionally I think that its one of those books that contains timeless knowledge which stands in contrast to a lot of other IT books out there.
My physical copy stands directly next to Physically-Based Rendering.

u/huck_cussler · 1 pointr/learnprogramming

Computer Systems: A Programmer's Perspective is the one we use at my school, and it is pretty awesome. It's engaging and entertaining inasmuch as a book on systems programming can be. There are tons of exercises and there is a website where you can work on lab assignments that the authors created.

u/Sean-Der · 1 pointr/uCinci

Just for my own clarification, is that just applied programming? I am a freshman also, and I have to make a decision about this also.

On one side it really is not that hard to learn how to program. Anyone can make it through LPTHW or hell even K&R... but being able to grapple SICP is a whole other story.

I really enjoy the whole spectrum, but what I am really looking for is the traditional theoretical courses. These sort of lessons are what really make me a better programmer I have found. I was a crappy PHP dev until I learned C, then I was a crappy C dev until I picked up Computer Systems: A Programmers Perspective

The one thing I want to avoid is to sit through garbage I am never gonna use like C#. First off its non-standard (No ISO or ECMA for the later version) Secondly non-free software doesn't teach you anything, merely makes you memorize what buttons and knobs to press.

So any upper classmen want to give advice to clueless youngins :D

u/Watabou90 · 1 pointr/learnprogramming

If you want x86 assembly, this book is very good: http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=dp_ob_title_bk/180-6741587-3105245

I'm talking an assembly class this semester that involves writing assembly from scratch and this book (which is required for this class) is a lifesaver because the professor isn't that great at summarizing the important points.

I think it's a good book. It starts easy and it has a lot of exercises that have answers on the back of the chapter so you can check your answers pretty easily.

u/ironcrown9 · 1 pointr/learnprogramming

There's a course at my school that covers exactly that (216 at UMD).
The book that's recommended is computer systems a programmer's perspective, it's exactly what you're looking for. Code is only used for examples, C and more often assembly. Mostly details on CPU instructions, hardware implementation and the creation of Unix

u/zhay · 1 pointr/compsci
u/megabreath · 1 pointr/videos

Not covered in this video: Peak Oil and the End of Cheap Abundant Energy.

All the bots in this video (and our whole society, in fact) are fueled by cheap abundant energy from fossil fuels. Reddit loves to pin its hopes on vaporware sources of cheap energy that are always JUuuuuST about to be figured out, but the reality is that we are NOT going to find a working replacement for our energy needs.

Bots may be here now, but they are not here to stay. The future will look more like The Long Descent and less like the Singularity.

Horses and human labor are poised to make a come back. Learn a trade craft. Grow food in your back yard. Develop a skill that will have value in the post-collapse economy. Become a beekeeper. Become a homebrewer. Make soap. Collapse now and avoid the rush.

EDIT: For a much more level-headed analysis, read this article right now: The End of Employment by John Michael Greer

u/Im_just_saying · 1 pointr/Christianity

The Singularity Is Near. Not sure why you're asking it in this forum, but it wouldn't mess with my theology at all, and I would welcome it as a blessing.

u/WordUpvote · 1 pointr/AskReddit

I suggest this book by Ray Kurzweil. Simply amazing.

u/Leninmb · 1 pointr/Futurology

I was actually thinking this a few days ago about my dog. Having read The Singularity Is Near by Ray Kurzweil, there are a few sections devoted to uploading the brain and using technology to augment brain capabilities. What it boils down to is that the truly unique things about our brain are 'past memories', 'emotions', and 'personality'. Every thing else is the brain is just stuff that regulates our bodies and processes information.

If we take the personality, memories, and emotions of my dog, and improve on the other parts of the brain by adding better memory, speech recognition, etc. Then we might just be able to create another biological species that rivals our intelligence.

We already are making the assumption that technology will make humans more advanced, the same thing should eventually apply to all other biological animals as well. (Except Mosquitos, of couse)

u/dk124497 · 1 pointr/ethtrader

Did the book happen to be The singularity is near?

u/peppaz · 1 pointr/Futurology

How can you be right or wrong about something that doesn't exist yet? I recommend this book.

https://www.amazon.com/Singularity-Near-Humans-Transcend-Biology/dp/0143037889

u/apantek · 1 pointr/askscience

Ray Kurzweil has been scarily accurate at predicting technological trends. You should check out The Singularity is Near, in which he makes some very specific estimates about many of these types of questions. You will have a completely different outlook on the way technology progresses after reading this... I certainly did.

u/ChemicallyCastrated · 1 pointr/AskReddit

Please read The Singularity Is Near by Ray Kurzweil. It will tell you what's next in terms of technology+biology.

u/BroGinoGGibroni · 1 pointr/Futurology

wow, yeah, 10 years is closer than 50 that's for sure. If you are right, that is something to be very excited about for sure. Just think of the possibilities. Can I ask where you get the estimate of 10 years? I am fairly uneducated on the subject, and admittedly I haven't even read the book about it therefore I am hesitant to even mention it, but I am familiar with Ray Kurzweil and his theories about The Singularity (basically when man and machine combine, hence "redefining" what it means to be human). I found his recent comments on nano-bots in our brains making us "God-like" intriguing to say the least, and if ever we will be able lay back, close our eyes, and experience some sort of virtual reality, it just makes sense to me that the most likely time when that will happen is when we have super intelligent nano-bots inside of our brains manipulating the way they work. I, personally, can't wait for this to happen, but I also know that I will be very apprehensive when it will come down to willfully injecting into my body millions and millions of nano-bots that have been specially designed to 'hijack' my brain, and make it work better. I think I will probably wait 10 years or so after people start doing it, maybe longer.

Here is Ray Kurzweil's book I was referring to that I really want to read: The Singularity Is Near: When Humans Transcend Biology

EDIT: I forgot to mention why I really brought up the singularity-- Mr. Kurzweil initially predicted that the singularity would occur sometime before 2030 (aka in the 2020's), but I believe he has now modified that to say that it will occur in the 2030's. Either way, that is not far away, and, being a pretty tech-savvy person myself (I pay attention to a thing or two) I think the 2030's is a reasonable estimate for something like this, but, as I mentioned earlier, I think it is the ethics of such a thing that will slow down true VR development (see: how the world responded to cloning)

double EDIT: just another thought (albeit quite a tangent)-- once a true singularity has been achieved (if ever?), 'transplanting' our consciousnesses into another body all of a sudden becomes quite a bit less sci-fi and altogether a more realistic possibility...

u/Snoutmol · 1 pointr/bestof

The Singularity is Near by Ray Kurzweil

"Based on current resources and estimated usage patterns based on historical information," shipping from the moon could be cost effective by 2045.

u/ArchangelleOPisAfag · 1 pointr/AskReddit

Of I saw the context; that's how I ran into this rant.

First of all, "Industrialization didn't improve things but for a small subset of society" isn't true. I can see where you're coming from, but even the poorest nations and people have benefited from industrialization.

Industrialization has caused murder rates to go down approx. 30-100 times lower. We are living in the most peaceful times in human history. See, thousands of years ago, you were lucky to see the age of 25. Now, even the poorest nations have average lifespans of 40+.

The reason you're so cynical is because news is more widespread nowadays. Again, I know where you're coming from because I have parents who think like you do.

Read this book:

http://www.amazon.com/Singularity-Near-Humans-Transcend-Biology/dp/0143037889

if you want a look of what is coming ahead and why we are in theory living in a third industrial revolution. Once you read this, you will learn to love the future. I guarantee it.

u/ZucriyAmsuna · 1 pointr/Random_Acts_Of_Amazon

Thanks!

Have you read The Singularity is Near? It's quite an interesting read.

Dr. Steel has a great view of things; I agree to everything in this video!

u/lfancypantsl · 1 pointr/Futurology

Give this a read. This isn't some crackpot, this is Google's director of engineering. I'm not saying it contradicts what you are saying.

>I doubt we'd anything like a true AI in 20 or so years

Is pretty close to his timetable too, but honestly even getting close to that computational power is well over what is needed to drive a car.

u/yonkeltron · 1 pointr/Judaism

have you read TSIN yet?

u/the_medicine · 1 pointr/CatholicPhilosophy

I had dismissed artificial consciousness for a long time because I believe it to be impossible and in fact still do. But I realized my outright dismissal was really a defense against the reality of superintelligence which is not that machine consciousness has major implications but but machines becoming competent does. I think for many (perhaps this is unfair) who assert that consciousness is purely material and therefore can be reproduced just see artificial consciousness as a big score for the naturalistic or material reductionist worldview. Then there are experts who are only interested in taking machine intelligence as far as it possibly can go whatever that means. Significantly there is a smaller group calling for caution and prudence in that endeavor. Have you seen the Story of the Sparrows? I can't find a link to it but it's at the beginning of Superintelligence: Paths, Dangers, Strategies.

u/ZodiacBrave98 · 1 pointr/PurplePillDebate

>no work from humans

On that day the machines cut out the humans, not implement Basic Income.

​

https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834/

u/entropywins8 · 1 pointr/nyc

I'm just going on the opinions of experts in the field:

https://en.wikipedia.org/wiki/The_Second_Machine_Age

Superintelligence: Paths, Dangers, Strategies https://www.amazon.com/dp/0198739834/

Yes we've had cotton gins and such but Artificial General Intelligence and Superintelligence is a game changer.

u/_infavol · 1 pointr/sociology

Superintelligence by Nick Bostrom is supposed to be good (I've been meaning to read it). There's also the YouTube video Humans Need Not Apply by C.G.P Grey which sounds like exactly what you need and the description has links to most of his sources.

u/skmz · 1 pointr/artificial

Re. Nick Bostrom: You should have a look at Superintelligence: Paths, Dangers, Strategies. It's definitely not about Terminators ending all of humanity. If anything, he outlines why even an indifference to human life can cause an AI (in the general or super-intelligent sense, not ML/prediction sense) to either subdue or circumvent humans' attempts to stop it from completing its goals.

If you believe that artificial general intelligence is possible, then he writes about things worth considering.

u/_immute_ · 1 pointr/WormFanfic

Maybe not exactly what you're asking for, but here's one by Bostrom: https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0198739834

u/Parsias · 1 pointr/videos

Anyone interested in AI should read Nick Bostrom's book, Superintelligence. Fair warning, it is very dense but rewarding.

One take away here is he did a survey of leading AI researchers who were asked to predict when General AI might arrive - the majority (~67%) believe it will take more than 25 years, interestingly 25% believe it might never happen. Source

Also, really great panel discussion about AI with Elon Musk, Bostrom, others.

u/neoneye · 1 pointr/scifi
u/hahanawmsayin · 1 pointr/gadgets

For sure - it's the old saw about technology being a force for good or evil, depending entirely on how it's used.

(I'm actually reading "Superintelligence" right now)

Where does that leave you in terms of caring about privacy, though? You said you're not swayed by the argument about giving up your email password... is there another argument you find compelling? Do you think it's pretty much irrelevant if there's no oppressive regime to abuse the citizenry?

u/sjforman · 1 pointr/singularity

There's an old saying that you don't really understand something until you can make it yourself. So I think the biggest and most interesting considerations are the meta-ethical questions. Any responsible attempt to create an AGI has to grapple not only with the fundamental question of what constitutes ethical behavior, but with the immense challenge of implementing it in software. As a species, we're either going to need to understand ethics much more deeply, soon, or we're going to be doomed.

Must-read book on this subject: Superintelligence (http://amzn.to/24USaWX).

u/RedHotChiliRocket · 1 pointr/technology

https://www.amazon.com/gp/aw/d/0198739834/

Consciousness is a hard to define word, but he talks about what it would mean if you had an artificial general intelligence significantly smarter than humans, possible paths to create one, and dangers of doing so. I haven't looked into any of his other stuff (talks or whatever).

u/xiongchiamiov · 1 pointr/programming

When I took my algorithms class using Cormen's Introduction to Algorithms, the first thing we did in class when looking at a new algorithm was to rewrite their terrible psuedocode with more descriptive names (and more sensible syntax - double nesting for loops is silly).

u/nictheman · 1 pointr/learnprogramming

Look up Introduction to Algorithms by Cormen et al. It is pretty much a computation major's bible, after doing the ACM for 2 years it became my reference guide for almost everything. Really, every CS student should have this on their bookshelf.

Also, it's worth considering doing problems on TopCoder, uva Online Judge etc. These are generally "practical" implementations of the Algorithms, where you are given a scenario which can be solved through some computational algorithm (Normally implemented in a tricky way). The key is that it tests you on all sorts of data, and almost always includes those exceptional/boundary cases. The questions from UVA are generally past ACM Programming Competitions.

The reason I recommend this is that you said you enjoy implementing Algorithms. You may find that getting "real-world" applications (Even of they are a bit... crazy sometimes) makes the work even more enjoyable, and exposes you to using the algorithms in ways you may not have even thought of.

u/benihana · 1 pointr/programming

>So what did you do? Anyone else have a formal CS education and feel like they came out of it with nothing?

I graduated in 2006 and I've been doing web development professionally for almost four years now. Until about two weeks ago, I felt like I could have skipped my entire five years at school because most of the stuff just doesn't apply to web development since it's so far abstracted from the hardware. I was reading my algorithms book on the toilet the other day when I realized that I learned a shitton at school and it gave me an incredible advantage over the guy who learned web development on the fly. It helps to go back and re-read things after you have a context to put it into so you can apply what the theory to what you've learned.

It took me a long time to start getting designs down. You have to make a lot of mistakes before you can learn from them. It's as simple as that. Don't get discouraged. If you haven't read Head First Design Patterns, buy that book right now and read it cover to cover. I had read design pattern catalogs, but none of them conveyed the 'why' as well as HFDP did. They don't have abstract, car has wheels, Ford is a car. They have real code examples and they show you why you should favor composition rather than inheritance. Why you should follow the law of Demeter.

I've entertained the notion of starting over several times. Don't quit, and don't get discouraged. If you ever get to the point where you think you've learned all you need to learn and you're writing code that can't be improved, start over.

u/slaystation25 · 0 pointsr/learnprogramming

Sorry, I just didn't put that right. I meant how data is stored in memory, and how it can be manipulated using C. I'm using this book for the course, if that helps clear what I'm trying to say.

u/secret_bitcoin_login · 0 pointsr/CryptoCurrency

Here is the text that describes Kurzweil's vision. It is the only book I've ever read twice (besides the Bible, but I was confused then). I really suggest you pick it up - if money is an obstacle feel free to pm me your address and I'll send you a copy.

u/Flofinator · 0 pointsr/learnprogramming

Yikes! Well it's going to be pretty hard for you to really understand how to do Python without actually coding in it.

The one thing you could do though is get a book with examples and write them down and try to modify the examples to do something a little extra while at work.

I find the http://www.headfirstlabs.com/books/hfpython/ books the absolute best books for almost anything if you are just starting out. The Java book is especially fun!

I know this isn't exactly what you are asking but it might be a good resource for you to start using.

Another great book that will teach you parts of the theory, and has really good examples on how computers work is http://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2/ref=sr_1_1?s=digital-text&ie=UTF8&qid=1457746705&sr=1-1&keywords=code+charles+petzold .

That really helped me think about computers in a more intuitive way when I was first starting. It goes through the history and to what an adder is and more. I highly recommend that book if you want to understand how computers work.

u/Aeelorty · 0 pointsr/freemasonry

But you did memorize something to do all those actions. Verbatim is not always necessary for proficiency but we should strive for that whenever ritual is concerned to avoid unintentional changes. The language we use does hint at all kinds of things. Changing a preposition could change meaning and disrupt a series of connections. If you want a more in depth rational for why I say memorization is important I suggest The Information by Gleick.

u/hiyosilver64 · 0 pointsr/truegaming

She might be interested in this:

http://www.amazon.com/The-Art-Video-Games-Pac-Man/dp/159962110X/ref=sr_1_1?ie=UTF8&qid=1381262117&sr=8-1&keywords=video+games+are+art

Or even this:

http://www.amazon.com/The-Art-Game-Design-lenses/dp/0123694965/ref=sr_1_6?ie=UTF8&qid=1381262117&sr=8-6&keywords=video+games+are+art

Possibly even this:

http://www.amazon.com/Theory-Game-Design-Raph-Koster/dp/1449363210/ref=sr_1_4?s=books&ie=UTF8&qid=1381262296&sr=1-4&keywords=games+are+fun

I am a 65F gamer - let her know she's missing out if she ignores video games. Not only fun but uses the mind in ways older people tend to use rarely or stop using at times. The challenge of video games keeps the brain firing on all circuits. Puzzles, quests, challenge, etc., all combine to not only entertain but also to teach and to broaden thinking in general :)

u/mrmoreawesome · 0 pointsr/compsci

I would highly recommend this book, if you are at all interested in algorithm design and analysis.

[EDIT] Just finsihed a very difficult algorithms course, this was my bible.

u/kurashu89 · 0 pointsr/scala

Breaking away from the Rails (or in my case Django & SQLAlchemy) mindset of the model is both the business model and the persistence model can be hard to do.

However, why not this? -- This might not be good scala, I mostly tool around in it, you'd probably want status as a private attribute:

abstract class TaskStatus()
case class TaskCompleted() extends TaskStatus()
case class TaskPending() extends TaskStatus()

class Task(var status: TaskStatus) {
def isCompleted: Boolean = status match {
case TaskCompleted() => true
case => false
}

def markCompleted(): Unit = status = TaskCompleted()
}

Then Task is a regular object with no ties to a repository or service.

new Task(TaskCompleted()).isCompleted // true
new Task(TaskPending()).isCompleted // false
val t = new Task(TaskPending())
t.isCompleted // false
t.markCompleted()
t.isCompleted // true

This makes Task an entity it has business logic and state that persists from session to session (this other state is probably a title, who its assigned to, and an ID as a persistence allowance, but I've omitted these for brevity).

You can then stuff these tasks into a data store (pick your poison on how you do that) and reconstitute them from that data store -- essentially that's what a repository is: take in some entities, shove them into a data store; take in a query, pull some entities out of the data store.

With that in mind, your TaskService.isTaskCompleted can be a simple wrapper around a repository query that pulls that one task out and does Task#isCompleted on it (note you'll want to guard against unfound Tasks):

class TaskService(val repo: TaskRepo) {
def isTaskComplete(id: Int) {
repo.findId(id).isComplete
}
}


Expanding this out to your TaskAssignments, your task now holds a reference to many TaskAssignments that probably have an isCompleted on them as well. In that case, Task#isCompleted can end up just being `tasks.forAll(
.isCompleted).<br /> <br /> Still no reference to a repository. The trick here is thatTaskis now an aggregate. When you ask the repository &quot;Will you find me task 12345?&quot; It not only pulls the literalTask` out of the data store, but also all the parts of the Task as well. So you get the Task and the TaskAssignment objects.

You might want to check out Domain Driven Design which helped cement these sorts of ideas in my head.

u/needz · 0 pointsr/Birmingham

Been reading that and this

u/Franko_ricardo · 0 pointsr/learnprogramming
u/emefluence · 0 pointsr/HighQualityGifs

Read a book, this one is good...
https://www.amazon.co.uk/Where-Wizards-Stay-Up-Late/dp/0684832674

There were early experiments in packet switched networking throughout the 70s and a growth of packet switched networks in industry and academia throughout the 80s BUT they weren't federated into "The Internet" until 1991 and even then there was pretty much zero public awareness of it til 1994/1995. So it IS absolutely crazy to suggest a scientist talking about it, especially the way this clip talks about it, in 1978/79 which is when Cosmos was made.

u/GaussianReset · -1 pointsr/KotakuInAction

&gt;ARPANET, which while it has some relation to the internet, is a completely different thing entirely...

For your reading pleasure:
http://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674
Knowledge is power.


&gt;No one gave an argument from tradition.

The OP did, when contending that the rough-and-tumble days of Usenet justified, even necessitated, a similar treatment of the ideology of "safe spaces" today.

&gt;free speech does not have anything to do with the right to manage any platform

Our legal system disagrees with you, vehemently. Here is achildren's primer on the subject, which seems about your speed:

http://1forall.us/teach-the-first-amendment/the-first-amendment/#a4

If you manage to choke that down, see Marsh v Alabama (1946), Hudgens v NLRB (1976) and Pruneyard v Robins (1980). It's why you're not allowed to wear pornographic t-shirts at Disneyland.

Twitter's EULA states, in part: "We reserve the right at all times (but will not have an obligation) to remove or refuse to distribute any Content on the Services and to terminate users or reclaim usernames.”

According to the law, they're tight. Your precious fee-fees have no claim.

u/ChineseCracker · -1 pointsr/AndroidMasterRace

honest question: are you trolling?

I have no idea, how you could've read the text, that you've just quoted, and still think that flash storage somehow works "like paper"


let me break it down for you, so you can understand........ELI5:

flash storage are maaaaany different small disks
small disks can write veeeery fast
but small disks also get broken, after writing a looooot of times
hurts
cheap disks get broken faaaaast
expensive disks get broken sloooow
but, there is good news:
if one disk dies, others are still there to work
yaaay
but they are not as fast as before :'-(



And now ELI17 (which is probably your actual age):

You've just quoted, that there are different types of flash storages, that have different PE cycles (according to the manufacturer)....so you understand that there is a quality difference in flash storage

I think your problem is, that you think 100000 PE cycles lasts for 1000 years or something.....

First of all: the flash storage that you buy from the store (in any shape or form) isn't the same as device manufacturers use as components for their devices. Especially when we're talking about embedded systems (you think they put a Samsung SSD inside their smartphones? They don't!) manufacturers want to save money, and therefore usually go for the rock bottom cheapest option they can get. (just like they use the shittiest reject fruit to put into jelly)

And secondly: A PE cycle doesn't mean that something gets deleted off of the disk, whenever you uninstall an app or delete a music file. An operating system that works entirely on a flash storage is constantly writing and deleting files! Every time you leave an app in Android, (and the onStop() method of the activity is called), the app gets cached on the storage to free up ram (DRAM)

therefore you have hundreds or even thousands of delete operations every day, especially with TRIM enabled.


-----

Don't be a retard. Either fully read the links I've provided, or just stop arguing about something you know absolutely nothing about. Don't just Ctrl+F through an article to find signs for things that vaguely sound like they'd make your argument. And stop claiming that flash storage is like paper. I'm usually a big fan of analogies, but I have no idea where you pulled this one out - especially because it doesn't make any sense

if you're actually interested in this subject matter.....I purchased this book, when I was studying computer science:

http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269/ref=dp_ob_title_bk (this is the newest edition, I have an older one)

it's a pretty good point of reference for understanding how computer hardware works. it is very dry though

u/Katastic_Voyage · -2 pointsr/Games

With all due respect (I don't own either, by the way), while you can see a correlation of "speed" by using numbers like texture fill-rates, pipelines, memory throughput and so on; they are not the whole story, and can easily mislead a prediction.

For a fun bedtime read, check out a cheap used copy/old edition of Computer Architecture: A Quantitative Approach. The book is 10% benchmarks, and 90% reasons why benchmarks can mislead you because you failed to consider X, Y, and Z. It is the bible of computer architecture.

So I'm not saying you're wrong, nor am I saying your right, I'm saying there's not enough information presented to really say either way.

u/ckcollab · -2 pointsr/Futurology

What about the 5 years part? That the next 5 years of technological advancements will be like the last 25? I didn't say in the next 5 years unemployment will be out of control... checkout the book The Singularity Is Near to learn more about that. Every technological advancement builds off of the last and helps the rest, i.e. self driving cars in the future making everything cheaper -&gt; cheaper to do all kinds of scientific/medical research.

I can see how that was confusing, probably more like 20 years from now we'll absolutely have to have basic income or negative income tax. We'll have more people than ever with fewer jobs than ever!