Best computer hardware & diy books according to redditors

We found 1,470 Reddit comments discussing the best computer hardware & diy books. We ranked the 449 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Subcategories:

Microprocessor & system design books
Mainframes & minicomputers books
Computer hardware upgrade & repair books
Robotics books
Computer hardware peripherals books
Computer design & architecture books
Internet & networking books
Personal computer books
Single board computers books

Top Reddit comments about Computer Hardware & DIY:

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/moeburn · 73 pointsr/aviation

The thing on the bottom right of the screen says "This copy of Windows is not genuine".

More:

http://i.imgur.com/FIym0Vs.png

Buy this book so I don't feel bad for putting up images from it:

http://www.amazon.com/gp/product/0544668251/

u/arnar · 71 pointsr/AskReddit

A program in any language goes through either a compiler or an interpreter, or a mixture of both. A compiler turns the program into a series of machine instructions, low level codes that the processor knows how to "execute". An interpreter executes a program indirectly.

The first step of both compiling and interpreting is called parsing. This step takes the program text and converts it into an internal representation called an AST (Abstract Syntax Tree). For e.g. this converts "if" and its attached test and statement blocks into one kind of object, and "or" and its attached expressions into another kind. The compiler or interpreter knows that these two objects should be handled differently.

Programming languages are often programmed in themselves. However, a bootstrapping process is required to create the first compiler or interpreter, i.e. the first version needs to be written in a different language. For low level languages like C, this bootstrapping language is usually assembler. For other languages, more often than not the bootstrapping language is C. Some compilers or interpreters are written in different languages, e.g. the most popular version of Python (which is a mix of a compiler and an interpreter) is written in C.

Going from "rand()" to a string of bytes has more to do with calling a library than a specific feature of a programming language. From the PL point of view, "rand()" is a function call. This function resides in a library that may or may not be written in the same language, in any case, it contains the logic to generate a random number. There are several ways to do this, google or wikipedia for Pseudo Random Number Generator (PRNG).

The standard reference on compilers and interpreters is the so-called Dragon Book, I'll leave it up to others to suggest more material.

u/rolfr · 57 pointsr/ReverseEngineering

I started from scratch on the formal CS side, with an emphasis on program analysis, and taught myself the following starting from 2007. If you're in the United States, I recommend BookFinder to save money buying these things used.

On the CS side:

  • Basic automata/formal languages/Turing machines; Sipser is recommended here.
  • Basic programming language theory; I used University of Washington CSE P505 online video lectures and materials and can recommend it.
  • Formal semantics; Semantics with Applications is good.
  • Compilers. You'll need several resources for this; my personal favorites for an introductory text are Appel's ML book or Programming Language Pragmatics, and Muchnick is mandatory for an advanced understanding. All of the graph theory that you need for this type of work should be covered in books such as these.
  • Algorithms. I used several books; for a beginner's treatment I recommend Dasgupta, Papadimitriou, and Vazirani; for an intermediate treatment I recommend MIT's 6.046J on Open CourseWare; for an advanced treatment, I liked Algorithmics for Hard Problems.

    On the math side, I was advantaged in that I did my undergraduate degree in the subject. Here's what I can recommend, given five years' worth of hindsight studying program analysis:

  • You run into abstract algebra a lot in program analysis as well as in cryptography, so it's best to begin with a solid foundation along those lines. There's a lot of debate as to what the best text is. If you're never touched the subject before, Gallian is very approachable, if not as deep and rigorous as something like Dummit and Foote.
  • Order theory is everywhere in program analysis. Introduction to Lattices and Order is the standard (read at least the first two chapters; the more you read, the better), but I recently picked up Lattices and Ordered Algebraic Structures and am enjoying it.
  • Complexity theory. Arora and Barak is recommended.
  • Formal logic is also everywhere. For this, I recommend the first few chapters in The Calculus of Computation (this is an excellent book; read the whole thing).
  • Computability, undecidability, etc. Not entirely separate from previous entries, but read something that treats e.g. Goedel's theorems, for instance The Undecidable.
  • Decision procedures. Read Decision Procedures.
  • Program analysis, the "accessible" variety. Read the BitBlaze publications starting from the beginning, followed by the BAP publications. Start with these two: TaintCheck and All You Ever Wanted to Know About Dynamic Taint Analysis and Forward Symbolic Execution. (BitBlaze and BAP are available in source code form, too -- in OCaml though, so you'll want to learn that as well.) David Brumley's Ph.D. thesis is an excellent read, as is David Molnar's and Sean Heelan's. This paper is a nice introduction to software model checking. After that, look through the archives of the RE reddit for papers on the "more applied" side of things.
  • Program analysis, the "serious" variety. Principles of Program Analysis is an excellent book, but you'll find it very difficult even if you understand all of the above. Similarly, Cousot's MIT lecture course is great but largely unapproachable to the beginner. I highly recommend Value-Range Analysis of C Programs, which is a rare and thorough glimpse into the development of an extremely sophisticated static analyzer. Although this book is heavily mathematical, it's substantially less insane than Principles of Program Analysis. I also found Gogul Balakrishnan's Ph.D. thesis, Johannes Kinder's Ph.D. thesis, Mila Dalla Preda's Ph.D. thesis, Antoine Mine's Ph.D. thesis, and Davidson Rodrigo Boccardo's Ph.D. thesis useful.
  • If you've gotten to this point, you'll probably begin to develop a very selective taste for program analysis literature: in particular, if it does not have a lot of mathematics (actual math, not just simple concepts formalized), you might decide that it is unlikely to contain a lasting and valuable contribution. At this point, read papers from CAV, SAS, and VMCAI. Some of my favorite researchers are the Z3 team, Mila Dalla Preda, Joerg Brauer, Andy King, Axel Simon, Roberto Giacobazzi, and Patrick Cousot. Although I've tried to lay out a reasonable course of study hereinbefore regarding the mathematics you need to understand this kind of material, around this point in the course you'll find that the creature we're dealing with here is an octopus whose tentacles spread in every direction. In particular, you can expect to encounter topology, category theory, tropical geometry, numerical mathematics, and many other disciplines. Program analysis is multi-disciplinary and has a hard time keeping itself shoehorned in one or two corners of mathematics.
  • After several years of wading through program analysis, you start to understand that there must be some connection between theorem-prover based methods and abstract interpretation, since after all, they both can be applied statically and can potentially produce similar information. But what is the connection? Recent publications by Vijay D'Silva et al (1, 2, 3, 4, 5) and a few others (1 2 3 4) have begun to plough this territory.
  • I'm not an expert at cryptography, so my advice is basically worthless on the subject. However, I've been enjoying the Stanford online cryptography class, and I liked Understanding Cryptography too. Handbook of Applied Cryptography is often recommended by people who are smarter than I am, and I recently picked up Introduction to Modern Cryptography but haven't yet read it.

    Final bit of advice: you'll notice that I heavily stuck to textbooks and Ph.D. theses in the above list. I find that jumping straight into the research literature without a foundational grounding is perhaps the most ill-advised mistake one can make intellectually. To whatever extent that what you're interested in is systematized -- that is, covered in a textbook or thesis already, you should read it before digging into the research literature. Otherwise, you'll be the proverbial blind man with the elephant, groping around in the dark, getting bits and pieces of the picture without understanding how it all forms a cohesive whole. I made that mistake and it cost me a lot of time; don't do the same.
u/rtanaka6 · 48 pointsr/programming

But the Dragon Book has cool dragons on it!

u/nostrademons · 41 pointsr/programming

You missed out. Compiler design was my favorite course in college.

Anyway, if you don't mind reading textbooks and digging deep into them, you can teach yourself compilers. The seminal work is the Dragon Book. I'd highly recommend supplementing is with Appel's Modern Compiler Implementation in ML (if you're a wimp, you can get the Java or C versions, but I wouldn't recommend it). SICP can also give you some useful perspectives. If you want to go further, there's Advanced Compiler Design and Implementation by Steven Muchnik.

There are also a couple of web tutorials that give you a really quick, whirlwind overview. Graydon Hoare's One Day Compilers uses Ocaml to write a native-code compiler for a Make-like language in 40 slides. My Write yourself a scheme in 48 hours has you write a Scheme interpreter in Haskell, though it sorta assumes you're already familiar with the basics of parsing/lexing/interpreting.

u/Hakawatha · 27 pointsr/electronics

That's because it is RF design. Have you read the handbook of black magic? Excellent book, I'm told.

u/Bluegorilla101 · 26 pointsr/funny

Should have gotten her this dragon book so she can get a headstart on writing compilers.

u/Lericsui · 26 pointsr/learnprogramming

"Introduction to Algorithms"by Cormen et.al. Is for me the most important one.

The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good.


Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics.


Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it.


SICP(although married to a language) teaches very very good fundamentals.


Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material.

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/Ispamm · 21 pointsr/androiddev

Don't give up just yet, keep looking.
Do you have a portfolio? if not try to work on a project of your own so you can have something to show.
And if you are considering improving your java skills try work with libraries like:

u/t3h_Inox · 20 pointsr/csharp

There are some good C# project examples implementing clean architecture in ASP.NET Core (although the architecture itself could be applied to WPF and other kind of apps).

Have a look at this talk: https://www.youtube.com/watch?v=_lwCVE_XgqI

GitHub repository from the talk here: https://github.com/JasonGT/NorthwindTraders (I'd prefer this one)

​

Another example: https://github.com/ardalis/CleanArchitecture

​

Additionally, if you really want to understand the subject this is an obligatory read: https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164

u/the_omega99 · 18 pointsr/learnprogramming

>I do have a textbook called "C: A modern approach" by King, but like I said before, I think it focuses more on the coding aspect.

Most books that focus on C are going to be about learning the language. If you want to learn low level stuff, you need to find books that focus on them (and they'll usually incidentally use C). The language itself is quite small and minimalistic in what it can do. Most heavy handed things like networking and GUIs require interaction with the OS.

Eg, if you wanted to do networking, you could use the Windows API or the POSIX socket API (POSIX being the standards that *nix systems follow -- and certain versions of Windows). Or you could use a higher level library like curl for cross platform support (and a wealth of nicer features).

>Can somebody please guide me on where to start?

Firstly, as much of a linux fanboy I am, I do want to make sure you know that you don't need to use Linux for any of the other things you wanted to learn (low-level programming, command lines, networking, etc). In fact, my OS class mostly used Linux, but we started out with a project using Windows threads (I guess the prof wanted us to see the difference from POSIX threading).

All that said, I do think Linux is something you'd want to learn and that a lot of low level things just seem more natural in Linux. But I'm biased. Linux fanboy, remember?

I'd start with downloading a Linux OS. Doesn't really matter which. I'd recommend going with Ubuntu. It's the most popular, easiest to find help with, and seems to be what most web servers are running, to boot. You can play around with the GUI for a bit if you want. It won't feel that different. Modern OSes sort of converged into the same high level ideas.

My favourite book for getting into the command line ever so slightly touching the low level aspects of OSes is Mark Sobel's A Practical Guide to Linux Commands, Editors, and Shell Programming. It will include some basic knowledge of Linux, but mostly focuses on the command line. But this is very useful because not only is the command line very practical to learn, but you'll end up learning a lot about Linux in the process (eg, by learning how everything is a file, how pipes work, etc). And arguably the command line a super big part of Linux, anyway. It makes sense as the first step.

Now, for the next step, you need to know C very well. So finish with your class, first. Read ahead if you have to. Yes, you already know if statements and functions and all, but do you understand pointers well? How about function pointers and void pointers? Do you understand how C's arrays work and the usage of pointer arithmetic? How about how arguments are passed to functions and when you'd want to pass a pointer to a function instead? As a rough skill testing question, you should implement a linked list for arbitrary data types with functions such as prepending, appending, concatenating lists, searching, removing, and iterating through the list. Make sure that your list can be allocated and freed correctly (no memory leaks).

Anyway, the next step is to learn OSes. Now, I said OSes and not Linux, because the Linux OS is a bit constrained if you want to learn low level programming (which would include a knowledge of what OSes in general do, and alternatives to OSes like Linux). But never fear, pretty much any OS book will heavily use Linux as an example of how things work and consequently explain a great deal of Linux internals. I can't recommend a class because mine was a regular university class, but Tanenbaum's Modern Operating Systems is a good book on the subject.

In particular, you can expect an OS class to not merely be focused on building an OS yourself (my class worked on aspects of OS101 to implement portions of our own OS), but also on utilizing low level aspects of existing OSes. Eg, as mentioned, my class involved working with Linux threading, as well as processes. We later implemented the syscalls for fork, join, etc ourselves, which was a fascinating exercise. Nothing gets you to understand how Linux creates processes like doing it yourself.

Do note, however, that I had taken a class on computer architecture (I found Computer Organization and Design a good book there, although note that I never did any of the excerises in the book, which seem to be heavily criticized in the reviews). It certainly helps in understand OSes. It's basically as low as you can go with programming (and a bit lower, entering the domain of computer engineering). I cannot say for sure if it's absolutely necessary. I would recommend it first, but it's probably skippable if you're not interested (personally, I found it phenomenally interesting).

For learning networking, Beej's book is well written. You don't need to know OSes before this or anything.

u/realjoeydood · 18 pointsr/csharp

Uncle Bob, 'Clean Architecture' is the book you want. Examples are in Java but you shouldn't have any problems understanding


This book is a must-read, imo.


Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series) https://www.amazon.com/dp/0134494164/ref=cm_sw_r_cp_apa_i_FyzqDbV9KJ25Y

u/antonivs · 18 pointsr/badcode

The code you posted was generated from a grammar definition, here's a copy of it:

http://www.opensource.apple.com/source/bc/bc-21/bc/bc/bc.y

As such, to answer the question in your title, this is the best code you've ever seen, in the sense that it embodies some very powerful computer science concepts.

It [edit: the Bison parser generator] takes a definition of a language grammar in a high-level, domain-specific language (the link above) and converts it to a custom state machine (the generated code that you linked) that can extremely efficiently parse source code that conforms to the defined grammar.

This is actually a very deep topic, and what you are looking at here is the output of decades of computer science research, which all modern programming language compilers rely on. For more, the classic book on the subject is the so-called Dragon Book, Compilers: Principles, Techniques, and Tools.

u/MrPopoGod · 18 pointsr/BABYMETAL

Yeah, Su's not just reading off a script. Her English has come really far; she's at the point of having enough vocabulary to feel like she can express what she wants to express once she picks the right words out of her dictionary. So she still has to do a translation of concepts into a smaller set of words (sort of like the book Thing Explainer) but she's got the confidence to do so.

u/viddy · 18 pointsr/Astronomy

Check out What If? and Thing Explainer.

u/Xenocide321 · 18 pointsr/EverythingScience

He was working on his new book that came out recently.

Thing Explainer

u/Cohesionless · 17 pointsr/cscareerquestions

The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics.

If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following:

  • Introduction to Algorithms, 3rd Edition: https://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844. A lot of people do not like this classic book because it is very theoretical, very mathematical, and very abstract, but I think that is its greatest strength. I find a lot of algorithms books either focus too much about how to implement an algorithm in a certain language or it underplays the theoretical foundation of the algorithm such that their readers can only recite the algorithms to their interviewers. This book forced me to think algorithmically to be able to design my own algorithms from all the techniques and concepts learned to solve very diverse problems.

  • Design Patterns: Elements of Reusable Object-Oriented Software, 1st Edition: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/. This is the original book on object-oriented design patterns. There are other more accessible books to read for this topic, but this is a classic. I don't mind if you replace this book with another.

  • Clean Code: A Handbook of Agile Software Craftsmanship, 1st Edition: https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882. This book is the classic book that teaches software engineer how to write clean code. A lot of best practices in software engineering is derived from this book.

  • Java Concurrency in Practice, 1st Edition: https://www.amazon.com/Java-Concurrency-Practice-Brian-Goetz/dp/0321349601. As a software engineer, you need to understand concurrent programming. These days there are various great concurrency abstractions, but I believe everyone should know how to use low-level threads and locks.

  • The Architecture of Open Source Applications: http://aosabook.org/en/index.html. This website features 4 volumes of books available to purchase or to read online for free. It's content focuses on over 75 case studies of widely used open-source projects often written by the creators of said project about the design decisions and the like that went into creating their popular projects. It is inspired by this statement: "Architects look at thousands of buildings during their training, and study critiques of those buildings written by masters."

  • Patterns of Enterprise Application Architecture, 1st Edition: https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/. This is a good read to start learning how to architect large applications.

    The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry.

    P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database.
u/ReasonableDrunk · 14 pointsr/marvelstudios

The definitive work on high speed digital electrical engineering is, in fact, called the Handbook of Black Magic.

https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/pop-pop-pop-pop-pop · 14 pointsr/javascript

Not a magic bullet but these helped me:

  • Study the structure of big popular open source projects like lodash or JQuery that have been around for awhile, learn how they structure and organize their code and borrow from it.

  • A book like Clean Architecture might help too.

  • Understand how JavaScript works under the hood and computers in general so you have a better understanding of the whole system, this involves learning low level documentation.

  • Get really good with OOP.

  • Code->Refactor->Code->Refactor, apply and reiterate all the stuff you've learned and see if it works.

    Disclaimer: I'm a pretty terrible programmer, but I used to be a lot worse.
u/mcur · 14 pointsr/linux

You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.

I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/

I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&ie=UTF8&qid=1377402221&sr=1-1

Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&ie=UTF8&qid=1377402371&sr=1-1

Edit: Got the wrong Hennessy/Patterson book...

u/TheEdgeOfRage · 14 pointsr/BSD

If you want something really hardcore check out The Design and implementation of the FreeBSD operating system

Edit for 2nd edition

u/syntheticproduct · 13 pointsr/node

There's a misconception that microservices are 'simple'. That's not always the case, they can be complex, efficent beasts, include caching, handle millions of concurrent requests, etc.

However, architecturally, as seen from the outside, microservices do one thing and do it well.

https://youtu.be/CZ3wIuvmHeM

https://martinfowler.com/articles/microservices.html

https://martinfowler.com/microservices/

First of all you have to ask yourself what your service will do. That will drive the architecture. Your question is like asking 'hey I wanna build an app, how should I architecture it'. It all depends the constraints on your problem.

There are some books on the topic that might help.

https://www.amazon.com/Building-Microservices-Designing-Fine-Grained-Systems/dp/1491950358

u/teresko · 12 pointsr/PHP

Actually i would suggest you to start learning OOP and maybe investigate the MVC design pattern, since those are both of subjects which average CodeIgniter user will be quite inexperienced in. While you might keep on "learning" frameworks, it is much more important to actually learn programming.

Here are few lecture that might help you with it:

u/PubliusPontifex · 11 pointsr/compsci

The Dragon Book by somebody. A bit out of date now, but really helped me with my parser/tree implementation.

u/cyrusol · 11 pointsr/PHP

You might want to look at Patterns of Enterprise Application Architecture by Martin Fowler. There are 3 full examples that you are looking for right now:

  • table gateway
  • active record style ORMs (like Eloquent)
  • data mapper style ORMs (like Doctrine which is exactly like Java's Hibernate)

    The examples are in Java but you probably won't have any difficulties understanding the code. He builds those from the ground up and finally compares them and says when to use what.

    -----

    Beyond that the abstraction endgame is to completely separate your persistence model from your domain model. If you have a user with an email address and other fields you don't want to have just one class that deals with loading the user row from a SQL database based on his email address, you want the latter part be isolated into its own class. (So that the user could in theory also be requested from an API or Memcached or a text file.)

    class User { ... }
    interface UserByEmailQuery {
    public function findByEmail(string $email): User?;
    }
    class SqlUserByEmailQuery implements UserByEmailQuery { ... }

    But it's sometimes simply not worth (economically) to go that far.
u/Authenticity3 · 10 pointsr/ECE

Old (1993) but classic fundamentals that are still relevant today:
High Speed Digital Design: A Handbook of Black Magic https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90

u/correctsbadalts · 10 pointsr/funny

Was expecting this dragon book

u/poincareDuality · 10 pointsr/compsci

For designing programming languages, my favorites are

u/NAMOS · 10 pointsr/onions

Basically any SRE advice for a normal service but replace/compliment HAproxy / nginx / ingress controller / ELB with the Tor daemon / OnionBalance.

I run Ablative Hosting and we have a few people who value uptime over anonymity etc and so we follow the usual processes for keeping stuff online.

Have multiples of everything (especially stuff that doesn't keep state), ensure you have monitoring of everything from connections, memory pressure, open files, free RAM etc etc.

Just think of the Tor daemon onion service as just a TCP reverse proxy, with load-balancing capability and then follow any other advice when it comes to building reliable infrastructure;

u/chopsuwe · 9 pointsr/arduino

I thought Programming Arduino Getting Started with Sketches by Simon Monk was very good. It starts from the very basics of what a micro controller is and the concepts of how it works. Then steps you through the example sketches in the Arduino IDE explaining how and why they work. It's written in a way that's very easy to understand even for the absolute beginner.

Once you've gone through those you'll have a good understanding of what is and isn't possible and how to make your own projects around it. After that Google.

u/humanmanguy · 9 pointsr/AmazonTopRated
  • Fire TV Stick, which is a lower-cost alternative to the awesome Fire TV. (think Apple TV, but actually good)

  • Raspberry Pi which is a tiny fully-functional/fully-featured ARM computer.

  • Arduino, which is an easy-to-use electronics prototyping platform, great if you're interested in learning how to make your own electronics and whatnot. (you might also want this, this, this, this, and this. Should be less than $40 altogether, though you could also probably find like a starter kit that comes with an arduino, book, and components.)

  • Huion drawing tablet, great for if you want to do digital art. I haven't used this model specifically, but I do have the (bigger/more expensive) Huion 610 Pro, which I love.

  • Amazon Prime student was like $40 IIRC, not sure if that has changed though.
u/kainolophobia · 9 pointsr/programming

Look into computer engineering. If you're interested in hardware meets software, you'll explore computer architecture.

See: Computer Architecture Quantitative Approach

and
Computer Organization and Design

u/whydna1 · 9 pointsr/AskComputerScience

Firstly, you should pick up a text on Operating Systems. There's a few classics:

  • Silbershatz's Operating System Concepts (aka "The Dinosaur Book")
  • Andrew Tanenbaum's Operating System Design and Implementation (aka "The Minix Book")

    That being said, the summarized version (assuming we're talking about pre-emptive multitasking) is that shortly after the OS boots, it initializes the hardware timer with some delay value (1ms is common in modern OS's, but others aren't unheard of). When the hardware timer expires, it issues an interrupt. When a hardware interrupt occurs, the processor will push the instruction pointer, stack pointer, flag registers (and perhaps others) onto the stack and begins executing the code at the address configured for the interrupt.

    An OS will have installed an interrupt handler for the timer that will run the task scheduler. In a really simple model, the task scheduler will suspend the currently running task by copying the various registers (flags, instruction pointer, stack pointer, etc) from the stack into memory. It will then select the next task that it wishes to execute and copy its registers onto the stack (replacing what was pushed there initially by the timer interrupt). Finally, it will reset the timer for another delay and will issue an "interrupt return" instruction which will pop the saved-registers off the stack and back into their corresponding locations in the CPU registers. At this point, the previous task is suspended, the new task is running.

    If you're feeling adventurous and have access to a microprocessor which has a timer and hardware interrupt support, you can write a fairly simple/naive scheduler yourself with relative ease (I did this on a Motorola HC12 in university for fun).

    In practice, a modern OS has a bit more to deal with than just the CPU registers. There might also be a virtual memory sub-system which needs to be updated, many other registers beyond just the IP, SP and flags which need to be saved and restored, etc. Additionally, many OS's have potential complex algorithms to select which task to run based on a variety of considerations: is the task interactive or doing batch work, are we trying to meet real-time scheduling requirements, is there an affinity to a particular CPU core that we want to consider (perhaps there's L2 cache that we like the application to stay close to), etc.

    In short: the PC isn't nulled out, it's saved away and then eventually restored.

    There's a whole other discussion which could be had about other types of multi-tasking. Co-operative multitasking was common until the mid-90s (System 7 on the Mac was co-operative, for example). In this model, applications need to periodically release control to the operating system. This is done through a system-call (and thus a software interrupt); in the case of the mac, this was WaitNextEvent.
u/[deleted] · 9 pointsr/programming

You need to show that you know your stuff. Just because you're doing something more applied like Network Security in grad school doesn't mean that you won't have a base level of knowledge you're expected to understand. In that case, you need to learn some basic stuff a CS student at a good school would know. I'm not "dumbing down" anything on my list here, so if it seems hard, don't get discouraged. I'm just trying to cut the bullshit and help you. (:

  • Redo your introduction to Computer Science. If you finish this, picking up a new language is cake.

  • Discrete Mathematics, A.K.A. "Math for Computer Scientists" This is the standard text for this, but this is pretty good for a cheap book.

  • Algorithms

  • Compilers

  • Operating Systems

  • Networking

  • For basic CS theory, "Introduction to Theory of Computation by Michael Sipser" is what I used to recommend, but Amazon doesn't seem to have a sanely priced copy. Either buy that used, or get the classic "Cinderella Book". Get an older edition if you can!

    Again, don't be discouraged, but you'll need to work hard to catch up. If you were trying for something like mathematics or physics while doing this, I'd call you batshit insane. You may be able to pull it off with CS though (at least for what you want to study). Make no mistake: getting through all these books I posted on your own is hard. Even if you do, it might be the case that still no one will admit you! But if you do it, and you can retain and flaunt your knowledge to a sympathetic professor, you might be surprised.

    Best of luck, and post if you need more clarification. As a side note, follow along here as well.

    Netsec people feel free to give suggestions as well.
u/Beagles_are_da_best · 9 pointsr/PrintedCircuitBoard

I did learn all of this stuff from experience. Honestly, I had a little bit of a tough time right out of college because I didn't have much practical circuit design experience. I now feel like I have a very good foundation for that and it came through experience, learning from my peers, and lots of research. I have no affiliation with Henry Ott, but I treat his book like a bible . I refer to it just about every time I do a board design. Why? because it's packed with this type of practical information. Here's his book. I bought mine used as cheap as I could. At my previous job, they just had one in the library. Either way, it was good to have around.

So why should you care about electromagnetic compatibility (EMC)? A couple reasons:

  1. EMC compliance is often regulated by industry and because a product requirement. The types of tests that your product has to pass is dependent on the industry typically, but in general there are tests where bad things are injected into your board and tests where they measure how noisy your board. You have to pass both.
  2. EMC compliance, in my opinion, is very well correlated with the reliability and quality of a product. If a product is destroyed "randomly" or stops working when the microwave is on, you're not likely to have a good opinion of that product. Following guidelines like the one I did above is the path to avoiding problems like that.
  3. EMC design is usually not taught in schools and yet it is the most important part of the design (besides making it perform the required product function in the first place). It also is very hard to understand because many of the techniques for improving your design do not necessarily show up on your schematics. Often, it's about how well your layout your board, how the mechanical design for the enclosure of your board is considered, etc.

    Anyways, it's definitely worth looking at and is a huge asset if you can follow those guidelines. Be prepared to enter the workforce and see rampant disregard for EMC best practices as well as rampant EMC problems in existing products. This is common because, as I said, it's not taught and engineers often don't know what tools to use to fix it. It often leads to expensive solutions where a few extra caps and a better layout would have sufficed.

    A couple more books I personally like and use:

    Howard Johnson, High Speed Digital Design (it's from 1993, but still works well)

    Horowitz and Hill, The Art of Electronics (good for understanding just about anything, good for finding tricks and ideas to help you for problems you haven't solved before but someone probably has)

    Last thing since I'm sitting here typing anyways:

    When I first got out of college, I really didn't trust myself even when I had done extensive research on a particular part of design. I was surrounded by engineers who also didn't have the experience or knowledge to say whether I was on the right path or not. It's important to use whatever resources you have to gain experience, even if those resources are books alone. It's unlikely that you will be lucky and get a job working with the world's best EE who will teach you everything you need to know. When I moved on from my first job after college, I found out that I was on the right path on many things thanks to my research and hard work. This was in opposition to my thinking before then as my colleagues at my first job were never confident in our own ability to "do EE the right way" - as in, the way that engineers at storied, big companies like Texas Instruments and Google had done. Hope that anecdotal story pushes you to keep going and learning more!
u/greenlambda · 9 pointsr/ECE

I'm mostly self-taught, so I've learned to lean heavily on App Notes, simulations, and experience, but I also like these books:
The Howard Johnson Books:
High Speed Digital Design: A Handbook of Black Magic
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_api_I0Iwyb99K9XCV
High Speed Signal Propagation: Advanced Black Magic
https://www.amazon.com/dp/013084408X/ref=cm_sw_r_cp_api_c3IwybKSBFYVA

Signal and Power Integrity - Simplified (2nd Edition)
https://www.amazon.com/dp/0132349795/ref=cm_sw_r_cp_api_J3IwybAAG9BWV

Also, another thing that can be overlooked is PCB manufacturability. It's vitally important to understand exactly what can and can't be manufactured so that you can make design trade offs, and in order to do that you need to know how they are made. As a fairly accurate intro, I like the Eurocircuits videos:
http://www.eurocircuits.com/making-a-pcb-pcb-manufacture-step-by-step

u/Atkrista · 9 pointsr/ECE

Personally, I found Oppenheim's text very dry and difficult to get through. I would recommend Lyons textbook.

u/Dhekke · 8 pointsr/programming

This book Structured Computer Organization is also very good at explaining in detail how the computer works, it's the one I used in college... Pretty expensive, I know, but at least the cover has nice drawings!

u/dbuckley · 8 pointsr/technology

> Why does it transmit anything at all

Electronics that have fast switching transitions generate significant amounts of radio frequency energy. In the modern world, it is a major part of the designers job to reduce or shield these emissions so that equipment doesn't interfere with other equipment.

There is a lot of skill and art and not a little black magic involved in getting high speed electronics to work at all. In fact, one of the first books to seriously tackle the subject, a book that many designer still has on their bookshelf is High Speed Digital Design: A Handbook of Black Magic. The challenge once it works is to make it less like a transmitter.

To prove the thing is compliant with the standards of where the thing is being sold, it is traditional to take the kit to an EMC test house, where the Device Under test (the DUT) is placed in a screened room, and set up in representative conditions (ie power cables, Ethernet cables etc), and the amount of radio frequency junk spewed into the air is measured. This costs money, according to here, about $1-10K. if you fail, you have to fix the design, and spend money for testing again, until it passes. And of course, fixing the design takes time.

Many countries are happy to sell kit across international boundaries with none of this stuff done at all.

u/grules · 8 pointsr/programming

Just read the table of contents of a couple of advanced compiler books:

Advanced Compiler Design and Implementation

Optimizing Compilers for Modern Architectures: A Dependence-based Approach

It is a different ball game.

u/DonaldPShimoda · 8 pointsr/ProgrammingLanguages

I've peeked at this free online book a few times when implementing things. I think it's a pretty solid reference with more discussion of these sorts of things!

Another option is a "real" textbook.

My programming languages course in university followed Programming Languages: Application and Interpretation (which is available online for free). It's more theory-based, which I enjoyed more than compilers.

But the dragon book is the go-to reference on compilers that is slightly old but still good. Another option is this one, which is a bit more modern. The latter was used in my compilers course.

Outside of that, you can read papers! The older papers are actually pretty accessible because they're fairly fundamental. Modern papers in PL theory can be tricky because they build on so much other material.

u/HikoboshiSama · 8 pointsr/compsci
u/Quintic · 8 pointsr/learnprogramming

Here are some standard textbooks on the subject. When I am looking for a book on a particular subject, I like to look at the class schedules for local universities and see what they are using. A class on programming languages is a standard part of a CS program I believe.

Compilers: Principles, Techniques, and Tools
http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811/ref=sr_1_3?ie=UTF8&qid=1343095509&sr=8-3&keywords=Dragon+Book

Concepts of Programming Languages
http://www.amazon.com/Concepts-Programming-Languages-Robert-Sebesta/dp/0136073476/ref=sr_1_5?s=books&ie=UTF8&qid=1343095607&sr=1-5&keywords=Programming+languages

Programming Language Pragmatics
http://www.amazon.com/Programming-Language-Pragmatics-Second-Michael/dp/0126339511/ref=sr_1_2?s=books&ie=UTF8&qid=1343095647&sr=1-2&keywords=Programming+language+pragmatics

u/TreeFitThee · 7 pointsr/freebsd

If, after reading the handbook, you find you still want a deeper dive check out The Design and Implementation of the FreeBSD Operating System

u/KaiserTom · 7 pointsr/learnprogramming

Might I recommend the book Upgrading and Repairing Computers?

Don't let the name fool you, this book is thick and goes pretty deep not only in the low-level stuff but into the history of why we do and have certain things in a computer. It is honestly one of the best IT books out there for fundamental IT knowledge.

If you already know much of what this book has, then you'll have more than enough fundamental base to really do anything.

u/Anthr0p0m0rphic · 7 pointsr/raspberry_pi

Great questions, although it would be helpful to know more about your background. I'm guessing the you are totally new to Pi, so let's start off with the standard beginner resources:

  • Raspberry Pi User Guide by Eben Upton and Gareth Halfacree - This is the unofficial user manual for the Raspberry Pi.
  • Learning Python with Raspberry Pi by Alex Bradbury and Ben Everard - You didn't list any programming background, so I'm not sure if this will be too simple or too advanced. I know of simpler resources, not more advanced ones.

    My favorite learning resource is YouTube. There are a number of people putting out occasional videos on their Pi projects. Personally I like to systematically learn from the same teachers for a few hours. You could try out Raspberry Pi IV Beginners

    Now, it's time to move on the the assignment at hand. Do you have a background in electronics or DIY? I have some resources with background on these topics also. If you're following the basics, Instructables is going to give you the templates that you need to create your end project. Here's my favorite example of using a GSM module for SMS.
u/If_you_just_lookatit · 7 pointsr/embedded

I started early on with Arduino and moved into lower level embedded with the stm32 discovery line of development boards. Attached link of a good starting board that has tons of example code from ST.

https://www.mouser.com/ProductDetail/STMicroelectronics/STM32F407G-DISC1?qs=mKNKSX85ZJejxc9JOGT45A%3D%3D

If you want a decent intro book into embedded topics, this book does a decent job of introducing the different parts of working on an embedded project:

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

​

Between the Arduino and Pi, the Arduino is more representative of an embedded device. It introduces you to resource constrained system that can be a good starting point for working with digital and analog io's and can let you hook up to communication bus enabled peripherals that use UART, I2C, and SPI. The biggest problem is that it will not introduce you immediately to debugging and standard compilation tools. However, arduino has been a starting point for many developers. Good luck!

u/aiCikey · 7 pointsr/learnprogramming

This and this by Tanenbaum are the best books on the topic i have read until now.

u/fatangaboo · 7 pointsr/ECE

For your job? Spend the money or get your boss to spend the money on the books written by Howard Johnson.

(book 1)

(book 2)

Trivialized and unsatisfying answer to the question in the title of this thread: Vbounce = Lground * dI/dt . You think Lground equals zero but you are mistaken.

u/YuleTideCamel · 7 pointsr/learnprogramming

Pick up these books:

u/waaaaaahhhhh · 7 pointsr/ECE

There seems to be two approaches to learning DSP: the mathematically rigorous approach, and the conceptual approach. I think most university textbooks are the former. While I'm not going to understate the importance of understanding the mathematics behind DSP, it's less helpful if you don't have a general understanding of the concepts.

There are two books I can recommend that take a conceptual approach: The Scientist and Engineer's Guide to Digital Signal Processing, which is free. There's also Understanding Digital Signal Processing, which I've never seen a bad word about. It recently got its third edition.

u/masklinn · 7 pointsr/programming

Yet another long-winded and mostly useless BeautifulCode blog post that could've been avoided by suggesting that readers go buy Fowler's Refactoring (and read page 260 in this case, introduce Null Object) as well as the same Fowler's Patterns of Enterprise Application Architecture page 496 Special Case, A subclass that provides special behavior for particular cases.

Also, note that while the whole texts and explanations and examples and whatnot are not provided, you can find the refactorings and patterns of these books in The Refactoring Catalog and Catalog of Patterns, the respective websites of Refactoring and PoEAA

u/frenchy_999 · 6 pointsr/learnprogramming

Not sure if it's quite what you're looking for but Computer Organization & Design - The HW/SW Interface is a fantastic book on processor architecture and uses the MIPS design as an example through the entire text including good stuff on MIPS assembly programming. The link is for the latest edition (fourth) but if you want to go cheaper the third edition is still available. I used (and still use, about to tutor a course on CompArch) the third edition and it's one of the most useful texts I have ever owned.

u/yoda17 · 6 pointsr/programming

You might check out some books by Hennessy and Patterson, eg,
http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=pd_bbs_sr_2?ie=UTF8&s=books&qid=1228140896&sr=8-2

So far the best books I have read on computers and operating systems. Talks about the same stuff as this article, but with more detail. They're grad level texts, but I didn't have a problem havnig no CS background.

u/pdq · 6 pointsr/programming

The good news is that MIPS is possibly the most straightforward assembly language to code or read. Compared to x86, it's a dream.

If your class is computer architecture related, you will probably be learning from Hennessy and Patterson, which is an excellent book covering hardware and software.

If you want to learn MIPS from a software perspective, check out the classic See MIPS Run.

u/reveazure · 6 pointsr/programming

If you want to learn about OS, check out MINIX. It's a UNIX-compatible OS specifically written for educational purposes. What makes it really great is that there's a book, Operating Systems: Design and Implementation, which describes the OS in detail and comes with the annotated source.

u/hwillis · 6 pointsr/electronics

Can't use free eagle (too big) for this, but kicad or probably other things would work. With a few good books you can lay out a big board without advanced tools, although it can take longer. With cheap/free tools you'll usually have to use some finicky or kludgy methods to do really complex routing (blind/buried vias, free vias, heat transfer, trace length), but that usually isn't too big a deal. Here's a timelapse of a guy using Altium to route a high speed, large (a bit smaller than op's) data board for a high speed camera. The description has rough steps with timestamps- 38 hours total to lay out.

u/bmarkovic · 6 pointsr/webdev

POSA books by Buschmann are considered to be the textbooky, Knuth/SICP level stuff from when I studied and are architecture equivalent to the GoF's design thing, but as a consequence they're also huge on OOP. The Fowler Book is even more preachy and OOPy but many things are still relevant. The third would be Uncle Bob's Clean Architecture (sorry for cryptic refs, am on mobile, just Google the refs you'll find them.

On the systems design front one should learn ESB and SOA as they are patterns still relevant in this microservices world but most books on the subject are often tied to particular tech (and its often wrong tech like IBM or Oracle or MS proprietary, untraslateable/untransferable stuff). I've heard good things about Thomas Erl books [1].

I've recently read Sam Newman book on Microservices and while it does have a lot of zeitgeist in it at least it's current zeitgeist and the book is decent.

Edits:

  • On keyboard now, added links.
  • [1] Arcitura, Thomas Erl's company has informative (if a bit ugly) websites on both classical SOA patterns and Microservice patterns. Again, it's buzzwordy, preachy, enterprisey CIO-talk but if you cut through it there are some good overviews of various systems design patterns there and are a quick way to ingest the concepts before dedicating your time to these huge tomes.
  • The mentioned books have aged well in general but some of the ideas they propose haven't aged that well so I'd like to dedicate a few bullet points to those:
    • MVC in particular, has lately fallen out of grace as UI pattern and has become delegated to the backend as data-presentation pattern (i.e. how you design, say, JSON API backends wrt DB access and transforming to JSON), whereas front-end UI has migrated to MOVE pattern which, in terms of GoF speek, mostly relates to MVC by replacing MVC's core Observer pattern with a Reactive programming Observable.
    • Active Record ORMs (think Hibernate) have fallen from grace and are becoming replaced with SQL building DSLs like Linq or ORMs with SQL builders below them. DTOs have also given way to either Monad-ic data access objects or swung back to the pre-AR/pre-DTO concept of Data Gateways (more common with Linq-style DSLs).
    • Reactive design in combination with Message Queuing has become more and more the method of choice for managing distributed state in SOAs.
u/sonorangoose · 6 pointsr/androiddev
u/sunriseandsunset · 6 pointsr/BCIT

If you're in the downtown campus you'll do Python for COMP 1510. The Burnaby campus will be doing Java for COMP 1510.

You should spend time going over HTML, CSS, and Javascript. You'll need Java for certain. You should use w3schools.com to get a good grasp over basic concepts.

One other thing that may help is getting some design books from amazon or as a pdf online to get down coding conventions and ways to think about coding

You'll find that you'll need to test your code in the coding classes and writing code to be testable is a critical skill in school and when you work in the industry.



u/Jazzy_Josh · 6 pointsr/cscareerquestions

The dragon book if you're into compilers

There's a second edition, but I think this one has a cooler cover ;)

u/FattyBurgerBoy · 6 pointsr/webdev

The book, Head First Design Patterns, is actually pretty good.

You could also read the book that started it all, Design Patterns: Elements of Reusable Object-Oriented Software. Although good, it is a dull read - I had to force myself to get through it.

Martin Fowler is also really good, in particular, I thoroughly enjoyed his book Patterns of Enterprise Architecture.

If you want more of an MS/.NET slant of things, you should also check out Dino Esposito. I really enjoyed his book Microsoft .NET: Architecting Applications for the Enterprise.

My recommendation would be to start with the Head First book first, as this will give you a good overview of the major design patterns.

u/llimllib · 6 pointsr/compsci

sipser (I have the first edition which you can get on the cheap, it's very good.)

AIMA

Dragon

Naturally, TAOCP.

Many will also recommend SICP, though I'm not quite sure that's what you're angling at here, it's probably worth browsing online to see.

u/dhdfdh · 6 pointsr/freebsd

The Design and Implementation of the FreeBSD Operating System

Also, the devs hang out on the mailing lists and some on the FreeBSD forum.

u/svec · 5 pointsr/embedded

Here's a few books I highly recommend:

Making Embedded Systems by Elecia White

An Embedded Software Primer by David Simon

Programming Embedded Systems in C and C++ by Michael Barr - out of print, but still a decent book.

Also, embedded guru Jack Ganssle has a long list of embedded books he's reviewed here: http://www.ganssle.com/bkreviews.htm - lots of good stuff on there

u/_sasan · 5 pointsr/csharp

These are my recommendations:

u/reddilada · 5 pointsr/learnprogramming

OS Design. Covers the creation of MINIX, the jumping point for linux. This is the 3rd edition of the book that came out way back when.

....just noticed they want 100 bucks for it.

edit: Link to minix3 site.

u/neatuovi · 5 pointsr/cscareerquestions

I think you're talking about his new book Clean Architecture.

https://www.amazon.de/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164

u/healydorf · 5 pointsr/cscareerquestions

As far as engineering practices are concerned: Clean Code, Clean Architecture. A secure app/arch is one that is well understood long after you've stopped working on it.

DefCon has a reading list:

https://www.defcon.org/html/links/book-list.html

If you're looking for a starting point, I'd suggest The Tangled Web. Web/browser security tends to be a good high-level starting point.

You asked for books, but I'd highly suggest participating in some CTFs.

u/superflygt · 5 pointsr/DSP

https://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419

There's probably a free pdf floating around somewhere on the net.

u/NoahFect · 5 pointsr/ECE

Oppenheim & Schafer is the usual standard text, as others have said. However, it's pretty theory-intensive and may not be that much of an improvement over your current book, if you are looking for alternative explanations.

I'd say you should look at Lyons' Understanding Digital Signal Processing instead of O&S. Also the Steven Smith guide that mostly_complaints mentioned is very accessible. Between Smith and Lyons you will get most of the knowledge that you need to actually do useful DSP work, if not pass a test in it.

u/ElectricRebel · 5 pointsr/compsci

For compilers:

u/boredcircuits · 5 pointsr/learnprogramming

Start with the Dragon Book.

When it actually comes time to implement the language, I would recommend just writing the frontend and reusing the backend from another compiler. LLVM is a good option (it's becoming popular to use as a backend, it now has frontends for C, C++, Objective C, Java, D, Pure, Hydra, Scheme, Rust, etc). See here for a case study on how to write a compiler using LLVM as the backend.

u/fbhc · 5 pointsr/AskComputerScience

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

​

But you might also consider Basics of Compiler Design which is a good and freely available resource.

​

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/moyix · 5 pointsr/ReverseEngineering

Have you worked through the loop detection in the Dragon Book? There are some slides on it here:

http://www.cs.cmu.edu/afs/cs/academic/class/15745-s03/public/lectures/L7_handouts.pdf

u/blexim · 5 pointsr/REMath

The object you're interested in is the call graph of the program. As you've observed, this is a DAG iff there is no recursion in the program. If function A calls B and B calls A, this is called mutual recursion and still counts as recursion :)

A related graph is the control flow graph (CFG) of a function. Again, the CFG is a DAG iff the function doesn't contain loops.

An execution trace of a program can certainly be represented as a DAG. In fact, since an execution trace does not have any branching, it is just a straight line! However you are very rarely interested in a single trace through a program -- you usually want to reason about all the traces. This is more difficult because if you have any looping structure in the global CFG, there is no (obvious) upper bound on the size of a trace, and so you can't capture them all with a finite structure that you can map into SMT.

Every program can be put into SSA form. The trick is that when you have joins in the control flow graph (such as at the head of a loop), you need a phi node to fix up the SSA indices. If you don't have it already, the dragon book is pretty much required reading if you're interested in any kind of program analysis.

In general, if you have a loop free control flow graph of any kind (a regular CFG or a call graph), then you can translate that graph directly into SAT or SMT in a fairly obvious way. If you have loops in the graph then you can't do this (because of the halting problem). To reason about programs containing loops, you're going to need some more advanced techniques than just symbolic execution. The big names in verification algorithms are:

  • Bounded model checking
  • Abstract interpretation
  • Predicate abstraction
  • Interpolation based methods

    A good overview of the field is this survey paper. To give an even briefer idea of the flavour of each of these techniques:

    Bounded model checking involves unwinding all the loops in the program a fixed number of times [; k ;]. This gives you a DAG representing all of the traces of length up to [; k ;]. You bitblast this DAG (i.e. convert it to SAT/SMT) and hand off the resulting problem to a SMT solver. If the problem is SAT, you've found a concrete bug in the program. If it's UNSAT, all you know is that there is no bug within the first [; k ;] steps of the program.

    Abstract interpretation is about picking an abstract domain to execute your program on, then running the program until you reach a fixed point. This fixed point tells you some invariants of you program (i.e. things which are always true in all runs of the program). The hope is that one of these invariants will be strong enough to prove the property you're interested in.

    Predicate abstraction is just a particular type of abstract interpretation where your abstract domain is a bunch of predicates over the variables of the program. The idea is that you get to keep refining your abstraction until it's good enough to prove your property using counterexample guided abstraction refinement.

    Interpolation can be viewed as a fancy way of doing predicate refinement. It uses some cool logic tricks to do your refinement lazily. The downside is that we don't have good methods for interpolating bitvector arithmetic, which is pretty crucial for analyzing real programs (otherwise you don't take into account integer overflow, which is a problem).

    A final wildcard technique that I'm just going to throw out there is loop acceleration. The idea here is that you can sometimes figure out a closed form for a loop and replace the loop with that. This means that you can sometimes remove a loop altogether from the CFG without losing any information or any program traces. You can't always compute these closed forms, but when you can you're in real good shape.

    Drop me a message if you want to know anything else. I'm doing a PhD in this exact area & would be happy to answer any questions you have.
u/adamnemecek · 5 pointsr/freebsd

you can check out the table of contents on amazon http://www.amazon.com/Design-Implementation-FreeBSD-Operating-Edition/dp/0321968972/ref=dp_ob_title_bk
or the books website http://ptgmedia.pearsoncmg.com/images/9780321968975/samplepages/9780321968975.pdf
but the answer to all your questions is basically yes, this is the book that fits your criteria.

u/deaddodo · 5 pointsr/osdev

The source in the littleosbook builds on itself each chapter. However, it's important to know that the littleosbook, osdev wiki and most online resources aren't necessarily "tutorials" after the bootloader and bare-bones stages. Any later information is going to be more abstract and guidance. If you need in depth assistance with osdev, you'll want to invest in one (or more) of the following:

u/yberreby · 4 pointsr/programming

When I started getting interested in compilers, the first thing I did was skim issues and PRs in the GitHub repositories of compilers, and read every thread about compiler construction that I came across on reddit and Hacker News. In my opinion, reading the discussions of experienced people is a nice way to get a feel of the subject.

As for 'normal' resources, I've personally found these helpful:

  • This list of talks about compilers in general.
  • The LLVM Kaleidoscope tutorial, which walks you through the creation of a compiler for a simple language, written in C++.
  • The Super Tiny Compiler. A really, really simple compiler, written in Go. It helps with understanding how a compilation pipeline can be structured and what it roughly looks like.
  • Anders Hejlsberg's talk on Modern Compiler Construction. Helps you understand the difference between the traditional approach to compilation and new approaches, with regards to incremental recompilation, analysis of incomplete code, etc. It's a bit more advanced, but very interesting nevertheless.

    In addition, just reading through the source code of open-source compilers such as Go's or Rust's helped immensely. You don't have to worry about understanding everything - just read, understand what you can, and try to recognize patterns.

    For example, here's Rust's parser. And here's Go's parser. These are for different languages, written in different languages. But they are both hand-written recursive descent parsers - basically, this means that you start at the 'top' (a source file) and go 'down', making decisions as to what to parse next as you scan through the tokens that make up the source text.

    I've started reading the 'Dragon Book', but so far, I can't say it has been immensely helpful. Your mileage may vary.

    You may also find the talk 'Growing a language' interesting, even though it's not exactly about compiler construction.

    EDIT: grammar
u/fluicpana · 4 pointsr/italy

Per testare le acque velocemente puoi usare https://rubymonk.com/ (introduce Ruby in modo basico). Anche Coursera, Khan, Udacity e simili hanno corsi introduttivi sulla programmazione.

Mentre se vuoi imparare a programmare, il percorso deve toccare almeno tutte queste tappe, in ordine:

  1. [Computer Organization and Design](http://www.amazon.com/Computer-
    Organization-Design-Fourth-Edition/dp/0123744938)

  2. The Structure and Interpretation of Computer Programs

  3. Un buon libro di Assembly

  4. The C programming language

  5. Compillers

  6. Code complete, The practice of programming

  7. Fai finta di aver letto tutto The art of computer programming

  8. Un linguaggio a oggetti, magari Programming Ruby

  9. O/E Python, Dive into Python

  10. Design patterns

  11. Impara un linguaggio funzionale.


    Da qui puoi partire e specializzarti in quello che ti interessa

u/ewood87 · 4 pointsr/BSD
u/icantthinkofone · 4 pointsr/BSD

I noticed you linked to the first edition. Here is the second edition

u/poorbowelcontrol · 4 pointsr/cscareerquestions

-How long after completing the camp did it take for you to get hired?
Within 10 days.
-Who do you work for?
~16 person consulting company in the bay.
-Did you have any prior coding experience before enrolling at the camp?
Yes full year of self study and some classes in high school and college.
-Are you happy with your current earnings?
I was untill I realized the cost of living where I am and how much Uncle Sam takes.
-Do employers consider the camps as sufficient to warrant upward mobility potential?
There is another person in my company that also went to my code camp. Our camp (app academy) discouraging revealing our participation in the camp till late in the hiring process.
-Best strategy to get accepted?
Apply.
What kind of students are they looking for? Can I, with my limited background become successful?
In my experience you can have the ability to think in that way or not.

What sort of students are most successful both during the camp and then in the job search following the camp?
The ones you would expect.
-Recommendations for pre-study?
Keep trying different tools until you really find something that works.

A great book is http://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765.
If i was gonna put forward one online resource it would be http://www.tutorialspoint.com/.

If you have a little time try some of the assembler stuff.

One final tip. There will be a time (or thousands) where you will be staring at some concept and drawing a blank. It may feel like nothing is happening. It may well be that lots of things are and you just gotta process the concepts.

Good luck.

u/Grayson_the · 4 pointsr/hardware

I highly recommend this book. I read it cover to cover and learned a lot. It will at least let you know what you want to go more in depth in study

u/bigdeddu · 4 pointsr/programming

I agree with OP. If you are looking for a good architecture book(s), beside fowlers, I've enjoyed

u/BinaryLust · 4 pointsr/Compilers

Here are few more advanced books that I highly recommend after you learn the basics and want to really implement something.

​

- Modern Compiler Implementation in Java 2nd Edition

- Advanced Compiler Design and Implementation 1st Edition

​

Also here is a link to a GitHub page with tons of compiler related resources. awesome-compilers

u/CSMastermind · 4 pointsr/learnprogramming

I've posted this before but I'll repost it here:

Now in terms of the question that you ask in the title - this is what I recommend:

Job Interview Prep


  1. Cracking the Coding Interview: 189 Programming Questions and Solutions
  2. Programming Interviews Exposed: Coding Your Way Through the Interview
  3. Introduction to Algorithms
  4. The Algorithm Design Manual
  5. Effective Java
  6. Concurrent Programming in Java™: Design Principles and Pattern
  7. Modern Operating Systems
  8. Programming Pearls
  9. Discrete Mathematics for Computer Scientists

    Junior Software Engineer Reading List


    Read This First


  10. Pragmatic Thinking and Learning: Refactor Your Wetware

    Fundementals


  11. Code Complete: A Practical Handbook of Software Construction
  12. Software Estimation: Demystifying the Black Art
  13. Software Engineering: A Practitioner's Approach
  14. Refactoring: Improving the Design of Existing Code
  15. Coder to Developer: Tools and Strategies for Delivering Your Software
  16. Perfect Software: And Other Illusions about Testing
  17. Getting Real: The Smarter, Faster, Easier Way to Build a Successful Web Application

    Understanding Professional Software Environments


  18. Agile Software Development: The Cooperative Game
  19. Software Project Survival Guide
  20. The Best Software Writing I: Selected and Introduced by Joel Spolsky
  21. Debugging the Development Process: Practical Strategies for Staying Focused, Hitting Ship Dates, and Building Solid Teams
  22. Rapid Development: Taming Wild Software Schedules
  23. Peopleware: Productive Projects and Teams

    Mentality


  24. Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency
  25. Against Method
  26. The Passionate Programmer: Creating a Remarkable Career in Software Development

    History


  27. The Mythical Man-Month: Essays on Software Engineering
  28. Computing Calamities: Lessons Learned from Products, Projects, and Companies That Failed
  29. The Deadline: A Novel About Project Management

    Mid Level Software Engineer Reading List


    Read This First


  30. Personal Development for Smart People: The Conscious Pursuit of Personal Growth

    Fundementals


  31. The Clean Coder: A Code of Conduct for Professional Programmers
  32. Clean Code: A Handbook of Agile Software Craftsmanship
  33. Solid Code
  34. Code Craft: The Practice of Writing Excellent Code
  35. Software Craftsmanship: The New Imperative
  36. Writing Solid Code

    Software Design


  37. Head First Design Patterns: A Brain-Friendly Guide
  38. Design Patterns: Elements of Reusable Object-Oriented Software
  39. Domain-Driven Design: Tackling Complexity in the Heart of Software
  40. Domain-Driven Design Distilled
  41. Design Patterns Explained: A New Perspective on Object-Oriented Design
  42. Design Patterns in C# - Even though this is specific to C# the pattern can be used in any OO language.
  43. Refactoring to Patterns

    Software Engineering Skill Sets


  44. Building Microservices: Designing Fine-Grained Systems
  45. Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools
  46. NoEstimates: How To Measure Project Progress Without Estimating
  47. Object-Oriented Software Construction
  48. The Art of Software Testing
  49. Release It!: Design and Deploy Production-Ready Software
  50. Working Effectively with Legacy Code
  51. Test Driven Development: By Example

    Databases


  52. Database System Concepts
  53. Database Management Systems
  54. Foundation for Object / Relational Databases: The Third Manifesto
  55. Refactoring Databases: Evolutionary Database Design
  56. Data Access Patterns: Database Interactions in Object-Oriented Applications

    User Experience


  57. Don't Make Me Think: A Common Sense Approach to Web Usability
  58. The Design of Everyday Things
  59. Programming Collective Intelligence: Building Smart Web 2.0 Applications
  60. User Interface Design for Programmers
  61. GUI Bloopers 2.0: Common User Interface Design Don'ts and Dos

    Mentality


  62. The Productive Programmer
  63. Extreme Programming Explained: Embrace Change
  64. Coders at Work: Reflections on the Craft of Programming
  65. Facts and Fallacies of Software Engineering

    History


  66. Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software
  67. New Turning Omnibus: 66 Excursions in Computer Science
  68. Hacker's Delight
  69. The Alchemist
  70. Masterminds of Programming: Conversations with the Creators of Major Programming Languages
  71. The Information: A History, A Theory, A Flood

    Specialist Skills


    In spite of the fact that many of these won't apply to your specific job I still recommend reading them for the insight, they'll give you into programming language and technology design.

  72. Peter Norton's Assembly Language Book for the IBM PC
  73. Expert C Programming: Deep C Secrets
  74. Enough Rope to Shoot Yourself in the Foot: Rules for C and C++ Programming
  75. The C++ Programming Language
  76. Effective C++: 55 Specific Ways to Improve Your Programs and Designs
  77. More Effective C++: 35 New Ways to Improve Your Programs and Designs
  78. More Effective C#: 50 Specific Ways to Improve Your C#
  79. CLR via C#
  80. Mr. Bunny's Big Cup o' Java
  81. Thinking in Java
  82. JUnit in Action
  83. Functional Programming in Scala
  84. The Art of Prolog: Advanced Programming Techniques
  85. The Craft of Prolog
  86. Programming Perl: Unmatched Power for Text Processing and Scripting
  87. Dive into Python 3
  88. why's (poignant) guide to Ruby
u/NateRudolph · 4 pointsr/arduino

Here's my advice, as a recent grad who was first exposed to arduino in school two years ago.

Get a starter kit that has a nice amount of sensors, jumpers, resistors. Nothing worse than seeing a project online and realizing you'd have to make a trip to radio shack just for some 30 cent resistor.

Amazon - $125

Sparkfun - $60

Jameco - $99

These are all a little pricey, but if you have a decent amount of confidence that you'll stick with things, I think this is a good way to get started. You could get one of the cheaper starter kits, but pushing a button to light an LED is only impressive for like a second. After that you're going to want to start moving and sensing things and it's nice to already have that at your fingertips.

Word of advice on tutorials. If you're anything like me, the internet can be your best friend and worst enemy. There are so many tutorials for stuff like arduino with varying levels of quality. It can be super distracting to look through a long tutorial and then see 100 other things you might want to do. At this point, that's bad because you're just chasing after a cool project, not actually learning. I'd encourage you to commit to buying a book, plugging away through every single tutorial in it, and then looking online. You'll start to see quicker which projects you actually want to dive into when you know a bit more about the process.

That first kit from Amazon comes with a book that I'm sure is great. Here's the one we went through at school: Programming Arduino - $12

That said, I'd very strongly encourage you to do it. Save up some money, get one of those kits, and start learning! It's incredible rewarding, and after even a few months you'll have projects lying around that will impress pretty much anyone who doesn't know what arduino is. I really wish I had started at your age. Good luck!

u/tramast · 4 pointsr/ECE

Sounds like what you're interested in is computer architecture. This is the study of how a computer system (whether it's chip level or system level) is organized and designed from a higher-level abstraction (usually at the register-transfer level or above). There are plenty of good resources on this, including many books (this one comes to mind). Not knowing your background, I can't say if this would be much of a stretch for you. I would say prior to jumping to this level you should have an idea of basic MOS logic design, sequential and combinational logic as well as some background in delays and timing.

Your best bet is probably to find a good old book on amazon or ebay and read to your hearts content. Feel free to PM me if you have any questions (I design microprocessors for a living).

u/jmunozar · 4 pointsr/programming

Operating Systems, Design and Implementation http://www.amazon.com/Operating-Systems-Implementation-Prentice-Software/dp/0131429388/ref=pd_sim_b_5 the book has full examples, and if you wanted the OS printed, is at the end of the Book, also comes with a CD that has the OS compiled and ready to run.

u/yamamushi · 4 pointsr/0x10c

It goes without saying, I'm sure you already have it if you're working on a project like this already...

Anyone interested in building an OS for the dcpu16 should own a copy of this book.

u/RoundTripRadio · 4 pointsr/programming

I like Operating Systems: Design and Implementation. It walks through the MINIX3 micro kernel. It is not a tutorial or a step by step guide, but I think it gives a very nice foundation of what it takes to make a functioning kernel.

u/saranagati · 4 pointsr/cscareerquestions

two of mine were The Design of the UNIX Operating System and Advanced Programming in the UNIX Environment

one of the other big things was some original unix documentation. I cant recall the name of them but there were small paperback book released early on for unix describing how to use and program for it. I remember someone linkijg to it either on reddit or hn a couple months ago.

u/naerbnic · 4 pointsr/learnprogramming

You may want to find a book on Operating System basics, like you'd find in a college course. Many of them describe most of the common concepts that someone who is learning low-level Linux programming would need to know, like threads, processes, filesystems, IPC, etc. Perhaps something like this would be appropriate. FYI, this is just a random book I found on Amazon.

Which interfaces are you talking about, by the way? To my knowledge, there aren't that many useless or deprecated interfaces in Linux.

u/spoonraker · 4 pointsr/personalfinance

Self-taught software engineer checking in to add on to this.

Everything u/TOM_BRADYS_PET_GOAT said is true.

I'll add a few specific resources:

Computer science fundamentals are really scary and overwhelming if you're self-taught. I'd highly recommend reading The Imposter's Handbook to get started with this topic. You'll want more in-depth material afterwards on each of the various subtopics, but this book is absolutely fantastic as a (surprisingly deep) introduction to all the concepts that's framed specifically to get self-taught programmers up to speed.

After you're familiar with the concepts at a conceptual level, and it's time to just get down to dedicated practice, Cracking the Coding Interview will be an invaluable resource. This book exists for the sole purpose of helping people become better at the types of questions most commonly asked during coding interviews. It's not just a list of a bunch of questions with solutions, it actually explains the theory in-depth, provides drill and smaller practice questions, as well as questions designed to emulate specific interview scenarios at real tech companies like Google, Microsoft, Amazon, etc. It'll even talk about the interview process at those companies outside of just the questions and theory behind them.

As a more general resource that you'll reach for repeatedly throughout your career, I'd recommend The Complete Software Developer's Career Guide. This book covers everything. How to learn, how to interview, how to negotiate salary, how to ask for raises, how to network, how to speak at conferences and prepare talks, how to build your personal brand, how to go into business for yourself if you want, etc. and that's just scratching the surface of what's covered in that book. I did't even buy this book until I was 10 years into my career and it's still very insightful.

And lets not forget, being a good developer isn't just a matter of making things that work, it's a matter of writing code that readable, extensible, and a pleasure for other developers to work on. So to this end, I'd recommend any developer read both Clean Code and Clean Architecture: A Craftsman's Guide to Software Structure and Design

u/crushendo · 3 pointsr/arduino

Nothing special, but I got my very first Arduino (uno)!

Also pictured:

  • breadboard

  • RGB LED Strip

  • Motion sensor

  • photoresistor

  • wire

  • battery and snap

  • USB cord.

    Not pictured: Programming Arduino. Im really excited to get started on my first electronics projects!
u/EngineerBill · 3 pointsr/arduino

I was pretty happy with "Programming Arduino - Getting Started With Sketchs" by Simon Monk. It provide a good overview of C and the various steps needed to get to working code. I've already a lot of coding experience by knew nothing about Arduino when I started so it brought me up to speed quickly but I think it would useful for beginners, as well.

It's available on Amazon for sub-$9: -> and he has a site which has a fair amount of errata, etc.: ->

u/PenMount · 3 pointsr/programming

Instead of learning assembly for it own sake I would learn about computer architectonic (eg. Computer Organization and Design by Patterson and Hennessy) after that you will have no problem picking any normal assembly language up in a couple of hours

I myself had it 2 year on my computer science BS.c.

u/maredsous10 · 3 pointsr/ECE

Get a good handle of digital electronics (statemachines, combinatorial logic and registered logic). They'll give you a good basis for building up an understanding of how a computer works.

Digital Design and Computer Architecture
http://www.amazon.com/Digital-Design-Computer-Architecture-Harris/dp/0123704979

That book will take you through basic digital design, Verilog HDL, and then show you how to use Verilog to build a MIPS-like microprocessor (DLX http://en.wikipedia.org/wiki/DLX).

Much of the later material is derived from this book.
http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=pd_sim_b_1

----------------------------------------------------------

Steve Gibson did a series of podcasts called How Computers Work.
http://www.grc.com/securitynow.htm
Search for HOW COMPUTERS WORK

----------------------------------------------------------
If you have any questions or need any other resources in the future, reply to my post.

u/BlackDeath3 · 3 pointsr/csMajors

Have you seen the MIPS WikiBooks site?

It's not complete, but after a cursory inspection it looks as though it may be useful for beginners.

What textbook is your class using? I believe that the textbook my course assigned (though I may or may not be able to recommend it as I may or may not have ever read it...) was this one.

u/pythonicus · 3 pointsr/compsci

I enjoyed writing a small MIPS simulator with this book:

https://www.amazon.com/Computer-Organization-Design-Fourth-Edition/dp/0123744938

u/Celdecea · 3 pointsr/ProgrammerHumor

An excellent and surprisingly complete book describing many lower level concepts would be Operating Systems: Design and Implementation (3rd ed. 2006) by Andrew Tanenbaum. It is about writing your own Linux-type OS called Minix complete with memory management, locks, scheduling, quantums, so much more. Even if not creating an OS the algorithms and hardware details it can teach you are bricks of gold for any programmer.

u/traal · 3 pointsr/software

This is the knowledge required to create the kernel. It probably doesn't go into the UI.

u/scrottie · 3 pointsr/openbsd

An old, old Unix thing (if you're interested in that stuff, the book _Lion's Commentary on Unix_ is awesome, and so is the Minix book, _Operating System Design Principles_ or something like that... _Operating Systems Design and Implementation_ ... Google knows exactly what "the Minix book" is =P)...

Unix is a pre-emptively multitasking OS. Cooperative multitasking OSes are thankfully extinct or nearly. Those would wait until a process decided that it was done with the CPU and gave the CPU back to the OS before changing tasks. That includes changing tasks between the database app running and the user interface that handles mouse movements and keyboard processes and clicks such as clicks to just close the damn thing. Pretty clearly that would allow any badly written program to wedge the whole computer, except instead of "badly written", "not extremely well written". It's doomed to fail.

So Unix fires off an interrupt several times per second, such 100 (that's the tick and the tick frequency) that interrupts the program and runs the OS task scheduler. The OS task scheduler looks to see if there is anything else of the same or higher priority that also needs to run, and if so, pauses the currently running task and fires up the other one that needs to run. That's called a context switch. That keeps the system responsive and multitasking. You could have 100 tasks waiting for CPU and all of them will get some CPU every second.

But that wasn't designed for laptops running on battery that only have a few tasks. This interrupt runs 100 times a second even if no tasks are waiting for CPU (maybe X is waiting for input events... and waiting for input/output is part of this system I've ignored). The CPU would ideally be in a snooze state taking very little power, but this keeps waking it up.

Ticks also help the OS keep time, compute internal stats such as how much data is being transferred in a time period, how many processes are waiting for the CPU, etc. Stats are updated each time this timer interrupt goes off.

"Tickless" systems on the other hand don't interrupt or wake up the CPU at any fixed interval. If it wakes up the CPU once and discovered that no tasks are waiting for the CPU, it won't wake it up again for a long time. Input events including the touchpad and network and disk also generate interrupts which wake up the CPU so that is okay.

Also, the timer to change context (move from one waiting process to the next) can be higher or lower. If only two tasks want CPU, it can change context less often, or if a thousand tasks are waiting and processing would otherwise be locking out the UI, it can change tasks more often by deciding that the next task change will be in 0.001 seconds instead of 0.01.

u/llFLAWLESSll · 3 pointsr/learnprogramming

Since you know Java I would suggest that you a read one of the best programming books ever written: [K&R The C Programming language] (http://www.amazon.com/The-Programming-Language-Brian-Kernighan/dp/0131103628/), this book was written by the people who made the C language and it's one of the best books ever written. It is a must read for every C programmer. [Computer Systems: A Programmer's Perspective (3rd Edition)] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/013409266X/) is a great book to learn about computer systems. But I would recommend [Operating Systems Design and Implementation (3rd Edition)] (http://www.amazon.com/Operating-Systems-Design-Implementation-Edition/dp/0131429388) because it has some minix source code which will go really well with learning C.

Best of luck buddy :)

u/jstampe · 3 pointsr/compsci

I found Tanenbaum's Structured Computer Organization to be very good. Giving a complete overview of the whole stack.

u/Wil_Code_For_Bitcoin · 3 pointsr/PrintedCircuitBoard

Not entirely sure what you're looking for but I've heard a lot of praises for this book : https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/dietfig · 3 pointsr/electronics

High Speed Digital Design: A Handbook of Black Magic is supposed to be a great book on the subject but the frequencies you're working at don't really qualify as anything approaching "high speed". I really don't think you'll have any issues. The wavelength at 100 kHz is 3 kilometers so you're nowhere near having to worry about transmission line effects.

Make sure to adequately decouple every power pin at the chip to deal with the switching transients from the FETs otherwise you'll see a lot of ripple on your supply lines which can cause problems. ADI generally uses a 1 uF and 100 nF capacitor in parallel (IIRC) in their application circuits and I tend to think they know what they're doing.

Is your copper pour grounded? I wouldn't be very worried about coupling noise into your logic traces because 400 Hz is such a low frequency but I suppose it's possible.

ADI publishes a guide called "PCB Board Layout and Design Techniques" that goes through things like proper grounding but I didn't have any luck trying to find it on Google. The Circuit Designer's Companion is an excellent book that also covers the same material with a lot more depth.

u/Skipper_Jos · 3 pointsr/engineering

I will also recommended 'High Speed Digital Design: A Handbook of Black Magic book' , it definitely has some good stuff!
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90#

u/erasmus42 · 3 pointsr/rfelectronics
u/doodle77 · 3 pointsr/electronics

this book.

OP's board is clearly not high speed so it doesn't matter.

u/WilburJames93 · 3 pointsr/SoftwareEngineering
u/big-ookie · 3 pointsr/csharp

I would strongly recommend reading this book

https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164

It should be mandatory reading in all CS and SE degrees IMO.

It will not answer your specific question, but it will provide you with the tools and knowledge to understand how best to approach the problem and ensure your architect and design is well though through and draws on the learnings of those who have come before us.

u/lerpanerp · 3 pointsr/DSP

I found Rick Lyon's book a much easier read.

u/apcragg · 3 pointsr/RTLSDR

The chapter on quadrature signals in this book is really good. It has some of the best illustrations of the concept that I have come across. The amazon link also lets you browse that chapter for free.

u/jamesonchampagne · 3 pointsr/DSP

Understanding Digital Signal Processing by Richard Lyons is the best intro in my opinion:

http://www.amazon.com/gp/aw/d/0137027419/ref=mp_s_a_2?pi=54x75&qid=1344996249&sr=8-2

Teaches concepts without getting bogged down in the math details. Once you understand the concepts, get Oppenheim and Schafer to learn the dirty details.

u/ntr0p3 · 3 pointsr/AskReddit

By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology.

Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software.

Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should.

For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything.

I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is.

For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever.

For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :(

Best of luck in your travels however! :)

edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius.

also, Feynman QED is great, but his other book is awesome just as a happy intellectual read

also try to avoid either kaku and hawking for anything more complicated than primers.

edit no. 9: mit's ocw is win itself.

edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields)

u/sindrit · 3 pointsr/compsci

Skip Calculus (not really useful unless you do fancy graphics or sound generators or scientific stuff). Discrete mathematics is what you want to look at for CS. You might want to move on to a linear algebra course from there.

Get the CS specific University textbooks. Here are some to get you started.

u/dohpaz42 · 3 pointsr/PHP

Agreed. There are plenty of resources out there that will help you understand design patterns. If you're new to the concept, I would recommend Head First: Design Patterns, it might be based on Java, but the examples are simple to understand and can mostly apply to PHP as well. When you feel like you've grasped the basic concepts of design patterns, you can move on to more advanced texts, like Martin Fowler's Patterns of Enterprise Design - this is a great reference for a lot of the more common patterns. There is also Refactoring: Improving the Design of Existing Code. These are great investments that will help you with any project you work on, and will help you if you decide to use a framework like Zend which uses design patterns very heavily.

u/smugglerFlynn · 3 pointsr/compsci
  1. Read any book on Java Patterns (probably the one by GoF) - they are applicable across different domains
  2. Read Patterns of Enterprise Application Architecture by Martin Fowler - this is the book that influenced many architects
  3. Study the field: The Architecture of Open Source Applications
  4. Study fundamentals: A Methodology for Systems Engineering by Arthur D. Hall - this one is hard to find
  5. Study as much different frameworks/architectures/languages as possible. You'll start to notice similarities yourself.
  6. Solve every problem you meet from architectural point of view. You will achieve nothing just reading the books. Refactor your old projects in terms of using patterns and new methodologies, write down designs for your wild random ideas, teach others about the stuff you know.

    Also, take a note: patterns are only a small part of systems engineering applied to CS. Think REST and SOAP, think of how to better integrate two different applications together - not how to code their insides more efficiently. Start considering business logic and requirements - and you will find yourself whole new level of challenging architectural tasks.
u/pitiless · 3 pointsr/PHP

The following books would be good suggestions irrespective of the language you're developing in:

Patterns of Enterprise Application Architecture was certainly an eye-opener on first read-through, and remains a much-thumbed reference.

Domain-Driven Design is of a similar vein & quality.

Refactoring - another fantastic Martin Fowler book.

u/cmgg · 3 pointsr/funny

Reminds me of a joke:

a: "Alright the book is done, all that is left is to choose a cover".

b: "Dragons"

a: "B-but the book is about..."

b: "DRAGONS I SAID"

Book

u/HotRodLincoln · 3 pointsr/IWantToLearn

There are books specifically on language design, syntax trees, and unambiguous grammars.

The classic books on compiler design are "The Dragon Book", designing a compiler is important because a statement in the language should mean exactly one thing, and a language should be able to be compiled efficiently. This is more difficult than it sounds.

Second, you need to understand language design, variable binding, etc. This is a topic of Programming Language Paradigms. I'll figure out a good book for this and edit to add it. The best book probably covers languages like Ada, Haskell, C, and Java and gives an overview of their design and reasons.

edit: The book for design is Concepts of Programming Languages 9th ed, by Robert W. Sebesta.

u/bobappleyard · 3 pointsr/programming

I don't know what he'd recommend, but I found the Dragon Book and Modern Compiler Design to be decent treatments of the subject. There are lots of interesting texts out there though.

Sorry for the cheeky reply.

u/evaned · 3 pointsr/programming

> And which coder uses physical books any more?

I do for things beyond actual language references; e.g. maybe everything in The Dragon Book has a good description somewhere, but grabbing that off my desk and having a decent chance of it having what I want (it has some problems and omissions, but it's reasonably good) will save wading through a bunch of crap online until I find something complete and accurate enough.

u/sfrank · 3 pointsr/programming

But make sure to get the 2nd edition of the Compiler book. It has been enhanced quite a bit.

u/OmegaNaughtEquals1 · 3 pointsr/cpp_questions

This is a great question! It's also one that every serious CS person will ask at some point. As others here have noted, to really understand this question you must understand how compilers work. However, it isn't necessary to understand the gory details of compiler internals to see what a compiler does for you. Let's say you have a file called hello.cpp that contains the quintessential C++ program

include <iostream>

int main() {<br />
    std::cout &amp;lt;&amp;lt; &quot;Hello, world!\n&quot;;<br />
}<br />


The first thing the compiler does is called preprocessing. Part of this process includes expanding the #include statements into their proper text. Assuming you are using gcc, you can have it show you the output of this step

gcc -E -o hello.pp hello.cpp

For me, the hello.cpp files explodes from 4 lines to nearly 18000! The important thing to note here is that the contents of the iostream library header occur before the int main lines in the output.

The next several step for the compiler are what you will learn about in compiler design courses. You can take a peek at gcc-specific representations using some flags as discussed on SO. However, I pray you give heed. For there be dragons!

Now let's take a look at the compiler's output. To do this, I am going to not #include anything so the output is very simple. Let's use a file called test.cpp for the rest of the tests.

int main() {
int i = 3, j = 5;
float f = 13.6 / i;
long k = i&lt;&lt;j;
}

To see the compiler's output, you can use

g++ -S -masm=intel test.cpp

The -S flag asks gcc to just output the generated assembly code and -masm=intel requests the intel dialect (by default, gcc uses the AT&amp;T dialect, but everyone knows the intel one is superior. :) ) The output on my machine (ignoring setup and teardown code) is outlined below.

push rbp
mov rbp, rsp

/ int i = 3, j = 5; /
mov DWORD PTR [rbp-20], 3
mov DWORD PTR [rbp-16], 5

/ float f = 13.6 / i; /
pxor xmm0, xmm0
cvtsi2sd xmm0, DWORD PTR [rbp-20]
movsd xmm1, QWORD PTR .LC0[rip]
divsd xmm1, xmm0
movapd xmm0, xmm1
cvtsd2ss xmm2, xmm0
movss DWORD PTR [rbp-12], xmm2

/ long k = i&lt;&lt;j; /
mov eax, DWORD PTR [rbp-16]
mov edx, DWORD PTR [rbp-20]
mov ecx, eax
sal edx, cl
mov eax, edx
cdqe
mov QWORD PTR [rbp-8], rax

/ implicit return 0; /
mov eax, 0
pop rbp
ret

There are lots of details to learn in here, but you can generally see how each simple C++ statement translates into many assembly instructions. For fun, try compiling that program with the optimizer turned on (with g++, you can use -O3). What is the output?

There is still much to see from the binary that is assembled. You can use nm and objdump to see symbols or ldd to see what other libraries were (dynamically) linked into the executable. I will leave that as an exercise for the reader. :)

u/Ohthere530 · 3 pointsr/explainlikeimfive

Disk drives are like post office boxes. They are numbered from one to whatever, and each one holds just a small amount of information. Each of these "boxes" is called a block.

Users like to look at documents organized into folders. Each document has its own name and can have any amount of data.

A filesystem is the software that converts from documents and folders the way users like to see them to the blocks that disk drives hold. For every file, there is some information about who owns it, when they last changed it, which blocks hold the actual data, and so on. (This is an inode.)

For details, I'd recommend The Design and Implementation of the FreeBSD Operating System.

u/a-schaefers · 3 pointsr/unixporn

Initially I watched an episode of lunduke hour that featured a freeBSD dev https://www.youtube.com/watch?v=cofKxtIO3Is

I like the documentation available to help me learn. I got my hands on the FreeBSD handbook and can't wait to get into the design and implementation text book, Addison Wesley, 928 pages. https://www.amazon.com/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=pd_lpo_sbs_14_t_0/143-0574353-4482766?_encoding=UTF8&amp;amp;psc=1

I appreciate the focus on servers and research computing that is BSD's strong suit.

u/mdf356 · 3 pointsr/cscareerquestions

It's about 40 years too late for any one person to have mastery of all the different parts of a computer.

For computer architecture, Hennessy and Patterson is the classic volume. For the software algorithms that are used everywhere, CLRS is the classic guide. For Operating Systems, The Design and Implementation of FreeBSD is a good book. I'm sure there's something similar for networking.

You could read the PCI spec, and some Intel data sheets, and the RFCs for networking protocols if you want to learn those things. For most parts of computers, I strongly suspect that most material is either too high level (yet another "Introduction to") or too low level (reading an RFC doesn't tell you whether it's used that way in practice or has been superseded).

The only way I've gotten to know things is to play with them. Change the code, make it do something else. Personal research like that is very educational but time consuming, and there's way too much out there to know everything, even for a single small piece of hardware and its associated software.

u/HarmlessSnack · 3 pointsr/TheCulture

Thing Explainer by Randall Munroe is my favorite coffee table book. (He’s the guy that makes XKCD comics.)

Giant detailed drawings of complex things explained using common language, and a candy coating of humor. Really fun book!

u/theo_sontag · 3 pointsr/newreddits

A relevant book with the same concept: Things Explainer by Randall Munroe https://www.amazon.com/Thing-Explainer-Complicated-Stuff-Simple/dp/0544668251

u/lifelongintent · 3 pointsr/suggestmeabook

This is so thoughtful! Very similar to Hyperbole and a Half is The Oatmeal, which is another sardonic blog with funny cartoons that has a book of its best content. I also highly recommend XKCD's book "Thing Explainer", which is a highly informative and entertaining read. Wishing your friend the best!

u/PickaxePete · 3 pointsr/space

A Galileoscope and books. I currently like Thing Explainer, which seems really good for that age. Any space book will do though.

Update with Amazon link to book

u/old_dog_new_trick · 3 pointsr/learnprogramming

Also recommend But How Do It Know? which is a shorter and (IMO) an even easier read than Code.

u/eitauisunity · 3 pointsr/learnpython

Here is an interesting video where they build a cpu up from the transistor level.

The CPU is only a theoretical one called a "Scott CPU", which was designed by John Scott, who is the author of the book, But How Do It Know?, which is an amazingly straight-forward, easy-to-digest book about computing.

I would recommend it as it was the first thing I read that gave me a deep understanding of computers on an abstract level. It completely demystified them and got me well on my way to programming.

Edit: The video doesn't go down to the transistor level, just goes over each component of a CPU. The book does go down to the transistor level, however, and again, I would highly recommend it.

u/untissuntiss13 · 3 pointsr/computers

Firstly I would like to say thank you for your curiosity in computers and thanks for asking colinw45, so my first suggestion would be go to your local library to look at books about basic computer repair and stuff like that or take a class at your high school career program or local junior college aka( 2 year) and yeah jus learn over time. Ive have learned about computers my entire life so like 18 years and I am still not a master. This book at barnes and noble or amazon is great its better if you get it in paper version because it is a good reference tool. https://www.amazon.ca/Upgrading-Repairing-22nd-Scott-Mueller/dp/0789756102

http://www.barnesandnoble.com/w/upgrading-and-repairing-pcs-scott-mueller/1116693453

u/LittleHelperRobot · 3 pointsr/test

Non-mobile: http://www.amazon.com/dp/1118717058

^That's ^why ^I'm ^here, ^I ^don't ^judge ^you. ^PM ^/u/xl0 ^if ^I'm ^causing ^any ^trouble.

u/case-o-nuts · 3 pointsr/compsci

It seems that most introductory texts focus on parsing. However, in my experience, the dragon book does a good job on introductory code generation. Appel's Tiger Book had good information as well. As a heads up, the C and Java versions of the same book are done as an afterthought, and you can tell that the code was translated after the fact. Stick with the ML version.

For optimization algorithms, I've heard good (And bad) things about Muchhnik: Advanced Compiler Design and Implementation.

However, I've had better luck just reading various papers. If there's a specific part of code generation and emission, I can point you to plenty of good papers.

u/triv_burt · 3 pointsr/csharp

https://www.amazon.co.uk/Pro-ASP-NET-Core-MVC-2/dp/148423149X

I'm currently using this book. The author prefers not to use templates meaning you actually learn to read the code properly rather than just following mouse clicks.

Because he doesn't use templates he writes everything in a way that you can use Visual Studio code as well as Visual Studio. It's great if you have an older computer/laptop or plan to develop on a Mac or on Linux.

u/amaleemur · 3 pointsr/compsci
u/key_lime_pie · 3 pointsr/nfl

Just in general, or on a specific topic?

Books I'm reading right now:

u/RadioactiveAardvark · 2 pointsr/embedded

There aren't any that I'd recommend, unfortunately.

This book is not specifically about embedded C, but about embedded in general:

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

Anything by Jack Ganssle is good as well.

u/oridb · 2 pointsr/learnprogramming

I've been playing around with writing a programming language and compiler in my spare time for a while now (shameless plug: http://eigenstate.org/myrddin.html; source: http://git.eigenstate.org/git/ori/mc.git). Lots of fun, and it can be as shallow or as deep as you want it to be.

Where are you with the calculator? Have you got a handle on tokenizing and parsing? Are you intending to use tools like lex and yacc, or do you want to do a recursive descent parser by hand? (Neither option is too hard; hand written is far easier to comprehend, but it doesn't give you any correctness guarantees)

The tutorials I'd suggest depend on exactly where you are and what you're trying to do. As far as books, the three that I would go with are, in order:

For basic recursive descent parsing:

u/wllmsaccnt · 2 pointsr/csharp

This would be the updated version of that book (as long as you are OK focusing on Core). Adam does a good job introducing ASP.NET, but he also covers a broad spectrum of cross cutting concerns and OOP concepts. I would highly recommend his books for anyone new that wants to go down an MVC path.

u/PedroMutter · 2 pointsr/typescript

Thanks for your advice. The O'reilly book you mentioned is this one? (Building-Microservices-Sam-Newman). And could you send me some material that you like please? (blog posts included).

u/Etnos · 2 pointsr/webdev

I enjoyed building micro services very much.

My other advice would be start building a rest api, figure it out how can you recycle services for multiple sites and what not. As with anything software: Practice makes master.

u/prasp · 2 pointsr/microservices

Two books that will serve you well

  1. Building Microservices by Sam Newman
  2. Production Ready Microservices

    It is basically SOA done right as @theplannacleman had mentioned in the comments
    As a primer read Martin Fowlers and Chris Richardsons articles in the sites mentioned below as well as Susans Fowlers post on the layers of microservice architecture here
u/SereneDoge001 · 2 pointsr/webdev

I read Building Microservices by Sam Newman recently. I think it's primarily targetted for people already working in the industry wondering "what the heck is this microservice thing everyone's talking about and how can I go about migrating a monolith-style application to a microservice architecture?"

It approaches the topic in a very pragmatic and practical approach by focusing on avoiding common pitfalls when creating a microservice based application, which I found made it very easy to read and relate to real life situations I encountered in the past.

I don't how suitable it is for a student with little/no work experience, but anyone already familiar with monolithic applications can pick up this book and rake something from it.

u/MrTCSmith · 2 pointsr/microservices

I'm reading this right now, it's highly recommended.

Building Microservices: Designing Fine-Grained Systems https://www.amazon.com/dp/1491950358/ref=cm_sw_r_cp_apa_i_.TcmDbE37JJPK

u/doggydid · 2 pointsr/arduino

I'm an older gentleman too and I just found this book, http://www.amazon.com/Programming-Arduino-Getting-Started-Sketches/dp/0071784225/ref=sr_1_1?ie=UTF8&amp;amp;qid=1412535749&amp;amp;sr=8-1&amp;amp;keywords=Programming+arduino. So far it's been very helpful answering questions about code that the "getting started with arduino" book didn't cover. I would recommend it for sure.

u/LiquidLogic · 2 pointsr/arduino

Check out Simon Monk's book: Programming Arduino, Getting Started with Sketches . I found it a great starter book, and was easy to understand and follow.

As for your keyboard interface.. it sounds like you will need the serial monitor running and waiting for a key-press. Arduino: SerialAvailable

Hope that gets you moving in the right direction! GL!

u/loubs001 · 2 pointsr/hardware

Agree. It depends on what you want to know, and how much you're willing to commit to learning. It's a big world. Code is a nice book if you want a very very simple explanation of the basics of bits and bytes and logic gates. It might be a good place to start, though it's intended for a non-technical audience and you may find it a little TOO simple. A proper digital systems book will go in to much more detail about digital logic (AND gates, flip-flops etc.). You might be surprised just how easy to learn the fundamentals are. I learned from Tocci which I found to be excellent, but that was a long time ago and I'm sure there's many other good ones around.

That's pretty low level digit circuits though. If you are really serious about learning computer architecture, I'd highly recommend Patterson and Hennssey . It covers the guts of how processors execute instructions, pipelining, caches, virtual memory and more.

If you're more interested in specific, modern technologies... then obviously Wikipedia, or good tech review sites. Especially reviews that focus on major new architectures. I remember reading lots of good in depth stuff about Intel's Nehalem architecture back when it was new, or nvidia's Fermi. There's a wealth of information out there about CUDA and GPU computing which may give you a sense of how GPUs are so different to CPUs. Also when I first started learning many years ago, I loved my copy of Upgrading and Repairing PCs , great for a less technical, more hobbyist perspective.

Lastly, ask questions! For example, you ask about DDR vs GDDR. Deep inside the memory chips themselves, actually not a great deal of difference. But the interface between the memory and the processor are quite different, they're designed for very different purposes. I'm simplifying here but CPUs have relatively low levels of parallism, they tend to operate on small units of memory (say a single value) at a time, they have quite unpredictable access patterns so low latency is essential, and the cores often work tightly together so coherency has to be maintained. With GPUs, they have a very predictable access pattern, so you can load much larger chunks at a time, latency is less important since you can easily keep your processors busy while memory is streamed in, and the GPUs many many tiny processors for the most part all work on separate words of memory, so coherence usually does not need to be maintained and they have much less need for caches.

The "L" (Level) naming for caches is quite simple. Memory that is closer to the core is faster to access. Generally each core has it's own L1 and L2, with L2 being slightly slower but there's more of it, and all cores share an L3, slower still but way more of it. Memory on the cpu is made out of transistors and is super fast but also takes up alot of space. Look how big the L3 is (here)[http://www.anandtech.com/show/8426/the-intel-haswell-e-cpu-review-core-i7-5960x-i7-5930k-i7-5820k-tested] and that's just 20MB. external ram is obviously much slower, but it is made out of capacitors and has much higher densities.

u/derpage · 2 pointsr/programming

&gt;Start at the bottom. Some books I liked...
&gt;
&gt;Learn what a computer does: Computer Organization &amp; design - Patterson &amp; Hennessy
&gt;
&gt;Learn C: Programming in C - Stephen Kochan
&gt;
&gt;VERY IMPORTANT learn your data structures: Introduction to Algorithms
&gt;
&gt;You will have learn Java in university, I found this book good: Absolute Java 4th ed.
&gt;
&gt;This is just scratching the surface, a lot more to learn afterword.

Don't worry, FTFH

u/MatrixManAtYrService · 2 pointsr/IWantToLearn

Just going from a bunch of hardware to the point where you can input machine code to be executed is a vast topic in itself (and something I don't have knowledge of). Once you can input machine language and have it execute though, I at least have an idea.

You can use machine code to write an assembler, which is a lot of work but not particularly complex.

You can use an assembler to write a compiler (good luck with this one, I'm in compiler design right now and it's a mind blow).

You can use a compiler to write pong.

There are many topics that you can really get acquainted with by just wandering the web. I don't think this is one of them. Once you get it you can really go some complex places, so what you're likely to find online is either too simple, or too complex for the understanding you seek. With dedication a book can probably help you, but if you can make nice with a teacher--auditing a computer organization/assembly language class will really open your eyes to what is going on in there.

Take a look at the course listing at a local college and e-mail the teacher, see if they'll let you audit their class.

This was my textbook for that class, it's decent. Maybe you can find an early edition for cheap:
http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=sr_1_2?ie=UTF8&amp;amp;qid=1302110540&amp;amp;sr=8-2-spell

u/Caret · 2 pointsr/hardware

As someone else mentioned, the Hennessy and Patterson Computer Architecture: A Quantitative Approach, and the Patterson and Hennessy Computer Organization and Design are the de facto standards (I used both in my Comp. Eng. undergrad) and are really fantastic books (the latter being more "software" oriented so to speak).

They are not EE textbooks (as far as I know) but they are text books nonetheless. A great book I found that is slightly dated but gives a simplified review of many processors is Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture which is less technical but I enjoyed it very much all the same. It is NOT a textbook, and I highly, highly recommend it.

Hope that helps!

u/sketerpot · 2 pointsr/learnprogramming

I found this book to be pretty readable. It uses a simplified MIPS-style processor for everything.

u/jbplaya · 2 pointsr/learnprogramming

A few books that helped me understand low level details of computers was

Assembly Language Step By Step: Programming With Linux

and

Operating Systems Designs and Implementation

u/kidmoe · 2 pointsr/learnprogramming

Just read Tanenbaum's Operating System's Design and Implementation, and Lions' Commentary on Unix and you should be good to go.

u/GPSMcAwesomeville · 2 pointsr/compsci

Hey, last year I followed a course in Operating Systems where we used MINIX as an example OS, which is one of the most understandable OS's out there, and great for learning.

This is a good (and quite pricey, unfortunately) book for MINIX and Operating Systems in general.

u/mrjaguar1 · 2 pointsr/learnprogramming

Its a little old but operating system design and implementation by Andrew Tanenbaum is a pretty good book and it includes a lot of the Minix source code in the book.

u/jeebusroxors · 2 pointsr/freebsd

Mess around in with minix for a bit. The Minix book (http://www.amazon.com/Operating-Systems-Design-Implementation-3rd/dp/0131429388) is wonderful. I only made it about half way through but it was one of the few "textbooks" I was able to actually sit and read. You may also want to drop back to Minix 2 as 3 is leaning more towards usability than education.

There is also linux .01 (try http://www.oldlinux.org/Linux.old/).

The main idea here is to stick to "early" code as it is clean, basic and without frills. Get the basics down then expand.

u/IjonTichy85 · 2 pointsr/compsci

I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.

If you're interested in the theory of cs, here are a few starting points:

Introduction to Automata Theory, Languages, and Computation

The book you should buy

MIT: Introduction to Algorithms

The book you should buy


Computer Architecture&lt;- The intro alone makes it worth watching!

The book you should buy

Linear Algebra

The book you should buy &lt;-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.

Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.

One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...

EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.

u/elder_george · 2 pointsr/programming

There's very nice (although expensive) book on Computer Architecture called 'Structured Computer Organization' by Tanenbaum.

u/eldigg · 2 pointsr/webdev

In most cases your program's performance is limited by memory access speed rather than raw CPU power. Your CPU to sit there 99% of the time twiddling its thumbs waiting for memory access.

This is a pretty good book, imo, that talks about this (among other things):

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

u/stonebit · 2 pointsr/linux

The Design of the UNIX Operating System https://www.amazon.com/dp/0132017997

u/frankenbeans · 2 pointsr/ECE

Amazing? These look like they were swiped from an overview lecture, there isn't any really good explanation in here. If this is all new to you they might be a good starting point for learning some basic concepts and vocabulary of signal integrity.

Johnson's Black Magic book is the general reference for this. There are many other (well written) white papers out there. Ott and Bogatin have good books as well.

u/jayknow05 · 2 pointsr/AskElectronics

This is a good book on the subject. I would personally work with a 4-layer board with a GND and VCC layer. It sounds like you already have a bunch of layers as it is so yes I would recommend a VCC layer.

u/drtwist · 2 pointsr/AskReddit

Eric Bogatin's book "Signal Integrity - Simplified" Howard Johnson's High Speed Digital Design and Mike Peng Li's Jitter, Noise, and Signal Integrity at High-Speed are all fantastic reads if you are looking for dead tree material. if you have a Safari subscription you can read Bogatin's and Li's books for "free"

u/PlatinumX · 2 pointsr/AskElectronics

&gt; Where did you take the formula for wire impedance from? Where could I read more about it?

This is a classic parallel conductor transmission line, there are calculators online. As I mentioned before, the twists do not affect impedance.

You can read more about transmission lines, characteristic impedance, twisted pair, and signal integrity all over the web (and of course check Wikipedia). These are very large topics with a lot of details to learn.

If you want a book, I recommend High Speed Digital Design: A Handbook of Black Magic.

u/tweakingforjesus · 2 pointsr/electronics

In addition to the per-IC decoupling cap already mentioned, I'd add a large electrolytic across VCC and GND near the connector on the right. You also might want to beef up the power and ground traces to reduce resistance to the individual ICs. Remember that your high-speed signal traces are going to induce the opposite current in parallel traces. A ground plane will help with this effect.

If you are really interested in digital PCB design, you might check out this book.

u/m85476585 · 2 pointsr/AskEngineers

I literally have a book called "A Handbook of Black Magic". It's a little old, but it's still one of the best books on the subject.

u/dangerbirds · 2 pointsr/ECE

Highspeed Digital Design by Graham and Johnson is more focused on high speed digital signals, but most of it applies to low speed as well. It has a ton of good "engineering rules of thumb" when it comes to doing PCB design.

u/velocicar · 2 pointsr/EngineeringStudents

Here's a book I use at work.

u/CodeTamarin · 2 pointsr/computerscience

The Stanford Algorithm book is complete overkill in my opinion do NOT read that book. That's insane. Read it when you've been doing programming for a while and have a grasp of how it even applies.

Here's my list, it's a "wanna be a decent junior" list:

  • Computer Science Distilled
  • Java/ C# / PHP/ JS (pick one)
  • Do some Programming Challenges
  • SQL
  • Maybe build a small web app. Don't worry about structure so much, just build something simple.
  • Applying UML: and Patterns: An Introduction to Object Oriented Anaysis and Design Iterative Development
  • Head First Design Patterns
  • Clean Architecture
  • Refactoring: Improving the Design of Existing Code
  • If you're interested in Web
  • Soft Skills: Power of Habit , A Mind for Numbers , Productivity Project

    &amp;#x200B;

    Reasoning: So, the first book is to give you a sense of all that's out there. It's short and sweet and primes you for what's ahead. It helps you understand most of the basic industry buzz words and whatnot. It answers a lot of unknown unknowns for a newbie.

    Next is just a list languages off the top of my head. But you can pick anything, seriously it's not a big deal. I did put Java first because that's the most popular and you'll like find a mountain of resources.

    Then after some focused practice, I suggest grabbing some SQL. You don't need to be an expert but you gotta know about DBs to some degree.

    Then I put an analysis book that's OOP focused. The nifty thing about that book, is it breaks into design patterns nicely with some very simple design patters to introduce you to design patterns and GRASP.

    Then I put in a legit Design Patterns book that explains and explores design patterns and principles associated with many of them.

    Now that you know how code is structured, you're ready for a conversation about Architecture. Clean architecture is a simple primer on the topic. Nothing too crazy, just preps you for the idea of architecture and dealing with it.

    Finally, refactoring is great for working devs. Often your early work will be focused on working with legacy code. Then knowing how to deal with those problems can be helpful.

    FINAL NOTE: Read the soft skills books first.

    The reason for reading the soft skills books first is it helps develop a mental framework for learning all the stuff.

    Good luck! I get this isn't strictly computer science and it's likely focused more toward Software Development. But I hope it helps. If it doesn't. My apologies.
u/KoleCasule1 · 2 pointsr/csharp

This is the third book suggested that was written by Uncle Bob.

Can you maybe explain the Difference between:

Clean Code , Clean Architecture and Clean Coder. All books by Uncle Bob

u/CaffinatedSquirrel · 2 pointsr/learnpython

Clean Code: Clean Code

Clean Architecture: Clean Arch

&amp;#x200B;

Just picked these two up myself.. not sure if its what you are looking for, but seem to be very valuable for software design as a whole.

u/turtlepot · 2 pointsr/AskComputerScience

Highly endorsed, first book I read out of school:

Code Complete - Steve McConnell


Bonus, engineers at my office were just given this book as recommended reading:

Clean Architecture - Robert C. Martin

u/e7hz3r0 · 2 pointsr/learnprogramming

Heh, that's a loaded phrase because it people haven't agreed on what it means.

So I agree with both the other posters in that it can include the stack but usually implies a deeper design understanding.

To me, it doesn't make much sense to ask about a rails app's architecture without going into the tech stack precisely because 1) rails apps have the same basic architecture (MVC) 2) the rest of the stack is actually part of the application. Do you use MySQL or Postges or something else? How many rails servers do you have? How many database servers are there and how are they replicated? Etc etc.

However, when you're talking about apps that don't have a given, accepted base design then it's really important to know how it's designed.

I'm going to use the phrase design and architecture interchangeably here, but one could argue they're slightly different.

The architecture of an app influences the "non-functional" characteristics it embodies (also called quality attributes). Furthermore, and more importantly, the architecture itself is (or should be) influenced by the desired non-functional characteristics.

What do I mean by non-functional characteristics? Stuff like:

  • Performance
  • Security
  • Modifiability
  • Modular
  • Testability
  • etc.

    If you think about it, these things are difficult and expensive to change down the road. If you want to add security to an app that's highly modular, you will have a lot of work due to the high amount of decoupling throughout the app. Or imagine trying to add performance to a highly modifiable app. Modifiability usually implies low coupling between parts which also, usually, impacts performance.

    So when you think about the architecture of an app, it's how the larger parts are put together to express these non-fuctionals. This can get down to the level of design patterns like MVC (modifiability) and dependency injection (testability) but it starts at a higher level where you look at things like Java packages instead of classes, as an example.

    There are a number of books on amazon about this but here are 2 (I've read the first, but not the second):
  • Software Architecture in Practice
  • Clean Architecture
u/FroggyWizard · 2 pointsr/learnprogramming

System design is probably the most important thing to learn next. Stuff like SOLID principles, dependency injection etc.
This is a book we use a lot at my company: https://www.amazon.co.uk/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164/

u/obliviousonions · 2 pointsr/ProgrammerHumor

There's a lot of books written about good programming practices, some of which are more advanced than others. Search around on amazon and you'll find a few.

https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164/ref=sr_1_3?ie=UTF8&amp;amp;qid=1520137256&amp;amp;sr=8-3&amp;amp;keywords=clean+code

I cannot vouch for this specific book, but I read another book in the same series, which taught me a lot of things about software design that I had never even thought about.

u/ShadowWebDeveloper · 2 pointsr/cscareerquestions

Once you've read Clean Code, I'd also recommend Clean Architecture which is in the same series. It's at a higher level that talks about how to organize software projects.

u/that_makes_sense · 2 pointsr/learnprogramming

I highly recommend this:

Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series) https://www.amazon.com/dp/0134494164/ref=cm_sw_r_cp_api_i_weBTCbMNPQ2B1

Also check out Clean Code also by Robert Martin.

u/rmurias · 2 pointsr/DSP
u/Sean_Michael · 2 pointsr/EngineeringStudents

Understanding Digital Signal Processing EET400,401 Electronics and Computer Engineering Bachelors Degree

u/washerdreier · 2 pointsr/DSP

Understanding DSP by Lyons, hands down. Get it and never look back. AWESOME book. http://www.amazon.com/Understanding-Digital-Signal-Processing-Edition/dp/0137027419

u/cbrpnk · 2 pointsr/AskProgramming

The fact that you mentioned that it'd be cool to work on a DAW tells me that you want to go low level. What you want to study is digital signal processing or DSP. I recommend Understanding Digital Signal Processing. Also watch This talk by Timur Doumler. Or anything by him. I recommend that you pick a programming language and try to output a sin wave to the speakers, then go on from there.

Also check those out:

https://theaudioprogrammer.com/

https://jackschaedler.github.io/circles-sines-signals/

https://blog.demofox.org/#Audio

&amp;#x200B;

Good luck.

u/110100100_Blaze_It · 2 pointsr/learnprogramming

It's on my to-do list, but this is something that I want to get right. I don't think I could fully appreciate it without a more formal approach. I'm currently working through this, and will try my hand in the subject afterword. I will definitely check out Professor Might's insight on the subject, and I would gladly take up any other resources you might have to offer!

u/lordvadr · 2 pointsr/AskComputerScience

We wrote a compiler for one of my CS classes in college. The language was called YAPL (yet another programming language).

First thing first, as other's have mentioned, a compiler translates from one language to another...typically assembly...but could be any other language. Our compiler compiled YAPL, which was a lot like Pascal, into C, which we then fed to the C compiler...which in turn was fed to the assembler. We actually wrote working programs in YAPL. For my final project, I wrote a functional--albeit VERY basic--web server.

With that said, it's quite a bit different for an interpreted language, but the biggest part for each is still the same. By far, the most complicated part of a compiler is the parser.

The parser is what reads a source code file and does whatever it's going to do with it. Entire bookshelves have been written on this subject, and PhD's given out on the matter, so parsing can be extremely complicated.

In a theoretical sense, higher level languages abstract common or more complicated tasks from the lower level languages. For example, to a CPU, variables don't have sizes or names, neither do functions, etc. On one hand, it greatly speeds up development because the code is far more understandable. On the other hand, certain tricks you can pull of in the lower-level languages (that can vastly improve performance) can be abstracted away. This trade-off is mostly considered acceptable. An extra $500 web server (or 100 for that matter) to handle some of the load is far less expensive than 10 extra $100,000 a year x86 assembly developers to develop, optimize, and debug lower level code.

So generally speaking, the parser looks for what are called tokens, which is why there are reserved words in languages. You can't name a variable int in C because int is a reserved word for a type. So when you name variable, you're simply telling the compiler "when I reference this name again, I'm talking about the same variable." The compiler knows an int is 4 bytes, so does the developer. When it makes it into assembly, it's just some 4 bytes somewhere in memory.

So the parser starts looking for keywords or symbols. When it sees int, the next thing it's going to expect is a label, and if that label is followed by (, it knows it's a function, if it's followed by ; it's a variable--it's more complicated than this but you get the idea.

The parser builds a big structure in memory of what's what and essentially the functionality. From there, either the interpreter goes through and interprets the language, or for a compiler, that gets handed to what's called the emitter. The emitter is the function that spits out the assembly (or whatever other language) equivalent a = b + c; happens to be.

This is complicated, but if you take it in steps, it's not really that hard. This is the book we used. There's a much newer version out now. If I can find my copy, I'll give it to you if you pay shipping. PM me.

u/jhartwell · 2 pointsr/AskComputerScience

Compilers: Principles, Techniques and Tools which is also referred to as The Dragon Book is what I'm currently reading and am enjoying it.

u/13ren · 2 pointsr/programming
u/johnweeder · 2 pointsr/learnprogramming

Yes. Do it. It's great to know. Useful ocassionally - especially grammars. The dragon book is the only college text I've kept.

https://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886/ref=pd_lpo_sbs_14_img_0?_encoding=UTF8&amp;amp;psc=1&amp;amp;refRID=6GT8HPHEKPGJX9GGVMNR

u/Scaliwag · 2 pointsr/gamedev

Regarding sandboxing, at least in lua from what I know you can have minute control over what libs access to, and users can only import other libraries if you allow them to (by including a "library" that imports other libraries :-).

Perhaps you should look into formal languages and parser generators, so you can create more complex languages if you feel like it. Even if you build the parsers yourself having the language specified, factorized and so on, helps a lot. The dragon book is a good choice, although it presupposes you know a bit about specifying a formal language IIRC. If you're a student (I know how it is!) then even the old dragon book is an excellent read and it's very cheap.

u/BaconWraith · 2 pointsr/compsci

Cheers man! The Dragon Book is a great place to start, and there's always this, but mainly it's about facing each problem as you come to it and hoping for the best :P

u/vineetk · 2 pointsr/programming

Looks like it can be had for about $7 on amazon, including shipping. Surely you can scrounge up that much.

u/phao · 2 pointsr/java

Well, some books can help:

  • There is the design patterns book (gang of four) =&gt; http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/
  • Another book on patterns, but targetting enterprise applications (i.e. information systems) =&gt; http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

    These books on patterns tend to be good on teaching a lot of what you're asking. Largely because you've named some patterns in your question, but also because many patterns are about:

  • Identifying the need for something X to be changed without affecting another thing Y with which X is coupled; and
  • separating X from Y in a way that allows X to change independently from Y.

    There are several ways to say what I just did in there. You're allowing X to vary independently from Y. This makes X a parameter of Y, which is yet another way to say it. You're separating what is likely to change often (X) from what doesn't need to be affected by that change (Y).

    Some benefits from this is that a reason to change X, now, doesn't affect Y because X can be changed independently of Y. Another is that understanding X can be significantly done without looking at Y. This is a core guiding rule in the separation of concerns principle: the concern X is separated from the concern Y. Now, a lot of activities you want with X can be performed independently of Y.

    You probably know all of this, so I'm sorry if this isn't much helpful. But just to finish, a classic example of this is a sorting function (the Y) and the comparison criteria (the X). Many people, in many projects, would like to have that a change in the comparison criteria not lead to a change in the sorting function. They're 2 separate concerns we'd like to deal with separately. Therefore, the comparison criteria, as commonly done today, is a parameter of sorting. In this case, the word "parameter" is being used both in the sense of a function parameter in the source code, but also in the more general sense of something being a parameter of something else, in which case something can be one of many, and may change over time.
u/materialdesigner · 2 pointsr/webdev

Patterns of Enterprise Application Architecture sounds like a must read for you.

u/guifroes · 2 pointsr/learnprogramming

Interesting!

Looks to me that you can "feel" what good code looks like but you're not able to rationalise it enough for you to write it on your own.

Couple of suggestions:

When you see elegant code, ask yourself: why is it elegant? Is it because is simple? Easy to understand? Try to recognise the desired attributes so you can try to reproduce on your code.

Try to write really short classes/methods that have only one responsibility. For more about this, search for Single Responsibility Principle.

How familiar are you with unit testing and TDD? It should help you a lot to write better designed code.

Some other resources:

u/SlobberGoat · 2 pointsr/java
u/st4rdr0id · 2 pointsr/androiddev

Hey that is the million dollar question. But because software is not an engineering, actually there is no reference book on SW architecture. Certainly there are books talking about this, but usually covering only some aspects and without real application examples.

Notice that in iOS programming the system imposes a great part of the architecture, so these guys are usually less concerned. But in Android we have more freedom, and the API actually encourages really bad practices (thanks Google). Because of this we are all a bit lost. Nowadays layered architecture and MVP seems to be the most popular approach, but then again everybody produces a different implementation...

Specifically for Clean Architecture you should read its author, Robert C. Martin. AFAIK this is not covered in detail in his books. You can read this blog post and watch this video. Other designs usually coming up in conferences are the Onion Architecture and the Hexagonal Architecture. But make no mistake: there's no route map on how to implement any of those, and examples claiming to follow this or that approach are usually not written by the authors of the architecture.


For DDD there is a very good book by Scott Millet with actual examples. But this style is meant for large enterprise backend apps, and the author himself advices against using is in small apps. So I'd say it is overkill for Android, but of course you could reuse some concepts successfully.


Theres also Software Architecture in Practice 3rd, but having read the 2nd edition I can tell you this is just smoke.


Probably best book to date is Fowler's but this is more a patterns compilation than an architecture guide.

u/vinnyvicious · 2 pointsr/gamedev

Have you ever heard of the open/closed principle? Or the single responsibility principle? Or Liskov substitution principle? All three are being violated. It drastically reduces the maintainability and extensibility of your code. I can't swap serializers easily, i can't tweak or extend them without touching that huge class and it's definitely not the responsibility of that class to know how to serialize A, B, C, D, E and the whole alphabet.

I highly recommend some literature on the subject if you're curious about it, it would drastically improve your approach to software architecture:

https://www.amazon.com/dp/0132350882

https://www.amazon.com/dp/0201485672

https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

http://cc2e.com/

https://www.amazon.com/dp/0321127420

u/vladmihalceacom · 2 pointsr/java

&gt; Yes, the Native Query and access to Connection is always THE Hibernate's answer to all the lacking support of basic SQL features like Window Functions or being able to count aggregated results.

That's a very common misconception. Hibernate is not a replacement for SQL. It's an alternative to JDBC API that implements the Enterprise Patterns stated by Martin Flower in his book.

Thre are many alternatives to JPA or Hibernate. In fact, I'm also using jOOQ, and I like it a lot. I wrote about it. I'm using it in my training and workshops as well.

There are things you can do in jOOQ that you can't do with Hibernate, and there are also things you can do with Hibernate that you can't do with jOOQ.

u/LXXXVI · 2 pointsr/learnprogramming

Thanks, I'm sure you will. It's just a question of getting that first success. Afterwards, it gets much easier, once you can point at a company and say "Their customers are using my code every day."

As for the interviews, I don't know, I'm honestly not the type to get nervous at interviews, either because I know my skill level is most likely too low and I take it as a learning experience, or because I know I can do it. I'd say that you should always write down all the interview questions you couldn't answer properly and afterwards google them extensively.

Besides, if you're from the US, you have a virtually unlimited pool of jobs to interview for. I live in a tiny European country that has 2 million people and probably somewhere in the range of 20 actual IT companies, so I had to be careful not to exhaust the pool too soon.

Funnily enough, right now, my CTO would kill for another even halfway competent nodejs developer with potential, but we literally can't find anyone.

Anyway, I'm nowhere near senior level, but I can already tell you that the architecture:language part is something your bootcamp got right. To that I would add a book my CTO gave me to read (I'm not finished yet myself, but it is a great book) - Patterns of Enterprise Architecture. Give it a look. I suspect, without ever having tried to implement a piece of architecture like that, it won't make much sense beyond theoretical, but I promise you, it's worth its weight in gold, once you start building something more complex and have to decide how to actually do it.

u/slowfly1st · 2 pointsr/learnprogramming

Your foes are kids in their twenties with a degree which takes years to achieve, this will be tough! But I think your age and your willingness to learn will help you lot.

&amp;#x200B;

Other things to learn:

  • JDK - you should be at least aware what API's the JDK provides, better, have used them (https://docs.oracle.com/javase/8/docs/). I think (personal preference / experience) those are the minimum: JDBC, Serialization, Security, Date and Time, I/O, Networking, (Internationalization - I'm from a country with more than one official language), Math, Collections, Concurrency.
  • DBMS: How to create databases and how to access them via JDBC. (I like postgreSQL). Learn SQL.
  • Learn how to use an ORM Mapper. (I like jOOQ, I dislike JPA/hibernate)
  • Requirements Engineering. I think without someone who has the requirements you can't really practice that, but theory should be present. It's a essential part of software development: Get the customers requirements and bring it to paper. Bad RE can lead to tears.
  • Writing Unit Tests / TDD. Having working code means the work is 50% done - book recommendation: Growing Object-Oriented Software, Guided by Tests
  • CI/CD (Continuous Integration / Delivery) - book recommendation: Continuous Delivery.
  • Read Clean Code (mandatory!)
  • Read Design Patterns (also mandatory!)
  • (Read Patterns of Enterprise Application Architecture (bit outdated, I think it's probably a thing you should read later, but I still love it!))
  • Get familiar with a build tool, such as maven or gradle.

    &amp;#x200B;

    If there's one framework to look at, it would be spring: spring.io provides dozens of frameworks, for webservices, backends, websites, and so on, but mainly their core technology for dependency injection.

    &amp;#x200B;

    (edit: other important things)
u/ladywanking · 2 pointsr/learnprogramming

You are asking a good question.

Wouldn't doing a separate class for each of the use cases you described be repeating yourself?

I would read about DTO and see how it goes.
A good book about this is this.

u/cderwin15 · 2 pointsr/compsci

For what it's worth, I'm in the midst of working through the dragon book and would highly recommend it. Unfortunately I don't know of any online courses you could take for credit.

u/0xf3e · 2 pointsr/programming

For anyone who wants to learn more about compilers and loves reading books, the so called dragon book is highly recommended lecture on this topic: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

u/echelonIV · 2 pointsr/gamedev

I ordered these for our company library, based on recommendations for/from other programmers (of all levels).

ISBN | Title
---|---
978-1568814247 | Real-time Rendering
0321486811 | Compilers: Principles, Techniques, and Tools (2nd Edition)
1482250926 or 0123742978 | Essential Mathematics for Games and Interactive Applications, Third Edition 3rd Edition
978-1482264616 | GPU Pro 6: Advanced Rendering Techniques
1466560010 | Game Engine Architecture, Second Edition
978-1482243567 | Multithreading for Visual Effects
978-0123750792 | Physically Based Rendering: From Theory To Implementation

u/Yulfy · 2 pointsr/AskProgramming

If you mean writing an interpreter or compiler, then yes. The iconic book for learning how to build languages is called Compilers, Principles, Techniques and Tools. It's often referred to as 'The Dragon Book'. It's pretty heavy reading but contains everything you need to know about building a language!

If you're looking for something more implementation driven, I recently read a book about building a programming language in Go. The principles are the same, just with a different language. The book was called Writing an Interpreter in Go. It's a much lighter read and details the construction of an interpreter from scratch!

u/cantstopthemoonlight · 2 pointsr/learnprogramming

Compilers-Principles-Techniques-Tools is considered THE definitive book on the subject. It's old, but in a fine wine kind of way.

u/WannabeDijkstra · 2 pointsr/linux

The Design and Implementation of the FreeBSD Operating System by Marshall Kirk McKusick

Though it uses FreeBSD-specific details, its breadth and level of detail is really high, with most of the concepts being applicable to nearly all Unix-likes, and much of it to operating systems in general.

The second edition came out just a few days ago, which I link to.

u/InfinitelyManic · 2 pointsr/openbsd
u/melp · 2 pointsr/zfs

That is not true. Sync writes go to memory (in a transaction group) and the SLOG at the same time.

The only reference I know of for this is in this book: https://www.amazon.com/dp/0321968972/

Section 10.4, logging. You can see it in the Amazon preview, starting on page 538.

u/chalbersma · 2 pointsr/linux
u/spamky23 · 2 pointsr/iamverysmart
u/dishayu · 2 pointsr/NoStupidQuestions

I also recommend the book named Thing Explainer which uses the 1000 most common words to explain complicated things. It has a lot of simple, funny pictures and is fun to read overall (the tone is not serious at all). I'd be happy to ship you a copy if you like. Just PM me your address.

u/RockaDelicato · 2 pointsr/AskReddit

Everything can be explained in layman's terms. If you can't you have not fully understood the topic. It might take a bit longer though. 😅

Not convinced? Check out the book "Thing Explainer" to get an idea of how to do it (:

Thing Explainer: Complicated Stuff in Simple Words https://www.amazon.de/dp/0544668251/ref=cm_sw_r_cp_apa_i_31QDCbXQ7V9FS

u/shittyNaturalist · 2 pointsr/engineering

I really would recommend Randall Munroe's Thing Explainer. When I started doing propulsion work, I actually used it as a reference because it's easy to reference and, it has a pretty strong foundation on a number of things at a very accessible level. As u/zaures mentioned, The Way Things Work (any edition) is excellent and in much the same vein.

u/dkuhry · 2 pointsr/ThingsCutInHalfPorn

I was going for an XKCD / Thing Explainer reference. The author explains technical blueprints using only the 10-hundred (1000) most commonly used English words.

u/FallingStar7669 · 2 pointsr/KerbalSpaceProgram

Science mode limits the available parts until you do the science to unlock more, without having to deal with restrictions like funding. You're almost literally forced to start simple, which is very useful given the steep curve of this game.

I'm no education expert, but I've been playing games since the NES came out. And what I've seen of this coming generation, they're pretty sharp, even if their reading skills are limited. Don't expect a 4 year old to understand delta-v, but fully expect them, after a few weeks of play, to not need to worry about it. If they can survive the steep learning curve, they'll know what engine they want by the picture (most of us do anyway) and they'll know what it does because they tried it and saw for themselves. It might be useful at the very least to explain "this one makes you go fast but uses up all your fuel, this one makes you go slow but uses less fuel" and stuff like that. Basically, talk to them as if you're quoting this book.

A child's mind is a very wondrous machine. If nothing else, trust that, if their interest is strong enough to overcome their failures, they will blow you away sooner than you could ever realize.

u/PM_me_about_jobs · 2 pointsr/engineering

I got this book for my niece for her 6th birthday.

u/StarOriole · 2 pointsr/CasualConversation

Books can be good. How about Thing Explainer?

u/killver · 2 pointsr/videos

If you are looking for a very easy to read introduction to how computers work, I can recommend the book "But How Do It Know?". Strange title, but book is great. https://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765

u/5p458d28 · 2 pointsr/gpumining

Great post, wish I I had looked into it myself.
I am again going to use Upgrading and Repairing PCs by Scott Mueller as a reference book.

I quote:
&gt; Peripheral Power Connectors
Perhaps the most common additional power connector seen on virtually all power supplies is the peripheral power connector, also called the disk drive power connector. What we know as the peripheral power connector was originally created by AMP as part of the commercial MATE-N-LOK series, although because it is also manufactured and sold by Molex, it is often incorrectly called a Molex connector.
To determine the location of pin 1, carefully look at the connector. It is usually embossed in the plastic connector body; however, it is often tiny and difficult to read. Fortunately, these connectors are keyed and therefore difficult to insert incorrectly. Figure 17.30 shows the keying with respect to pin numbers on the larger drive power connector.

This is Figure 17.30

&gt; this is the one connector type that has been on all PC power supplies from the original IBM PC to the latest systems built today. It is most commonly known as a disk drive connector, but it is also used in some systems to provide additional power to the motherboard, video card, cooling fans, or just about anything that can use +5V or +12V power.
A peripheral power connector is a 4-pin connector with round terminals spaced 0.200 inches apart, rated to carry up to 11 amps per pin. Because there is one +12V pin and one +5V pin (the other two are grounds), the maximum power-handling capability of the peripheral connector is 187 watts. The plug is 0.830 inches wide, making it suitable for larger drives and devices.

I think that the misconfection about the power handling of a Peripheral Power Connector is due to the 187W figure which is the combined power handling of the +12V11A pin and the +5V11A pin

If we look only on the +12V pin then the top power is only 12V*11A which is 132W considering that a GPU can draw up to 75W from a riser, 2 risers could potentially consume 150W which is more then the Peripheral Power Connector +12V pin able to provide.

u/Reptilian_Overlords · 2 pointsr/sysadmin

&gt;But basically after that I have to decide soon whether or not to focus on a Cisco, or Microsoft track at my college.

Sounds like your "college" is a joke. You should be learning the fundamentals that are responsible for the underpinnings of these technologies, not vendor recommendations that can easily almost be called propaganda. Especially at your beginner level, you wouldn't even touch technologies as part of your responsibility at the level taught by an MCSE or CCNA unless you work for an absolute moron.

The world is larger than Cisco and Microsoft. I suggest you look for actual academic books on Networking and Server Architecture to learn more useful things.

Computer Networking: A Top-Down Approach (6th Edition) https://www.amazon.com/dp/0132856204/ref=cm_sw_r_cp_awd_4Ev3wbE0EVGDH

Understanding and Deploying LDAP Directory Services, 2nd Edition https://www.amazon.com/dp/0672323168/ref=cm_sw_r_cp_awd_KFv3wbW3QNAGF

For future tracks:

Databases:

SQL Queries for Mere Mortals: A Hands-On Guide to Data Manipulation in SQL (3rd Edition) https://www.amazon.com/dp/0321992474/ref=cm_sw_r_cp_awd_SGv3wbGCZ24FA

Fundamentals of Database Systems (7th Edition) https://www.amazon.com/dp/0133970779/ref=cm_sw_r_cp_awd_qHv3wb1YC95NS

Security:

Computer Security: Principles and Practice (3rd Edition) https://www.amazon.com/dp/0133773922/ref=cm_sw_r_cp_awd_ZHv3wb7J1YJKC

Blue Team Handbook: Incident Response Edition: A condensed field guide for the Cyber Security Incident Responder. https://www.amazon.com/dp/1500734756/ref=cm_sw_r_cp_awd_uIv3wbK1361D2

Hardware:

Upgrading and Repairing PCs (22nd Edition) https://www.amazon.com/dp/0789756102/ref=cm_sw_r_cp_awd_gJv3wbCKGA502

Problem Solving:

The Thinker's Toolkit: 14 Powerful Techniques for Problem Solving https://www.amazon.com/dp/0812928083/ref=cm_sw_r_cp_awd_XKv3wbKQFJK6Q

Best of luck. I recommend learning Shell languages and the basics of shell navigation and data manipulation techniques for various operating systems as well.

u/TMITectonic · 2 pointsr/ECE

Yeah, back when I worked for a local PC Builder/Repair shop, I used to know Mueller's book cover to cover, which covered all hardware types and their histories. Nowadays, I'm more Networking, Security, and Programming-centric, so I'm getting fuzzier on hardware by the day!

u/bengineering101 · 2 pointsr/raspberry_pi

The problem with talking about "software" on the Pi is that there are so many different things you can do, so that question is too broad. e.g. do you want to learn Linux? Python? The various media-center or retro-gaming software options? You said you already know C++? etc. There isn't really a single tutorial for "Raspberry Pi software".

I'm more of a hardware guy, but in my opinion it's silly to use the Pi just to learn a programming language. You could learn Python on a regular PC. Instead, I'd look for a software-based project that the Pi is good for, and use that as motivation. That being said, it looks like there are resources specifically for learning Python with the Pi:

http://www.amazon.com/Learning-Python-Raspberry-Alex-Bradbury/dp/1118717058

u/trump_pushes_mongo · 2 pointsr/neoliberal

O'Reilly Publishing is a reputable source for programming in general. Here is an embedded systems book.

Edit; stupid formatting

u/adi123456789 · 2 pointsr/cpp

I'm an embedded software developer who used to use C and now primarily works with C++.

Learning C is relatively easier when you start off and gives you a better appreciation of memory handling and it's complexities than C++ does in my opinion. The C knowledge will also transfer well to C++.

C++ is definitely a much more powerful language and you can get your tasks done quicker with it. There are a lot of things to learn in C++, but you can get them with time. A lot of embedded processors, particularly the ARM based ones, support C++ as well, so that is not a problem

Like someone else mentioned though, embedded development relies on a good knowledge of programming as well as a good understanding of computer architecture.

Here's a nice book I've read which is useful for new embedded developers - Making Embedded Systems: Design Patterns for Great Software https://www.amazon.com/dp/1449302149/ref=cm_sw_r_cp_apa_i_MuFhDb1WWXK3W

u/MrAureliusR · 2 pointsr/ElectricalEngineering

Okay, you're definitely at the beginning. I'll clarify a few things and then recommend some resources.

  1. Places to buy components: Depending on where you live in the world, the large component suppliers are almost always the way to go, with smaller suppliers like Adafruit/Sparkfun if you need development boards or specialised things. I buy almost exclusively from Digikey -- they have $8 flat shipping to Canada, which typically arrives the next day, with no customs fees. They have some sort of agreement in place where they cover these costs. This *always* saves money over going to my local stores where the prices are inflated. It's crazy how cheap some things are. If I need a few 2.2K 1206 resistors for a project, I just buy a reel of 1000 because they are so cheap.
  2. "Steer a joystick with an app" Do you mean connect motors to it and have them move the joystick for you? You're going to want some sort of microcontroller platform, along with a motor controller and way to communicate with a smartphone app. You mention you know C++ so it will be easy to switch to C. This is both true and false. Programming for microcontrollers is not the same as programming for computers. You are much closer to the hardware, typically manipulating many registers directly instead of abstracting it away. Each microcontroller vendor has their own tools and compilers, although *some* do support GCC or alternatives. You mentioned PIC, which is a line of microcontrollers by a large company called Microchip. There are 8-bit, 16-bit, and 32-bit PICs, all at different price points and with hugely differing capabilities. Selecting the microcontroller for a project can be half the battle sometimes. Or, like me, you can just go with whatever you have on hand (which is usually MSP430s or PIC32MX's)
  3. A lot of people will recommend the book The Art of Electronics. It's decent, but it's not for everyone. Some really like the conversational style, others don't. Many people who want to get into microcontroller programming and embedded development want to skip over the fundamentals and just get something working. For those, I point them to Arduino and let them on their merry way. However, if you actually want to learn something, I highly recommend buying an actual microcontroller development board, learning the fundamentals about electrical circuits, and programming in actual C with actual IDEs.
  4. As far as resources go, again it depends on your actual goal. Whenever I want to learn a new tool (like a PCB layout software, or a new IDE) I always start with a simple project. Having an end point to reach will keep you motivated when things seem complicated. Your controlling a joystick with motors is a great starting point. I would buy a development board, Microchip PICs are popular, as are ST32s, and MSP430. It doesn't really matter that much in the long run. Just don't tie yourself too hard to one brand. Then pick up some stepper motors, and a stepper motor control board (grab one from Sparkfun/Adafruit, etc). Get yourself a breadboard, and some breadboard jumpers, a cheap power supply (there are tons available now for cheap that are pretty decent), and then jump in head first!
  5. I highly recommend the book Making Embedded Systems by Elecia White, once you've covered the basics. It's a great way to learn more about how professionals actually design things. For the basics, you can watch *EARLY* EEVBlog videos (anything past around video 600/650 he gets progressively more annoying and set in his ways, another topic entirely, but the early stuff is decent). I'd also recommend picking up your choice of books about the fundamentals -- Electronics for Dummies, the aforementioned Art of Electronics, Making Embedded Systems, The Art of Designing Embedded Systems, and even stuff like Design Patterns for Embedded Systems in C. Again, it all depends on what your goal is. If you want to do embedded design, then you'll need to focus on that. If you're more into analog circuits, then maybe check out The Art and Science of Analog Circuit Design. Either way, grounding yourself in the fundamentals will help a LOT later on. It will make reading schematics way easier.

    I feel like I've gone off on a few tangents, but just ask for clarification if you want. I'd be happy to point you towards other resources.
u/returnofthecityboy · 2 pointsr/programming

May I ask you the reverse question? I'd be interested what it takes to move from embedded to backend.

Embedded is a wide field. It kind of depends on whether you're developing firmware or the application layer for an embedded target. Principally it's not much different than programming normal C, but for a lot of targets there's no malloc (because it can have disastrous timing behavior) and you'll need to learn about hard real-time scheduling and working with peripherals.

If you work closely with the hardware guys it might be useful to know the basics of handling an oscilloscope.

For the basics I can recommend [this book here].(https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149)

u/Yelneerg · 2 pointsr/embedded

Check out the embedded podcast and blog, and the book "Making Embedded Systems"
https://www.embedded.fm/

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

u/Iwishiknewwhatiknew · 2 pointsr/embedded


Making Embedded Systems: Design Patterns for Great Software https://www.amazon.com/dp/1449302149/ref=cm_sw_r_cp_apa_i_BbryCbHZDAV0T

u/postmodern · 1 pointr/programming
  • ActiveRecord is a pattern according to Patterns of Enterprise Application Architecture. It's definitely the easiest pattern to implement, and thus the most popular amongst ORMs. Therefore, the simple misconception that ORM == ActiveRecord.
  • Your own description of the DataMapper pattern would imply that DataMapper does in fact differ from ActiveRecord. :) ActiveRecord has no concept of mapping, nor separation between the schema representation and the model. ActiveRecord simply instantiates Models froms rows.
  • Note: I am not quoting Martin Fowler as an Apple to Authority, but simply because Martin Fowler wrote Patterns of Enterprise Application Architecture (PoEAA), in which the ActiveRecord and DataMapper patterns are explained. :) See also the Wikipedia entry for ActiveRecord which lists Martin Fowler as the creator.
u/cythrawll · 1 pointr/PHP

Honestly I haven't read a "PHP book" in ages so I am a very bad source in critiquing. the majority of the one's I have come across are painfully outdated and some right out inaccurate. My suggestion to learn PHP programming better is try books that have to do with programming in general. Books on object orientation and patterns, like GOF http://en.wikipedia.org/wiki/Design_Patterns or PoEAA http://www.amazon.com/Enterprise-Application-Architecture-Addison-Wesley-Signature/dp/0321127420 are great for learning Object Oriented principals.

But those will help somewhat. What really helped me become a better PHP programmer is to study other languages, and then study their web frameworks, then take what you learned back to PHP. Find out why one aspect is pretty common in X language frameworks, but not PHP frameworks. How do other language frameworks solve the dependency issue, etc. Some languages I suggest learning are actually other ones that are pretty mainstream in web programming: Python for it's typing and how it compares to PHP. Ruby with how mixins relate to PHP traits. and Java is great as there are quite a bit of aspects that PHP stole off Java in it's OO design.

u/ndanger · 1 pointr/compsci

Upvote for Domain-Driven Design, it's a great book. Depending on the size of the system, Martin Fowler's PoEAA might also be helpful.

Also what dethswatch said: what's the audience &amp; scope; i.e. what's in the previous document? If you're presenting three architectures you probably need enough detail that people can choose between them. That means knowing how well each will address the goals, some estimate on implementation effort (time &amp; cost), limitations, future-proofing, etc.

Finally, IMHO, this really isn't computer science. You might have better luck asking in /r/programming/ or the new r/SWArchitecture/

u/mrferos · 1 pointr/PHP

When you get to a project larger than a small business or personal website it's less about the language and more about imposing a structure that makes sense backed by a data model that's thought out and performant.

I recommend these books:

http://www.amazon.com/High-Performance-MySQL-Optimization-Replication/dp/1449314287/ref=sr_1_1?ie=UTF8&amp;amp;qid=1425831227&amp;amp;sr=8-1&amp;amp;keywords=high+performance+mysql
http://www.amazon.com/gp/product/0321127420/ref=oh_aui_detailpage_o05_s00?ie=UTF8&amp;amp;psc=1

u/jasonlotito · 1 pointr/PHP

First, ignore the code examples on the page. They are fairly bad. The reason you don't grok OOP is because of examples like these. What you want is a good solid book.

Patterns of Enterprise Application Architecture

PoEAA sounds imposing, but it's really good. I highly recommend it.

I also recently wrote up an article on Data Mapping that you might be interested in reading. If you have any questions, you can ask. Basically, for me, when I finally groked OOP, was when I was reading an article on observers, and how that pattern works, and I started to get it. I'm still learning. I'm only now getting over my need for getters and setters (You don't need them, they are bad, stop using them).

But yeah, the reason most OOP articles are bad is because most people using objects are merely using them as namespaces for functions and variables. If you have any questions, let me know.

u/pjmlp · 1 pointr/programming

&gt; Those are interpreted (JIT - just in time compiled)

You are contradicting yourself.

Plus there are also native compilers for Java and .NET languages.

Just a few examples:

  • OS/400 compiles to native code at installation time

  • Android ART compiles to native code at installation time

  • Websphere Real Time JVM has an AOT compiler

  • Aonix JVM has an AOT compiler

  • Mono -aot, NGEN and .NET Native are all compilers generating native code

    Given that C and C++:

  • have the CINT interpreter

  • on OS/400 they compile to bytecode

  • on Windows CE they could be compiled to bytecode

  • were compiled to bytecode with the TenDRA compilers

    Does that make C and C++ interpreted?

    Of course not, because as any compiler design student knows, language != implementation.

    Bytecodes are just a form of architecture independent machine instructions, by no means do they require interpretation vs further compilation to native code.

    Since the Summer is still not over, here is some reading:

    Compilers: Principles, Techniques, and Tools (2nd Edition)
u/OhYourFuckingGod · 1 pointr/funny

She's trying to hide her disappointment. I'm pretty sure this is the one she wanted .

u/Elynole · 1 pointr/nfl

I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:

Universal Principles of Design

Dieter Rams: As Little Design as Possible

Rework by 37signals

Clean Code

The Art of Programming

The Mythical Man-Month

The Pragmatic Programmer

Design Patterns - "Gang of Four"

Programming Language Pragmatics

Compilers - "The Dragon Book"

The Language of Mathematics

A Mathematician's Lament

The Joy of x

Mathematics: Its Content, Methods, and Meaning

Introduction to Algorithms (MIT)

If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.

I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.


u/cmtedouglas · 1 pointr/learnprogramming

well, the common source out there that i can't avoid recommend is the Dragon's book

http://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

u/johannes1971 · 1 pointr/cpp

European dragons have a heritage that stretches back to at least the time of the Greek civilisation; calling them "pale imitations" does them a grave disservice. The oldest known sources for dragon myths are from the Middle East, not the Far East.

If you feel like arguing this point, please don't use the Dresden Files. Just stick with authorative sources instead, ok?

u/blahdom · 1 pointr/learnpython

No problem. Good luck finding a class, compilers are a really fun subject!

If you are just generally interested (I don't know your experience level) but the dragon book is still highly regarded. And might be a good entry way into the theory of it all.

u/hou32hou · 1 pointr/ProgrammingLanguages

I suggest you to read the Dragon Book.

u/dnew · 1 pointr/worldnews

&gt; Is this a realistic goal

Yes, quite. The bits you are going to be missing are some of the mathematical underpinnings. Depending on what you're programming, you'll also want to grab books on the particular topic at hand that don't try to teach you programming at the same time.

For example, if you want to learn why C# is object-oriented and what that means and how to use it, grab a copy of this book: http://en.wikipedia.org/wiki/Object-Oriented_Software_Construction

If you want to learn how relational databases work, read this one http://www.amazon.com/Introduction-Database-Systems-8th-Edition/dp/0321197844 (You can easily find online versions, but I didn't investigate whether they were legally released or not.)

You want to write a compiler? Grab the "dragon book": http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

None of those teach you how to program. They teach you the math and background behind major inventions in programming. Keep up with those, find a local mentor who enjoys talking about this stuff, and you'll do fine.

u/empleadoEstatalBot · 1 pointr/argentina

&gt; It’s hard to consolidate databases theory without writing a good amount of code. CS 186 students add features to Spark, which is a reasonable project, but we suggest just writing a simple relational database management system from scratch. It will not be feature rich, of course, but even writing the most rudimentary version of every aspect of a typical RDBMS will be illuminating.
&gt;
&gt; Finally, data modeling is a neglected and poorly taught aspect of working with databases. Our suggested book on the topic is Data and Reality: A Timeless Perspective on Perceiving and Managing Information in Our Imprecise World.
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ### Languages and Compilers
&gt;
&gt; Most programmers learn languages, whereas most computer scientists learn about languages. This gives the computer scientist a distinct advantage over the programmer, even in the domain of programming! Their knowledge generalizes; they are able to understand the operation of a new language more deeply and quickly than those who have merely learnt specific languages.
&gt;
&gt; The canonical introductory text is Compilers: Principles, Techniques &amp; Tools, commonly called “the Dragon Book”. Unfortunately, it’s not designed for self-study, but rather for instructors to pick out 1-2 semesters worth of topics for their courses. It’s almost essential then, that you cherrypick the topics, ideally with the help of a mentor.
&gt;
&gt; If you choose to use the Dragon Book for self-study, we recommend following a video lecture series for structure, then dipping into the Dragon Book as needed for more depth. Our recommended online course is Alex Aiken’s, available from Stanford’s MOOC platform Lagunita.
&gt;
&gt; As a potential alternative to the Dragon Book we suggest Language Implementation Patterns by Terence Parr. It is written more directly for the practicing software engineer who intends to work on small language projects like DSLs, which may make it more practical for your purposes. Of course, it sacrifices some valuable theory to do so.
&gt;
&gt; For project work, we suggest writing a compiler either for a simple teaching language like COOL, or for a subset of a language that interests you. Those who find such a project daunting could start with Make a Lisp, which steps you through the project.
&gt;
&gt;
&gt;
&gt; [Compilers: Principles, Techniques &amp; Tools](https://teachyourselfcs.com//dragon.jpg) [Language Implementation Patterns](https://teachyourselfcs.com//parr.jpg)&gt; Don’t be a boilerplate programmer. Instead, build tools for users and other programmers. Take historical note of textile and steel industries: do you want to build machines and tools, or do you want to operate those machines?
&gt;
&gt; — Ras Bodik at the start of his compilers course
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ### Distributed Systems
&gt;
&gt; As computers have increased in number, they have also spread. Whereas businesses would previously purchase larger and larger mainframes, it’s typical now for even very small applications to run across multiple machines. Distributed systems is the study of how to reason about the tradeoffs involved in doing so, an increasingly important skill.
&gt;
&gt; Our suggested textbook for self-study is Maarten van Steen and Andrew Tanenbaum’s Distributed Systems, 3rd Edition. It’s a great improvement over the previous edition, and is available for free online thanks to the generosity of its authors. Given that the distributed systems is a rapidly changing field, no textbook will serve as a trail guide, but Maarten van Steen’s is the best overview we’ve seen of well-established foundations.
&gt;
&gt; A good course for which some videos are online is MIT’s 6.824 (a graduate course), but unfortunately the audio quality in the recordings is poor, and it’s not clear if the recordings were authorized.
&gt;
&gt; No matter the choice of textbook or other secondary resources, study of distributed systems absolutely mandates reading papers. A good list is here, and we would highly encourage attending your local Papers We Love chapter.
&gt;
&gt;
&gt;
&gt; [Distributed Systems 3rd edition](https://teachyourselfcs.com//distsys.png)
&gt;
&gt;
&gt;
&gt; ## Frequently asked questions
&gt;
&gt; #### What about AI/graphics/pet-topic-X?
&gt;
&gt; We’ve tried to limit our list to computer science topics that we feel every practicing software engineer should know, irrespective of specialty or industry. With this foundation, you’ll be in a much better position to pick up textbooks or papers and learn the core concepts without much guidance. Here are our suggested starting points for a couple of common “electives”:
&gt;
&gt; - For artificial intelligence: do Berkeley’s intro to AI course by watching the videos and completing the excellent Pacman projects. As a textbook, use Russell and Norvig’s Artificial Intelligence: A Modern Approach.
&gt; - For machine learning: do Andrew Ng’s Coursera course. Be patient, and make sure you understand the fundamentals before racing off to shiny new topics like deep learning.
&gt; - For computer graphics: work through Berkeley’s CS 184 material, and use Computer Graphics: Principles and Practice as a textbook.
&gt;
&gt; #### How strict is the suggested sequencing?
&gt;
&gt; Realistically, all of these subjects have a significant amount of overlap, and refer to one another cyclically. Take for instance the relationship between discrete math and algorithms: learning math first would help you analyze and understand your algorithms in greater depth, but learning algorithms first would provide greater motivation and context for discrete math. Ideally, you’d revisit both of these topics many times throughout your career.
&gt;
&gt; As such, our suggested sequencing is mostly there to help you just get started… if you have a compelling reason to prefer a different sequence, then go for it. The most significant “pre-requisites” in our opinion are: computer architecture before operating systems or databases, and networking and operating systems before distributed systems.
&gt;
&gt; #### Who is the target audience for this guide?
&gt;
&gt; We have in mind that you are a self-taught software engineer, bootcamp grad or precocious high school student, or a college student looking to supplement your formal education with some self-study. The question of when to embark upon this journey is an entirely personal one, but most people tend to benefit from having some professional experience before diving too deep into CS theory. For instance, we notice that students love learning about database systems if they have already worked with databases professionally, or about computer networking if they’ve worked on a web project or two.
&gt;
&gt; #### How does this compare to Open Source Society or freeCodeCamp curricula?
&gt;
&gt; The OSS guide has too many subjects, suggests inferior resources for many of them, and provides no rationale or guidance around why or what aspects of particular courses are valuable. We strove to limit our list of courses to those which you really should know as a software engineer, irrespective of your specialty, and to help you understand why each course is included.
&gt;
&gt; freeCodeCamp is focused mostly on programming, not computer science. For why you might want to learn computer science, see above.
&gt;
&gt; #### What about language X?
&gt;
&gt; Learning a particular programming language is on a totally different plane to learning about an area of computer science — learning a language is much easier and much less valuable. If you already know a couple of languages, we strongly suggest simply following our guide and fitting language acquisition in the gaps, or leaving it for afterwards. If you’ve learned programming well (such as through Structure and Interpretation of Computer Programs), and especially if you have learned compilers, it should take you little more than a weekend to learn the essentials of a new language.
&gt;
&gt; #### What about trendy technology X?
&gt;

&gt; (continues in next comment)

u/balefrost · 1 pointr/AskProgramming

OK, a few things:

It looks like you're trying to build a shift/reduce parser, which is a form of an LR parser, for your language. LR parsers try to reduce symbols into more abstract terms as soon as possible. To do this, an LR parser "remembers" all the possible reductions that it's pursuing, and as soon as it sees the input symbols that correspond to a specific reduction, it will perform that reduction. This is called "handle finding".

&gt; If I am correct, my Automaton is a DFA?

When the parser is pursuing a reduction, it's looking for sequences of symbols that match the right-hand sides of the relevant (to our current parse state) productions in our grammar. Since the right-hand sides of all the productions in a grammar are simple sequences, all the handle finding work can be done by a DFA. Yes, the handle recognizer of your parser is a DFA. But keep in mind that it needs to be combined with other parts to make a full parser, and your actual grammar can't be recognized with just a DFA.

In particular, you've shown the ACTION table for a shift/reduce parser. It determines what to do when you encounter a symbol in the input stream. But a shift/reduce parser typically needs a second table as well - the GOTO table - that determines what to do after a reduction has taken place.

One other thing that's worth mentioning: you've expressed your ACTION table as a plain DFA transition table. That's not necessarily wrong, but it's not commonly done that way. Instead of reducing when you reach a certain state, it's common to instead attach an action - either 'shift' or 'reduce' ('accept') - to each transition itself. So in a shift/reduce parser, your table might look more like this:

| [ | ] | &lt; | &gt; | id | / | attr
----+-----+-----+-----+-----+------+-----+--------
0 | S1 | | S4 | | | |
1 | | | | | S2 | | R3 : Reduce Tag -&gt; [ id ]
2 | | R3 | | | | | R7 : Reduce Tag -&gt; &lt; id ??? / &gt;
4 | | | | | S5 | S10 | R9 : Reduce Tag -&gt; &lt; id ??? &gt;
5 | | | | R9 | | S6 | S8 R12 : Reduce Tag -&gt; &lt; / id &gt;
6 | | | | R7 | | |
8 | | | | R9 | | S6 | S8
10 | | | | | S11 | |
11 | | | | R12 | | |

Note that R7 and R9 aren't well-formed, since multiple sequences of input tokens might cause you to reach these actions. While it would be possible to construct a shift / reduce parser this way, it's not commonly done. Typically, the DFA to recognize handles is an acyclic graph, but your have a self-transition in state 8.

&gt; What would be the best way of implementing this automaton in C++? Do I really have to make a huge array?

In general, yes, you need a big array (or, as suggested before, two big arrays). But you can use any space-saving technique you want. For example, since most entries in the ACTION table are invalid, one could represent that data with a sparse array data structure. Also, both The Dragon Book and Cooper and Torczon briefly cover parser-specific ways to compress those tables. For example, notice that rows 5 and 8 in your example have the same entries. Most real grammars have multiple instances of identical rows, so factoring out this commonality can save enough space that the extra complexity is worth it.

---

I'm a little surprised that you're building a parser like this by hand, though. Typically people do one of two things:

  1. Build, by hand, a modified LL(1) recursive descent parser (or variant, like a packrat parser)
  2. Build, using a tool like YACC or Bison, a LR(1) shift/reduce parser

    You're sort of doing a mix of the two, which means you have the downsides of both approaches. You need to track all the states and transitions by hand, instead of relying on tools to automate that process, yet you don't get the flexibility of a hand-coded recursive descent parser.

    If you're doing this for education's sake, then by all means proceed. I'd highly encourage you to pick up a book on parsing; I think Cooper and Torczon is a great source. But if you just want a parser that works, I'd definitely recommend using a tool or using a more direct approach, like recursive-descent.
u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/Yunath_ · 1 pointr/uwaterloo

LOL it seems interesting to me. I'm reading https://www.amazon.ca/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=dp_ob_title_bk right now.

&amp;#x200B;

Maybe its good in theory, and not in practice.

u/a4qbfb · 1 pointr/C_Programming

&gt;&gt; You are confusing sections with segments

&gt; and you're behaving the way C++ usually behave

I am not a C++ anything. I have 25 years of experience with C and 20 years of experience with operating system design and development. The odds are better than even that both the computer you used to post this abusive rant and the computers which reddit use to store and serve it run code that I have written.

(Yes, I've been writing C++ on and off for almost as long as I've been writing C, but I vastly prefer the latter.)

&gt; code section or code segment or text section or test segment are same in this context

Absolutely not. Segments are a feature of
some computer architectures, most prominently the Intel 8086 and 80386 (but not their 64-bit descendants), used to organize and access code and data in memory. Sections are a feature of most executable file formats (such as ELF and COFF) used on modern non-embedded platforms. The OS and / or the run-time linker read code and data from sections in the executable file and store them somewhere in memory.

&gt; simple googling gives you the result: https://stackoverflow.com/questions/2589949/string-literals-where-do-they-go

Stack Overflow is not a reliable source of
correct* information (see for instance this article or this one about how SO's karma system encourages a race to the bottom). I would suggest reading Tanenbaum or McKusick et al. instead.

This specific answer is correct only in the sense that the literals are included in the same file as the code. Setting aside embedded platforms and their idiosyncrasies, they are stored in different sections of the file and loaded into different memory locations at run-time.

&gt; Where do they go? the same segment where the .text section of the object file gets dumped, which has Read and Exec permissions, but not Write

Your information is 20 to 30 years out of date. No modern OS uses segments the way Intel envisioned when they designed the 8086 or the 80386. The former needed segments to escape pointer size limitations when the address bus outgrew the registers, and the latter retained them for backward compatibility and added memory protection features on a per-segment basis. However, modern OSes for the 80386 and its descendants implement memory protection at the page table level rather than at the segment level and use segment registers purely as conveniences for array operations, which is why they were left out when the 80386 architecture was expanded to 64 bits.

&gt; But go ahead c++ pleb continue **** your ***
like your kind always does.

I understand that it can be hard to admit that you are wrong, but that's not a reason to resort to that kind of language.

u/jeremiahs_bullfrog · 1 pointr/linux

Well, it is copyrighted by Kirk McKusick, who was a core contribute in the early days of FreeBSD, and he has a restriction that it only be used tastefully (so there's some subjectivity to it). I don't know if he still works on it as a core contributor though, but he did recently release v2 of The Design and Implementation of the FreeBSD Operating System, so he's involved in some capacity.

I'm not sure of the legal restrictions on the new FreeBSD logo, Beastie or Tux so you may very well be right they they don't need to be defended.

u/powerclaw1 · 1 pointr/mildlyinteresting

I should point out that this (I'm pretty sure at least) comes from Randall Munroe's book Thing Explainer which uses the xkcd art style etc. to explain complicated concepts using the 1,000 most common English words. It's pretty great, check it out if you can!

u/CricketPinata · 1 pointr/milliondollarextreme

If you want to just know buzzwords to throw around, spend a bunch of time clicking around on Wikipedia, and watch stuff like Crash Course on YouTube. It's easy to absorb, and you'll learn stuff, even if it's biased, but at least you'll be learning.

If you want to become SMARTER, one of my biggest pieces of advice is to either carry a notebook with you, or find a good note taking app you like on your phone. When someone makes a statement you don't understand, write it down and parse it up.

So for instance, write down "Social Democracy", and write down "The New Deal", and go look them up on simple.wikipedia.com (Put's all of it in simplest language possible), it's a great starting point for learning about any topic, and provides you a jumping board to look more deeply into it.

If you are really curious about starting an education, and you absolutely aren't a reader, some good books to start on are probably:

"Thing Explainer: Complicated Stuff in Simple Words" by Randall Munroe

"A Short History of Nearly Everything" by Bill Bryson

"Philosophy 101" by Paul Kleinman, in fact the ____ 101 books are all pretty good "starter" books for people that want an overview of a topic they are unfamiliar with.

"The World's Religions" by Huston Smith

"An Incomplete Education" by Judy Jones and Will Wilson

Those are all good jumping off points, but great books that I think everyone should read... "A History of Western Philosophy" by Bertrand Russell, "Western Canon" by Harold Bloom, "Education For Freedom" by Robert Hutchins, The Norton Anthology of English Literature; The Major Authors, The Bible.

Read anything you find critically, don't just swallow what someone else says, read into it and find out what their sources were, otherwise you'll find yourself quoting from Howard Zinn verbatim and thinking you're clever and original when you're just an asshole.

u/ewk · 1 pointr/zen
u/asdf-user · 1 pointr/harrypotter

The thing explainer because he loves knowing about muggle stuff

u/SpiderFnJerusalem · 1 pointr/NoStupidQuestions

Even complicated stuff can be easy.

u/Spunki · 1 pointr/nostalgia

So you like cross section books that explain things. Check out Thing Explainer

u/frankenduke · 1 pointr/pics

This is why I bought
Thing Explainer
Cannot wait until the toddler is old enough to have them.

u/JimWibble · 1 pointr/Gifts

He sounds like a younger version of myself! Technical and adventurous in equal measure. My girlfriend and I tend to organise surprise activities or adventures we can do together as gifts which I love - it doesn't have to be in any way extravegant but having someone put time and thought into something like that it amazing.

You could get something to do with nature and organise a trip or local walk that would suit his natural photography hobby. I love to learn about new things and how stuff works so if he's anything like me, something informative that fits his photography style like a guide to local wildflowers or bug guide. I don't know much about parkour but I do rock climb and a beginners bouldering or climbing session might also be fun and something you can do together.

For a more traditional gift Randall Munroe from the web comic XKCD has a couple of cool books that might be of interest - Thing Explainer and What If. Also the book CODE is a pretty good book for an inquisitive programmer and it isn't tied to any particular language, skillset or programming level.

u/dsmelser68 · 1 pointr/dndnext

If the DM insists on only being able to repeat sounds that have been heard before, then you could take the time to expose the raven to a limited vocabularly.
For an example of what can be accomplished with a limited vocabularly, see https://www.amazon.com/Thing-Explainer-Complicated-Stuff-Simple/dp/0544668251 which explains complicated subjects using the common 10 hundred words in the english langauge. Here is an example: https://xkcd.com/1133/

&amp;#x200B;

u/anyones_ghost27 · 1 pointr/funny

Yeah, he ate a lot of the front cover and destroyed the first 20-30 pages of my hardback HP and the Deathly Hallows. But he removed the dust jacket first without damaging it, so at least I can put that on and cover the damage.

He also destroyed the Thing Explainer by Randall Munroe, which I highly recommend as a gift for anyone, including kids, who likes cool drawings and nerdy things. Or maybe for dogs who eat hardback books. My dog found it extra tasty and super chewy.

u/nvincent · 1 pointr/GiftIdeas

So, I think I am the kind of person you are describing. I have a pretty great job, so I usually just buy my own technology stuff. Not only that, but I am rather picky with technology stuff, so even if someone did get me something like that, I would act excited and happy, but in the back of my mind I would secretly wishing they did more research before buying the thing that they did.

That said! If I were buying for me, I would go with something like the hyperbole and a half book (http://www.amazon.com/Hyperbole-Half-Unfortunate-Situations-Mechanisms/dp/1451666179), or something by the creator of the XKCD comics (http://www.amazon.com/Thing-Explainer-Complicated-Stuff-Simple/dp/0544668251/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1449202837&amp;amp;sr=1-1&amp;amp;keywords=xkcd).

If it has to be tech related, there is always http://www.thinkgeek.com - they have tons of fun, nerdy gifts that I would like. All of these things combined are probably way less than $1,000. That is just a lot of money.

Another random suggestion - if they were ever into pokemon, this is a dream come true: Gym Badges! https://www.etsy.com/listing/128753018/pokemon-kanto-gym-badges-gen-1?utm_source=google&amp;amp;utm_medium=cpc&amp;amp;utm_campaign=shopping_us_b-accessories-patches_and_pins-pins_and_pinback_buttons&amp;amp;utm_custom1=a91c90fb-48c1-4024-87f9-fb14aadac033&amp;amp;gclid=CjwKEAiA7f-yBRDAgdv4jZ-78TwSJAA_WdMaz_NXsXrFH_0f-Mb6ovmqqcCHto-b7S6zm1DplssHQhoCNuvw_wcB

u/tehzephyrsong · 1 pointr/AskWomen

I like books - this year I've asked for Thing Explainer and Welcome to Night Vale, along with some linguistics books, but I'm a language nerd.

My boyfriend's family asks each other what they want for gift-giving holidays. It does kind of spoil the surprise, but I'll take knowing what I'm getting over getting a bunch of ugly-ass clothes or shitty soap gift sets. Again.

u/Redditor_1138 · 1 pointr/gifs

See the page on "Tiny bags of water you're made of" in this book's preview

u/nibarius · 1 pointr/LearnJapanese

This is a good example what you get in the most 1000 common words in English:

https://xkcd.com/1133/
http://www.amazon.com/gp/product/0544668251/ref=as_li_qf_sp_asin_il_tl?ie=UTF8&amp;amp;camp=1789&amp;amp;creative=9325&amp;amp;creativeASIN=0544668251&amp;amp;linkCode=as2&amp;amp;tag=thekcs-20&amp;amp;linkId=S4XRZJJKSMHOWEQU

It's probably something similar in Japanese.

That being said, by knowing 1000 words you'll understand a lot more than by knowing 0. Keep trying to learn 500 words and if you succeed try to learn 500 more. If you can keep it up over many years you'll learn a lot.

u/chernobyl169 · 1 pointr/atheism

http://www.amazon.com/gp/product/0544668251

The author's provided link was banned by AutoModerator bot. Please visit XKCD and use his amazon link so he gets an extra few cents.

u/cats_n_cake · 1 pointr/explainlikeimfive

You might want to take a look at the book But How Do It Know. It does a great job walking you through how circuits can be used to add all the way up to how those circuits are combined to create a computer. It's geared towards who have little to no experience with programming/electrical engineering.

u/Storsjon · 1 pointr/pcmasterrace

How Do It Know?

Easy to read introduction to basic principles of computers.

u/JaiBhavaniJaiShivaji · 1 pointr/emulation

Its this one no?

u/Idoiocracy · 1 pointr/computers

I had never heard of But How Do It Know?, thank you for bringing it to my attention. From a related link, another title The Elements of Computing Systems: Building a Modern Computer from First Principles got good reviews as well.

u/FastEddieTheG · 1 pointr/explainlikeimfive

If you're interested in learning a surprising amount about this without needing heavy technical background, might I recommend a fantastic book, But How Do It Know?

u/shitzafit · 1 pointr/tis100

I'm kind of new to programming started at one time and have just started getting back into it, so I lack the jargon and knowledge to be able to communicate with you folks. I bought https://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765/ref=sr_1_2?s=books&amp;amp;ie=UTF8&amp;amp;qid=1482956438&amp;amp;sr=1-2&amp;amp;keywords=how+do+it+do+it hoping it might help, I'm halfway through it but the fact the book doesn't even mention what a node is was disappointing. I'm guessing the term is specific to the game and not really in the language that author is using.

u/l19ar · 1 pointr/argentina
u/Mr-Ultimate · 1 pointr/techsupport

I know why I got down voted. It's not advanced enough. Here this should make up for it: https://www.amazon.com/gp/aw/d/0789756102?vs=1

u/Repa · 1 pointr/pcmasterrace

Every PC enthusiast should have a copy of Scott Mueller's Upgrading and Repairing PCs.

It's relatively cheap compared to the amount of knowledge it contains within. It is dry in some portions, but explains a lot of history and why PCs work the way they do. Ever wanted to know the difference between L2 and L3 cache? What about memory timings and rankings? It's in there.

u/MajorHavok · 1 pointr/Python

A more perfect book?

Learning Python with Raspberry Pi
Alex Bradbury (Author), Ben Everard (Author)

http://www.amazon.com/Learning-Python-Raspberry-Alex-Bradbury/dp/1118717058/ref=sr_1_1?ie=UTF8&amp;amp;qid=1395812735&amp;amp;sr=8-1

u/asb · 1 pointr/Python

Learning Python with Raspberry Pi by myself and Ben Everard came about about a month and a half ago.

u/benev · 1 pointr/linux

I've written a book on Python an the Raspberry Pi (http://www.amazon.co.uk/Learning-Python-Raspberry-Alex-Bradbury/dp/1118717058/).

It was written about a year ago, and published (IIRC) in February this year.

Personally, I really believe in the book as a form of learning. A few people mention things like Stack Overflow as an alternative, but I think they're really not. SO is great for looking up solutions to problems, but it doesn't (usually) teach you much. It's a short hit of knowledge on one problem you're facing, not a systematic overview of the whole area. A well written technical book should take you from one level of expertiese up to another entirely, and give you a broad knowledge of the area.

The issue of staying up to date is a huge challenge, although it's more of a problem in some areas than others (A book on RHEL 7 should be fine for years, for example). Some people have mentioned the possibility of pushing updates to e-books, but I'm deeply skeptical that there's enough money in most books to make this worth while from the author's perspective.

(let me know if you've got any questions about the book-publishing business, and I'll try to answer them).

u/cr4cken · 1 pointr/learnprogramming

Got that for christmas with a pi
http://www.amazon.com/Learning-Python-Raspberry-Alex-Bradbury/dp/1118717058

Till now it's a lot of fun. But i am a programmer beginner. Don't know if this fits for your situation.

Edit: also available as pdf for download. Just google it

u/Suppafly · 1 pointr/AskMen

shitty mobile:

http://www.amazon.com/gp/aw/d/1118717058

regular:

http://www.amazon.com/dp/1118717058

although amazon tends to pack the url with a bunch of extra referral and search term stuff too.

u/ss0317 · 1 pointr/ECE

I wouldn't exactly say the programming is easy... there's a lot of new ideas and vocabulary to become familiar with.

I am fairly new in microcontrollers and am still confused about a lot of things. Writing bootloaders/brickloaders, watchdog timers, configuring fuse bits, handling interrupts, prescalers, timers, adjusting pwm frequencies, i2c/spi/uart/1-wire/usb, ethernet, wifi... the list goes on...

Not to mention the techniques for optimization/memory handling/reduction of power consumption...

There's a lot of concepts related to hardware programming that you just won't encounter when say, writing console applications.

With that being said, I haven't found a complete tutorial series on youtube, but Human Hard Drive has a decent intro to AVR programming and I found this book to be a helpful introduction to the topic.

u/welfare_pvm · 1 pointr/SoftwareEngineering

What field do you want to specialize in? Embedded? Web? Mobile?

The best way to learn is by practicing, but if you want more of an abstract, design level read, there are lots of options.

I'm have a web background, so here's three that I've read recently as examples.

I enjoyed this book on microservice design and I think everyone who uses OOP should at least familiarize themselves with the common OOP design patterns.

If you are into JavaScript, Eloquent JavaScript is my go-to for a good mix of summary/detail of the language. It's well written, and comes with fun exercises at the end of each chapter to help solidify your understanding of each concept.

I'm sure there are other great books, but these are some of my favorites so far.

u/gxhxoxsxtxfxm · 1 pointr/csharp

Oh! These are indeed very useful tips. Thank you for the points. I am currently learning ASP.NET Core MVC. I have been a C# developer for a few years but I have never developed Web applications with ASP and have always resorted to what I already knew (Java and PHP). My current work laptop as well as the home software ecosystem is now Apple-based and I would rather not split work and switch between operating systems. That's why I was trying to utilise VS for Mac. As of now, my aim is to learn ASP.NET, but at some point I would also need to build .DLL files and I may have to build REST APIs and host apps on Azure. I doubt if I will go back to building native/desktop apps for now. But if I will someday, I will probably start learning something like Electron.NET. So, any further tips are appreciated.

&amp;#x200B;

P.S. The book I am currently reading is Pro ASP.NET MVC 2 by Adam Freeman which looks comprehensive thus far even though the examples are built in the Windows version of Visual Studio for which he takes no blame.

u/UpNorthMark · 1 pointr/csharp

https://www.amazon.ca/Pro-ASP-NET-Core-MVC-2/dp/148423149X

https://www.amazon.ca/Pro-ASP-NET-MVC-Adam-Freeman/dp/1430265299

Just about to pull the trigger one of these.
I'm not going going be applying for jobs for a couple of years because of college. Should i bother with MVC 5 or try to jump straight into core.

u/NickTheFirstOne · 1 pointr/dotnet

Hello,
Based on the comments until now i understand that you trying to learn asp.net core 2.
When i started my journey on asp.net i started with Professional ASP.NET MVC 5 great book.

For Asp.net Core i started with: Pro ASP.NET Core MVC its a nice book for asp.net core 1.

for asp.net core 2 i would suggest the Pro ASP.NET Core MVC 2 but with a slight hesitation because asp.net core 2 at the time of the publishing was still new.

Also this MVA course could help you.

If you need more info and tutorials - courses. Comment bellow and i will try to help you find the best courses for you.


Thanks.

u/OSHWjunkie · 1 pointr/ECE

I would suggest also picking up this one
http://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149/ref=sr_1_1?ie=UTF8&amp;amp;qid=1417370827&amp;amp;sr=8-1&amp;amp;keywords=making+embedded+systems

As for AVR versus PIC, really doesn't matter. I like to use PICs just because I prefer the way their documentation is handled and they seem to have more specialized silicon for specific projects.

HOWEVER, I would play the devils advocate and say neither. Start looking at development boards made by TI or Renesas. They ship more units and if you can get an embedded job you'll more likely be working with those toolchains.

u/edwardkmett · 1 pointr/programming

As for how folks feel about derivative notation? Maybe a little. ;) Sure, Liebnitz had a way better notation than Newton, but folks still use Newton's notation in some small physics examples, so depending on what someone was struggling with... As for SICP, well, it depends, I'd rather someone swallowed a sugar coated pill than spit it out and never get better.

From a compiler design perspective there are really 2 very different paths you can take (they do start to converge at the end, but you still wind up with ANF vs. SSA, etc) You can explore the scheme/haskell/etc functional programming path or you can go down the Steven Muchnik Advanced Compiler Design and Implementation path., which is much more traditional in terms of the kinds of languages it seems to consider.

http://www.amazon.com/Advanced-Compiler-Design-Implementation-Muchnick/dp/1558603204

Both offer very different seeming designs. So, if lisp was a non-starter, I'd probably send them over to the Dragon book and then down towards Muchnik from there.

u/jodraws · 1 pointr/hearthstone

Make use of that intelligence and get her an arduino uno.

She'll be able to make anything from simple robots to a light up dress that changes colors. A simple guide to get started will help as well. Guide.

u/uptocode · 1 pointr/arduino

And heck! If you don't have an Arduino just yet, you can try one out virtually first! I like 123D Circuits by AutoDesk; however, there are many other simulators with Arduinos built in. Google them! :-)

Like it but don't like the $$$? You can make your own! There are many tutorials online for making a bare bones Arduino with cheap* electronics components.

I really like @schorhr book suggestions. To add on, the following books are great for Arduino beginners: Programming Arduino: Getting Started with Sketches &amp; Make: Getting Started with Arduino. Also, great tutorials can be found here: tronixstuff Arduino Tutorials &amp; Ladyada's Arduino Tutorials.

Good luck!

u/The_Masked_Lurker · 1 pointr/talesfromtechsupport

Going to a private, but non-profit institution, its cool.

(as a matter of fact, a friend has friends that go to bent state university and after comparing physics hw found that, well our curriculum is much harder, I guess their intro final had a "draw a line to match the term to its definition" type thing)

Anywho, one of our compsci upper level courses is based on this book http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=la_B000APBUAE_1_2?s=books&amp;amp;ie=UTF8&amp;amp;qid=1406428553&amp;amp;sr=1-2 It goes through and explains computer architecture for an actual cpu, I don't recall how easy it is to read however. (if you buy it and it makes no sense, the intro book we use was called, "an invitation to computer science", but get an adition or two back from current if you buy)

Finally you can a bunch of info here http://ocw.mit.edu/index.htm

u/tc655 · 1 pointr/ECE

See if your library has this book:

http://www.amazon.com/Computer-Organization-Design-Fourth-Edition/dp/0123744938

It's what we used in my computer organization course and I found it to be quite helpful. If you are desperate, a PDF version of your book is comes up as the second result on google...

u/njoubert · 1 pointr/compsci

I would suggest that the carlh programming guides is not a bad idea then!

I would heavily suggest learning C well - this is a language that was designed to stay close to the hardware while being portable, and is a very small language. So, buy a copy of the K&amp;R Book, ever C programmer has one.

Then, Patterson's book is a tome for computer engineering. It'll show you assembly, all the way down to NAND gates.

I would suggest you start by watching and working through Berkeley's CS61C course. It's the logically second course in CS, and after a quick overview of C it dives into the machine itself. Website here, videos here. Also, Dan Garcia is an excellent lecturer.

Once you have all the machine details down, you'll probably feel hampered by your actual program wizardry. This is where you start looking into algorithms and data structures. Your go-to guide here is probably Cormen's Introduction to Algorithms since it handles both data structures and algorithms. It's definitely more of a theoretical/CS-ey book, so if this is not what you want, then Head First Java will teach you a new language (and learning more languages is one of the best ways to grow as a programmer!) and also do many data structures. In fact, you can get both those books and have the light side and the serious side of programming books.

At this point you should be well equipped to go off in whatever direction you want with programming. Start contributing to open source projects! Find things that interest you and try to solve problems! Being a part of the programming community will be your biggest aid in both learning programming and starting to make money through it. People pay for programmers that they know can deliver, and success in the open source world means a lot, and you don't need to go to school for it to get to this point!

Lastly, many CS/programming folks hang out on IRC. If you have questions, find the appropriate IRCS channels and go talk to people. Good luck and welcome to programming!

u/jmknsd · 1 pointr/hardware

I learned mostly from:

http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X

But this has alot of information in it, and was the book for the prerequisite of the class I took that Used the above book:


http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938

u/ItsNotMineISwear · 1 pointr/programming

Here's the lab course website: https://engineering.purdue.edu/~ece437l/materials.shtml

I'm not sure how helpful that will be since the course was very hands-on and most of the value I got was out of interaction with the TAs.

The lecture notes weren't posted online iirc. The book we used is Patterson and Hennessy

u/wicker0 · 1 pointr/osdev

The Design of the Unix Operating System is a classic. It's from the 80's, but still plenty relevant. It's very well written, with plenty of diagrams to help you along.

It doesn't quite start from the very beginning. If you're looking for information on how to start with absolutely nothing (ie, write a bootloader, implement basic device drivers, etc), then you'll need to supplement with other sources. It does, however, do a really great job of explaining things like processes, threads, memory management, and other basic concepts. It doesn't give you source code (though it contains a bit of pseudocode), but it explains in succinct, legible prose, the data structures and algorithms that drive core functionality. Again, it's an old book - $6.00 plus shipping used. Can't really go wrong.

Operating Systems Design and Implementation covers basically the same ground. I prefer the former, as it treats you a little more like an adult and skips straight to explaining how concepts are implemented (and the cover art is just so undeniably classic).

u/JonasY · 1 pointr/raspberry_pi

I've tried doing something similar for x86 about a decade ago.

I've written this GUI a while back from scratch that could work from DOS (for things like loading images into memory). All those controls that you see in the picture worked well and how they should. I believe it only took like 3000-5000 lines of code to write it. How I started was, I already knew x86 assembly (not really needed for GUI part, unless you want to optimize) and C. I found some sort of Linux bootloader that had a GUI. I looked through its code and got a basic idea on how to write it. Its author used C++ (just for classes and inheritance), which is what I used. So for a GUI I recommend learning C and a bit of C++, then finding this bootloader (I don't remember the name) or some other relatively small project that has its own GUI and see how it's made.

For the OS part, I recommend a book called "Operating Systems Design and Implementation (3rd Edition)". You will need to know C and x86 assembly to understand it. It discusses how a particular OS named Minix was made. Linus Torvalds used the first edition of this book to write the first version of Linux.

You could google for something like "osdev", "osdev gui".

u/tluyben2 · 1 pointr/programming

I would very much recommend http://www.amazon.com/Black-Video-Game-Console-Design/dp/0672328208 ; it goes really far in explaining everything from quantum level up to a working game console. And after that; http://www.amazon.com/Operating-Systems-Design-Implementation-3rd/dp/0131429388/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1405346881&amp;amp;sr=1-1&amp;amp;keywords=minix . Then you'll be set.

Edit; Although the Black art is about game consoles; if you work through it, you can build your own computer in the end, or, what I really like, you can pick up old machines (80s/begin 90s) from Ebay for &lt; $5, open them up , understand them and change them. As they are not 'one chip' with some power supply stuff, but almost everything is held in separate ICs, so you can follow the PCB and see actually what it is doing and how. Great fun. And it scales, as I have no issue making digital things with FGPA's etc because I know it at this level.

u/netdroid9 · 1 pointr/programming
u/just-an0ther-guy · 1 pointr/software

Most operating systems are written in a combination of ASM (Assembly, which is machine code for x86/x86-64 processors), and C or C++.

If you're really serious about it, there is a book that walks through a basic operating system called MINIX (a minimal *nix OS). See: https://www.amazon.com/Operating-Systems-Design-Implementation-3rd/dp/0131429388/

Nowadays, modern operating systems are much more complex, though.

u/mobileagent · 1 pointr/IWantToLearn

Operating System Design and Implementation, a classic (which...I hope is still relevant, but it's the first thing I thought of). Also good stuff in the recommenations.

u/idontchooseanid · 1 pointr/linux

Do you want to know which parts make an OS or how it's actually run in runtime. Former, is easy just install Arch, Gentoo or Linux From Scratch. Latter is a lot complicated nowadays but https://www.amazon.de/Operating-Systems-Implementation-Prentice-Software/dp/0131429388/ref=tmm_hrd_swatch_0?_encoding=UTF8&amp;amp;qid=&amp;amp;sr= is a great start or there's https://wiki.osdev.org/Tutorials if you want to go deep.

If you do both and combine the knowledge, your beard will grow 100x.

Source: I did but I am hairless 0*100 = 0 :/.

u/exeter · 1 pointr/programming

I'd recommend Minix; in particular, version 2. Not only is the kernel quite small, there's also a book that walks you through its internal workings in detail, and a newsgroup to go to for help or questions.

u/saitt04 · 1 pointr/compsci

[This one?]
(http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210)

I just started my computer architechture class and this is one of the books they recommended, I think I'll try to get this one if it is that good.

u/cschaef66 · 1 pointr/learnprogramming

Structured Computer Organization by Andrew Tanenbaum:
https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0131485210
One of the best books I've read.

u/smith7018 · 1 pointr/jailbreak

It's hard to say because that's a more advanced section of computer science called architecture and I learned it in a college setting. With that being said, I've heard good things about this textbook. Hopefully whichever book you pick up on ARM assembly will have a few chapters going over how the processor functions.

Good luck! :)

u/oldsecondhand · 1 pointr/technology

I'd check out these two books from the local library and read the first 2-3 chapters. It might contain more than what you need, but these are pretty well written books and don't assume a lot of previous knowledge.

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

http://www.amazon.com/Computer-Networks-5th-Andrew-Tanenbaum/dp/0132126958/ref=la_B000AQ1UBW_sp-atf_title_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1376126566&amp;amp;sr=1-1

Or you could just check out your network settings and search for the terms that you encounter (IP address, DNS, DHCP, gateway, proxy, router, firewall)

u/zendruid · 1 pointr/learnprogramming

I study in South Africa and we did this book in our first year, with Chapters 1-2 in first semester and 3-4 in the second. It had very little on algorithms, we did basic recursion and for the most part that was it.

We then did this book in the 3rd semester, which covered more algorithms in detail (DFS/BFS/MergeSort/QuickSort etc). And I'm now in the 4th semester where we have a ton of different books including this, and this.

u/dickcoins · 1 pointr/sre

My mentor suggested these two books to me about 5 years ago. Even though the books are old, the info is bizzarly not all that outdated. I find them easier to read then newer books because back then no one knew anything, so everything is written in a plainer english and builds up.

Unix:

https://www.amazon.com/Design-UNIX-Operating-System/dp/0132017997

TCP/IP (e.g. the internet)

https://www.barnesandnoble.com/p/tcp-ip-illustrated-volume-1-w-richard-stevens/1111620679/2660484049025?st=PLA&amp;sid=BNB_DRS_New+Marketplace+Shopping+Textbooks_00000000&amp;2sid=Google_&amp;sourceId=PLGoP164998&amp;gclid=EAIaIQobChMIr63AiI-b3gIVj_5kCh0HpA3LEAQYASABEgL5RfD_BwE

&amp;#x200B;

I have nailed most of my interviews from the knowledge in those books alone.

u/0theus · 1 pointr/linux

&gt; Is there some sort of golden rule book whose laws must not be violated?

Yes. The Design of the UNIX operating system and there's The UNIX Programming Environment :\^)

u/hawkinsw2005 · 1 pointr/linuxquestions

Understanding the Linux Kernel is great and, obviously, very specific to Linux.

&amp;#x200B;

Linus has cited that he read Bach's book about the design and implementation of UNIX as inspiration for the development of Linux.

&amp;#x200B;

Read both and really enjoyed them! I hope you like!

u/stuartcw · 1 pointr/programming

Worth buying a used copy of Design of the UNIX Operating System

u/sreguera · 1 pointr/programming

Interesting talk. The only thing that bothers me (slightly) is that he keeps talking of the Unix kernel as it was in the eighties.

u/squaganaga · 1 pointr/ECE

I haven't yet designed boards with EMC and RF in mind. I've seen recommendations for the high-speed digital design books thrown around, though.

u/VectorPotential · 1 pointr/AskElectronics

What are you trying to measure? Your pulse rise time will do some funny things to your results if you're not careful.

If you can obtain a copy of High Speed Digital Design, the author describes several test jigs for such tests.

u/somekindofdevil · 1 pointr/AskElectronics

Almost every PCB/EDA software doing length matching automatically so you don't need to worry about that. If you wanna know how softwares are doing it, It's more like a mathematical problem. I think they are using parametric curves like Bezier. You can calculate length of a bezier curve easily so you can match them.

https://en.wikipedia.org/wiki/B%C3%A9zier_curve

If you wanna know more about high speed pcb design, I recommend this book.
https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/Beggar876 · 1 pointr/AskElectronics

Find/download/buy this book: High speed Digital Design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

&amp;#x200B;

Scan it cover to cover. It will pay for itself the first time you save a pcb re-spin because of something you saw in it. It has for me.

u/kevlarcoated · 1 pointr/PrintedCircuitBoard

The books referenced by the most presenters and PCB design conferences are
Right the first time by Lee Ritchie http://www.thehighspeeddesignbook.com/
Highspeed design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

Note that A handbook of black magic reads like a text book, it is very long and very boring.
The subject of PCB is complicated and requires an in depth understanding of the physics because just knowing the rules isn't enough convince other engineers that it's the right way to do something. More importantly, in my experience PCB design is always the least bad solution, you have to understand when you can break the rules and what the implications will be and understand if the trade off is acceptable

u/reddit_user33 · 1 pointr/Electrical_Engineers

Different applications require different rules to abide by to a certain extent.

High speed digital design - Howard Johnson has given me such a great insight to the world of high speed electronics.

u/EvasiveBeaver · 1 pointr/learnprogramming

OOP is a very general concept and it doesn't go further than the SOLID principles. As for the how this actually gets done is somewhat of an opinion and open to interpretation. I'm a fan of this book....

Clean Architecture A Craftsman's Guide to Software Structure and Design

It has very strong opinions and because of that it gives a consistent message and direction. It should by no means be the only opinion or book you take in to account (learn from as many people as you can). But it's a very good start.

u/gin_and_toxic · 1 pointr/webdev

Clean Architecture: https://www.amazon.com/dp/0134494164/ (also read Clean Code if you haven't).

Designing Data-Intensive Applications: https://www.amazon.com/dp/1449373321/

u/Ayakalam · 1 pointr/DSP

Hands down, no question, I would recommend Richard Lyons' book FIRST.

u/kwaddle · 1 pointr/DSP

I think The Scientist and Engineer's Guide to Digital Signal Processing and Understanding Digital Signal Processing and generally considered the most accessible introductions. I've gotten more mileage out of Understanding DSP; I feel like it goes into a little more detail and really works to walk you through concepts, step by step.

http://www.dspguide.com/pdfbook.htm

https://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419


Aside from searching out good learning resources, IMO nothing is more helpful for learning than setting up your environment with Matlab, Jupyter notebooks, or whatever you're going to use, and getting comfortable with the tools you'll be using to explore these topics.

u/fr3nch13 · 1 pointr/FPGA

Although not specifically targeting FPGAs, “Understanding DSP” by Richard Lyons is very good. Very readable.

https://www.amazon.com/dp/0137027419/ref=cm_sw_r_cp_awdb_t1_FQ4ZCbSRHV7QQ

u/necr0tik · 1 pointr/amateurradio

Thanks for the great reply!

The Lessons In Electric Circuits was already on my radar, and I believe will be the first resource in electronics I go through after hearing it beat in my head yet again!

That DSP book I have not seen. I just grabbed a copy and it looks like a great text. I mentioned this post to a fellow electronics enthusiast and he loaned me a copy of a book he said was exceptional for entry into the world of DSP: http://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419/ DSP is pretty complex, More than likely I will go through both to fully absorb this topic.

EMRFD sounds like a cookbook. Given that its by ARRL I expect its quality to be superb. I am not against these type of text, I have a few already, however I'd rather have more of the theory at this point. I imagine this will be great once I am satisified with the basics, and want to build an actual radio with its operation noted.

u/khafra · 1 pointr/DebateReligion

Much of your thinking seems to be based on a confusion of levels. If you knew more specifically how the firing together of neurons strengthens the probability they'll fire together in the future; or if you'd examined a program simulating physics, you wouldn't be using confusion as building blocks for arguments.

For instance, you would not be as confused right here if you were a systems developer instead of a philosopher; one read-through of the Dragon Book would clear everything right up. I'll try to summarize, but please understand this is not rigorous:

Your mind is running the algorithm "Step 1: Move to front of house. Step 2: Move to back of house. Step 3: Go to Step 1." Your mind is something your brain does. Your brain is implemented on physics. Exactly like the boulder.

The most legitimate question related to this post is that of substrate. Note: I do not agree with everything in this essay, but it presents the problem better than writings on "dust theory" (unless you're willing to read the whole Greg Egan novel Permutation City).

u/name_censored_ · 1 pointr/learnprogramming

&gt;Do you know of a book or a website that teach useful optimization techniques?

I'm only an enthusiast, I've never needed really optimised code (truth be told, most of what I do day to day is quick-and-dirty, appallingly inefficient scripts, because it "needs to be done yesterday"), so I can't give you a canonical list, but here's what I do know;

For books, there's this /r/compsci reddit thread from a while ago. Something on compilers like The Dragon Book might be your best bet, especially the optimisation chapter. And obviously jotux's "How Computers Do Maths" - though never having even flicked through it, I can't say if it's any good.

You could try your luck in /r/ReverseEngineering (or the quieter /r/asm and /r/compilers), there are a lot of low-level guys there who'd know a lot more than me. You could also try /r/compsci or /r/algorithms, although they'd be more useful for algorithms than for optimisation. And of course, /r/quantfinance.

u/Zonr_0 · 1 pointr/news

Really? Unless you're in a specific special topics course, the principles for most computer science haven't changed much. This book was the assigned reading for my compilers course and it was written in the mid 80s. Similarly, the core algorithms and data structures in the standard CS education haven't changed much (except for a move away from teaching bubble sort as an intro sort and into insertion sort).

But maybe that was just my education.

u/alanwj · 1 pointr/learnprogramming

it sort of depends on what you are trying to do.

I can't really tell from the name what that book is going to cover, but I expect that most books on programming language theory are going to start with things like lambda calculus, and go into type theory, etc, etc. If you are trying to learn the theoretical underpinnings of programming languages then this is great!

However, in my opinion a more practical place to start is with learning how to implement a programming language. That is, how to write a compiler. For that there is a whole separate set of theory (regular expressions, grammars, automata, etc) that you need to learn. The standard text for this is "the dragon book".

u/vplatt · 1 pointr/java

I have this as well, but don't really have any remarks for you. That said, maybe you should look through some of the reviews for it on Amazon or the like. The reviews there seem pretty authentic.

https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

u/pmjones · 1 pointr/PHP

PoEAA is fantastic and I recommend it to everyone. I reference it regularly.

However, those books don't come with step-by-step instructions on how to work with a legacy PHP codebase in order to modernize it. Modernizing Legacy Applications in PHP does.

u/SamHennessy · 1 pointr/PHP

When I said "I can see how maybe they could be useless to you.", that's because I instantly know what kind of programmer you were. You're a low level guy.

I have a copy of "Algorithms in a Nutshell" (http://www.amazon.com/Algorithms-Nutshell-In-OReilly/dp/059651624X) but I never finished it. My favorit programming book may be "Patterns of Enterprise Application Architecture" (http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420). Neither of these books are language specific, but I don't think they could be further apart in every way. Both are very valuable and I appreciate that they both exist.

There is a good number of reasons that you should maximize your use of the built-in PHP functions (http://webandphp.com/5reasonstomaximizeyouruseofPHP%E2%80%99sbuiltinfeatures). My book is an attempt to come up with a system that will help you learn all of the built-in PHP functions by giving a realistic use case that could be applied in your everyday work.

Being a PHP programmer, it is much more useful to know what functions PHP has for array sorting, than it is to know how to implement array sorting in PHP code.

u/xnoise · 1 pointr/PHP

There are a ton of books, but i guess the main question is: what are you interested in? Concepts or examples? Because many strong conceptual books are using examples from java, c++ and other languages, very few of them use php as example. If you have the ability to comprehend other languages, then:

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;amp;qid=1322476598&amp;amp;sr=8-1 definetly a must read. Beware not to memorize it, it is more like a dictionary. It should be pretty easy to read, a little harder to comprehend and you need to work with the patterns presented in that book.

http://www.amazon.com/PHP-5-Objects-Patterns-Practice/dp/1590593804 - has already been mentioned, is related directly to the above mentioned one, so should be easier to grasp.

http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/ref=sr_1_1?ie=UTF8&amp;amp;qid=1322476712&amp;amp;sr=8-1 - one of the most amazing books i have read some time ago. Needs alot of time and good prior knowledge.

http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_4?ie=UTF8&amp;amp;qid=1322476712&amp;amp;sr=8-4 - another interesting read, unfortunatelly i cannot give details because i haven't had the time to read it all.

u/bitcycle · 1 pointr/Database
  1. Use a relational data store first: MySQL/PostgreSQL/M$ SQL Server.
  2. Put CRUD operations behind a web service.
  3. Instead of arbitrary key/value pairs, try to get as specific as possible about the data (instead of storing everything as strings). Being more specific will help things (usually) to be more performant.
  4. Build the application on top of the service.
  5. Scale to the # of users you want to be able to support
  6. At this point, if you need to move part of the data into a non-relational data store, you can.

    I also recommend reading this book: Patterns of Enterprise Application Architecture
u/celtric · 1 pointr/PHP

You can read books with code examples written in Java or C#, the syntax is very similar and the principles apply in the same way (PoEAA comes to mind).

u/DocAtDuq · 0 pointsr/todayilearned

I'd suggest an arduino uno to start out I don't know what the kits include but I'd suggest against them. You'll do better ordering your parts for projects separately so you only buy what you need. I'd suggest starting with an led cube it's easy to solder and there are code sequences already written for patterns. You'll need a soldering iron. I'd suggest this book also.
http://www.amazon.com/gp/aw/d/0071784225/ref=pd_aw_sims_5?pi=SL500_SY115&amp;amp;simLd=1

u/SyrianRefugeeRefugee · 0 pointsr/linux

"The" book CS students read, and would be good for you, is:

http://www.amazon.com/Operating-Systems-Design-Implementation-Edition/dp/0131429388

It goes deep into OS design, but if you read it, then all the mysterious tips, explanations, and pointers you find online will make sense.

u/theevilsharpie · 0 pointsr/linux

&gt; I want to learn how linux (and computers) work.

If you want to learn how Linux (and computers) work, take a course on operating system design and development. It's offered at any university that has a respectable computer science program, and you can probably find online courses that teach it for free. If you're more of a self-starter, grab a textbook and work your way through it. A book on the internal workings of Linux in particular might also be helpful, but IMO the development of the Linux kernel is too rapid for a book to provide a useful up-to-date reference.

If you want to learn Linux as a day-to-day user (which is what I suspect you're looking for), pick Ubuntu or one of its derivatives. They are easy to get up and running, while still allowing you to "spread your wings" when you're ready.

u/kmafb · 0 pointsr/IAmA

For lexing and parsing, you should just pick up a compiler book. You could bang your head against it your whole life without figuring it out, and the Right Answer is not that hard if you have someone to show it to you. There are lots of good ones; the classic is the "dragon book" (http://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886).

Beyond that, VMs are a big topic. They include all of compilers, and almost all of systems programming. The Smith and Nair book (http://www.amazon.com/Virtual-Machines-Versatile-Platforms-Architecture/dp/1558609105) is a great jumping off point. But so is playing around with a project that means something to you. It depends what you find more rewarding.

u/levu-webworks · 0 pointsr/learnprogramming
  • The "Red Dragon Book of Compiler Design"
  • Compiler Design in C

    Both books I've read. The latter sits on my bookshelf. It was a gift from my girlfriend. Please don't waste your time trying to implement a compiler. It's a PhD level endeavor that will take years of dedicated 60 hour work weeks.

    Here are the same links linked from my Amazon affiliates account:

  • The Red Dragon Book of Compiler Design
  • Compiler Design in C


    You are better off implementing a algebraic calculator using LR Parse. Start with Tom Torf's - Programmers Calculator - PCalc. It's written in C and pretty simple. You can fork it from my GitHub account if you have trouble finding Tom's source archive. Tom (may he rest in peace) also wrote several programming tutorials and contributed to comp.lang.c, alt.lang.c and the comp.lang.c FAQ.
u/CodeShaman · 0 pointsr/java

Once you've mastered basic design patterns, this book will take you to a heightened state of consciousness.

u/henk53 · 0 pointsr/java

No, I'm not getting anything confused!

I started in the 70-ties with asm, worked my way through C, then C++ and lately Java. At some point during that time I've implemented a Pascal compiler and I've read the dragon book cover to back and again.

I think I've got a pretty firm grasp of what call by reference is, and let me tell you: Java DOES NOT have call by reference, ONLY call by value.

If there's anyone who's confused it's you. Sorry.

Things get a LOT easier mentally when you call it a pointer in Java. This is just terminology, even the Java designers knew it was a pointer (hence the name NullPointerException).

In Java, objects are only accessible via pointers. To be more consistent with C++, every variable in Java should actually be written as:

MyObject obj = new MyObject();

But since there's no concept of a stack allocated object in Java, you would always have to put that
there, and so they left it out.

And indeed, for pass-by-reference, a reference to whatever you have (a pointer in this case) has to be passed. So that's indeed a double indirection. In C++ you can even express this explicitly, since there's not only call-by-reference semantics, but also the address off operator &amp;. Applying the &amp; operator to a pointer (e.g. foo*) yields a double pointer (foo*). This is what happens behind the scenes when you use call-by-reference in C++ and various other languages.

In Java, everything but primitives is a pointer, and that pointer is passed, BY VALUE. In C++ terminology, there's only foo
, foo** does not exist. Hence, you can follow that pointer from within a method and thus modify the object that is being pointed too, but you CAN NOT re-assign the pointer as I showed above.

Pass-by-reference semantics DEMAND that a pointer can be re-assigned.

Modifying that was is being pointed too does not count. You are CONFUSED with modifying a stack allocated object. If you pass that into a method and can modify it, then it's pass by reference, since the object itself is passed (no pointer involved), but in Java there is no stack allocated object, only a pointer to an object and that pointer is passed by value.

If you're still confused, please read the dragon book. Even if not for this argument, it will greatly enhance your understanding of how languages and computers work.

u/r2p42 · 0 pointsr/AskProgramming

I tried to come up with a simple explanation how a compiler works but I feel unable to provide a wording which is still somehow correct.
I guess there is a reason why this book has 1000 pages: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

The simplest explanation would be that somewone wrote a program which the computer understands, that is able to detect your language and converts it also to something your computer can understand.

u/eclectro · 0 pointsr/programming

&gt;We are talking about a high-level language compiler, remember?

I still consider C a high level language. Some people don't for various reasons..

&gt;You were complaining that it compiles to C rather than emit instructions.

You simply read/took my post wrong. Nowhere am I complaining. Merely making an observation. It is not an unusual feat for a compiler to generate assembly instructions or machine code. Nor would I call it super difficult to write a compiler, but rather straightforward.

&gt;If you are going to emit instructions, it's up to you to write your own optimizer.

Or buy/obtain a compiler that already is capable of doing this step.

u/read_code · -5 pointsr/programming

I talk about services and microservices

u/VasiliyZukanov · -7 pointsr/androiddev

I find it funny that this guy suggest his audience to read a book that hasn't been published yet.

And then he continues:

&gt; That guy is one of the main guys behind clean architecture.

When the speaker puts in his slides references that he knows nothing about, it says a lot.