Best computer hardware design books according to redditors

We found 294 Reddit comments discussing the best computer hardware design books. We ranked the 54 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top Reddit comments about Computer Hardware Design:

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/arnar · 71 pointsr/AskReddit

A program in any language goes through either a compiler or an interpreter, or a mixture of both. A compiler turns the program into a series of machine instructions, low level codes that the processor knows how to "execute". An interpreter executes a program indirectly.

The first step of both compiling and interpreting is called parsing. This step takes the program text and converts it into an internal representation called an AST (Abstract Syntax Tree). For e.g. this converts "if" and its attached test and statement blocks into one kind of object, and "or" and its attached expressions into another kind. The compiler or interpreter knows that these two objects should be handled differently.

Programming languages are often programmed in themselves. However, a bootstrapping process is required to create the first compiler or interpreter, i.e. the first version needs to be written in a different language. For low level languages like C, this bootstrapping language is usually assembler. For other languages, more often than not the bootstrapping language is C. Some compilers or interpreters are written in different languages, e.g. the most popular version of Python (which is a mix of a compiler and an interpreter) is written in C.

Going from "rand()" to a string of bytes has more to do with calling a library than a specific feature of a programming language. From the PL point of view, "rand()" is a function call. This function resides in a library that may or may not be written in the same language, in any case, it contains the logic to generate a random number. There are several ways to do this, google or wikipedia for Pseudo Random Number Generator (PRNG).

The standard reference on compilers and interpreters is the so-called Dragon Book, I'll leave it up to others to suggest more material.

u/rtanaka6 · 48 pointsr/programming

But the Dragon Book has cool dragons on it!

u/Hakawatha · 27 pointsr/electronics

That's because it is RF design. Have you read the handbook of black magic? Excellent book, I'm told.

u/Bluegorilla101 · 26 pointsr/funny

Should have gotten her this dragon book so she can get a headstart on writing compilers.

u/Lericsui · 26 pointsr/learnprogramming

"Introduction to Algorithms"by Cormen et.al. Is for me the most important one.

The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good.


Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics.


Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it.


SICP(although married to a language) teaches very very good fundamentals.


Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material.

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/Yehosua · 22 pointsr/programming

Whenever I get an email about a new programming Humble Bundle, I hop over to Reddit to see if anyone else thinks it's worth buying. In this case, Reddit has failed me, because no one has shared their opinions. All is not lost, however, for I can share mine!

These are probably the most commonly recommended DevOps books:

  • The Phoenix Project - written in the form of a novel to teach DevOps principles. (I've read it. It's awesome.)
  • The DevOps Handbook - a non-fiction book from the authors of The Phoenix Project. (I've started it.)
  • Site Reliability Engineering - "SRE" is more or less Google's term for DevOps. This book is more or less how Google does DevOps. (I've not read it. It's available online for free.)
  • Accelerate: The Science of Lean Software and DevOps - Martin Fowler calls this the best programming book of this year. (I've not read it.)
  • The Site Reliability Workbook - a sequel of sorts to Site Reliability Engineering. Probably less popular than the others I just listed. (I've not read it.)

    The Site Reliability Workbook is the only one of these that's included in this bundle. So the first question I ask myself regarding this bundle is, "Do I want to spend the time and money on this bundle's books, or should I spend that on one of the highly recommended books instead?" (Personally, I'm going with the latter.)

    Otherwise, most of the books here are technology-specific, so the second question is, "Do I want to learn any of these specific technologies now, and are e-books a good way of doing it?" (Personally, my answer is no.)

    Depending on how you answer the first two questions, the last question is, "Are the non-technology-specific books worth getting?" To answer that, here are Amazon links to the non-technology-specific books, for reviews and sales rankings:

  • The Site Reliability Workbook
  • Designing Distributed Systems
  • Database Reliability Engineering
  • Seeking SRE
  • Cloud Native Infrastructure
  • Practical Monitoring
  • Effective DevOps
u/antonivs · 18 pointsr/badcode

The code you posted was generated from a grammar definition, here's a copy of it:

http://www.opensource.apple.com/source/bc/bc-21/bc/bc/bc.y

As such, to answer the question in your title, this is the best code you've ever seen, in the sense that it embodies some very powerful computer science concepts.

It [edit: the Bison parser generator] takes a definition of a language grammar in a high-level, domain-specific language (the link above) and converts it to a custom state machine (the generated code that you linked) that can extremely efficiently parse source code that conforms to the defined grammar.

This is actually a very deep topic, and what you are looking at here is the output of decades of computer science research, which all modern programming language compilers rely on. For more, the classic book on the subject is the so-called Dragon Book, Compilers: Principles, Techniques, and Tools.

u/Cohesionless · 17 pointsr/cscareerquestions

The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics.

If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following:

  • Introduction to Algorithms, 3rd Edition: https://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844. A lot of people do not like this classic book because it is very theoretical, very mathematical, and very abstract, but I think that is its greatest strength. I find a lot of algorithms books either focus too much about how to implement an algorithm in a certain language or it underplays the theoretical foundation of the algorithm such that their readers can only recite the algorithms to their interviewers. This book forced me to think algorithmically to be able to design my own algorithms from all the techniques and concepts learned to solve very diverse problems.

  • Design Patterns: Elements of Reusable Object-Oriented Software, 1st Edition: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/. This is the original book on object-oriented design patterns. There are other more accessible books to read for this topic, but this is a classic. I don't mind if you replace this book with another.

  • Clean Code: A Handbook of Agile Software Craftsmanship, 1st Edition: https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882. This book is the classic book that teaches software engineer how to write clean code. A lot of best practices in software engineering is derived from this book.

  • Java Concurrency in Practice, 1st Edition: https://www.amazon.com/Java-Concurrency-Practice-Brian-Goetz/dp/0321349601. As a software engineer, you need to understand concurrent programming. These days there are various great concurrency abstractions, but I believe everyone should know how to use low-level threads and locks.

  • The Architecture of Open Source Applications: http://aosabook.org/en/index.html. This website features 4 volumes of books available to purchase or to read online for free. It's content focuses on over 75 case studies of widely used open-source projects often written by the creators of said project about the design decisions and the like that went into creating their popular projects. It is inspired by this statement: "Architects look at thousands of buildings during their training, and study critiques of those buildings written by masters."

  • Patterns of Enterprise Application Architecture, 1st Edition: https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/. This is a good read to start learning how to architect large applications.

    The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry.

    P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database.
u/quixotidian · 15 pointsr/compsci

Here are some books I've heard are good (I haven't read them myself, but I provide commentary based on what I've heard about them):

u/ReasonableDrunk · 14 pointsr/marvelstudios

The definitive work on high speed digital electrical engineering is, in fact, called the Handbook of Black Magic.

https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/mcur · 14 pointsr/linux

You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.

I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/

I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&ie=UTF8&qid=1377402221&sr=1-1

Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&ie=UTF8&qid=1377402371&sr=1-1

Edit: Got the wrong Hennessy/Patterson book...

u/pmjones · 13 pointsr/PHP

Scalable Internet Architectures by Theo Schlossnagle. He was my boss at OmniTI and knows his stuff.

u/[deleted] · 12 pointsr/programming

So I've been doing a bit more ASM programming etc lately. I liked this book when I read it, but these days I've gotten interested in really doing fast programming, i.e. taking advantage of the processors design in your code.

So if you liked this book and wanted to take it to the next step in superfast, I suggest these resources:

  • Agner Fog's optimization page
  • Jon Stokes' book Inside the Machine is AMAZING and covers the dawn of advanced x86 processor design up until recently - all the way from the Pentium to the Core 2 line, and covers PPC design too!

    And if you're on Linux, you NEED to install perf and check if your CPU has any performance counters it can take advantage of. They give tons of insight and may upset some assumptions you have about how your code really performs. valgrind's cachegrind tool is wonderful in the same vein, but only for simulated L1-L2 cache usage.

    Also, if you have one of those fancy new phones with a modern processor, ARM assembly is wonderful and fun (I do it on my iPhone.) Shit, some of them are dual core now. Throw your C code in gcc -S or whatever and look at the generated assembly. I'll try and find my resources for that later.
u/teresko · 12 pointsr/PHP

Actually i would suggest you to start learning OOP and maybe investigate the MVC design pattern, since those are both of subjects which average CodeIgniter user will be quite inexperienced in. While you might keep on "learning" frameworks, it is much more important to actually learn programming.

Here are few lecture that might help you with it:

u/cyrusol · 11 pointsr/PHP

You might want to look at Patterns of Enterprise Application Architecture by Martin Fowler. There are 3 full examples that you are looking for right now:

  • table gateway
  • active record style ORMs (like Eloquent)
  • data mapper style ORMs (like Doctrine which is exactly like Java's Hibernate)

    The examples are in Java but you probably won't have any difficulties understanding the code. He builds those from the ground up and finally compares them and says when to use what.

    -----

    Beyond that the abstraction endgame is to completely separate your persistence model from your domain model. If you have a user with an email address and other fields you don't want to have just one class that deals with loading the user row from a SQL database based on his email address, you want the latter part be isolated into its own class. (So that the user could in theory also be requested from an API or Memcached or a text file.)

    class User { ... }
    interface UserByEmailQuery {
    public function findByEmail(string $email): User?;
    }
    class SqlUserByEmailQuery implements UserByEmailQuery { ... }

    But it's sometimes simply not worth (economically) to go that far.
u/PubliusPontifex · 11 pointsr/compsci

The Dragon Book by somebody. A bit out of date now, but really helped me with my parser/tree implementation.

u/Authenticity3 · 10 pointsr/ECE

Old (1993) but classic fundamentals that are still relevant today:
High Speed Digital Design: A Handbook of Black Magic https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90

u/correctsbadalts · 10 pointsr/funny

Was expecting this dragon book

u/poincareDuality · 10 pointsr/compsci

For designing programming languages, my favorites are

u/NAMOS · 10 pointsr/onions

Basically any SRE advice for a normal service but replace/compliment HAproxy / nginx / ingress controller / ELB with the Tor daemon / OnionBalance.

I run Ablative Hosting and we have a few people who value uptime over anonymity etc and so we follow the usual processes for keeping stuff online.

Have multiples of everything (especially stuff that doesn't keep state), ensure you have monitoring of everything from connections, memory pressure, open files, free RAM etc etc.

Just think of the Tor daemon onion service as just a TCP reverse proxy, with load-balancing capability and then follow any other advice when it comes to building reliable infrastructure;

u/Beagles_are_da_best · 9 pointsr/PrintedCircuitBoard

I did learn all of this stuff from experience. Honestly, I had a little bit of a tough time right out of college because I didn't have much practical circuit design experience. I now feel like I have a very good foundation for that and it came through experience, learning from my peers, and lots of research. I have no affiliation with Henry Ott, but I treat his book like a bible . I refer to it just about every time I do a board design. Why? because it's packed with this type of practical information. Here's his book. I bought mine used as cheap as I could. At my previous job, they just had one in the library. Either way, it was good to have around.

So why should you care about electromagnetic compatibility (EMC)? A couple reasons:

  1. EMC compliance is often regulated by industry and because a product requirement. The types of tests that your product has to pass is dependent on the industry typically, but in general there are tests where bad things are injected into your board and tests where they measure how noisy your board. You have to pass both.
  2. EMC compliance, in my opinion, is very well correlated with the reliability and quality of a product. If a product is destroyed "randomly" or stops working when the microwave is on, you're not likely to have a good opinion of that product. Following guidelines like the one I did above is the path to avoiding problems like that.
  3. EMC design is usually not taught in schools and yet it is the most important part of the design (besides making it perform the required product function in the first place). It also is very hard to understand because many of the techniques for improving your design do not necessarily show up on your schematics. Often, it's about how well your layout your board, how the mechanical design for the enclosure of your board is considered, etc.

    Anyways, it's definitely worth looking at and is a huge asset if you can follow those guidelines. Be prepared to enter the workforce and see rampant disregard for EMC best practices as well as rampant EMC problems in existing products. This is common because, as I said, it's not taught and engineers often don't know what tools to use to fix it. It often leads to expensive solutions where a few extra caps and a better layout would have sufficed.

    A couple more books I personally like and use:

    Howard Johnson, High Speed Digital Design (it's from 1993, but still works well)

    Horowitz and Hill, The Art of Electronics (good for understanding just about anything, good for finding tricks and ideas to help you for problems you haven't solved before but someone probably has)

    Last thing since I'm sitting here typing anyways:

    When I first got out of college, I really didn't trust myself even when I had done extensive research on a particular part of design. I was surrounded by engineers who also didn't have the experience or knowledge to say whether I was on the right path or not. It's important to use whatever resources you have to gain experience, even if those resources are books alone. It's unlikely that you will be lucky and get a job working with the world's best EE who will teach you everything you need to know. When I moved on from my first job after college, I found out that I was on the right path on many things thanks to my research and hard work. This was in opposition to my thinking before then as my colleagues at my first job were never confident in our own ability to "do EE the right way" - as in, the way that engineers at storied, big companies like Texas Instruments and Google had done. Hope that anecdotal story pushes you to keep going and learning more!
u/greenlambda · 9 pointsr/ECE

I'm mostly self-taught, so I've learned to lean heavily on App Notes, simulations, and experience, but I also like these books:
The Howard Johnson Books:
High Speed Digital Design: A Handbook of Black Magic
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_api_I0Iwyb99K9XCV
High Speed Signal Propagation: Advanced Black Magic
https://www.amazon.com/dp/013084408X/ref=cm_sw_r_cp_api_c3IwybKSBFYVA

Signal and Power Integrity - Simplified (2nd Edition)
https://www.amazon.com/dp/0132349795/ref=cm_sw_r_cp_api_J3IwybAAG9BWV

Also, another thing that can be overlooked is PCB manufacturability. It's vitally important to understand exactly what can and can't be manufactured so that you can make design trade offs, and in order to do that you need to know how they are made. As a fairly accurate intro, I like the Eurocircuits videos:
http://www.eurocircuits.com/making-a-pcb-pcb-manufacture-step-by-step

u/Dhekke · 8 pointsr/programming

This book Structured Computer Organization is also very good at explaining in detail how the computer works, it's the one I used in college... Pretty expensive, I know, but at least the cover has nice drawings!

u/dbuckley · 8 pointsr/technology

> Why does it transmit anything at all

Electronics that have fast switching transitions generate significant amounts of radio frequency energy. In the modern world, it is a major part of the designers job to reduce or shield these emissions so that equipment doesn't interfere with other equipment.

There is a lot of skill and art and not a little black magic involved in getting high speed electronics to work at all. In fact, one of the first books to seriously tackle the subject, a book that many designer still has on their bookshelf is High Speed Digital Design: A Handbook of Black Magic. The challenge once it works is to make it less like a transmitter.

To prove the thing is compliant with the standards of where the thing is being sold, it is traditional to take the kit to an EMC test house, where the Device Under test (the DUT) is placed in a screened room, and set up in representative conditions (ie power cables, Ethernet cables etc), and the amount of radio frequency junk spewed into the air is measured. This costs money, according to here, about $1-10K. if you fail, you have to fix the design, and spend money for testing again, until it passes. And of course, fixing the design takes time.

Many countries are happy to sell kit across international boundaries with none of this stuff done at all.

u/DonaldPShimoda · 8 pointsr/ProgrammingLanguages

I've peeked at this free online book a few times when implementing things. I think it's a pretty solid reference with more discussion of these sorts of things!

Another option is a "real" textbook.

My programming languages course in university followed Programming Languages: Application and Interpretation (which is available online for free). It's more theory-based, which I enjoyed more than compilers.

But the dragon book is the go-to reference on compilers that is slightly old but still good. Another option is this one, which is a bit more modern. The latter was used in my compilers course.

Outside of that, you can read papers! The older papers are actually pretty accessible because they're fairly fundamental. Modern papers in PL theory can be tricky because they build on so much other material.

u/HikoboshiSama · 8 pointsr/compsci
u/Kingizzardthelizard · 8 pointsr/linuxquestions

I got a bookmark folder filled with resources just for the day I choose not being a lazy slob. Here's some:

https://www.kernel.org/doc/html/latest/ - Official docs

http://www.dit.upm.es/~jmseyas/linux/kernel/hackers-docs.html - Resource list including books and webpages

Some books i got from libgen:

Professional Kernel Architecture

Understanding the Linux Kernel, Third Edition

Linux Kernel Development (3rd Edition)

u/LinuxStreetFighter · 8 pointsr/linuxmasterrace

Intro

Command Line

You mentioned MIS, which is fine, but from a Computer Science perspective, I can't recommend this book enough:

Computer Architecture


Raspbian has basic programs written in Scratch and Java if you go to /home/user/Documents

You'll have to search under BlueJ and the other IDEs they use. Replace user with the user name.

Have fun :)

u/Quintic · 8 pointsr/learnprogramming

Here are some standard textbooks on the subject. When I am looking for a book on a particular subject, I like to look at the class schedules for local universities and see what they are using. A class on programming languages is a standard part of a CS program I believe.

Compilers: Principles, Techniques, and Tools
http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811/ref=sr_1_3?ie=UTF8&qid=1343095509&sr=8-3&keywords=Dragon+Book

Concepts of Programming Languages
http://www.amazon.com/Concepts-Programming-Languages-Robert-Sebesta/dp/0136073476/ref=sr_1_5?s=books&ie=UTF8&qid=1343095607&sr=1-5&keywords=Programming+languages

Programming Language Pragmatics
http://www.amazon.com/Programming-Language-Pragmatics-Second-Michael/dp/0126339511/ref=sr_1_2?s=books&ie=UTF8&qid=1343095647&sr=1-2&keywords=Programming+language+pragmatics

u/DVWLD · 7 pointsr/node

You should start by learning Go, Erlang, Rust and C.

/trolololololololol

Seriously, though, if you're talking about cramming as many users as onto a single machine as possible then node is not your runtime.

Node is great at building things that scale horizontally. It makes it really easy to write realtime, event based code. Node is really good at doing things that involve a lot of network IO since it's easy to do that in a non-blocking way. It's not a great choice for a high scale game server where memory usage is key.

If you want to know more about horizontal scaling patterns (which Eve only qualifies for if you squint a bit), I'd recommend starting here:

http://www.amazon.com/Scalable-Internet-Architectures-Theo-Schlossnagle/dp/067232699X

And looking at distributed consensus approaches, message queues, and bumming around http://highscalability.com/ a bit.

u/theoldwizard1 · 7 pointsr/hardware

>ISAs are commonly categorized by their complexity, i.e., the size of their instruction space: large ISAs such as x86-64 are called Complex Instruction Set Architectures (CISC), while the chips powering smartphones and other portable, low-power devices are based on a Reduced Instruction Set Architecture (RISC). The huge instructions space of the typical CISC ISA necessitates equally complex and powerful chips while RISC designs tend to be simpler and therefore less power hungry.

I do understand that this a "once over lightly", but, based on spending a few years of my career working with a team to select a "net gen" embedded processor for a Fortune 50 company (that would purchase millions) I do feel qualified to make these comments. (I am also the proud owner of a well worn 1st Edition of the Hennessy and Patterson Computer Architecture: A Quantitative Approach)

The lines between RISC and CISC keep getting muddier every year that goes by. While probably no longer true, but the biggest differentiator between RISC and CISC was that RISC used fixed length instructions. This made decoding the instructions MUCH simpler. The decode portion of a CISC CPU had to grab a few bytes, partially decode them, and then decide how many more bytes to grab.

The old Digital Equipment Corporation VAX architecture was (and probably still is) the MOST complex instruction set architecture. Most arithmetic and logical operation could have 3 operands and each operand could have any combination of multiple addressing modes. Worse, the VAX architecture dedicated 3 of the only 16 register for "context" (SP, FP and AP).

RISC machines had more registers than CISC machines and, over time, compiler writers figured out how to do the equivalent of the FP and AP from deltas off the SP. With the larger number of registers, typically one register was a dedicated constant zero register, necessary because all memory was accessed via indirect addressing. For embedded processor that had no loader to do "fix up" at load time, 1 or 2 more registers became dedicated pointers to specific types of memory (perhaps RAM vs ROM or "short" data vs "complex" data i.e. arrays, strings, etc)

With smaller die sizes, RISC machines could have more cache on chip. More cache meant "more faster" !

u/fatangaboo · 7 pointsr/ECE

For your job? Spend the money or get your boss to spend the money on the books written by Howard Johnson.

(book 1)

(book 2)

Trivialized and unsatisfying answer to the question in the title of this thread: Vbounce = Lground * dI/dt . You think Lground equals zero but you are mistaken.

u/YuleTideCamel · 7 pointsr/learnprogramming

Pick up these books:

u/masklinn · 7 pointsr/programming

Yet another long-winded and mostly useless BeautifulCode blog post that could've been avoided by suggesting that readers go buy Fowler's Refactoring (and read page 260 in this case, introduce Null Object) as well as the same Fowler's Patterns of Enterprise Application Architecture page 496 Special Case, A subclass that provides special behavior for particular cases.

Also, note that while the whole texts and explanations and examples and whatnot are not provided, you can find the refactorings and patterns of these books in The Refactoring Catalog and Catalog of Patterns, the respective websites of Refactoring and PoEAA

u/hwillis · 6 pointsr/electronics

Can't use free eagle (too big) for this, but kicad or probably other things would work. With a few good books you can lay out a big board without advanced tools, although it can take longer. With cheap/free tools you'll usually have to use some finicky or kludgy methods to do really complex routing (blind/buried vias, free vias, heat transfer, trace length), but that usually isn't too big a deal. Here's a timelapse of a guy using Altium to route a high speed, large (a bit smaller than op's) data board for a high speed camera. The description has rough steps with timestamps- 38 hours total to lay out.

u/Jazzy_Josh · 6 pointsr/cscareerquestions

The dragon book if you're into compilers

There's a second edition, but I think this one has a cooler cover ;)

u/FattyBurgerBoy · 6 pointsr/webdev

The book, Head First Design Patterns, is actually pretty good.

You could also read the book that started it all, Design Patterns: Elements of Reusable Object-Oriented Software. Although good, it is a dull read - I had to force myself to get through it.

Martin Fowler is also really good, in particular, I thoroughly enjoyed his book Patterns of Enterprise Architecture.

If you want more of an MS/.NET slant of things, you should also check out Dino Esposito. I really enjoyed his book Microsoft .NET: Architecting Applications for the Enterprise.

My recommendation would be to start with the Head First book first, as this will give you a good overview of the major design patterns.

u/llimllib · 6 pointsr/compsci

sipser (I have the first edition which you can get on the cheap, it's very good.)

AIMA

Dragon

Naturally, TAOCP.

Many will also recommend SICP, though I'm not quite sure that's what you're angling at here, it's probably worth browsing online to see.

u/blexim · 5 pointsr/REMath

The object you're interested in is the call graph of the program. As you've observed, this is a DAG iff there is no recursion in the program. If function A calls B and B calls A, this is called mutual recursion and still counts as recursion :)

A related graph is the control flow graph (CFG) of a function. Again, the CFG is a DAG iff the function doesn't contain loops.

An execution trace of a program can certainly be represented as a DAG. In fact, since an execution trace does not have any branching, it is just a straight line! However you are very rarely interested in a single trace through a program -- you usually want to reason about all the traces. This is more difficult because if you have any looping structure in the global CFG, there is no (obvious) upper bound on the size of a trace, and so you can't capture them all with a finite structure that you can map into SMT.

Every program can be put into SSA form. The trick is that when you have joins in the control flow graph (such as at the head of a loop), you need a phi node to fix up the SSA indices. If you don't have it already, the dragon book is pretty much required reading if you're interested in any kind of program analysis.

In general, if you have a loop free control flow graph of any kind (a regular CFG or a call graph), then you can translate that graph directly into SAT or SMT in a fairly obvious way. If you have loops in the graph then you can't do this (because of the halting problem). To reason about programs containing loops, you're going to need some more advanced techniques than just symbolic execution. The big names in verification algorithms are:

  • Bounded model checking
  • Abstract interpretation
  • Predicate abstraction
  • Interpolation based methods

    A good overview of the field is this survey paper. To give an even briefer idea of the flavour of each of these techniques:

    Bounded model checking involves unwinding all the loops in the program a fixed number of times [; k ;]. This gives you a DAG representing all of the traces of length up to [; k ;]. You bitblast this DAG (i.e. convert it to SAT/SMT) and hand off the resulting problem to a SMT solver. If the problem is SAT, you've found a concrete bug in the program. If it's UNSAT, all you know is that there is no bug within the first [; k ;] steps of the program.

    Abstract interpretation is about picking an abstract domain to execute your program on, then running the program until you reach a fixed point. This fixed point tells you some invariants of you program (i.e. things which are always true in all runs of the program). The hope is that one of these invariants will be strong enough to prove the property you're interested in.

    Predicate abstraction is just a particular type of abstract interpretation where your abstract domain is a bunch of predicates over the variables of the program. The idea is that you get to keep refining your abstraction until it's good enough to prove your property using counterexample guided abstraction refinement.

    Interpolation can be viewed as a fancy way of doing predicate refinement. It uses some cool logic tricks to do your refinement lazily. The downside is that we don't have good methods for interpolating bitvector arithmetic, which is pretty crucial for analyzing real programs (otherwise you don't take into account integer overflow, which is a problem).

    A final wildcard technique that I'm just going to throw out there is loop acceleration. The idea here is that you can sometimes figure out a closed form for a loop and replace the loop with that. This means that you can sometimes remove a loop altogether from the CFG without losing any information or any program traces. You can't always compute these closed forms, but when you can you're in real good shape.

    Drop me a message if you want to know anything else. I'm doing a PhD in this exact area & would be happy to answer any questions you have.
u/ElectricRebel · 5 pointsr/compsci

For compilers:

u/boredcircuits · 5 pointsr/learnprogramming

Start with the Dragon Book.

When it actually comes time to implement the language, I would recommend just writing the frontend and reusing the backend from another compiler. LLVM is a good option (it's becoming popular to use as a backend, it now has frontends for C, C++, Objective C, Java, D, Pure, Hydra, Scheme, Rust, etc). See here for a case study on how to write a compiler using LLVM as the backend.

u/fbhc · 5 pointsr/AskComputerScience

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

​

But you might also consider Basics of Compiler Design which is a good and freely available resource.

​

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/moyix · 5 pointsr/ReverseEngineering

Have you worked through the loop detection in the Dragon Book? There are some slides on it here:

http://www.cs.cmu.edu/afs/cs/academic/class/15745-s03/public/lectures/L7_handouts.pdf

u/nwndarkness · 4 pointsr/FPGA

Computer Organization and Design RISC-V Edition: The Hardware Software Interface (ISSN) https://www.amazon.com/dp/B0714LM21Z/ref=cm_sw_r_cp_api_i_Wn3xDbMYHH61S

Computer Architecture: A Quantitative Approach (The Morgan Kaufmann Series in Computer Architecture and Design) https://www.amazon.com/dp/0128119055/ref=cm_sw_r_cp_api_i_jp3xDbRYQ12GA

u/motus · 4 pointsr/programming

The classic book on CORBA is

http://www.amazon.com/Advanced-Programming-Addison-Wesley-Professional-Computing/dp/0201379279

Worth reading even if you don't plan to use C++ on your project.

In addition to that, the OMG site

http://www.omg.org/gettingstarted/corbafaq.htm

has a lot of downloadable PDFs, and your favorite ORB probably have some good docs as well.

For decent C++ and Python mappings, check out omniORB:

http://omniorb.sourceforge.net/docs.html

You may also want to take a look at ICE, which is very similar to CORBA, but has somewhat better language mappings (esp. for C++):

http://www.zeroc.com/

Hope this helps!

u/DamnLogins · 4 pointsr/WTF

Charlie Stross (author of the blog) has just gone up even further in my estimation. Not only is he a great SF author but a great Perl programmer as well.

u/yberreby · 4 pointsr/programming

When I started getting interested in compilers, the first thing I did was skim issues and PRs in the GitHub repositories of compilers, and read every thread about compiler construction that I came across on reddit and Hacker News. In my opinion, reading the discussions of experienced people is a nice way to get a feel of the subject.

As for 'normal' resources, I've personally found these helpful:

  • This list of talks about compilers in general.
  • The LLVM Kaleidoscope tutorial, which walks you through the creation of a compiler for a simple language, written in C++.
  • The Super Tiny Compiler. A really, really simple compiler, written in Go. It helps with understanding how a compilation pipeline can be structured and what it roughly looks like.
  • Anders Hejlsberg's talk on Modern Compiler Construction. Helps you understand the difference between the traditional approach to compilation and new approaches, with regards to incremental recompilation, analysis of incomplete code, etc. It's a bit more advanced, but very interesting nevertheless.

    In addition, just reading through the source code of open-source compilers such as Go's or Rust's helped immensely. You don't have to worry about understanding everything - just read, understand what you can, and try to recognize patterns.

    For example, here's Rust's parser. And here's Go's parser. These are for different languages, written in different languages. But they are both hand-written recursive descent parsers - basically, this means that you start at the 'top' (a source file) and go 'down', making decisions as to what to parse next as you scan through the tokens that make up the source text.

    I've started reading the 'Dragon Book', but so far, I can't say it has been immensely helpful. Your mileage may vary.

    You may also find the talk 'Growing a language' interesting, even though it's not exactly about compiler construction.

    EDIT: grammar
u/fluicpana · 4 pointsr/italy

Per testare le acque velocemente puoi usare https://rubymonk.com/ (introduce Ruby in modo basico). Anche Coursera, Khan, Udacity e simili hanno corsi introduttivi sulla programmazione.

Mentre se vuoi imparare a programmare, il percorso deve toccare almeno tutte queste tappe, in ordine:

  1. [Computer Organization and Design](http://www.amazon.com/Computer-
    Organization-Design-Fourth-Edition/dp/0123744938)

  2. The Structure and Interpretation of Computer Programs

  3. Un buon libro di Assembly

  4. The C programming language

  5. Compillers

  6. Code complete, The practice of programming

  7. Fai finta di aver letto tutto The art of computer programming

  8. Un linguaggio a oggetti, magari Programming Ruby

  9. O/E Python, Dive into Python

  10. Design patterns

  11. Impara un linguaggio funzionale.


    Da qui puoi partire e specializzarti in quello che ti interessa

u/xamino · 3 pointsr/learnprogramming

I don't think we need to go that deep. An excellent book on how CPUs work at the assembly level is Inside the Machine. I can only recommend it even for programmers.

u/bluecav · 3 pointsr/raspberry_pi

I'm an ECE that got into Raspberry Pi about a month ago. I work in microelectronics (chip design), and wanted to use it to get back into larger scale electronics hacking and to do some more hardware oriented programming and projects.

As such, I had to basically reform my electronics gadget supply at the same time since I ditched my college collection a while back when moving to a new house.

Here's some of the key things I bought to go with my Pi that I felt I needed. I'm assuming you're like me and want to work on electronics hardware (lights, switches, etc).

  • Raspberry Pi B+ : I wanted the larger sized one with more memory and USB ports as the prototype environment. As I get stuff fully working, then I plan to buy an A+ for the implementation environment. I bought the Canakit Ultimate Starter Kit on Amazon
  • You'll want a good microSD card. I swapped out the 8GB one from the Canakit for a 16GB one since I want to store some data on the card for a project I have in mind
  • A case : I used the one from the Canakit
  • A USB keyboard. The Logitech K400 is nice (just make sure to pair it on a PC first), or the Rii i8 Mini work nice (I have both)
  • If you want an onboard display, look at the PiTFT from Adafruit. I used that for my initial setup, then set up my Pi to autostart a VNC server on boot and now I work without a display. If you don't want the PiTFT, you can use a TV or a monitor if it supports HDMI (or a regular monitor with an HDMI to DVI adapter).
  • If hardware hacking, a breadboard and cobbler board : You'll want a breadboard for prototyping electronics projects before soldering to a PCB, and a cobbler board to connect the pin header of the Raspberry to your breadboard. I used the one from the Canakit but there are various ones out there you can buy
  • To go with a breadboard, I suggest flexible breadboard wire. These or these would work.
  • If hardware hacking, you'll want LEDs, switched, and resistors/capacitors. I really like these resistors (they came bagged and labelled), the LEDs I started out with from Radio Shack, and for switches I really like these ones. They snap right into a breadboard. The caps I just got at RadioShack.
  • You may want to grab a multimeter as well. I have two myself with different functions (one for logic probing mainly).

    Beyond those basic starter components, the rest is up to your imagination and what you want to do next. In my case, I plan to drive higher current components, so I'll be using optocouples and relays eventually. And I plan to make my own PCBs to snap onto the Raspberry, so I have PCBs, headers, and soldering stuff.

    If you're new to the Raspberry, there's online resources out there. I also got this book off Amazon as a starter as well, which I've been coupling with online resources.

    On the Arduino side, that's my next purchase since I may find it easier to have the software and server side of one of my projects on a Pi, and the hardware interface on an Arduino. I'm just going to get an R3 board to start since I have the rest of the stuff they usually include in a starter pack listed above.

    This blog did a nice writeup comparing some Arduino R3 starter kits:
    https://www.pretzellogix.net/2014/10/09/three-arduino-starter-kits-compared-and-reviewed/
u/wgren · 3 pointsr/dcpu_16_programming

Code: The Hidden Language of Computer Hardware and Software,The Elements of Computing Systems and Inside the Machine were recommended on Hacker News.

I have the last one, I will re-read it over Easter holidays...

u/GrayDonkey · 3 pointsr/java

You need to understand there are a couple of ways to do Java web development.

  • Servlets & JSPs. - Check out Core Servlets and JavaServer Pages or the Java EE Tutorial. Note that I link to an older EE tutorial because the newer versions try to switch to JSF and not much changed in Servlets and JSPs between Java EE 5 and 6. I recommend learning Servlets and JSPs before anything else.
  • JSF - A frameworks that is layered on top of Servlets and JSPs. Works well for some tasks like making highly form centric business web apps. Most of the JSF 2 books are okay. JSF is covered in the Java EE 6 Tutorial
  • Spring - Spring is actually a bunch of things. You'd want to learn Spring MVC. If you learn any server-side Java web tech besides Servlets and JSPs you'd probably want to learn Spring MVC. I wouldn't bother with GWT or any other server-side Java web tech.
  • JAX-RS - After you get Servlets and JSPs down, this is the most essential thing for you to learn. More and more you don't use server-side Java (Servlets & JSPs) to generate your clients HTML and instead you use client-side JavaScript to make AJAX calls to a Java backend via HTTP/JSON. You'll probably spend more time with JavaScript:The Good Parts and JavaScript: The Definitive Guide than anything else. Also the JAX-RS api isn't that hard but designing a good RESTful api can be so be on the lookout for language agnostic REST books.

    Definitely learn Hibernate. You can start with the JPA material in the Java EE tutorial.

    As for design patterns, Design Patterns: Elements of Reusable Object-Oriented Software is a classic. I also like Patterns of Enterprise Application Architecture for more of an enterprise system pattern view of things. Probably avoid most J2EE pattern books. Most of the Java EE patterns come about because of deficiencies of the J2EE/JavaEE platform. As each new version of Java EE comes out you see that the patterns that have arisen become part for the platform. For example you don't create a lot of database DAOs because JPA/Hibernate handles your database integration layer. You also don't write a lot of service locators now because of CDI. So books like CoreJ2EE Patterns can interesting but if you are learning a modern Java web stack you'll be amazed at how archaic things used to be if you look at old J2EE pattern books.

    p.s. Don't buy anything that says J2EE, it'll be seven years out of date.
u/OmegaNaughtEquals1 · 3 pointsr/cpp_questions

This is a great question! It's also one that every serious CS person will ask at some point. As others here have noted, to really understand this question you must understand how compilers work. However, it isn't necessary to understand the gory details of compiler internals to see what a compiler does for you. Let's say you have a file called hello.cpp that contains the quintessential C++ program

include <iostream>

int main() {<br />
    std::cout &amp;lt;&amp;lt; &quot;Hello, world!\n&quot;;<br />
}<br />


The first thing the compiler does is called preprocessing. Part of this process includes expanding the #include statements into their proper text. Assuming you are using gcc, you can have it show you the output of this step

gcc -E -o hello.pp hello.cpp

For me, the hello.cpp files explodes from 4 lines to nearly 18000! The important thing to note here is that the contents of the iostream library header occur before the int main lines in the output.

The next several step for the compiler are what you will learn about in compiler design courses. You can take a peek at gcc-specific representations using some flags as discussed on SO. However, I pray you give heed. For there be dragons!

Now let's take a look at the compiler's output. To do this, I am going to not #include anything so the output is very simple. Let's use a file called test.cpp for the rest of the tests.

int main() {
int i = 3, j = 5;
float f = 13.6 / i;
long k = i&lt;&lt;j;
}

To see the compiler's output, you can use

g++ -S -masm=intel test.cpp

The -S flag asks gcc to just output the generated assembly code and -masm=intel requests the intel dialect (by default, gcc uses the AT&amp;T dialect, but everyone knows the intel one is superior. :) ) The output on my machine (ignoring setup and teardown code) is outlined below.

push rbp
mov rbp, rsp

/ int i = 3, j = 5; /
mov DWORD PTR [rbp-20], 3
mov DWORD PTR [rbp-16], 5

/ float f = 13.6 / i; /
pxor xmm0, xmm0
cvtsi2sd xmm0, DWORD PTR [rbp-20]
movsd xmm1, QWORD PTR .LC0[rip]
divsd xmm1, xmm0
movapd xmm0, xmm1
cvtsd2ss xmm2, xmm0
movss DWORD PTR [rbp-12], xmm2

/ long k = i&lt;&lt;j; /
mov eax, DWORD PTR [rbp-16]
mov edx, DWORD PTR [rbp-20]
mov ecx, eax
sal edx, cl
mov eax, edx
cdqe
mov QWORD PTR [rbp-8], rax

/ implicit return 0; /
mov eax, 0
pop rbp
ret

There are lots of details to learn in here, but you can generally see how each simple C++ statement translates into many assembly instructions. For fun, try compiling that program with the optimizer turned on (with g++, you can use -O3). What is the output?

There is still much to see from the binary that is assembled. You can use nm and objdump to see symbols or ldd to see what other libraries were (dynamically) linked into the executable. I will leave that as an exercise for the reader. :)

u/sfrank · 3 pointsr/programming

But make sure to get the 2nd edition of the Compiler book. It has been enhanced quite a bit.

u/evaned · 3 pointsr/programming

&gt; And which coder uses physical books any more?

I do for things beyond actual language references; e.g. maybe everything in The Dragon Book has a good description somewhere, but grabbing that off my desk and having a decent chance of it having what I want (it has some problems and omissions, but it's reasonably good) will save wading through a bunch of crap online until I find something complete and accurate enough.

u/bobappleyard · 3 pointsr/programming

I don't know what he'd recommend, but I found the Dragon Book and Modern Compiler Design to be decent treatments of the subject. There are lots of interesting texts out there though.

Sorry for the cheeky reply.

u/jhillatwork · 3 pointsr/compsci

In addition to these, check out Computer Architecture: A Quantatative Approach by Hennesey &amp; Patterson. I had this as a textbook as an undergrad and still throw it at folks when they are doing very low-level optimizations that require intimate understanding of modern computers.

u/jstampe · 3 pointsr/compsci

I found Tanenbaum's Structured Computer Organization to be very good. Giving a complete overview of the whole stack.

u/Wil_Code_For_Bitcoin · 3 pointsr/PrintedCircuitBoard

Not entirely sure what you're looking for but I've heard a lot of praises for this book : https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/dietfig · 3 pointsr/electronics

High Speed Digital Design: A Handbook of Black Magic is supposed to be a great book on the subject but the frequencies you're working at don't really qualify as anything approaching "high speed". I really don't think you'll have any issues. The wavelength at 100 kHz is 3 kilometers so you're nowhere near having to worry about transmission line effects.

Make sure to adequately decouple every power pin at the chip to deal with the switching transients from the FETs otherwise you'll see a lot of ripple on your supply lines which can cause problems. ADI generally uses a 1 uF and 100 nF capacitor in parallel (IIRC) in their application circuits and I tend to think they know what they're doing.

Is your copper pour grounded? I wouldn't be very worried about coupling noise into your logic traces because 400 Hz is such a low frequency but I suppose it's possible.

ADI publishes a guide called "PCB Board Layout and Design Techniques" that goes through things like proper grounding but I didn't have any luck trying to find it on Google. The Circuit Designer's Companion is an excellent book that also covers the same material with a lot more depth.

u/Skipper_Jos · 3 pointsr/engineering

I will also recommended 'High Speed Digital Design: A Handbook of Black Magic book' , it definitely has some good stuff!
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90#

u/erasmus42 · 3 pointsr/rfelectronics
u/doodle77 · 3 pointsr/electronics

this book.

OP's board is clearly not high speed so it doesn't matter.

u/ntr0p3 · 3 pointsr/AskReddit

By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology.

Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software.

Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should.

For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything.

I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is.

For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever.

For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :(

Best of luck in your travels however! :)

edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius.

also, Feynman QED is great, but his other book is awesome just as a happy intellectual read

also try to avoid either kaku and hawking for anything more complicated than primers.

edit no. 9: mit's ocw is win itself.

edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields)

u/sindrit · 3 pointsr/compsci

Skip Calculus (not really useful unless you do fancy graphics or sound generators or scientific stuff). Discrete mathematics is what you want to look at for CS. You might want to move on to a linear algebra course from there.

Get the CS specific University textbooks. Here are some to get you started.

u/dohpaz42 · 3 pointsr/PHP

Agreed. There are plenty of resources out there that will help you understand design patterns. If you're new to the concept, I would recommend Head First: Design Patterns, it might be based on Java, but the examples are simple to understand and can mostly apply to PHP as well. When you feel like you've grasped the basic concepts of design patterns, you can move on to more advanced texts, like Martin Fowler's Patterns of Enterprise Design - this is a great reference for a lot of the more common patterns. There is also Refactoring: Improving the Design of Existing Code. These are great investments that will help you with any project you work on, and will help you if you decide to use a framework like Zend which uses design patterns very heavily.

u/smugglerFlynn · 3 pointsr/compsci
  1. Read any book on Java Patterns (probably the one by GoF) - they are applicable across different domains
  2. Read Patterns of Enterprise Application Architecture by Martin Fowler - this is the book that influenced many architects
  3. Study the field: The Architecture of Open Source Applications
  4. Study fundamentals: A Methodology for Systems Engineering by Arthur D. Hall - this one is hard to find
  5. Study as much different frameworks/architectures/languages as possible. You'll start to notice similarities yourself.
  6. Solve every problem you meet from architectural point of view. You will achieve nothing just reading the books. Refactor your old projects in terms of using patterns and new methodologies, write down designs for your wild random ideas, teach others about the stuff you know.

    Also, take a note: patterns are only a small part of systems engineering applied to CS. Think REST and SOAP, think of how to better integrate two different applications together - not how to code their insides more efficiently. Start considering business logic and requirements - and you will find yourself whole new level of challenging architectural tasks.
u/pitiless · 3 pointsr/PHP

The following books would be good suggestions irrespective of the language you're developing in:

Patterns of Enterprise Application Architecture was certainly an eye-opener on first read-through, and remains a much-thumbed reference.

Domain-Driven Design is of a similar vein &amp; quality.

Refactoring - another fantastic Martin Fowler book.

u/cmgg · 3 pointsr/funny

Reminds me of a joke:

a: "Alright the book is done, all that is left is to choose a cover".

b: "Dragons"

a: "B-but the book is about..."

b: "DRAGONS I SAID"

Book

u/HotRodLincoln · 3 pointsr/IWantToLearn

There are books specifically on language design, syntax trees, and unambiguous grammars.

The classic books on compiler design are "The Dragon Book", designing a compiler is important because a statement in the language should mean exactly one thing, and a language should be able to be compiled efficiently. This is more difficult than it sounds.

Second, you need to understand language design, variable binding, etc. This is a topic of Programming Language Paradigms. I'll figure out a good book for this and edit to add it. The best book probably covers languages like Ada, Haskell, C, and Java and gives an overview of their design and reasons.

edit: The book for design is Concepts of Programming Languages 9th ed, by Robert W. Sebesta.

u/IjonTichy85 · 2 pointsr/compsci

I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.

If you're interested in the theory of cs, here are a few starting points:

Introduction to Automata Theory, Languages, and Computation

The book you should buy

MIT: Introduction to Algorithms

The book you should buy


Computer Architecture&lt;- The intro alone makes it worth watching!

The book you should buy

Linear Algebra

The book you should buy &lt;-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.

Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.

One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...

EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.

u/elder_george · 2 pointsr/programming

There's very nice (although expensive) book on Computer Architecture called 'Structured Computer Organization' by Tanenbaum.

u/eldigg · 2 pointsr/webdev

In most cases your program's performance is limited by memory access speed rather than raw CPU power. Your CPU to sit there 99% of the time twiddling its thumbs waiting for memory access.

This is a pretty good book, imo, that talks about this (among other things):

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

u/the_other_other_matt · 2 pointsr/linuxadmin

Are you using NRPE? If so, it needs to run on every machine you want monitor. I recommend the book Building a Monitoring Infrastructure with Nagios (older, but still a good reference) https://www.amazon.com/dp/0132236931/ref=cm_sw_r_cp_apa_c8S6ybVPBSMDJ

u/frankenbeans · 2 pointsr/ECE

Amazing? These look like they were swiped from an overview lecture, there isn't any really good explanation in here. If this is all new to you they might be a good starting point for learning some basic concepts and vocabulary of signal integrity.

Johnson's Black Magic book is the general reference for this. There are many other (well written) white papers out there. Ott and Bogatin have good books as well.

u/jayknow05 · 2 pointsr/AskElectronics

This is a good book on the subject. I would personally work with a 4-layer board with a GND and VCC layer. It sounds like you already have a bunch of layers as it is so yes I would recommend a VCC layer.

u/drtwist · 2 pointsr/AskReddit

Eric Bogatin's book "Signal Integrity - Simplified" Howard Johnson's High Speed Digital Design and Mike Peng Li's Jitter, Noise, and Signal Integrity at High-Speed are all fantastic reads if you are looking for dead tree material. if you have a Safari subscription you can read Bogatin's and Li's books for "free"

u/PlatinumX · 2 pointsr/AskElectronics

&gt; Where did you take the formula for wire impedance from? Where could I read more about it?

This is a classic parallel conductor transmission line, there are calculators online. As I mentioned before, the twists do not affect impedance.

You can read more about transmission lines, characteristic impedance, twisted pair, and signal integrity all over the web (and of course check Wikipedia). These are very large topics with a lot of details to learn.

If you want a book, I recommend High Speed Digital Design: A Handbook of Black Magic.

u/tweakingforjesus · 2 pointsr/electronics

In addition to the per-IC decoupling cap already mentioned, I'd add a large electrolytic across VCC and GND near the connector on the right. You also might want to beef up the power and ground traces to reduce resistance to the individual ICs. Remember that your high-speed signal traces are going to induce the opposite current in parallel traces. A ground plane will help with this effect.

If you are really interested in digital PCB design, you might check out this book.

u/m85476585 · 2 pointsr/AskEngineers

I literally have a book called "A Handbook of Black Magic". It's a little old, but it's still one of the best books on the subject.

u/dangerbirds · 2 pointsr/ECE

Highspeed Digital Design by Graham and Johnson is more focused on high speed digital signals, but most of it applies to low speed as well. It has a ton of good "engineering rules of thumb" when it comes to doing PCB design.

u/velocicar · 2 pointsr/EngineeringStudents

Here's a book I use at work.

u/110100100_Blaze_It · 2 pointsr/learnprogramming

It's on my to-do list, but this is something that I want to get right. I don't think I could fully appreciate it without a more formal approach. I'm currently working through this, and will try my hand in the subject afterword. I will definitely check out Professor Might's insight on the subject, and I would gladly take up any other resources you might have to offer!

u/lordvadr · 2 pointsr/AskComputerScience

We wrote a compiler for one of my CS classes in college. The language was called YAPL (yet another programming language).

First thing first, as other's have mentioned, a compiler translates from one language to another...typically assembly...but could be any other language. Our compiler compiled YAPL, which was a lot like Pascal, into C, which we then fed to the C compiler...which in turn was fed to the assembler. We actually wrote working programs in YAPL. For my final project, I wrote a functional--albeit VERY basic--web server.

With that said, it's quite a bit different for an interpreted language, but the biggest part for each is still the same. By far, the most complicated part of a compiler is the parser.

The parser is what reads a source code file and does whatever it's going to do with it. Entire bookshelves have been written on this subject, and PhD's given out on the matter, so parsing can be extremely complicated.

In a theoretical sense, higher level languages abstract common or more complicated tasks from the lower level languages. For example, to a CPU, variables don't have sizes or names, neither do functions, etc. On one hand, it greatly speeds up development because the code is far more understandable. On the other hand, certain tricks you can pull of in the lower-level languages (that can vastly improve performance) can be abstracted away. This trade-off is mostly considered acceptable. An extra $500 web server (or 100 for that matter) to handle some of the load is far less expensive than 10 extra $100,000 a year x86 assembly developers to develop, optimize, and debug lower level code.

So generally speaking, the parser looks for what are called tokens, which is why there are reserved words in languages. You can't name a variable int in C because int is a reserved word for a type. So when you name variable, you're simply telling the compiler "when I reference this name again, I'm talking about the same variable." The compiler knows an int is 4 bytes, so does the developer. When it makes it into assembly, it's just some 4 bytes somewhere in memory.

So the parser starts looking for keywords or symbols. When it sees int, the next thing it's going to expect is a label, and if that label is followed by (, it knows it's a function, if it's followed by ; it's a variable--it's more complicated than this but you get the idea.

The parser builds a big structure in memory of what's what and essentially the functionality. From there, either the interpreter goes through and interprets the language, or for a compiler, that gets handed to what's called the emitter. The emitter is the function that spits out the assembly (or whatever other language) equivalent a = b + c; happens to be.

This is complicated, but if you take it in steps, it's not really that hard. This is the book we used. There's a much newer version out now. If I can find my copy, I'll give it to you if you pay shipping. PM me.

u/jhartwell · 2 pointsr/AskComputerScience

Compilers: Principles, Techniques and Tools which is also referred to as The Dragon Book is what I'm currently reading and am enjoying it.

u/13ren · 2 pointsr/programming
u/johnweeder · 2 pointsr/learnprogramming

Yes. Do it. It's great to know. Useful ocassionally - especially grammars. The dragon book is the only college text I've kept.

https://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886/ref=pd_lpo_sbs_14_img_0?_encoding=UTF8&amp;amp;psc=1&amp;amp;refRID=6GT8HPHEKPGJX9GGVMNR

u/Scaliwag · 2 pointsr/gamedev

Regarding sandboxing, at least in lua from what I know you can have minute control over what libs access to, and users can only import other libraries if you allow them to (by including a "library" that imports other libraries :-).

Perhaps you should look into formal languages and parser generators, so you can create more complex languages if you feel like it. Even if you build the parsers yourself having the language specified, factorized and so on, helps a lot. The dragon book is a good choice, although it presupposes you know a bit about specifying a formal language IIRC. If you're a student (I know how it is!) then even the old dragon book is an excellent read and it's very cheap.

u/BaconWraith · 2 pointsr/compsci

Cheers man! The Dragon Book is a great place to start, and there's always this, but mainly it's about facing each problem as you come to it and hoping for the best :P

u/oridb · 2 pointsr/learnprogramming

I've been playing around with writing a programming language and compiler in my spare time for a while now (shameless plug: http://eigenstate.org/myrddin.html; source: http://git.eigenstate.org/git/ori/mc.git). Lots of fun, and it can be as shallow or as deep as you want it to be.

Where are you with the calculator? Have you got a handle on tokenizing and parsing? Are you intending to use tools like lex and yacc, or do you want to do a recursive descent parser by hand? (Neither option is too hard; hand written is far easier to comprehend, but it doesn't give you any correctness guarantees)

The tutorials I'd suggest depend on exactly where you are and what you're trying to do. As far as books, the three that I would go with are, in order:

For basic recursive descent parsing:

u/vineetk · 2 pointsr/programming

Looks like it can be had for about $7 on amazon, including shipping. Surely you can scrounge up that much.

u/CSMastermind · 2 pointsr/AskComputerScience

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex &amp; yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/phao · 2 pointsr/java

Well, some books can help:

  • There is the design patterns book (gang of four) =&gt; http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/
  • Another book on patterns, but targetting enterprise applications (i.e. information systems) =&gt; http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

    These books on patterns tend to be good on teaching a lot of what you're asking. Largely because you've named some patterns in your question, but also because many patterns are about:

  • Identifying the need for something X to be changed without affecting another thing Y with which X is coupled; and
  • separating X from Y in a way that allows X to change independently from Y.

    There are several ways to say what I just did in there. You're allowing X to vary independently from Y. This makes X a parameter of Y, which is yet another way to say it. You're separating what is likely to change often (X) from what doesn't need to be affected by that change (Y).

    Some benefits from this is that a reason to change X, now, doesn't affect Y because X can be changed independently of Y. Another is that understanding X can be significantly done without looking at Y. This is a core guiding rule in the separation of concerns principle: the concern X is separated from the concern Y. Now, a lot of activities you want with X can be performed independently of Y.

    You probably know all of this, so I'm sorry if this isn't much helpful. But just to finish, a classic example of this is a sorting function (the Y) and the comparison criteria (the X). Many people, in many projects, would like to have that a change in the comparison criteria not lead to a change in the sorting function. They're 2 separate concerns we'd like to deal with separately. Therefore, the comparison criteria, as commonly done today, is a parameter of sorting. In this case, the word "parameter" is being used both in the sense of a function parameter in the source code, but also in the more general sense of something being a parameter of something else, in which case something can be one of many, and may change over time.
u/CodeTamarin · 2 pointsr/learnprogramming

Umm this might become specific...

So Clean Architecture got me thinking a lot about code structure at a macro level, which is really important for development. It's a good beginning to understanding architecture but definitely not a definitive reference.

I would likely suggest any book that helps you understand your tech stack at a deeper level. If your goal is to say, be a lead developer, for example. Often the role of a lead is to support team members who are stuck.

So understanding the tech stack is important. For me, I got C# in a Nutshell. (I would suggest you getting the nutshell equivalent of your tech stack). It's important and it let's you understand what's happening under the hood.

Learn Algorithms in 24 Hours was a nice little primer on data structures and algorithms. While by no means a "revolutionary" book, it was useful in understanding what structures solved which problems.

So here's me answer:

If you're trying to get better at your stack: Get the Nutshell version of your language. Or buy a book on the framework your on.

If you're trying to just be a better computer scientist... I would learn Design Patterns (you noted this), then Architecture (Clean Architecture and then Patterns of Enterprise Application Architecture) then you're going to need to understand how to solve scaling problems. So Data Structures and Algorithms. This is hard because what book you want depends on your comfort with math.

For me, the biggest impact book, was design patterns. Then it was all the tech stack stuff that helped. Then finally, architecture books. The Martin Fowler architecture book was useful with design and thinking about how to handle saving data.

But it's really going to boil down to what you want to do with your career long term.

u/materialdesigner · 2 pointsr/webdev

Patterns of Enterprise Application Architecture sounds like a must read for you.

u/guifroes · 2 pointsr/learnprogramming

Interesting!

Looks to me that you can "feel" what good code looks like but you're not able to rationalise it enough for you to write it on your own.

Couple of suggestions:

When you see elegant code, ask yourself: why is it elegant? Is it because is simple? Easy to understand? Try to recognise the desired attributes so you can try to reproduce on your code.

Try to write really short classes/methods that have only one responsibility. For more about this, search for Single Responsibility Principle.

How familiar are you with unit testing and TDD? It should help you a lot to write better designed code.

Some other resources:

u/SlobberGoat · 2 pointsr/java
u/st4rdr0id · 2 pointsr/androiddev

Hey that is the million dollar question. But because software is not an engineering, actually there is no reference book on SW architecture. Certainly there are books talking about this, but usually covering only some aspects and without real application examples.

Notice that in iOS programming the system imposes a great part of the architecture, so these guys are usually less concerned. But in Android we have more freedom, and the API actually encourages really bad practices (thanks Google). Because of this we are all a bit lost. Nowadays layered architecture and MVP seems to be the most popular approach, but then again everybody produces a different implementation...

Specifically for Clean Architecture you should read its author, Robert C. Martin. AFAIK this is not covered in detail in his books. You can read this blog post and watch this video. Other designs usually coming up in conferences are the Onion Architecture and the Hexagonal Architecture. But make no mistake: there's no route map on how to implement any of those, and examples claiming to follow this or that approach are usually not written by the authors of the architecture.


For DDD there is a very good book by Scott Millet with actual examples. But this style is meant for large enterprise backend apps, and the author himself advices against using is in small apps. So I'd say it is overkill for Android, but of course you could reuse some concepts successfully.


Theres also Software Architecture in Practice 3rd, but having read the 2nd edition I can tell you this is just smoke.


Probably best book to date is Fowler's but this is more a patterns compilation than an architecture guide.

u/vinnyvicious · 2 pointsr/gamedev

Have you ever heard of the open/closed principle? Or the single responsibility principle? Or Liskov substitution principle? All three are being violated. It drastically reduces the maintainability and extensibility of your code. I can't swap serializers easily, i can't tweak or extend them without touching that huge class and it's definitely not the responsibility of that class to know how to serialize A, B, C, D, E and the whole alphabet.

I highly recommend some literature on the subject if you're curious about it, it would drastically improve your approach to software architecture:

https://www.amazon.com/dp/0132350882

https://www.amazon.com/dp/0201485672

https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

http://cc2e.com/

https://www.amazon.com/dp/0321127420

u/vladmihalceacom · 2 pointsr/java

&gt; Yes, the Native Query and access to Connection is always THE Hibernate's answer to all the lacking support of basic SQL features like Window Functions or being able to count aggregated results.

That's a very common misconception. Hibernate is not a replacement for SQL. It's an alternative to JDBC API that implements the Enterprise Patterns stated by Martin Flower in his book.

Thre are many alternatives to JPA or Hibernate. In fact, I'm also using jOOQ, and I like it a lot. I wrote about it. I'm using it in my training and workshops as well.

There are things you can do in jOOQ that you can't do with Hibernate, and there are also things you can do with Hibernate that you can't do with jOOQ.

u/LXXXVI · 2 pointsr/learnprogramming

Thanks, I'm sure you will. It's just a question of getting that first success. Afterwards, it gets much easier, once you can point at a company and say "Their customers are using my code every day."

As for the interviews, I don't know, I'm honestly not the type to get nervous at interviews, either because I know my skill level is most likely too low and I take it as a learning experience, or because I know I can do it. I'd say that you should always write down all the interview questions you couldn't answer properly and afterwards google them extensively.

Besides, if you're from the US, you have a virtually unlimited pool of jobs to interview for. I live in a tiny European country that has 2 million people and probably somewhere in the range of 20 actual IT companies, so I had to be careful not to exhaust the pool too soon.

Funnily enough, right now, my CTO would kill for another even halfway competent nodejs developer with potential, but we literally can't find anyone.

Anyway, I'm nowhere near senior level, but I can already tell you that the architecture:language part is something your bootcamp got right. To that I would add a book my CTO gave me to read (I'm not finished yet myself, but it is a great book) - Patterns of Enterprise Architecture. Give it a look. I suspect, without ever having tried to implement a piece of architecture like that, it won't make much sense beyond theoretical, but I promise you, it's worth its weight in gold, once you start building something more complex and have to decide how to actually do it.

u/slowfly1st · 2 pointsr/learnprogramming

Your foes are kids in their twenties with a degree which takes years to achieve, this will be tough! But I think your age and your willingness to learn will help you lot.

&amp;#x200B;

Other things to learn:

  • JDK - you should be at least aware what API's the JDK provides, better, have used them (https://docs.oracle.com/javase/8/docs/). I think (personal preference / experience) those are the minimum: JDBC, Serialization, Security, Date and Time, I/O, Networking, (Internationalization - I'm from a country with more than one official language), Math, Collections, Concurrency.
  • DBMS: How to create databases and how to access them via JDBC. (I like postgreSQL). Learn SQL.
  • Learn how to use an ORM Mapper. (I like jOOQ, I dislike JPA/hibernate)
  • Requirements Engineering. I think without someone who has the requirements you can't really practice that, but theory should be present. It's a essential part of software development: Get the customers requirements and bring it to paper. Bad RE can lead to tears.
  • Writing Unit Tests / TDD. Having working code means the work is 50% done - book recommendation: Growing Object-Oriented Software, Guided by Tests
  • CI/CD (Continuous Integration / Delivery) - book recommendation: Continuous Delivery.
  • Read Clean Code (mandatory!)
  • Read Design Patterns (also mandatory!)
  • (Read Patterns of Enterprise Application Architecture (bit outdated, I think it's probably a thing you should read later, but I still love it!))
  • Get familiar with a build tool, such as maven or gradle.

    &amp;#x200B;

    If there's one framework to look at, it would be spring: spring.io provides dozens of frameworks, for webservices, backends, websites, and so on, but mainly their core technology for dependency injection.

    &amp;#x200B;

    (edit: other important things)
u/ladywanking · 2 pointsr/learnprogramming

You are asking a good question.

Wouldn't doing a separate class for each of the use cases you described be repeating yourself?

I would read about DTO and see how it goes.
A good book about this is this.

u/cderwin15 · 2 pointsr/compsci

For what it's worth, I'm in the midst of working through the dragon book and would highly recommend it. Unfortunately I don't know of any online courses you could take for credit.

u/0xf3e · 2 pointsr/programming

For anyone who wants to learn more about compilers and loves reading books, the so called dragon book is highly recommended lecture on this topic: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

u/Thrawed · 2 pointsr/raspberry_pi

GPIO. There are plenty of user guides floating around, including this one co-written by the creator.

u/case-o-nuts · 2 pointsr/programming

That's a good question, acutally. I picked it up in bits and pieces over years. I probably started to pick up when I tried to implement an object-oriented programming system in C. The dragon book was also a great help in figuring this sort of stuff out.

Another great way to learn is to write simple test programs in C or C++, and see what they compile down to with GCC. Using '-O' I find gives me the most readable "direct" assembly.

http://asm.sourceforge.net/howto/Assembly-HOWTO.html

Also, if you have any specific questions, possibly a tutorial or two... well, it's time that I started putting together a website.

u/echelonIV · 2 pointsr/gamedev

I ordered these for our company library, based on recommendations for/from other programmers (of all levels).

ISBN | Title
---|---
978-1568814247 | Real-time Rendering
0321486811 | Compilers: Principles, Techniques, and Tools (2nd Edition)
1482250926 or 0123742978 | Essential Mathematics for Games and Interactive Applications, Third Edition 3rd Edition
978-1482264616 | GPU Pro 6: Advanced Rendering Techniques
1466560010 | Game Engine Architecture, Second Edition
978-1482243567 | Multithreading for Visual Effects
978-0123750792 | Physically Based Rendering: From Theory To Implementation

u/Caret · 2 pointsr/hardware

As someone else mentioned, the Hennessy and Patterson Computer Architecture: A Quantitative Approach, and the Patterson and Hennessy Computer Organization and Design are the de facto standards (I used both in my Comp. Eng. undergrad) and are really fantastic books (the latter being more "software" oriented so to speak).

They are not EE textbooks (as far as I know) but they are text books nonetheless. A great book I found that is slightly dated but gives a simplified review of many processors is Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture which is less technical but I enjoyed it very much all the same. It is NOT a textbook, and I highly, highly recommend it.

Hope that helps!

u/Yulfy · 2 pointsr/AskProgramming

If you mean writing an interpreter or compiler, then yes. The iconic book for learning how to build languages is called Compilers, Principles, Techniques and Tools. It's often referred to as 'The Dragon Book'. It's pretty heavy reading but contains everything you need to know about building a language!

If you're looking for something more implementation driven, I recently read a book about building a programming language in Go. The principles are the same, just with a different language. The book was called Writing an Interpreter in Go. It's a much lighter read and details the construction of an interpreter from scratch!

u/EngrKeith · 2 pointsr/FPGA

Pong P. Chu's "Verilog/VHDL by Example" :

http://www.amazon.com/FPGA-Prototyping-Verilog-Examples-Spartan-3/dp/0470185325/ref=sr_1_2?ie=UTF8&amp;amp;qid=1412004641&amp;amp;sr=8-2&amp;amp;keywords=verilog+by+example

Really good, easy read:

http://www.amazon.com/Bebop-Boolean-Boogie-Third-Unconventional/dp/1856175073/ref=sr_1_3?ie=UTF8&amp;amp;qid=1412004683&amp;amp;sr=8-3&amp;amp;keywords=maxfield+clive

This is an older book, but is pretty cool because it has verilog on one page and VHDL on the other, in addition to showing you how some tools might synthesize it. Meaning you get a schematic of how it was implemented. Very good to learn what hardware gets instantiated by your HDL :

http://www.amazon.com/Hdl-Chip-Design-Synthesizing-Simulating/dp/0965193438/ref=cm_cr_pr_product_top

u/FPFan · 2 pointsr/FPGA

Actually, I worded it so there isn't an assumption on the languages being discussed, just using Language A and Language B as examples to show the flaw in the methodology used. It may or may not be reality, I don't know, but the article drawing the conclusions don't show that the data they used for the conclusion is valid either. It could be just the opposite, and verilog requires more googling, in that case, the data would be just as flawed.

You are right, they may require similar amounts of googling, Verilog may require much, much more, or it may be VHDL that requires more. However, to make a conclusion from the data (the one in the article is that the google search results represented the number of users, and could be counted on to make a fairly significant decision), you need to show that the data represents what is claimed.

With respect to the decision on what language to learn first, either will work, I learned VHDL before Verilog. The person should use real data based on their needs to make the decision. Look at the industries, schools, etc that you are aiming to work with/in. If you are learning FPGA's as a hobby, what sources are you using to learn, are they VHDL or Verilog oriented. I would also recommend a good book that gives examples in both languages like this, it is an older book, so I would look for a newer one, but it gives examples in both languages side by side. This gives a person exposure to the language they are not using while learning one, making it much easier to learn the second later. Like most programing problems, the language really doesn't matter if you get good fundementals, switching languages becomes a minor issue.

TL;DR: Learn the language that makes sense for your situation first, but don't trust the data from the article to make this decision.

u/DCoder1337 · 2 pointsr/PHP

If you are working on a small project with three tables, you don't need to worry too much. The problems arise when you have a large system where good architecture and appropriate patterns become really important.

With AR, the model class is responsible for both a) modelling a domain object and b) saving it to the DB and loading it back. Which means the model class gets mixed with methods like beforeSave, afterSave and afterFind to turn DB column values into your domain values and back (e.g. timestamps and datetimes converted to DateTime, complete with some timezone). This breaks the Single Responsibility Principle. Note: this is not a big deal for a small solution where your business logic is simple, but it tends to get worse as the system grows.

If an attribute holds something more complex than a string/number (e.g. a DateTime, a custom Money class, etc.), you have to manually (un)stringify it using beforeSave and afterSave (or stow the original value away somewhere in an additional private property).

  • Are you sure those conversions are lossless (of course they should be, but are you certain, especially with datetime/timezones)? I've been bitten by (my own fault) fun bugs before where a DateTime was constructed with a local timezone, then (correctly, by design) saved and unstringified into a DateTime with UTC, and suddenly that entity's publication date is different than it was before...
  • What happens if there's an exception thrown between beforeSave and afterSave - what state is your model left in? Personally, I am not fond of a property that can be both a string and an object at different times.

    See also: Martin Fowler - Patterns of Enterprise Applications.
u/cantstopthemoonlight · 2 pointsr/learnprogramming

Compilers-Principles-Techniques-Tools is considered THE definitive book on the subject. It's old, but in a fine wine kind of way.

u/PinPinIre · 1 pointr/learnprogramming

It largely depends on which Computer Science degree you are going to do. There can be some that focus heavily on software and very little on hardware and some that get a nice balance between the two. If the degree is going to focus on hardware I would recommend reading up on the underlying logic of a computer and then reading this book (Inside the machine). ITM isn't a very technical book(I would label it as the computer science equivalent of popular science) but it gives a nice clear overview of the what happens in a processor.

When it comes to programming, I would recommend starting with Java and Eclipse. Java gets quite a bit of hate but for a newcomer, I think Java would be easier to grasp than the likes of C/C++. C/C++ are nice languages but a newcomer may find their error messages a little bit obscure and may get confused with the nitty-gritty nuances of the languages.

Though the one thing you should realise is that programming is a skill that isn't confined to one language. If you understand the basic concepts of recursion, arrays, classes, generics/templates, inheritance, etc. you can apply this knowledge to almost any language. Ideally i would recomend two books on programming (Algorithmics) and (Introduction to Algorithms). Algorithmics is another books I would label as the cs equivalent to popular science but the early chapters give a nice overview of exactly what algorithms actually are. Introduction to Algorithms is a more technical book that I would recommend to someone once they know how to program and want a deeper understanding of algorithms.

The rest is personal preference, personally I prefer to use a Unix machine with Sublime Text 2 and the command line. Some will try to convince you to use Vim or Emacs but you should just find whichever you are most comfortable with.

u/ry_binaris · 1 pointr/programming

Inside the machine

Probably my favorite book of all time.

u/lawanda123 · 1 pointr/india

Hi,

Can somebody recommend me good books/sources for the following-:

1.Advanced Design Patterns - OOPS + Functional
2.Refactoring
3.Big data analytics and ML algorithms
4.Any fast track course/refresher for JS + Angular(Im looking for something that has finer details,ive done JS in the past but ive forgotten most of it)

Also,ive picked up on some of Martin Fowlers books for now,but would like more perspective-:

https://www.csie.ntu.edu.tw/~r95004/Refactoring_improving_the_design_of_existing_code.pdf

http://www.amazon.in/Enterprise-Application-Architecture-Addison-Wesley-Signature-ebook/dp/B008OHVDFM

Would highly recommend these for anyone interested..

u/farmvilleduck · 1 pointr/electronics

For understanding how computers work , there's a good book by the co-founder of arstechnica.

http://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593271042

u/AntiCompositeNumber · 1 pointr/raspberry_pi

The Raspberry Pi User Guide is pretty good. Not too technical, but provides good info. 11 might be at the low end, but the book should be useful.

u/Airules · 1 pointr/raspberry_pi

I really like the user guide book:

Raspberry Pi User Guide https://www.amazon.co.uk/dp/1119264367/ref=cm_sw_r_cp_tai_x1CDzbAXAC2TN

It offers some basic info to get you up and running and helps a lot with your first steps. Making and breaking the pi is part of the fun with it! I'd recommend having a backup system (I use pibaker on my mac to create backups whenever my tinkering leads to a success, so if/when I break it again I can rollback to my last working version!)

u/balefrost · 1 pointr/AskProgramming

OK, a few things:

It looks like you're trying to build a shift/reduce parser, which is a form of an LR parser, for your language. LR parsers try to reduce symbols into more abstract terms as soon as possible. To do this, an LR parser "remembers" all the possible reductions that it's pursuing, and as soon as it sees the input symbols that correspond to a specific reduction, it will perform that reduction. This is called "handle finding".

&gt; If I am correct, my Automaton is a DFA?

When the parser is pursuing a reduction, it's looking for sequences of symbols that match the right-hand sides of the relevant (to our current parse state) productions in our grammar. Since the right-hand sides of all the productions in a grammar are simple sequences, all the handle finding work can be done by a DFA. Yes, the handle recognizer of your parser is a DFA. But keep in mind that it needs to be combined with other parts to make a full parser, and your actual grammar can't be recognized with just a DFA.

In particular, you've shown the ACTION table for a shift/reduce parser. It determines what to do when you encounter a symbol in the input stream. But a shift/reduce parser typically needs a second table as well - the GOTO table - that determines what to do after a reduction has taken place.

One other thing that's worth mentioning: you've expressed your ACTION table as a plain DFA transition table. That's not necessarily wrong, but it's not commonly done that way. Instead of reducing when you reach a certain state, it's common to instead attach an action - either 'shift' or 'reduce' ('accept') - to each transition itself. So in a shift/reduce parser, your table might look more like this:

| [ | ] | &lt; | &gt; | id | / | attr
----+-----+-----+-----+-----+------+-----+--------
0 | S1 | | S4 | | | |
1 | | | | | S2 | | R3 : Reduce Tag -&gt; [ id ]
2 | | R3 | | | | | R7 : Reduce Tag -&gt; &lt; id ??? / &gt;
4 | | | | | S5 | S10 | R9 : Reduce Tag -&gt; &lt; id ??? &gt;
5 | | | | R9 | | S6 | S8 R12 : Reduce Tag -&gt; &lt; / id &gt;
6 | | | | R7 | | |
8 | | | | R9 | | S6 | S8
10 | | | | | S11 | |
11 | | | | R12 | | |

Note that R7 and R9 aren't well-formed, since multiple sequences of input tokens might cause you to reach these actions. While it would be possible to construct a shift / reduce parser this way, it's not commonly done. Typically, the DFA to recognize handles is an acyclic graph, but your have a self-transition in state 8.

&gt; What would be the best way of implementing this automaton in C++? Do I really have to make a huge array?

In general, yes, you need a big array (or, as suggested before, two big arrays). But you can use any space-saving technique you want. For example, since most entries in the ACTION table are invalid, one could represent that data with a sparse array data structure. Also, both The Dragon Book and Cooper and Torczon briefly cover parser-specific ways to compress those tables. For example, notice that rows 5 and 8 in your example have the same entries. Most real grammars have multiple instances of identical rows, so factoring out this commonality can save enough space that the extra complexity is worth it.

---

I'm a little surprised that you're building a parser like this by hand, though. Typically people do one of two things:

  1. Build, by hand, a modified LL(1) recursive descent parser (or variant, like a packrat parser)
  2. Build, using a tool like YACC or Bison, a LR(1) shift/reduce parser

    You're sort of doing a mix of the two, which means you have the downsides of both approaches. You need to track all the states and transitions by hand, instead of relying on tools to automate that process, yet you don't get the flexibility of a hand-coded recursive descent parser.

    If you're doing this for education's sake, then by all means proceed. I'd highly encourage you to pick up a book on parsing; I think Cooper and Torczon is a great source. But if you just want a parser that works, I'd definitely recommend using a tool or using a more direct approach, like recursive-descent.
u/xiongchiamiov · 1 pointr/webdev

The ones I see most often are The Art of Scalability, Building Scalable Websites, and Scalable Internet Architectures, although I can't say anything personally about them. There's a question on SO that would be useful if it wasn't closed. And there's often good stuff on the High Scalability blog.

I'm not aware of any recent books, though, no. I've started a book about that and some other stuff, but along with not being finished it's probably targeting lower traffic than you're looking for.

u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/brucehoult · 1 pointr/ComputerEngineering

Welcome!

You need two books:

https://www.amazon.com/Computer-Organization-Design-RISC-V-Architecture/dp/0128122757

Get the original MIPS or later ARM version if you prefer -- they're absolutely fine, and the principles you learn one one apply to everything -- but the RISC-V one is the newest and is the only only one that you're actually legally allowed to make an implementation of at home and distribute, put on github etc.

But of course designing and making your own 16 bit ISA is huge fun, so I definitely recommend that too!

Once you've digested all that, their other book is more advanced. But the first one will get you a long way. This next one is the absolute bible of real computer architects and hardware designers.

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

That's by these guys, who originally invented the RISC-I and MIPS processors in the early 80s, invented the term "RISC" (and also RAID, btw). They recently received the Turing award for their lifetime efforts:

https://www.youtube.com/watch?v=3LVeEjsn8Ts

Join comp.arch on usenet / google groups. There are lots of actual working or retired computer architects there, and they're helpful to energetic students and amateurs designing their own toy stuff.

u/throwdemawaaay · 1 pointr/AskComputerScience

https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269

After that:

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

These authors are the foremost authorities in the field. The second book is *the* textbook for computer architecture. These are the people that invented RISC.

u/exp11235 · 1 pointr/buildapc

The formal name for this field is "computer architecture." The most popular textbook by far is Patterson and Hennessey, and it's pretty easy to find materials from college courses posted online, ex. MIT open courseware, UC Berkeley.

For something less likely to put you to sleep, Ben Eater has a great Youtube channel where he explains computer architecture from a practical angle. He's got a great series where he builds a simple 8-bit computer from scratch, explaining all the pieces along the way.

u/saitt04 · 1 pointr/compsci

[This one?]
(http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210)

I just started my computer architechture class and this is one of the books they recommended, I think I'll try to get this one if it is that good.

u/cschaef66 · 1 pointr/learnprogramming

Structured Computer Organization by Andrew Tanenbaum:
https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0131485210
One of the best books I've read.

u/smith7018 · 1 pointr/jailbreak

It's hard to say because that's a more advanced section of computer science called architecture and I learned it in a college setting. With that being said, I've heard good things about this textbook. Hopefully whichever book you pick up on ARM assembly will have a few chapters going over how the processor functions.

Good luck! :)

u/oldsecondhand · 1 pointr/technology

I'd check out these two books from the local library and read the first 2-3 chapters. It might contain more than what you need, but these are pretty well written books and don't assume a lot of previous knowledge.

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

http://www.amazon.com/Computer-Networks-5th-Andrew-Tanenbaum/dp/0132126958/ref=la_B000AQ1UBW_sp-atf_title_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1376126566&amp;amp;sr=1-1

Or you could just check out your network settings and search for the terms that you encounter (IP address, DNS, DHCP, gateway, proxy, router, firewall)

u/zendruid · 1 pointr/learnprogramming

I study in South Africa and we did this book in our first year, with Chapters 1-2 in first semester and 3-4 in the second. It had very little on algorithms, we did basic recursion and for the most part that was it.

We then did this book in the 3rd semester, which covered more algorithms in detail (DFS/BFS/MergeSort/QuickSort etc). And I'm now in the 4th semester where we have a ton of different books including this, and this.

u/J_J_Rousseau0 · 1 pointr/learnprogramming

No, not that I know of. A lot of our assembly programming will be done using a MIPS virtual machine. [This] (http://www.amazon.com/Introduction-RISC-Assembly-Language-Programming/dp/0201398281) is one of the books we need for the class. The [other] (http://www.amazon.com/Essentials-Computer-Architecture-Douglas-Comer/dp/0131491792) is just a general computer architecture textbook.

u/clvx · 1 pointr/sysadmin

Nagios is really easy to manage.. however, if you find yourself struggling with it, you should read at least the official documentation[1], or any of these books[2,3].. in fact, I encourage to read the books.

[1] http://nagios.sourceforge.net/docs/nagioscore/4/en/toc.html
[2] http://www.amazon.com/Nagios-Network-Monitoring-Wolfgang-Barth/dp/1593271794
[3] http://www.amazon.com/Building-Monitoring-Infrastructure-Nagios-Josephsen/dp/0132236931

u/squaganaga · 1 pointr/ECE

I haven't yet designed boards with EMC and RF in mind. I've seen recommendations for the high-speed digital design books thrown around, though.

u/VectorPotential · 1 pointr/AskElectronics

What are you trying to measure? Your pulse rise time will do some funny things to your results if you're not careful.

If you can obtain a copy of High Speed Digital Design, the author describes several test jigs for such tests.

u/somekindofdevil · 1 pointr/AskElectronics

Almost every PCB/EDA software doing length matching automatically so you don't need to worry about that. If you wanna know how softwares are doing it, It's more like a mathematical problem. I think they are using parametric curves like Bezier. You can calculate length of a bezier curve easily so you can match them.

https://en.wikipedia.org/wiki/B%C3%A9zier_curve

If you wanna know more about high speed pcb design, I recommend this book.
https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/Beggar876 · 1 pointr/AskElectronics

Find/download/buy this book: High speed Digital Design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

&amp;#x200B;

Scan it cover to cover. It will pay for itself the first time you save a pcb re-spin because of something you saw in it. It has for me.

u/kevlarcoated · 1 pointr/PrintedCircuitBoard

The books referenced by the most presenters and PCB design conferences are
Right the first time by Lee Ritchie http://www.thehighspeeddesignbook.com/
Highspeed design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

Note that A handbook of black magic reads like a text book, it is very long and very boring.
The subject of PCB is complicated and requires an in depth understanding of the physics because just knowing the rules isn't enough convince other engineers that it's the right way to do something. More importantly, in my experience PCB design is always the least bad solution, you have to understand when you can break the rules and what the implications will be and understand if the trade off is acceptable

u/reddit_user33 · 1 pointr/Electrical_Engineers

Different applications require different rules to abide by to a certain extent.

High speed digital design - Howard Johnson has given me such a great insight to the world of high speed electronics.

u/khafra · 1 pointr/DebateReligion

Much of your thinking seems to be based on a confusion of levels. If you knew more specifically how the firing together of neurons strengthens the probability they'll fire together in the future; or if you'd examined a program simulating physics, you wouldn't be using confusion as building blocks for arguments.

For instance, you would not be as confused right here if you were a systems developer instead of a philosopher; one read-through of the Dragon Book would clear everything right up. I'll try to summarize, but please understand this is not rigorous:

Your mind is running the algorithm "Step 1: Move to front of house. Step 2: Move to back of house. Step 3: Go to Step 1." Your mind is something your brain does. Your brain is implemented on physics. Exactly like the boulder.

The most legitimate question related to this post is that of substrate. Note: I do not agree with everything in this essay, but it presents the problem better than writings on "dust theory" (unless you're willing to read the whole Greg Egan novel Permutation City).

u/name_censored_ · 1 pointr/learnprogramming

&gt;Do you know of a book or a website that teach useful optimization techniques?

I'm only an enthusiast, I've never needed really optimised code (truth be told, most of what I do day to day is quick-and-dirty, appallingly inefficient scripts, because it "needs to be done yesterday"), so I can't give you a canonical list, but here's what I do know;

For books, there's this /r/compsci reddit thread from a while ago. Something on compilers like The Dragon Book might be your best bet, especially the optimisation chapter. And obviously jotux's "How Computers Do Maths" - though never having even flicked through it, I can't say if it's any good.

You could try your luck in /r/ReverseEngineering (or the quieter /r/asm and /r/compilers), there are a lot of low-level guys there who'd know a lot more than me. You could also try /r/compsci or /r/algorithms, although they'd be more useful for algorithms than for optimisation. And of course, /r/quantfinance.

u/Zonr_0 · 1 pointr/news

Really? Unless you're in a specific special topics course, the principles for most computer science haven't changed much. This book was the assigned reading for my compilers course and it was written in the mid 80s. Similarly, the core algorithms and data structures in the standard CS education haven't changed much (except for a move away from teaching bubble sort as an intro sort and into insertion sort).

But maybe that was just my education.

u/alanwj · 1 pointr/learnprogramming

it sort of depends on what you are trying to do.

I can't really tell from the name what that book is going to cover, but I expect that most books on programming language theory are going to start with things like lambda calculus, and go into type theory, etc, etc. If you are trying to learn the theoretical underpinnings of programming languages then this is great!

However, in my opinion a more practical place to start is with learning how to implement a programming language. That is, how to write a compiler. For that there is a whole separate set of theory (regular expressions, grammars, automata, etc) that you need to learn. The standard text for this is "the dragon book".

u/ErstwhileRockstar · 1 pointr/programming

Steve Vinoski - heared that name before.

u/robertcrowther · 1 pointr/linux
u/vplatt · 1 pointr/java

I have this as well, but don't really have any remarks for you. That said, maybe you should look through some of the reviews for it on Amazon or the like. The reviews there seem pretty authentic.

https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

u/SamHennessy · 1 pointr/PHP

When I said "I can see how maybe they could be useless to you.", that's because I instantly know what kind of programmer you were. You're a low level guy.

I have a copy of "Algorithms in a Nutshell" (http://www.amazon.com/Algorithms-Nutshell-In-OReilly/dp/059651624X) but I never finished it. My favorit programming book may be "Patterns of Enterprise Application Architecture" (http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420). Neither of these books are language specific, but I don't think they could be further apart in every way. Both are very valuable and I appreciate that they both exist.

There is a good number of reasons that you should maximize your use of the built-in PHP functions (http://webandphp.com/5reasonstomaximizeyouruseofPHP%E2%80%99sbuiltinfeatures). My book is an attempt to come up with a system that will help you learn all of the built-in PHP functions by giving a realistic use case that could be applied in your everyday work.

Being a PHP programmer, it is much more useful to know what functions PHP has for array sorting, than it is to know how to implement array sorting in PHP code.

u/xnoise · 1 pointr/PHP

There are a ton of books, but i guess the main question is: what are you interested in? Concepts or examples? Because many strong conceptual books are using examples from java, c++ and other languages, very few of them use php as example. If you have the ability to comprehend other languages, then:

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;amp;qid=1322476598&amp;amp;sr=8-1 definetly a must read. Beware not to memorize it, it is more like a dictionary. It should be pretty easy to read, a little harder to comprehend and you need to work with the patterns presented in that book.

http://www.amazon.com/PHP-5-Objects-Patterns-Practice/dp/1590593804 - has already been mentioned, is related directly to the above mentioned one, so should be easier to grasp.

http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/ref=sr_1_1?ie=UTF8&amp;amp;qid=1322476712&amp;amp;sr=8-1 - one of the most amazing books i have read some time ago. Needs alot of time and good prior knowledge.

http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_4?ie=UTF8&amp;amp;qid=1322476712&amp;amp;sr=8-4 - another interesting read, unfortunatelly i cannot give details because i haven't had the time to read it all.

u/bitcycle · 1 pointr/Database
  1. Use a relational data store first: MySQL/PostgreSQL/M$ SQL Server.
  2. Put CRUD operations behind a web service.
  3. Instead of arbitrary key/value pairs, try to get as specific as possible about the data (instead of storing everything as strings). Being more specific will help things (usually) to be more performant.
  4. Build the application on top of the service.
  5. Scale to the # of users you want to be able to support
  6. At this point, if you need to move part of the data into a non-relational data store, you can.

    I also recommend reading this book: Patterns of Enterprise Application Architecture
u/celtric · 1 pointr/PHP

You can read books with code examples written in Java or C#, the syntax is very similar and the principles apply in the same way (PoEAA comes to mind).

u/postmodern · 1 pointr/programming
  • ActiveRecord is a pattern according to Patterns of Enterprise Application Architecture. It's definitely the easiest pattern to implement, and thus the most popular amongst ORMs. Therefore, the simple misconception that ORM == ActiveRecord.
  • Your own description of the DataMapper pattern would imply that DataMapper does in fact differ from ActiveRecord. :) ActiveRecord has no concept of mapping, nor separation between the schema representation and the model. ActiveRecord simply instantiates Models froms rows.
  • Note: I am not quoting Martin Fowler as an Apple to Authority, but simply because Martin Fowler wrote Patterns of Enterprise Application Architecture (PoEAA), in which the ActiveRecord and DataMapper patterns are explained. :) See also the Wikipedia entry for ActiveRecord which lists Martin Fowler as the creator.
u/cythrawll · 1 pointr/PHP

Honestly I haven't read a "PHP book" in ages so I am a very bad source in critiquing. the majority of the one's I have come across are painfully outdated and some right out inaccurate. My suggestion to learn PHP programming better is try books that have to do with programming in general. Books on object orientation and patterns, like GOF http://en.wikipedia.org/wiki/Design_Patterns or PoEAA http://www.amazon.com/Enterprise-Application-Architecture-Addison-Wesley-Signature/dp/0321127420 are great for learning Object Oriented principals.

But those will help somewhat. What really helped me become a better PHP programmer is to study other languages, and then study their web frameworks, then take what you learned back to PHP. Find out why one aspect is pretty common in X language frameworks, but not PHP frameworks. How do other language frameworks solve the dependency issue, etc. Some languages I suggest learning are actually other ones that are pretty mainstream in web programming: Python for it's typing and how it compares to PHP. Ruby with how mixins relate to PHP traits. and Java is great as there are quite a bit of aspects that PHP stole off Java in it's OO design.

u/ndanger · 1 pointr/compsci

Upvote for Domain-Driven Design, it's a great book. Depending on the size of the system, Martin Fowler's PoEAA might also be helpful.

Also what dethswatch said: what's the audience &amp; scope; i.e. what's in the previous document? If you're presenting three architectures you probably need enough detail that people can choose between them. That means knowing how well each will address the goals, some estimate on implementation effort (time &amp; cost), limitations, future-proofing, etc.

Finally, IMHO, this really isn't computer science. You might have better luck asking in /r/programming/ or the new r/SWArchitecture/

u/mrferos · 1 pointr/PHP

When you get to a project larger than a small business or personal website it's less about the language and more about imposing a structure that makes sense backed by a data model that's thought out and performant.

I recommend these books:

http://www.amazon.com/High-Performance-MySQL-Optimization-Replication/dp/1449314287/ref=sr_1_1?ie=UTF8&amp;amp;qid=1425831227&amp;amp;sr=8-1&amp;amp;keywords=high+performance+mysql
http://www.amazon.com/gp/product/0321127420/ref=oh_aui_detailpage_o05_s00?ie=UTF8&amp;amp;psc=1

u/jasonlotito · 1 pointr/PHP

First, ignore the code examples on the page. They are fairly bad. The reason you don't grok OOP is because of examples like these. What you want is a good solid book.

Patterns of Enterprise Application Architecture

PoEAA sounds imposing, but it's really good. I highly recommend it.

I also recently wrote up an article on Data Mapping that you might be interested in reading. If you have any questions, you can ask. Basically, for me, when I finally groked OOP, was when I was reading an article on observers, and how that pattern works, and I started to get it. I'm still learning. I'm only now getting over my need for getters and setters (You don't need them, they are bad, stop using them).

But yeah, the reason most OOP articles are bad is because most people using objects are merely using them as namespaces for functions and variables. If you have any questions, let me know.

u/pjmlp · 1 pointr/programming

&gt; Those are interpreted (JIT - just in time compiled)

You are contradicting yourself.

Plus there are also native compilers for Java and .NET languages.

Just a few examples:

  • OS/400 compiles to native code at installation time

  • Android ART compiles to native code at installation time

  • Websphere Real Time JVM has an AOT compiler

  • Aonix JVM has an AOT compiler

  • Mono -aot, NGEN and .NET Native are all compilers generating native code

    Given that C and C++:

  • have the CINT interpreter

  • on OS/400 they compile to bytecode

  • on Windows CE they could be compiled to bytecode

  • were compiled to bytecode with the TenDRA compilers

    Does that make C and C++ interpreted?

    Of course not, because as any compiler design student knows, language != implementation.

    Bytecodes are just a form of architecture independent machine instructions, by no means do they require interpretation vs further compilation to native code.

    Since the Summer is still not over, here is some reading:

    Compilers: Principles, Techniques, and Tools (2nd Edition)
u/OhYourFuckingGod · 1 pointr/funny

She's trying to hide her disappointment. I'm pretty sure this is the one she wanted .

u/Elynole · 1 pointr/nfl

I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:

Universal Principles of Design

Dieter Rams: As Little Design as Possible

Rework by 37signals

Clean Code

The Art of Programming

The Mythical Man-Month

The Pragmatic Programmer

Design Patterns - "Gang of Four"

Programming Language Pragmatics

Compilers - "The Dragon Book"

The Language of Mathematics

A Mathematician's Lament

The Joy of x

Mathematics: Its Content, Methods, and Meaning

Introduction to Algorithms (MIT)

If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.

I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.


u/cmtedouglas · 1 pointr/learnprogramming

well, the common source out there that i can't avoid recommend is the Dragon's book

http://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

u/johannes1971 · 1 pointr/cpp

European dragons have a heritage that stretches back to at least the time of the Greek civilisation; calling them "pale imitations" does them a grave disservice. The oldest known sources for dragon myths are from the Middle East, not the Far East.

If you feel like arguing this point, please don't use the Dresden Files. Just stick with authorative sources instead, ok?

u/blahdom · 1 pointr/learnpython

No problem. Good luck finding a class, compilers are a really fun subject!

If you are just generally interested (I don't know your experience level) but the dragon book is still highly regarded. And might be a good entry way into the theory of it all.

u/hou32hou · 1 pointr/ProgrammingLanguages

I suggest you to read the Dragon Book.

u/dnew · 1 pointr/worldnews

&gt; Is this a realistic goal

Yes, quite. The bits you are going to be missing are some of the mathematical underpinnings. Depending on what you're programming, you'll also want to grab books on the particular topic at hand that don't try to teach you programming at the same time.

For example, if you want to learn why C# is object-oriented and what that means and how to use it, grab a copy of this book: http://en.wikipedia.org/wiki/Object-Oriented_Software_Construction

If you want to learn how relational databases work, read this one http://www.amazon.com/Introduction-Database-Systems-8th-Edition/dp/0321197844 (You can easily find online versions, but I didn't investigate whether they were legally released or not.)

You want to write a compiler? Grab the "dragon book": http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

None of those teach you how to program. They teach you the math and background behind major inventions in programming. Keep up with those, find a local mentor who enjoys talking about this stuff, and you'll do fine.

u/empleadoEstatalBot · 1 pointr/argentina

&gt; It’s hard to consolidate databases theory without writing a good amount of code. CS 186 students add features to Spark, which is a reasonable project, but we suggest just writing a simple relational database management system from scratch. It will not be feature rich, of course, but even writing the most rudimentary version of every aspect of a typical RDBMS will be illuminating.
&gt;
&gt; Finally, data modeling is a neglected and poorly taught aspect of working with databases. Our suggested book on the topic is Data and Reality: A Timeless Perspective on Perceiving and Managing Information in Our Imprecise World.
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ### Languages and Compilers
&gt;
&gt; Most programmers learn languages, whereas most computer scientists learn about languages. This gives the computer scientist a distinct advantage over the programmer, even in the domain of programming! Their knowledge generalizes; they are able to understand the operation of a new language more deeply and quickly than those who have merely learnt specific languages.
&gt;
&gt; The canonical introductory text is Compilers: Principles, Techniques &amp; Tools, commonly called “the Dragon Book”. Unfortunately, it’s not designed for self-study, but rather for instructors to pick out 1-2 semesters worth of topics for their courses. It’s almost essential then, that you cherrypick the topics, ideally with the help of a mentor.
&gt;
&gt; If you choose to use the Dragon Book for self-study, we recommend following a video lecture series for structure, then dipping into the Dragon Book as needed for more depth. Our recommended online course is Alex Aiken’s, available from Stanford’s MOOC platform Lagunita.
&gt;
&gt; As a potential alternative to the Dragon Book we suggest Language Implementation Patterns by Terence Parr. It is written more directly for the practicing software engineer who intends to work on small language projects like DSLs, which may make it more practical for your purposes. Of course, it sacrifices some valuable theory to do so.
&gt;
&gt; For project work, we suggest writing a compiler either for a simple teaching language like COOL, or for a subset of a language that interests you. Those who find such a project daunting could start with Make a Lisp, which steps you through the project.
&gt;
&gt;
&gt;
&gt; [Compilers: Principles, Techniques &amp; Tools](https://teachyourselfcs.com//dragon.jpg) [Language Implementation Patterns](https://teachyourselfcs.com//parr.jpg)&gt; Don’t be a boilerplate programmer. Instead, build tools for users and other programmers. Take historical note of textile and steel industries: do you want to build machines and tools, or do you want to operate those machines?
&gt;
&gt; — Ras Bodik at the start of his compilers course
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ### Distributed Systems
&gt;
&gt; As computers have increased in number, they have also spread. Whereas businesses would previously purchase larger and larger mainframes, it’s typical now for even very small applications to run across multiple machines. Distributed systems is the study of how to reason about the tradeoffs involved in doing so, an increasingly important skill.
&gt;
&gt; Our suggested textbook for self-study is Maarten van Steen and Andrew Tanenbaum’s Distributed Systems, 3rd Edition. It’s a great improvement over the previous edition, and is available for free online thanks to the generosity of its authors. Given that the distributed systems is a rapidly changing field, no textbook will serve as a trail guide, but Maarten van Steen’s is the best overview we’ve seen of well-established foundations.
&gt;
&gt; A good course for which some videos are online is MIT’s 6.824 (a graduate course), but unfortunately the audio quality in the recordings is poor, and it’s not clear if the recordings were authorized.
&gt;
&gt; No matter the choice of textbook or other secondary resources, study of distributed systems absolutely mandates reading papers. A good list is here, and we would highly encourage attending your local Papers We Love chapter.
&gt;
&gt;
&gt;
&gt; [Distributed Systems 3rd edition](https://teachyourselfcs.com//distsys.png)
&gt;
&gt;
&gt;
&gt; ## Frequently asked questions
&gt;
&gt; #### What about AI/graphics/pet-topic-X?
&gt;
&gt; We’ve tried to limit our list to computer science topics that we feel every practicing software engineer should know, irrespective of specialty or industry. With this foundation, you’ll be in a much better position to pick up textbooks or papers and learn the core concepts without much guidance. Here are our suggested starting points for a couple of common “electives”:
&gt;
&gt; - For artificial intelligence: do Berkeley’s intro to AI course by watching the videos and completing the excellent Pacman projects. As a textbook, use Russell and Norvig’s Artificial Intelligence: A Modern Approach.
&gt; - For machine learning: do Andrew Ng’s Coursera course. Be patient, and make sure you understand the fundamentals before racing off to shiny new topics like deep learning.
&gt; - For computer graphics: work through Berkeley’s CS 184 material, and use Computer Graphics: Principles and Practice as a textbook.
&gt;
&gt; #### How strict is the suggested sequencing?
&gt;
&gt; Realistically, all of these subjects have a significant amount of overlap, and refer to one another cyclically. Take for instance the relationship between discrete math and algorithms: learning math first would help you analyze and understand your algorithms in greater depth, but learning algorithms first would provide greater motivation and context for discrete math. Ideally, you’d revisit both of these topics many times throughout your career.
&gt;
&gt; As such, our suggested sequencing is mostly there to help you just get started… if you have a compelling reason to prefer a different sequence, then go for it. The most significant “pre-requisites” in our opinion are: computer architecture before operating systems or databases, and networking and operating systems before distributed systems.
&gt;
&gt; #### Who is the target audience for this guide?
&gt;
&gt; We have in mind that you are a self-taught software engineer, bootcamp grad or precocious high school student, or a college student looking to supplement your formal education with some self-study. The question of when to embark upon this journey is an entirely personal one, but most people tend to benefit from having some professional experience before diving too deep into CS theory. For instance, we notice that students love learning about database systems if they have already worked with databases professionally, or about computer networking if they’ve worked on a web project or two.
&gt;
&gt; #### How does this compare to Open Source Society or freeCodeCamp curricula?
&gt;
&gt; The OSS guide has too many subjects, suggests inferior resources for many of them, and provides no rationale or guidance around why or what aspects of particular courses are valuable. We strove to limit our list of courses to those which you really should know as a software engineer, irrespective of your specialty, and to help you understand why each course is included.
&gt;
&gt; freeCodeCamp is focused mostly on programming, not computer science. For why you might want to learn computer science, see above.
&gt;
&gt; #### What about language X?
&gt;
&gt; Learning a particular programming language is on a totally different plane to learning about an area of computer science — learning a language is much easier and much less valuable. If you already know a couple of languages, we strongly suggest simply following our guide and fitting language acquisition in the gaps, or leaving it for afterwards. If you’ve learned programming well (such as through Structure and Interpretation of Computer Programs), and especially if you have learned compilers, it should take you little more than a weekend to learn the essentials of a new language.
&gt;
&gt; #### What about trendy technology X?
&gt;

&gt; (continues in next comment)

u/kmafb · 0 pointsr/IAmA

For lexing and parsing, you should just pick up a compiler book. You could bang your head against it your whole life without figuring it out, and the Right Answer is not that hard if you have someone to show it to you. There are lots of good ones; the classic is the "dragon book" (http://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886).

Beyond that, VMs are a big topic. They include all of compilers, and almost all of systems programming. The Smith and Nair book (http://www.amazon.com/Virtual-Machines-Versatile-Platforms-Architecture/dp/1558609105) is a great jumping off point. But so is playing around with a project that means something to you. It depends what you find more rewarding.

u/levu-webworks · 0 pointsr/learnprogramming
  • The "Red Dragon Book of Compiler Design"
  • Compiler Design in C

    Both books I've read. The latter sits on my bookshelf. It was a gift from my girlfriend. Please don't waste your time trying to implement a compiler. It's a PhD level endeavor that will take years of dedicated 60 hour work weeks.

    Here are the same links linked from my Amazon affiliates account:

  • The Red Dragon Book of Compiler Design
  • Compiler Design in C


    You are better off implementing a algebraic calculator using LR Parse. Start with Tom Torf's - Programmers Calculator - PCalc. It's written in C and pretty simple. You can fork it from my GitHub account if you have trouble finding Tom's source archive. Tom (may he rest in peace) also wrote several programming tutorials and contributed to comp.lang.c, alt.lang.c and the comp.lang.c FAQ.
u/CodeShaman · 0 pointsr/java

Once you've mastered basic design patterns, this book will take you to a heightened state of consciousness.

u/henk53 · 0 pointsr/java

No, I'm not getting anything confused!

I started in the 70-ties with asm, worked my way through C, then C++ and lately Java. At some point during that time I've implemented a Pascal compiler and I've read the dragon book cover to back and again.

I think I've got a pretty firm grasp of what call by reference is, and let me tell you: Java DOES NOT have call by reference, ONLY call by value.

If there's anyone who's confused it's you. Sorry.

Things get a LOT easier mentally when you call it a pointer in Java. This is just terminology, even the Java designers knew it was a pointer (hence the name NullPointerException).

In Java, objects are only accessible via pointers. To be more consistent with C++, every variable in Java should actually be written as:

MyObject obj = new MyObject();

But since there's no concept of a stack allocated object in Java, you would always have to put that
there, and so they left it out.

And indeed, for pass-by-reference, a reference to whatever you have (a pointer in this case) has to be passed. So that's indeed a double indirection. In C++ you can even express this explicitly, since there's not only call-by-reference semantics, but also the address off operator &amp;. Applying the &amp; operator to a pointer (e.g. foo*) yields a double pointer (foo*). This is what happens behind the scenes when you use call-by-reference in C++ and various other languages.

In Java, everything but primitives is a pointer, and that pointer is passed, BY VALUE. In C++ terminology, there's only foo
, foo** does not exist. Hence, you can follow that pointer from within a method and thus modify the object that is being pointed too, but you CAN NOT re-assign the pointer as I showed above.

Pass-by-reference semantics DEMAND that a pointer can be re-assigned.

Modifying that was is being pointed too does not count. You are CONFUSED with modifying a stack allocated object. If you pass that into a method and can modify it, then it's pass by reference, since the object itself is passed (no pointer involved), but in Java there is no stack allocated object, only a pointer to an object and that pointer is passed by value.

If you're still confused, please read the dragon book. Even if not for this argument, it will greatly enhance your understanding of how languages and computers work.

u/r2p42 · 0 pointsr/AskProgramming

I tried to come up with a simple explanation how a compiler works but I feel unable to provide a wording which is still somehow correct.
I guess there is a reason why this book has 1000 pages: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

The simplest explanation would be that somewone wrote a program which the computer understands, that is able to detect your language and converts it also to something your computer can understand.

u/eclectro · 0 pointsr/programming

&gt;We are talking about a high-level language compiler, remember?

I still consider C a high level language. Some people don't for various reasons..

&gt;You were complaining that it compiles to C rather than emit instructions.

You simply read/took my post wrong. Nowhere am I complaining. Merely making an observation. It is not an unusual feat for a compiler to generate assembly instructions or machine code. Nor would I call it super difficult to write a compiler, but rather straightforward.

&gt;If you are going to emit instructions, it's up to you to write your own optimizer.

Or buy/obtain a compiler that already is capable of doing this step.