Best microprocessor & system design books according to redditors

We found 568 Reddit comments discussing the best microprocessor & system design books. We ranked the 161 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Subcategories:

Computer hardware design books
Computer hardware control systems books
Computer hardware DSPs books
Computer hardware embedded systems books
Microprocessor design books
PIC microcontrollers books

Top Reddit comments about Microprocessor & System Design:

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/arnar · 71 pointsr/AskReddit

A program in any language goes through either a compiler or an interpreter, or a mixture of both. A compiler turns the program into a series of machine instructions, low level codes that the processor knows how to "execute". An interpreter executes a program indirectly.

The first step of both compiling and interpreting is called parsing. This step takes the program text and converts it into an internal representation called an AST (Abstract Syntax Tree). For e.g. this converts "if" and its attached test and statement blocks into one kind of object, and "or" and its attached expressions into another kind. The compiler or interpreter knows that these two objects should be handled differently.

Programming languages are often programmed in themselves. However, a bootstrapping process is required to create the first compiler or interpreter, i.e. the first version needs to be written in a different language. For low level languages like C, this bootstrapping language is usually assembler. For other languages, more often than not the bootstrapping language is C. Some compilers or interpreters are written in different languages, e.g. the most popular version of Python (which is a mix of a compiler and an interpreter) is written in C.

Going from "rand()" to a string of bytes has more to do with calling a library than a specific feature of a programming language. From the PL point of view, "rand()" is a function call. This function resides in a library that may or may not be written in the same language, in any case, it contains the logic to generate a random number. There are several ways to do this, google or wikipedia for Pseudo Random Number Generator (PRNG).

The standard reference on compilers and interpreters is the so-called Dragon Book, I'll leave it up to others to suggest more material.

u/rtanaka6 · 48 pointsr/programming

But the Dragon Book has cool dragons on it!

u/Enlightenment777 · 42 pointsr/ECE

-----
-----

BOOKS


Children Electronics and Electricity books:

u/Hakawatha · 27 pointsr/electronics

That's because it is RF design. Have you read the handbook of black magic? Excellent book, I'm told.

u/Bluegorilla101 · 26 pointsr/funny

Should have gotten her this dragon book so she can get a headstart on writing compilers.

u/Lericsui · 26 pointsr/learnprogramming

"Introduction to Algorithms"by Cormen et.al. Is for me the most important one.

The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good.


Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics.


Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it.


SICP(although married to a language) teaches very very good fundamentals.


Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material.

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/wheeman · 25 pointsr/ECE

The Scientist & Engineer's Guide to Digital Signal Processing is a pretty decent book as a crash course. It covers the high level concepts in the first half and the hard core math at the very end.

In the middle there’s a chunk of stuff that’s very practical if you don’t have the time to learn all the math behind it. This is the stuff that I found most useful. It covered the various filters, why you would use one over the other, and basic implementations.

If you really want to learn DSP, a course might be useful but it all depends on what you want from it.

u/DeepMusing · 23 pointsr/engineering

Any engineering job is going to have a significant amount of domain knowledge that is specific to that company's products, services, or research. Getting an engineering degree is just the beginning. Once you get a job at a company, you will need to learn a shit load of new terms, IP, history, and procedures that are specific to that company. It's the next level of your education, and will take years to fully assimilate. School doesn't teach you anywhere near enough to walk into most engineering jobs and be independently productive. You are there to learn as much as do. The senior engineers are your teachers and gaining their knowledge and experience is the key to building a successful career. You need to look at them as a valuable resource that you should be taking every opportunity to learn from. If you don't understand what they are saying, then ask, take notes, and do independent research to fill in your knowledge gaps. Don't just dismiss what they say as techo-babble.

!!!!!! TAKE THIS TO HEART !!!!! - The single biggest challenge you will have in your engineering career is learning how to work well with your peers, seniors, and managers. Interpersonal skills are ABSOLUTELY critical. Engineering is easy: Math, science, physics, chemistry, software, electronics.... all of that is a logical, and learnable, and a piece of cake compared to dealing with the numerous and often quirky personalities of the other engineers and managers. Your success will be determined by your creativity, productivity, initiative, and intelligence. Your failure will be determined by everyone else around you. If they don't like you, no amount of cleverness or effort on your part will get you ahead. Piss off your peers or managers, and you will be stepped on, marginalized, criticized, and sabotaged. It's the hard truth about the work world that they don't teach you in school. You aren't going anywhere without the support of the people around you. You are much more likely to be successful as a moron that everyone loves, than a genius that everyone hates. It sucks, but that's the truth.

You are the new guy, you have lots to learn, and that is normal and expected. It's going to be hard and frustrating for a while, but you will get the hang of it and find your footing. Learn as much as you can, and be appreciative for any help or information that you can get.

As for digitizing a signal, it is correct that you should stick with powers of 2 for a number of technical reasons. At the heart of the FFT algorithm, the signal processing is done in binary. This is part of the "Fast" in Fast Fourier Transforms. By sticking with binary and powers of 2, you can simply shift bits or drop bits to multiply or divide by 2, which is lightning fast for hardware. If you use non powers of 2 integers or fractional sampling rates, then the algorithm would need to do extensive floating point math, which can be much slower for DSPs, embedded CPUs, and FPGAs with fixed-point ALUs. It's about the efficiency of the calculations in a given platform, not what is theoretically possible. Power of 2 sample rates are much more efficient to calculate with integer math for almost all digital signal processing.

I highly recommend reading the book "The Scientist and Engineer's Guide to Digital Signal Processing" by Steven W. Smith. It is by far the best hand-holding, clearly-explained, straight-to-the-point, introductory book for learning the basics of digital signal processing, including the FFT.

You can buy the book from Amazon [here.] (https://www.amazon.com/Scientist-Engineers-Digital-Signal-Processing/dp/0966017633/ref=sr_1_1?ie=UTF8&qid=1492940980&sr=8-1&keywords=The+Scientist+and+Engineer%27s+Guide+to+Digital+Signal+Processing) If you can afford it, the physical book is great for flipping though and learning tons about different signal processing techniques.

Or you can download the entire book in PDF form legally for free here. The author is actually giving the book away for free in electronic form ( chapter by chapter ).

Chapter 12 covers FFTs.



u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/antonivs · 18 pointsr/badcode

The code you posted was generated from a grammar definition, here's a copy of it:

http://www.opensource.apple.com/source/bc/bc-21/bc/bc/bc.y

As such, to answer the question in your title, this is the best code you've ever seen, in the sense that it embodies some very powerful computer science concepts.

It [edit: the Bison parser generator] takes a definition of a language grammar in a high-level, domain-specific language (the link above) and converts it to a custom state machine (the generated code that you linked) that can extremely efficiently parse source code that conforms to the defined grammar.

This is actually a very deep topic, and what you are looking at here is the output of decades of computer science research, which all modern programming language compilers rely on. For more, the classic book on the subject is the so-called Dragon Book, Compilers: Principles, Techniques, and Tools.

u/Cohesionless · 17 pointsr/cscareerquestions

The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics.

If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following:

  • Introduction to Algorithms, 3rd Edition: https://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844. A lot of people do not like this classic book because it is very theoretical, very mathematical, and very abstract, but I think that is its greatest strength. I find a lot of algorithms books either focus too much about how to implement an algorithm in a certain language or it underplays the theoretical foundation of the algorithm such that their readers can only recite the algorithms to their interviewers. This book forced me to think algorithmically to be able to design my own algorithms from all the techniques and concepts learned to solve very diverse problems.

  • Design Patterns: Elements of Reusable Object-Oriented Software, 1st Edition: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/. This is the original book on object-oriented design patterns. There are other more accessible books to read for this topic, but this is a classic. I don't mind if you replace this book with another.

  • Clean Code: A Handbook of Agile Software Craftsmanship, 1st Edition: https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882. This book is the classic book that teaches software engineer how to write clean code. A lot of best practices in software engineering is derived from this book.

  • Java Concurrency in Practice, 1st Edition: https://www.amazon.com/Java-Concurrency-Practice-Brian-Goetz/dp/0321349601. As a software engineer, you need to understand concurrent programming. These days there are various great concurrency abstractions, but I believe everyone should know how to use low-level threads and locks.

  • The Architecture of Open Source Applications: http://aosabook.org/en/index.html. This website features 4 volumes of books available to purchase or to read online for free. It's content focuses on over 75 case studies of widely used open-source projects often written by the creators of said project about the design decisions and the like that went into creating their popular projects. It is inspired by this statement: "Architects look at thousands of buildings during their training, and study critiques of those buildings written by masters."

  • Patterns of Enterprise Application Architecture, 1st Edition: https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/. This is a good read to start learning how to architect large applications.

    The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry.

    P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database.
u/quixotidian · 15 pointsr/compsci

Here are some books I've heard are good (I haven't read them myself, but I provide commentary based on what I've heard about them):

u/mcur · 14 pointsr/linux

You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.

I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/

I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&ie=UTF8&qid=1377402221&sr=1-1

Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&ie=UTF8&qid=1377402371&sr=1-1

Edit: Got the wrong Hennessy/Patterson book...

u/TheEdgeOfRage · 14 pointsr/BSD

If you want something really hardcore check out The Design and implementation of the FreeBSD operating system

Edit for 2nd edition

u/ReasonableDrunk · 14 pointsr/marvelstudios

The definitive work on high speed digital electrical engineering is, in fact, called the Handbook of Black Magic.

https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/pmjones · 13 pointsr/PHP

Scalable Internet Architectures by Theo Schlossnagle. He was my boss at OmniTI and knows his stuff.

u/[deleted] · 12 pointsr/programming

So I've been doing a bit more ASM programming etc lately. I liked this book when I read it, but these days I've gotten interested in really doing fast programming, i.e. taking advantage of the processors design in your code.

So if you liked this book and wanted to take it to the next step in superfast, I suggest these resources:

  • Agner Fog's optimization page
  • Jon Stokes' book Inside the Machine is AMAZING and covers the dawn of advanced x86 processor design up until recently - all the way from the Pentium to the Core 2 line, and covers PPC design too!

    And if you're on Linux, you NEED to install perf and check if your CPU has any performance counters it can take advantage of. They give tons of insight and may upset some assumptions you have about how your code really performs. valgrind's cachegrind tool is wonderful in the same vein, but only for simulated L1-L2 cache usage.

    Also, if you have one of those fancy new phones with a modern processor, ARM assembly is wonderful and fun (I do it on my iPhone.) Shit, some of them are dual core now. Throw your C code in gcc -S or whatever and look at the generated assembly. I'll try and find my resources for that later.
u/teresko · 12 pointsr/PHP

Actually i would suggest you to start learning OOP and maybe investigate the MVC design pattern, since those are both of subjects which average CodeIgniter user will be quite inexperienced in. While you might keep on "learning" frameworks, it is much more important to actually learn programming.

Here are few lecture that might help you with it:

u/cyrusol · 11 pointsr/PHP

You might want to look at Patterns of Enterprise Application Architecture by Martin Fowler. There are 3 full examples that you are looking for right now:

  • table gateway
  • active record style ORMs (like Eloquent)
  • data mapper style ORMs (like Doctrine which is exactly like Java's Hibernate)

    The examples are in Java but you probably won't have any difficulties understanding the code. He builds those from the ground up and finally compares them and says when to use what.

    -----

    Beyond that the abstraction endgame is to completely separate your persistence model from your domain model. If you have a user with an email address and other fields you don't want to have just one class that deals with loading the user row from a SQL database based on his email address, you want the latter part be isolated into its own class. (So that the user could in theory also be requested from an API or Memcached or a text file.)

    class User { ... }
    interface UserByEmailQuery {
    public function findByEmail(string $email): User?;
    }
    class SqlUserByEmailQuery implements UserByEmailQuery { ... }

    But it's sometimes simply not worth (economically) to go that far.
u/PubliusPontifex · 11 pointsr/compsci

The Dragon Book by somebody. A bit out of date now, but really helped me with my parser/tree implementation.

u/Authenticity3 · 10 pointsr/ECE

Old (1993) but classic fundamentals that are still relevant today:
High Speed Digital Design: A Handbook of Black Magic https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90

u/correctsbadalts · 10 pointsr/funny

Was expecting this dragon book

u/poincareDuality · 10 pointsr/compsci

For designing programming languages, my favorites are

u/VK2DDS · 9 pointsr/DSP

+1 for Cortex-M (with FPUs). I'm building a guitar pedal with an STM32F407 and it handles 12x oversampled distortion and a bunch of biquads at 48kHz (mono). It is paired with a CS4272 audio codec with DMA handling the I2S data.

It won't handle any reasonable FIR filter and the RAM limits it to ~500ms delay. There is a discovery board with external RAM but I haven't tried using it.

The F7 series are clocked a bit faster and some come with a double precision FPU instead of single but they have the same instruction set as the F4s. The Cortex-M7 has a longer pipeline (6 Vs 3 stages, probably to support the higher clock rate) so branching is probably less of a penalty on the M4.

This book is an excellent guide to the low level guts of the Cortex-M3 & M4 chips and contains a chapter dedicated to DSP on the M4. Long story short is contains a bunch of DSP instructions such as saturating integer arithmetic, integer SIMD, floating point fused multiply-accumulate etc which makes it semi-competitive against "true" DSP cores. The book compares the M4 and Sharc DSP to show that there's a big jump between them but the M4 wins hands down for ease of learning & development (strong community, free (GNU) tools etc).

Edit: If you want hardware this audio codec can be paired with this STM32F7 kit or this motherboard paird with this STM32F4Discovery board can take it as well.

u/Beagles_are_da_best · 9 pointsr/PrintedCircuitBoard

I did learn all of this stuff from experience. Honestly, I had a little bit of a tough time right out of college because I didn't have much practical circuit design experience. I now feel like I have a very good foundation for that and it came through experience, learning from my peers, and lots of research. I have no affiliation with Henry Ott, but I treat his book like a bible . I refer to it just about every time I do a board design. Why? because it's packed with this type of practical information. Here's his book. I bought mine used as cheap as I could. At my previous job, they just had one in the library. Either way, it was good to have around.

So why should you care about electromagnetic compatibility (EMC)? A couple reasons:

  1. EMC compliance is often regulated by industry and because a product requirement. The types of tests that your product has to pass is dependent on the industry typically, but in general there are tests where bad things are injected into your board and tests where they measure how noisy your board. You have to pass both.
  2. EMC compliance, in my opinion, is very well correlated with the reliability and quality of a product. If a product is destroyed "randomly" or stops working when the microwave is on, you're not likely to have a good opinion of that product. Following guidelines like the one I did above is the path to avoiding problems like that.
  3. EMC design is usually not taught in schools and yet it is the most important part of the design (besides making it perform the required product function in the first place). It also is very hard to understand because many of the techniques for improving your design do not necessarily show up on your schematics. Often, it's about how well your layout your board, how the mechanical design for the enclosure of your board is considered, etc.

    Anyways, it's definitely worth looking at and is a huge asset if you can follow those guidelines. Be prepared to enter the workforce and see rampant disregard for EMC best practices as well as rampant EMC problems in existing products. This is common because, as I said, it's not taught and engineers often don't know what tools to use to fix it. It often leads to expensive solutions where a few extra caps and a better layout would have sufficed.

    A couple more books I personally like and use:

    Howard Johnson, High Speed Digital Design (it's from 1993, but still works well)

    Horowitz and Hill, The Art of Electronics (good for understanding just about anything, good for finding tricks and ideas to help you for problems you haven't solved before but someone probably has)

    Last thing since I'm sitting here typing anyways:

    When I first got out of college, I really didn't trust myself even when I had done extensive research on a particular part of design. I was surrounded by engineers who also didn't have the experience or knowledge to say whether I was on the right path or not. It's important to use whatever resources you have to gain experience, even if those resources are books alone. It's unlikely that you will be lucky and get a job working with the world's best EE who will teach you everything you need to know. When I moved on from my first job after college, I found out that I was on the right path on many things thanks to my research and hard work. This was in opposition to my thinking before then as my colleagues at my first job were never confident in our own ability to "do EE the right way" - as in, the way that engineers at storied, big companies like Texas Instruments and Google had done. Hope that anecdotal story pushes you to keep going and learning more!
u/greenlambda · 9 pointsr/ECE

I'm mostly self-taught, so I've learned to lean heavily on App Notes, simulations, and experience, but I also like these books:
The Howard Johnson Books:
High Speed Digital Design: A Handbook of Black Magic
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_api_I0Iwyb99K9XCV
High Speed Signal Propagation: Advanced Black Magic
https://www.amazon.com/dp/013084408X/ref=cm_sw_r_cp_api_c3IwybKSBFYVA

Signal and Power Integrity - Simplified (2nd Edition)
https://www.amazon.com/dp/0132349795/ref=cm_sw_r_cp_api_J3IwybAAG9BWV

Also, another thing that can be overlooked is PCB manufacturability. It's vitally important to understand exactly what can and can't be manufactured so that you can make design trade offs, and in order to do that you need to know how they are made. As a fairly accurate intro, I like the Eurocircuits videos:
http://www.eurocircuits.com/making-a-pcb-pcb-manufacture-step-by-step

u/Atkrista · 9 pointsr/ECE

Personally, I found Oppenheim's text very dry and difficult to get through. I would recommend Lyons textbook.

u/Dhekke · 8 pointsr/programming

This book Structured Computer Organization is also very good at explaining in detail how the computer works, it's the one I used in college... Pretty expensive, I know, but at least the cover has nice drawings!

u/dbuckley · 8 pointsr/technology

> Why does it transmit anything at all

Electronics that have fast switching transitions generate significant amounts of radio frequency energy. In the modern world, it is a major part of the designers job to reduce or shield these emissions so that equipment doesn't interfere with other equipment.

There is a lot of skill and art and not a little black magic involved in getting high speed electronics to work at all. In fact, one of the first books to seriously tackle the subject, a book that many designer still has on their bookshelf is High Speed Digital Design: A Handbook of Black Magic. The challenge once it works is to make it less like a transmitter.

To prove the thing is compliant with the standards of where the thing is being sold, it is traditional to take the kit to an EMC test house, where the Device Under test (the DUT) is placed in a screened room, and set up in representative conditions (ie power cables, Ethernet cables etc), and the amount of radio frequency junk spewed into the air is measured. This costs money, according to here, about $1-10K. if you fail, you have to fix the design, and spend money for testing again, until it passes. And of course, fixing the design takes time.

Many countries are happy to sell kit across international boundaries with none of this stuff done at all.

u/DonaldPShimoda · 8 pointsr/ProgrammingLanguages

I've peeked at this free online book a few times when implementing things. I think it's a pretty solid reference with more discussion of these sorts of things!

Another option is a "real" textbook.

My programming languages course in university followed Programming Languages: Application and Interpretation (which is available online for free). It's more theory-based, which I enjoyed more than compilers.

But the dragon book is the go-to reference on compilers that is slightly old but still good. Another option is this one, which is a bit more modern. The latter was used in my compilers course.

Outside of that, you can read papers! The older papers are actually pretty accessible because they're fairly fundamental. Modern papers in PL theory can be tricky because they build on so much other material.

u/HikoboshiSama · 8 pointsr/compsci
u/Quintic · 8 pointsr/learnprogramming

Here are some standard textbooks on the subject. When I am looking for a book on a particular subject, I like to look at the class schedules for local universities and see what they are using. A class on programming languages is a standard part of a CS program I believe.

Compilers: Principles, Techniques, and Tools
http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811/ref=sr_1_3?ie=UTF8&qid=1343095509&sr=8-3&keywords=Dragon+Book

Concepts of Programming Languages
http://www.amazon.com/Concepts-Programming-Languages-Robert-Sebesta/dp/0136073476/ref=sr_1_5?s=books&ie=UTF8&qid=1343095607&sr=1-5&keywords=Programming+languages

Programming Language Pragmatics
http://www.amazon.com/Programming-Language-Pragmatics-Second-Michael/dp/0126339511/ref=sr_1_2?s=books&ie=UTF8&qid=1343095647&sr=1-2&keywords=Programming+language+pragmatics

u/TreeFitThee · 7 pointsr/freebsd

If, after reading the handbook, you find you still want a deeper dive check out The Design and Implementation of the FreeBSD Operating System

u/DVWLD · 7 pointsr/node

You should start by learning Go, Erlang, Rust and C.

/trolololololololol

Seriously, though, if you're talking about cramming as many users as onto a single machine as possible then node is not your runtime.

Node is great at building things that scale horizontally. It makes it really easy to write realtime, event based code. Node is really good at doing things that involve a lot of network IO since it's easy to do that in a non-blocking way. It's not a great choice for a high scale game server where memory usage is key.

If you want to know more about horizontal scaling patterns (which Eve only qualifies for if you squint a bit), I'd recommend starting here:

http://www.amazon.com/Scalable-Internet-Architectures-Theo-Schlossnagle/dp/067232699X

And looking at distributed consensus approaches, message queues, and bumming around http://highscalability.com/ a bit.

u/If_you_just_lookatit · 7 pointsr/embedded

I started early on with Arduino and moved into lower level embedded with the stm32 discovery line of development boards. Attached link of a good starting board that has tons of example code from ST.

https://www.mouser.com/ProductDetail/STMicroelectronics/STM32F407G-DISC1?qs=mKNKSX85ZJejxc9JOGT45A%3D%3D

If you want a decent intro book into embedded topics, this book does a decent job of introducing the different parts of working on an embedded project:

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

​

Between the Arduino and Pi, the Arduino is more representative of an embedded device. It introduces you to resource constrained system that can be a good starting point for working with digital and analog io's and can let you hook up to communication bus enabled peripherals that use UART, I2C, and SPI. The biggest problem is that it will not introduce you immediately to debugging and standard compilation tools. However, arduino has been a starting point for many developers. Good luck!

u/theoldwizard1 · 7 pointsr/hardware

>ISAs are commonly categorized by their complexity, i.e., the size of their instruction space: large ISAs such as x86-64 are called Complex Instruction Set Architectures (CISC), while the chips powering smartphones and other portable, low-power devices are based on a Reduced Instruction Set Architecture (RISC). The huge instructions space of the typical CISC ISA necessitates equally complex and powerful chips while RISC designs tend to be simpler and therefore less power hungry.

I do understand that this a "once over lightly", but, based on spending a few years of my career working with a team to select a "net gen" embedded processor for a Fortune 50 company (that would purchase millions) I do feel qualified to make these comments. (I am also the proud owner of a well worn 1st Edition of the Hennessy and Patterson Computer Architecture: A Quantitative Approach)

The lines between RISC and CISC keep getting muddier every year that goes by. While probably no longer true, but the biggest differentiator between RISC and CISC was that RISC used fixed length instructions. This made decoding the instructions MUCH simpler. The decode portion of a CISC CPU had to grab a few bytes, partially decode them, and then decide how many more bytes to grab.

The old Digital Equipment Corporation VAX architecture was (and probably still is) the MOST complex instruction set architecture. Most arithmetic and logical operation could have 3 operands and each operand could have any combination of multiple addressing modes. Worse, the VAX architecture dedicated 3 of the only 16 register for "context" (SP, FP and AP).

RISC machines had more registers than CISC machines and, over time, compiler writers figured out how to do the equivalent of the FP and AP from deltas off the SP. With the larger number of registers, typically one register was a dedicated constant zero register, necessary because all memory was accessed via indirect addressing. For embedded processor that had no loader to do "fix up" at load time, 1 or 2 more registers became dedicated pointers to specific types of memory (perhaps RAM vs ROM or "short" data vs "complex" data i.e. arrays, strings, etc)

With smaller die sizes, RISC machines could have more cache on chip. More cache meant "more faster" !

u/EngrToday · 7 pointsr/CUDA

As far as I know this is the go to for most people learning CUDA programming. For CUDA 9+ specific features, you're best bet is probably looking at the programming guide on NVIDIA's site for the 9 or 10 release. I don't believe there's much in terms of published books on specific releases like there is for C++ standards.

u/fatangaboo · 7 pointsr/ECE

For your job? Spend the money or get your boss to spend the money on the books written by Howard Johnson.

(book 1)

(book 2)

Trivialized and unsatisfying answer to the question in the title of this thread: Vbounce = Lground * dI/dt . You think Lground equals zero but you are mistaken.

u/waaaaaahhhhh · 7 pointsr/ECE

There seems to be two approaches to learning DSP: the mathematically rigorous approach, and the conceptual approach. I think most university textbooks are the former. While I'm not going to understate the importance of understanding the mathematics behind DSP, it's less helpful if you don't have a general understanding of the concepts.

There are two books I can recommend that take a conceptual approach: The Scientist and Engineer's Guide to Digital Signal Processing, which is free. There's also Understanding Digital Signal Processing, which I've never seen a bad word about. It recently got its third edition.

u/YuleTideCamel · 7 pointsr/learnprogramming

Pick up these books:

u/masklinn · 7 pointsr/programming

Yet another long-winded and mostly useless BeautifulCode blog post that could've been avoided by suggesting that readers go buy Fowler's Refactoring (and read page 260 in this case, introduce Null Object) as well as the same Fowler's Patterns of Enterprise Application Architecture page 496 Special Case, A subclass that provides special behavior for particular cases.

Also, note that while the whole texts and explanations and examples and whatnot are not provided, you can find the refactorings and patterns of these books in The Refactoring Catalog and Catalog of Patterns, the respective websites of Refactoring and PoEAA

u/hwillis · 6 pointsr/electronics

Can't use free eagle (too big) for this, but kicad or probably other things would work. With a few good books you can lay out a big board without advanced tools, although it can take longer. With cheap/free tools you'll usually have to use some finicky or kludgy methods to do really complex routing (blind/buried vias, free vias, heat transfer, trace length), but that usually isn't too big a deal. Here's a timelapse of a guy using Altium to route a high speed, large (a bit smaller than op's) data board for a high speed camera. The description has rough steps with timestamps- 38 hours total to lay out.

u/Jazzy_Josh · 6 pointsr/cscareerquestions

The dragon book if you're into compilers

There's a second edition, but I think this one has a cooler cover ;)

u/FattyBurgerBoy · 6 pointsr/webdev

The book, Head First Design Patterns, is actually pretty good.

You could also read the book that started it all, Design Patterns: Elements of Reusable Object-Oriented Software. Although good, it is a dull read - I had to force myself to get through it.

Martin Fowler is also really good, in particular, I thoroughly enjoyed his book Patterns of Enterprise Architecture.

If you want more of an MS/.NET slant of things, you should also check out Dino Esposito. I really enjoyed his book Microsoft .NET: Architecting Applications for the Enterprise.

My recommendation would be to start with the Head First book first, as this will give you a good overview of the major design patterns.

u/llimllib · 6 pointsr/compsci

sipser (I have the first edition which you can get on the cheap, it's very good.)

AIMA

Dragon

Naturally, TAOCP.

Many will also recommend SICP, though I'm not quite sure that's what you're angling at here, it's probably worth browsing online to see.

u/dhdfdh · 6 pointsr/freebsd

The Design and Implementation of the FreeBSD Operating System

Also, the devs hang out on the mailing lists and some on the FreeBSD forum.

u/moyix · 5 pointsr/ReverseEngineering

Have you worked through the loop detection in the Dragon Book? There are some slides on it here:

http://www.cs.cmu.edu/afs/cs/academic/class/15745-s03/public/lectures/L7_handouts.pdf

u/blexim · 5 pointsr/REMath

The object you're interested in is the call graph of the program. As you've observed, this is a DAG iff there is no recursion in the program. If function A calls B and B calls A, this is called mutual recursion and still counts as recursion :)

A related graph is the control flow graph (CFG) of a function. Again, the CFG is a DAG iff the function doesn't contain loops.

An execution trace of a program can certainly be represented as a DAG. In fact, since an execution trace does not have any branching, it is just a straight line! However you are very rarely interested in a single trace through a program -- you usually want to reason about all the traces. This is more difficult because if you have any looping structure in the global CFG, there is no (obvious) upper bound on the size of a trace, and so you can't capture them all with a finite structure that you can map into SMT.

Every program can be put into SSA form. The trick is that when you have joins in the control flow graph (such as at the head of a loop), you need a phi node to fix up the SSA indices. If you don't have it already, the dragon book is pretty much required reading if you're interested in any kind of program analysis.

In general, if you have a loop free control flow graph of any kind (a regular CFG or a call graph), then you can translate that graph directly into SAT or SMT in a fairly obvious way. If you have loops in the graph then you can't do this (because of the halting problem). To reason about programs containing loops, you're going to need some more advanced techniques than just symbolic execution. The big names in verification algorithms are:

  • Bounded model checking
  • Abstract interpretation
  • Predicate abstraction
  • Interpolation based methods

    A good overview of the field is this survey paper. To give an even briefer idea of the flavour of each of these techniques:

    Bounded model checking involves unwinding all the loops in the program a fixed number of times [; k ;]. This gives you a DAG representing all of the traces of length up to [; k ;]. You bitblast this DAG (i.e. convert it to SAT/SMT) and hand off the resulting problem to a SMT solver. If the problem is SAT, you've found a concrete bug in the program. If it's UNSAT, all you know is that there is no bug within the first [; k ;] steps of the program.

    Abstract interpretation is about picking an abstract domain to execute your program on, then running the program until you reach a fixed point. This fixed point tells you some invariants of you program (i.e. things which are always true in all runs of the program). The hope is that one of these invariants will be strong enough to prove the property you're interested in.

    Predicate abstraction is just a particular type of abstract interpretation where your abstract domain is a bunch of predicates over the variables of the program. The idea is that you get to keep refining your abstraction until it's good enough to prove your property using counterexample guided abstraction refinement.

    Interpolation can be viewed as a fancy way of doing predicate refinement. It uses some cool logic tricks to do your refinement lazily. The downside is that we don't have good methods for interpolating bitvector arithmetic, which is pretty crucial for analyzing real programs (otherwise you don't take into account integer overflow, which is a problem).

    A final wildcard technique that I'm just going to throw out there is loop acceleration. The idea here is that you can sometimes figure out a closed form for a loop and replace the loop with that. This means that you can sometimes remove a loop altogether from the CFG without losing any information or any program traces. You can't always compute these closed forms, but when you can you're in real good shape.

    Drop me a message if you want to know anything else. I'm doing a PhD in this exact area & would be happy to answer any questions you have.
u/adamnemecek · 5 pointsr/freebsd

you can check out the table of contents on amazon http://www.amazon.com/Design-Implementation-FreeBSD-Operating-Edition/dp/0321968972/ref=dp_ob_title_bk
or the books website http://ptgmedia.pearsoncmg.com/images/9780321968975/samplepages/9780321968975.pdf
but the answer to all your questions is basically yes, this is the book that fits your criteria.

u/deaddodo · 5 pointsr/osdev

The source in the littleosbook builds on itself each chapter. However, it's important to know that the littleosbook, osdev wiki and most online resources aren't necessarily "tutorials" after the bootloader and bare-bones stages. Any later information is going to be more abstract and guidance. If you need in depth assistance with osdev, you'll want to invest in one (or more) of the following:

u/svec · 5 pointsr/embedded

Here's a few books I highly recommend:

Making Embedded Systems by Elecia White

An Embedded Software Primer by David Simon

Programming Embedded Systems in C and C++ by Michael Barr - out of print, but still a decent book.

Also, embedded guru Jack Ganssle has a long list of embedded books he's reviewed here: http://www.ganssle.com/bkreviews.htm - lots of good stuff on there

u/elchief99 · 5 pointsr/raspberry_pi

Buy the Maker's Guide to the Zombie Apocalypse and make some of the contraptions in it. They use RPis

u/superflygt · 5 pointsr/DSP

https://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419

There's probably a free pdf floating around somewhere on the net.

u/NoahFect · 5 pointsr/ECE

Oppenheim & Schafer is the usual standard text, as others have said. However, it's pretty theory-intensive and may not be that much of an improvement over your current book, if you are looking for alternative explanations.

I'd say you should look at Lyons' Understanding Digital Signal Processing instead of O&S. Also the Steven Smith guide that mostly_complaints mentioned is very accessible. Between Smith and Lyons you will get most of the knowledge that you need to actually do useful DSP work, if not pass a test in it.

u/ElectricRebel · 5 pointsr/compsci

For compilers:

u/boredcircuits · 5 pointsr/learnprogramming

Start with the Dragon Book.

When it actually comes time to implement the language, I would recommend just writing the frontend and reusing the backend from another compiler. LLVM is a good option (it's becoming popular to use as a backend, it now has frontends for C, C++, Objective C, Java, D, Pure, Hydra, Scheme, Rust, etc). See here for a case study on how to write a compiler using LLVM as the backend.

u/fbhc · 5 pointsr/AskComputerScience

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

​

But you might also consider Basics of Compiler Design which is a good and freely available resource.

​

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/nwndarkness · 4 pointsr/FPGA

Computer Organization and Design RISC-V Edition: The Hardware Software Interface (ISSN) https://www.amazon.com/dp/B0714LM21Z/ref=cm_sw_r_cp_api_i_Wn3xDbMYHH61S

Computer Architecture: A Quantitative Approach (The Morgan Kaufmann Series in Computer Architecture and Design) https://www.amazon.com/dp/0128119055/ref=cm_sw_r_cp_api_i_jp3xDbRYQ12GA

u/yberreby · 4 pointsr/programming

When I started getting interested in compilers, the first thing I did was skim issues and PRs in the GitHub repositories of compilers, and read every thread about compiler construction that I came across on reddit and Hacker News. In my opinion, reading the discussions of experienced people is a nice way to get a feel of the subject.

As for 'normal' resources, I've personally found these helpful:

  • This list of talks about compilers in general.
  • The LLVM Kaleidoscope tutorial, which walks you through the creation of a compiler for a simple language, written in C++.
  • The Super Tiny Compiler. A really, really simple compiler, written in Go. It helps with understanding how a compilation pipeline can be structured and what it roughly looks like.
  • Anders Hejlsberg's talk on Modern Compiler Construction. Helps you understand the difference between the traditional approach to compilation and new approaches, with regards to incremental recompilation, analysis of incomplete code, etc. It's a bit more advanced, but very interesting nevertheless.

    In addition, just reading through the source code of open-source compilers such as Go's or Rust's helped immensely. You don't have to worry about understanding everything - just read, understand what you can, and try to recognize patterns.

    For example, here's Rust's parser. And here's Go's parser. These are for different languages, written in different languages. But they are both hand-written recursive descent parsers - basically, this means that you start at the 'top' (a source file) and go 'down', making decisions as to what to parse next as you scan through the tokens that make up the source text.

    I've started reading the 'Dragon Book', but so far, I can't say it has been immensely helpful. Your mileage may vary.

    You may also find the talk 'Growing a language' interesting, even though it's not exactly about compiler construction.

    EDIT: grammar
u/fluicpana · 4 pointsr/italy

Per testare le acque velocemente puoi usare https://rubymonk.com/ (introduce Ruby in modo basico). Anche Coursera, Khan, Udacity e simili hanno corsi introduttivi sulla programmazione.

Mentre se vuoi imparare a programmare, il percorso deve toccare almeno tutte queste tappe, in ordine:

  1. [Computer Organization and Design](http://www.amazon.com/Computer-
    Organization-Design-Fourth-Edition/dp/0123744938)

  2. The Structure and Interpretation of Computer Programs

  3. Un buon libro di Assembly

  4. The C programming language

  5. Compillers

  6. Code complete, The practice of programming

  7. Fai finta di aver letto tutto The art of computer programming

  8. Un linguaggio a oggetti, magari Programming Ruby

  9. O/E Python, Dive into Python

  10. Design patterns

  11. Impara un linguaggio funzionale.


    Da qui puoi partire e specializzarti in quello che ti interessa

u/ewood87 · 4 pointsr/BSD
u/icantthinkofone · 4 pointsr/BSD

I noticed you linked to the first edition. Here is the second edition

u/yumology · 4 pointsr/arduino

Might want to check out Makers Guide to Zombie Apocalypse. It has lots of home defense things you can do with an arduino.

u/bobappleyard · 3 pointsr/programming

I don't know what he'd recommend, but I found the Dragon Book and Modern Compiler Design to be decent treatments of the subject. There are lots of interesting texts out there though.

Sorry for the cheeky reply.

u/evaned · 3 pointsr/programming

> And which coder uses physical books any more?

I do for things beyond actual language references; e.g. maybe everything in The Dragon Book has a good description somewhere, but grabbing that off my desk and having a decent chance of it having what I want (it has some problems and omissions, but it's reasonably good) will save wading through a bunch of crap online until I find something complete and accurate enough.

u/sfrank · 3 pointsr/programming

But make sure to get the 2nd edition of the Compiler book. It has been enhanced quite a bit.

u/OmegaNaughtEquals1 · 3 pointsr/cpp_questions

This is a great question! It's also one that every serious CS person will ask at some point. As others here have noted, to really understand this question you must understand how compilers work. However, it isn't necessary to understand the gory details of compiler internals to see what a compiler does for you. Let's say you have a file called hello.cpp that contains the quintessential C++ program

include <iostream>

int main() {<br />
    std::cout &amp;lt;&amp;lt; &quot;Hello, world!\n&quot;;<br />
}<br />


The first thing the compiler does is called preprocessing. Part of this process includes expanding the #include statements into their proper text. Assuming you are using gcc, you can have it show you the output of this step

gcc -E -o hello.pp hello.cpp

For me, the hello.cpp files explodes from 4 lines to nearly 18000! The important thing to note here is that the contents of the iostream library header occur before the int main lines in the output.

The next several step for the compiler are what you will learn about in compiler design courses. You can take a peek at gcc-specific representations using some flags as discussed on SO. However, I pray you give heed. For there be dragons!

Now let's take a look at the compiler's output. To do this, I am going to not #include anything so the output is very simple. Let's use a file called test.cpp for the rest of the tests.

int main() {
int i = 3, j = 5;
float f = 13.6 / i;
long k = i&lt;&lt;j;
}

To see the compiler's output, you can use

g++ -S -masm=intel test.cpp

The -S flag asks gcc to just output the generated assembly code and -masm=intel requests the intel dialect (by default, gcc uses the AT&amp;T dialect, but everyone knows the intel one is superior. :) ) The output on my machine (ignoring setup and teardown code) is outlined below.

push rbp
mov rbp, rsp

/ int i = 3, j = 5; /
mov DWORD PTR [rbp-20], 3
mov DWORD PTR [rbp-16], 5

/ float f = 13.6 / i; /
pxor xmm0, xmm0
cvtsi2sd xmm0, DWORD PTR [rbp-20]
movsd xmm1, QWORD PTR .LC0[rip]
divsd xmm1, xmm0
movapd xmm0, xmm1
cvtsd2ss xmm2, xmm0
movss DWORD PTR [rbp-12], xmm2

/ long k = i&lt;&lt;j; /
mov eax, DWORD PTR [rbp-16]
mov edx, DWORD PTR [rbp-20]
mov ecx, eax
sal edx, cl
mov eax, edx
cdqe
mov QWORD PTR [rbp-8], rax

/ implicit return 0; /
mov eax, 0
pop rbp
ret

There are lots of details to learn in here, but you can generally see how each simple C++ statement translates into many assembly instructions. For fun, try compiling that program with the optimizer turned on (with g++, you can use -O3). What is the output?

There is still much to see from the binary that is assembled. You can use nm and objdump to see symbols or ldd to see what other libraries were (dynamically) linked into the executable. I will leave that as an exercise for the reader. :)

u/Ohthere530 · 3 pointsr/explainlikeimfive

Disk drives are like post office boxes. They are numbered from one to whatever, and each one holds just a small amount of information. Each of these "boxes" is called a block.

Users like to look at documents organized into folders. Each document has its own name and can have any amount of data.

A filesystem is the software that converts from documents and folders the way users like to see them to the blocks that disk drives hold. For every file, there is some information about who owns it, when they last changed it, which blocks hold the actual data, and so on. (This is an inode.)

For details, I'd recommend The Design and Implementation of the FreeBSD Operating System.

u/a-schaefers · 3 pointsr/unixporn

Initially I watched an episode of lunduke hour that featured a freeBSD dev https://www.youtube.com/watch?v=cofKxtIO3Is

I like the documentation available to help me learn. I got my hands on the FreeBSD handbook and can't wait to get into the design and implementation text book, Addison Wesley, 928 pages. https://www.amazon.com/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=pd_lpo_sbs_14_t_0/143-0574353-4482766?_encoding=UTF8&amp;amp;psc=1

I appreciate the focus on servers and research computing that is BSD's strong suit.

u/mdf356 · 3 pointsr/cscareerquestions

It's about 40 years too late for any one person to have mastery of all the different parts of a computer.

For computer architecture, Hennessy and Patterson is the classic volume. For the software algorithms that are used everywhere, CLRS is the classic guide. For Operating Systems, The Design and Implementation of FreeBSD is a good book. I'm sure there's something similar for networking.

You could read the PCI spec, and some Intel data sheets, and the RFCs for networking protocols if you want to learn those things. For most parts of computers, I strongly suspect that most material is either too high level (yet another "Introduction to") or too low level (reading an RFC doesn't tell you whether it's used that way in practice or has been superseded).

The only way I've gotten to know things is to play with them. Change the code, make it do something else. Personal research like that is very educational but time consuming, and there's way too much out there to know everything, even for a single small piece of hardware and its associated software.

u/jgm27 · 3 pointsr/ECE

The second book is good. You might also want to check out: Modern Processor Design: Fundamentals of Superscalar Processors https://www.amazon.com/dp/1478607831/ref=cm_sw_r_cp_api_M8YmxbVCM44YW

u/mfukar · 3 pointsr/askscience

&gt; When we say dual core of quad core processor, what we really mean is a single integrated chip (CPU) with 2 (dual) or 4 (quad) processors on it. In the old days processors were single core so this confusion didn't arise as a single core processor was just a processor.
&gt;

Many CPUs can be included in a single integrated die.

In "the old days" there were multi-chip modules includinged multiple CPUs (and/or other modules) in separate ICs.

&gt; A processor consists of a control unit (CU) and a arithmetic logic unit (ALU).

And many other things, which it sometimes shares (MMUs, I/O controllers, memory controllers, etc). Don't be too picky over what a processing unit includes. For those that want to dive in, grab this or this book and read on.

&gt; The combination of components is why just having more cores or more GHz doesn't always mean a faster CPU - As the onboard cache and other factors can also slow the processing down, acting as a bottleneck.

Bit of a superfluous contrast, these days. Using anything external to the CPU slows it down, by virtue of propagation delays alone. That's one of the reasons we want many cores / CPUs. The more CPUs or faster clocks question is a red herring - here's an article that explores why (the context is CAD, but the observations are valid in most areas of application).

u/xamino · 3 pointsr/learnprogramming

I don't think we need to go that deep. An excellent book on how CPUs work at the assembly level is Inside the Machine. I can only recommend it even for programmers.

u/wgren · 3 pointsr/dcpu_16_programming

Code: The Hidden Language of Computer Hardware and Software,The Elements of Computing Systems and Inside the Machine were recommended on Hacker News.

I have the last one, I will re-read it over Easter holidays...

u/GrayDonkey · 3 pointsr/java

You need to understand there are a couple of ways to do Java web development.

  • Servlets &amp; JSPs. - Check out Core Servlets and JavaServer Pages or the Java EE Tutorial. Note that I link to an older EE tutorial because the newer versions try to switch to JSF and not much changed in Servlets and JSPs between Java EE 5 and 6. I recommend learning Servlets and JSPs before anything else.
  • JSF - A frameworks that is layered on top of Servlets and JSPs. Works well for some tasks like making highly form centric business web apps. Most of the JSF 2 books are okay. JSF is covered in the Java EE 6 Tutorial
  • Spring - Spring is actually a bunch of things. You'd want to learn Spring MVC. If you learn any server-side Java web tech besides Servlets and JSPs you'd probably want to learn Spring MVC. I wouldn't bother with GWT or any other server-side Java web tech.
  • JAX-RS - After you get Servlets and JSPs down, this is the most essential thing for you to learn. More and more you don't use server-side Java (Servlets &amp; JSPs) to generate your clients HTML and instead you use client-side JavaScript to make AJAX calls to a Java backend via HTTP/JSON. You'll probably spend more time with JavaScript:The Good Parts and JavaScript: The Definitive Guide than anything else. Also the JAX-RS api isn't that hard but designing a good RESTful api can be so be on the lookout for language agnostic REST books.

    Definitely learn Hibernate. You can start with the JPA material in the Java EE tutorial.

    As for design patterns, Design Patterns: Elements of Reusable Object-Oriented Software is a classic. I also like Patterns of Enterprise Application Architecture for more of an enterprise system pattern view of things. Probably avoid most J2EE pattern books. Most of the Java EE patterns come about because of deficiencies of the J2EE/JavaEE platform. As each new version of Java EE comes out you see that the patterns that have arisen become part for the platform. For example you don't create a lot of database DAOs because JPA/Hibernate handles your database integration layer. You also don't write a lot of service locators now because of CDI. So books like CoreJ2EE Patterns can interesting but if you are learning a modern Java web stack you'll be amazed at how archaic things used to be if you look at old J2EE pattern books.

    p.s. Don't buy anything that says J2EE, it'll be seven years out of date.
u/motivated_electron · 3 pointsr/ECE

Hi,

I have two-part suggestion for you. Naturally, this is really just what I did to move from your shoes to where I am now (writing device drivers, and using real-time operating systems) on nice toolchains (sets of IDEs coupled with compilers and debugger interfaces).

The first thing you ought to do next is focus on learning C. ANSI C.

This is book to use: Introduction to Embedded Systems by David Russel

By actually stepping through the book, you'll get to learn what is embedded is all about, without having to learn a new debugging interface (because you won't have one), and without having to buy a new board.

The book uses the Arduino Uno board, and the Arduino IDE to teach you how to NOT use the Arduino API and libraries. This teaches you about the "building" process of C programs - the compilation, linking, bootlaoders, etc. You'll learn all the low level stuff, on a platform (The AT Mega 328p) that is easier to understand. You'll even get into making interrupt-based programs, a buffered serial driver, pulse-width modulation, input capture on timers etc.

Ever since having gone through the book, I've been able to move to other platforms, knowing that the ideas are essentially the same, but more involved. If you were to jump straight into the Arm Cortex-based 32 bit processors, you would feel rather overwhelmed by the complexity of the peripherals that typically get placed onto the same die. You would end up resorting to high level hardware abstraction libraries (like ST Microelectronic's horrid HAL) and MAYBE be able to use them, but you would have no idea what would actually be going on. As soon as a library stops working, you need to be able to know where it broke and why.

So do this patiently.

Start with the 8 bit micro on the Arduino using the book, using basic timers, basic interrupts, basic C types.


Only later, it is wise to pick up an ARM board to start experimenting with. These devices are powerful monsters that get work done. But you won't have an appreciation for what they and their powerful peripherals can do until you've wrestled with their simpler cousins on the AT Mega 328p.

You'll have to pick an IDE, (or none if you really want to know and understand the build process), a set of libraries to use (or none, if you want to learn the monster peripherals the hard way), and an operating system (or none, if you want to stick to bare-metal programs, but really, a 120 MHz cpu with all that power and no OS is sort of a waste).

I would recommend starting with the TIVA C series board from Texas Instruments. It's used in the very nicely paced Intro to ARM Microcontrollers by Jonathan Valvano.

That would be another book well worth the time to go through like a tutorial, because it is.

These book have also helped me along the way: Definitive Guide Cortex M3 Cortex M4 Processors by Joseph Yiu
and Computer Organization and Embedded Systems by Zvonko Vranesic.

If you want to explore this field, please be prepared to be patient. The learn curves here can be steep. But the things you'll be capable of making are incredible if you keep it up. I hope your exploration goes well!
















u/jhillatwork · 3 pointsr/compsci

In addition to these, check out Computer Architecture: A Quantatative Approach by Hennesey &amp; Patterson. I had this as a textbook as an undergrad and still throw it at folks when they are doing very low-level optimizations that require intimate understanding of modern computers.

u/pyroscopic24 · 3 pointsr/CUDA

I agree with guy. Having a solid C++ background is good but programming for CUDA specifically is something else.

The book that I used when I took CUDA programming as an undergrad was this:
Programming Massively Parallel Processors: A Hands-on Approach 3rd Edition


Here's a sample of the 1st edition of the book. It's not too far from the 3rd edition but checkout Chapter 3 to see how much different it is from programming typically in C++.

u/ArithmeticIsHard · 3 pointsr/Cplusplus

When I took a High Performance Computing course, this book came in handy.

Programming Massively Parallel Processors: A Hands-on Approach https://www.amazon.com/dp/0128119861/ref=cm_sw_r_cp_api_i_Xc3SCbDS47WCP

u/jstampe · 3 pointsr/compsci

I found Tanenbaum's Structured Computer Organization to be very good. Giving a complete overview of the whole stack.

u/Wil_Code_For_Bitcoin · 3 pointsr/PrintedCircuitBoard

Not entirely sure what you're looking for but I've heard a lot of praises for this book : https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/dietfig · 3 pointsr/electronics

High Speed Digital Design: A Handbook of Black Magic is supposed to be a great book on the subject but the frequencies you're working at don't really qualify as anything approaching "high speed". I really don't think you'll have any issues. The wavelength at 100 kHz is 3 kilometers so you're nowhere near having to worry about transmission line effects.

Make sure to adequately decouple every power pin at the chip to deal with the switching transients from the FETs otherwise you'll see a lot of ripple on your supply lines which can cause problems. ADI generally uses a 1 uF and 100 nF capacitor in parallel (IIRC) in their application circuits and I tend to think they know what they're doing.

Is your copper pour grounded? I wouldn't be very worried about coupling noise into your logic traces because 400 Hz is such a low frequency but I suppose it's possible.

ADI publishes a guide called "PCB Board Layout and Design Techniques" that goes through things like proper grounding but I didn't have any luck trying to find it on Google. The Circuit Designer's Companion is an excellent book that also covers the same material with a lot more depth.

u/Skipper_Jos · 3 pointsr/engineering

I will also recommended 'High Speed Digital Design: A Handbook of Black Magic book' , it definitely has some good stuff!
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90#

u/erasmus42 · 3 pointsr/rfelectronics
u/doodle77 · 3 pointsr/electronics

this book.

OP's board is clearly not high speed so it doesn't matter.

u/lerpanerp · 3 pointsr/DSP

I found Rick Lyon's book a much easier read.

u/apcragg · 3 pointsr/RTLSDR

The chapter on quadrature signals in this book is really good. It has some of the best illustrations of the concept that I have come across. The amazon link also lets you browse that chapter for free.

u/jamesonchampagne · 3 pointsr/DSP

Understanding Digital Signal Processing by Richard Lyons is the best intro in my opinion:

http://www.amazon.com/gp/aw/d/0137027419/ref=mp_s_a_2?pi=54x75&amp;amp;qid=1344996249&amp;amp;sr=8-2

Teaches concepts without getting bogged down in the math details. Once you understand the concepts, get Oppenheim and Schafer to learn the dirty details.

u/ntr0p3 · 3 pointsr/AskReddit

By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology.

Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software.

Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should.

For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything.

I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is.

For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever.

For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :(

Best of luck in your travels however! :)

edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius.

also, Feynman QED is great, but his other book is awesome just as a happy intellectual read

also try to avoid either kaku and hawking for anything more complicated than primers.

edit no. 9: mit's ocw is win itself.

edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields)

u/sindrit · 3 pointsr/compsci

Skip Calculus (not really useful unless you do fancy graphics or sound generators or scientific stuff). Discrete mathematics is what you want to look at for CS. You might want to move on to a linear algebra course from there.

Get the CS specific University textbooks. Here are some to get you started.

u/dohpaz42 · 3 pointsr/PHP

Agreed. There are plenty of resources out there that will help you understand design patterns. If you're new to the concept, I would recommend Head First: Design Patterns, it might be based on Java, but the examples are simple to understand and can mostly apply to PHP as well. When you feel like you've grasped the basic concepts of design patterns, you can move on to more advanced texts, like Martin Fowler's Patterns of Enterprise Design - this is a great reference for a lot of the more common patterns. There is also Refactoring: Improving the Design of Existing Code. These are great investments that will help you with any project you work on, and will help you if you decide to use a framework like Zend which uses design patterns very heavily.

u/smugglerFlynn · 3 pointsr/compsci
  1. Read any book on Java Patterns (probably the one by GoF) - they are applicable across different domains
  2. Read Patterns of Enterprise Application Architecture by Martin Fowler - this is the book that influenced many architects
  3. Study the field: The Architecture of Open Source Applications
  4. Study fundamentals: A Methodology for Systems Engineering by Arthur D. Hall - this one is hard to find
  5. Study as much different frameworks/architectures/languages as possible. You'll start to notice similarities yourself.
  6. Solve every problem you meet from architectural point of view. You will achieve nothing just reading the books. Refactor your old projects in terms of using patterns and new methodologies, write down designs for your wild random ideas, teach others about the stuff you know.

    Also, take a note: patterns are only a small part of systems engineering applied to CS. Think REST and SOAP, think of how to better integrate two different applications together - not how to code their insides more efficiently. Start considering business logic and requirements - and you will find yourself whole new level of challenging architectural tasks.
u/pitiless · 3 pointsr/PHP

The following books would be good suggestions irrespective of the language you're developing in:

Patterns of Enterprise Application Architecture was certainly an eye-opener on first read-through, and remains a much-thumbed reference.

Domain-Driven Design is of a similar vein &amp; quality.

Refactoring - another fantastic Martin Fowler book.

u/cmgg · 3 pointsr/funny

Reminds me of a joke:

a: "Alright the book is done, all that is left is to choose a cover".

b: "Dragons"

a: "B-but the book is about..."

b: "DRAGONS I SAID"

Book

u/HotRodLincoln · 3 pointsr/IWantToLearn

There are books specifically on language design, syntax trees, and unambiguous grammars.

The classic books on compiler design are "The Dragon Book", designing a compiler is important because a statement in the language should mean exactly one thing, and a language should be able to be compiled efficiently. This is more difficult than it sounds.

Second, you need to understand language design, variable binding, etc. This is a topic of Programming Language Paradigms. I'll figure out a good book for this and edit to add it. The best book probably covers languages like Ada, Haskell, C, and Java and gives an overview of their design and reasons.

edit: The book for design is Concepts of Programming Languages 9th ed, by Robert W. Sebesta.

u/PCneedsBuilding · 2 pointsr/computerscience

I enjoyed this book https://www.amazon.ca/Definitive-Guide-Cortex®-M3-Cortex®-M4-Processors/dp/0124080820, you're going to have to get acquainted with the technical reference manuals at arm's website also.

u/creav · 2 pointsr/programming

&gt; I'm personally happy that didn't focus on any single language that much.

Programming Language Pragmatics is one of my favorite books of all time!

u/IjonTichy85 · 2 pointsr/compsci

I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.

If you're interested in the theory of cs, here are a few starting points:

Introduction to Automata Theory, Languages, and Computation

The book you should buy

MIT: Introduction to Algorithms

The book you should buy


Computer Architecture&lt;- The intro alone makes it worth watching!

The book you should buy

Linear Algebra

The book you should buy &lt;-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.

Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.

One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...

EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.

u/elder_george · 2 pointsr/programming

There's very nice (although expensive) book on Computer Architecture called 'Structured Computer Organization' by Tanenbaum.

u/eldigg · 2 pointsr/webdev

In most cases your program's performance is limited by memory access speed rather than raw CPU power. Your CPU to sit there 99% of the time twiddling its thumbs waiting for memory access.

This is a pretty good book, imo, that talks about this (among other things):

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

u/frankenbeans · 2 pointsr/ECE

Amazing? These look like they were swiped from an overview lecture, there isn't any really good explanation in here. If this is all new to you they might be a good starting point for learning some basic concepts and vocabulary of signal integrity.

Johnson's Black Magic book is the general reference for this. There are many other (well written) white papers out there. Ott and Bogatin have good books as well.

u/jayknow05 · 2 pointsr/AskElectronics

This is a good book on the subject. I would personally work with a 4-layer board with a GND and VCC layer. It sounds like you already have a bunch of layers as it is so yes I would recommend a VCC layer.

u/drtwist · 2 pointsr/AskReddit

Eric Bogatin's book "Signal Integrity - Simplified" Howard Johnson's High Speed Digital Design and Mike Peng Li's Jitter, Noise, and Signal Integrity at High-Speed are all fantastic reads if you are looking for dead tree material. if you have a Safari subscription you can read Bogatin's and Li's books for "free"

u/PlatinumX · 2 pointsr/AskElectronics

&gt; Where did you take the formula for wire impedance from? Where could I read more about it?

This is a classic parallel conductor transmission line, there are calculators online. As I mentioned before, the twists do not affect impedance.

You can read more about transmission lines, characteristic impedance, twisted pair, and signal integrity all over the web (and of course check Wikipedia). These are very large topics with a lot of details to learn.

If you want a book, I recommend High Speed Digital Design: A Handbook of Black Magic.

u/tweakingforjesus · 2 pointsr/electronics

In addition to the per-IC decoupling cap already mentioned, I'd add a large electrolytic across VCC and GND near the connector on the right. You also might want to beef up the power and ground traces to reduce resistance to the individual ICs. Remember that your high-speed signal traces are going to induce the opposite current in parallel traces. A ground plane will help with this effect.

If you are really interested in digital PCB design, you might check out this book.

u/m85476585 · 2 pointsr/AskEngineers

I literally have a book called "A Handbook of Black Magic". It's a little old, but it's still one of the best books on the subject.

u/dangerbirds · 2 pointsr/ECE

Highspeed Digital Design by Graham and Johnson is more focused on high speed digital signals, but most of it applies to low speed as well. It has a ton of good "engineering rules of thumb" when it comes to doing PCB design.

u/velocicar · 2 pointsr/EngineeringStudents

Here's a book I use at work.

u/tjlusco · 2 pointsr/ElectricalEngineering

I'm going to be frank, this is probably the worst engineering article I've ever read. I may be biased because I majored in control systems, but this article doesn't even remotely cover what would be a control systems 101 introductory lecture, it is littered with grammatical and technical inaccuracies, and is completely devoid of technical depth that someone who would bother reading the article would be interested in. It is also obvious that the submitter is also affiliated with the site, not that I have a problem with shameless self promotion but this is simply bad content.

For those who would like a good introduction to control systems, this is IMOH the best text on the subject: Modern Control Systems, R.H. Bishop. (Amazon,Torrent)

u/subheight640 · 2 pointsr/engineering

http://www.amazon.com/Modern-Control-Systems-12th-Edition/dp/0136024580

This is the book written by my former controls professor.

u/stupider_than_you · 2 pointsr/robotics

I learned from Modern Control Systems by Bishop and Dorf, I thought it was pretty good. Here is the newest version.

u/rmurias · 2 pointsr/DSP
u/maredsous10 · 2 pointsr/ECE

Links
www.dspguru.com

Videos
Oppenheimer's MIT Lectures
(http://ocw.mit.edu/resources/res-6-008-digital-signal-processing-spring-2011/video-lectures/)
Digital Filters I through V (Hamming Learning to Learn on Youtube)
Monty's Presentations http://www.xiph.org/video/

Books
Schaum's Digital Signal Processing (&lt;= Recommended It's good and cheap.)
http://www.amazon.com/Schaums-Outline-Digital-Processing-Edition/dp/0071635092
Signals and System Made Easy
http://www.amazon.com/Signals-Systems-Made-Ridiculously-Simple/dp/0964375214
ftp://ftp.cefetes.br/cursos/EngenhariaEletrica/Hans/Sinais%20e%20Sistemas/ZIZI%20Press%20-%20Signals%20and%20Systems%20Made%20Ridiculously%20Simple.pdf

Discrete Time Signal Processing
http://www.amazon.com/Discrete-Time-Signal-Processing-Edition-Prentice/dp/0131988425/

Discrete Time Signal Processing (Older version I used in school)
http://www.amazon.com/Discrete-Time-Signal-Processing-Edition-Prentice-Hall/dp/0137549202/

DSP using MATLAB
http://www.amazon.com/Digital-Signal-Processing-Using-MATLAB/dp/1111427372/

Digital Signal Processing Prokais
(Similar to Oppenheimer book, but found it clearer in some instances from what I remember. )
http://www.amazon.com/Digital-Signal-Processing-4th-Edition/dp/0131873741/

Books I've seen around
Understanding Digital Signal Processing
http://www.amazon.com/Understanding-Digital-Signal-Processing-Edition/dp/0137027419/

Scientist-Engineers-Digital-Signal-Processing
http://www.amazon.com/Scientist-Engineers-Digital-Signal-Processing/dp/0966017633/

http://www.dspguide.com


u/Sean_Michael · 2 pointsr/EngineeringStudents

Understanding Digital Signal Processing EET400,401 Electronics and Computer Engineering Bachelors Degree

u/washerdreier · 2 pointsr/DSP

Understanding DSP by Lyons, hands down. Get it and never look back. AWESOME book. http://www.amazon.com/Understanding-Digital-Signal-Processing-Edition/dp/0137027419

u/cbrpnk · 2 pointsr/AskProgramming

The fact that you mentioned that it'd be cool to work on a DAW tells me that you want to go low level. What you want to study is digital signal processing or DSP. I recommend Understanding Digital Signal Processing. Also watch This talk by Timur Doumler. Or anything by him. I recommend that you pick a programming language and try to output a sin wave to the speakers, then go on from there.

Also check those out:

https://theaudioprogrammer.com/

https://jackschaedler.github.io/circles-sines-signals/

https://blog.demofox.org/#Audio

&amp;#x200B;

Good luck.

u/110100100_Blaze_It · 2 pointsr/learnprogramming

It's on my to-do list, but this is something that I want to get right. I don't think I could fully appreciate it without a more formal approach. I'm currently working through this, and will try my hand in the subject afterword. I will definitely check out Professor Might's insight on the subject, and I would gladly take up any other resources you might have to offer!

u/lordvadr · 2 pointsr/AskComputerScience

We wrote a compiler for one of my CS classes in college. The language was called YAPL (yet another programming language).

First thing first, as other's have mentioned, a compiler translates from one language to another...typically assembly...but could be any other language. Our compiler compiled YAPL, which was a lot like Pascal, into C, which we then fed to the C compiler...which in turn was fed to the assembler. We actually wrote working programs in YAPL. For my final project, I wrote a functional--albeit VERY basic--web server.

With that said, it's quite a bit different for an interpreted language, but the biggest part for each is still the same. By far, the most complicated part of a compiler is the parser.

The parser is what reads a source code file and does whatever it's going to do with it. Entire bookshelves have been written on this subject, and PhD's given out on the matter, so parsing can be extremely complicated.

In a theoretical sense, higher level languages abstract common or more complicated tasks from the lower level languages. For example, to a CPU, variables don't have sizes or names, neither do functions, etc. On one hand, it greatly speeds up development because the code is far more understandable. On the other hand, certain tricks you can pull of in the lower-level languages (that can vastly improve performance) can be abstracted away. This trade-off is mostly considered acceptable. An extra $500 web server (or 100 for that matter) to handle some of the load is far less expensive than 10 extra $100,000 a year x86 assembly developers to develop, optimize, and debug lower level code.

So generally speaking, the parser looks for what are called tokens, which is why there are reserved words in languages. You can't name a variable int in C because int is a reserved word for a type. So when you name variable, you're simply telling the compiler "when I reference this name again, I'm talking about the same variable." The compiler knows an int is 4 bytes, so does the developer. When it makes it into assembly, it's just some 4 bytes somewhere in memory.

So the parser starts looking for keywords or symbols. When it sees int, the next thing it's going to expect is a label, and if that label is followed by (, it knows it's a function, if it's followed by ; it's a variable--it's more complicated than this but you get the idea.

The parser builds a big structure in memory of what's what and essentially the functionality. From there, either the interpreter goes through and interprets the language, or for a compiler, that gets handed to what's called the emitter. The emitter is the function that spits out the assembly (or whatever other language) equivalent a = b + c; happens to be.

This is complicated, but if you take it in steps, it's not really that hard. This is the book we used. There's a much newer version out now. If I can find my copy, I'll give it to you if you pay shipping. PM me.

u/jhartwell · 2 pointsr/AskComputerScience

Compilers: Principles, Techniques and Tools which is also referred to as The Dragon Book is what I'm currently reading and am enjoying it.

u/13ren · 2 pointsr/programming
u/johnweeder · 2 pointsr/learnprogramming

Yes. Do it. It's great to know. Useful ocassionally - especially grammars. The dragon book is the only college text I've kept.

https://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886/ref=pd_lpo_sbs_14_img_0?_encoding=UTF8&amp;amp;psc=1&amp;amp;refRID=6GT8HPHEKPGJX9GGVMNR

u/Scaliwag · 2 pointsr/gamedev

Regarding sandboxing, at least in lua from what I know you can have minute control over what libs access to, and users can only import other libraries if you allow them to (by including a "library" that imports other libraries :-).

Perhaps you should look into formal languages and parser generators, so you can create more complex languages if you feel like it. Even if you build the parsers yourself having the language specified, factorized and so on, helps a lot. The dragon book is a good choice, although it presupposes you know a bit about specifying a formal language IIRC. If you're a student (I know how it is!) then even the old dragon book is an excellent read and it's very cheap.

u/BaconWraith · 2 pointsr/compsci

Cheers man! The Dragon Book is a great place to start, and there's always this, but mainly it's about facing each problem as you come to it and hoping for the best :P

u/oridb · 2 pointsr/learnprogramming

I've been playing around with writing a programming language and compiler in my spare time for a while now (shameless plug: http://eigenstate.org/myrddin.html; source: http://git.eigenstate.org/git/ori/mc.git). Lots of fun, and it can be as shallow or as deep as you want it to be.

Where are you with the calculator? Have you got a handle on tokenizing and parsing? Are you intending to use tools like lex and yacc, or do you want to do a recursive descent parser by hand? (Neither option is too hard; hand written is far easier to comprehend, but it doesn't give you any correctness guarantees)

The tutorials I'd suggest depend on exactly where you are and what you're trying to do. As far as books, the three that I would go with are, in order:

For basic recursive descent parsing:

u/vineetk · 2 pointsr/programming

Looks like it can be had for about $7 on amazon, including shipping. Surely you can scrounge up that much.

u/CSMastermind · 2 pointsr/AskComputerScience

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex &amp; yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/phao · 2 pointsr/java

Well, some books can help:

  • There is the design patterns book (gang of four) =&gt; http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/
  • Another book on patterns, but targetting enterprise applications (i.e. information systems) =&gt; http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

    These books on patterns tend to be good on teaching a lot of what you're asking. Largely because you've named some patterns in your question, but also because many patterns are about:

  • Identifying the need for something X to be changed without affecting another thing Y with which X is coupled; and
  • separating X from Y in a way that allows X to change independently from Y.

    There are several ways to say what I just did in there. You're allowing X to vary independently from Y. This makes X a parameter of Y, which is yet another way to say it. You're separating what is likely to change often (X) from what doesn't need to be affected by that change (Y).

    Some benefits from this is that a reason to change X, now, doesn't affect Y because X can be changed independently of Y. Another is that understanding X can be significantly done without looking at Y. This is a core guiding rule in the separation of concerns principle: the concern X is separated from the concern Y. Now, a lot of activities you want with X can be performed independently of Y.

    You probably know all of this, so I'm sorry if this isn't much helpful. But just to finish, a classic example of this is a sorting function (the Y) and the comparison criteria (the X). Many people, in many projects, would like to have that a change in the comparison criteria not lead to a change in the sorting function. They're 2 separate concerns we'd like to deal with separately. Therefore, the comparison criteria, as commonly done today, is a parameter of sorting. In this case, the word "parameter" is being used both in the sense of a function parameter in the source code, but also in the more general sense of something being a parameter of something else, in which case something can be one of many, and may change over time.
u/CodeTamarin · 2 pointsr/learnprogramming

Umm this might become specific...

So Clean Architecture got me thinking a lot about code structure at a macro level, which is really important for development. It's a good beginning to understanding architecture but definitely not a definitive reference.

I would likely suggest any book that helps you understand your tech stack at a deeper level. If your goal is to say, be a lead developer, for example. Often the role of a lead is to support team members who are stuck.

So understanding the tech stack is important. For me, I got C# in a Nutshell. (I would suggest you getting the nutshell equivalent of your tech stack). It's important and it let's you understand what's happening under the hood.

Learn Algorithms in 24 Hours was a nice little primer on data structures and algorithms. While by no means a "revolutionary" book, it was useful in understanding what structures solved which problems.

So here's me answer:

If you're trying to get better at your stack: Get the Nutshell version of your language. Or buy a book on the framework your on.

If you're trying to just be a better computer scientist... I would learn Design Patterns (you noted this), then Architecture (Clean Architecture and then Patterns of Enterprise Application Architecture) then you're going to need to understand how to solve scaling problems. So Data Structures and Algorithms. This is hard because what book you want depends on your comfort with math.

For me, the biggest impact book, was design patterns. Then it was all the tech stack stuff that helped. Then finally, architecture books. The Martin Fowler architecture book was useful with design and thinking about how to handle saving data.

But it's really going to boil down to what you want to do with your career long term.

u/materialdesigner · 2 pointsr/webdev

Patterns of Enterprise Application Architecture sounds like a must read for you.

u/guifroes · 2 pointsr/learnprogramming

Interesting!

Looks to me that you can "feel" what good code looks like but you're not able to rationalise it enough for you to write it on your own.

Couple of suggestions:

When you see elegant code, ask yourself: why is it elegant? Is it because is simple? Easy to understand? Try to recognise the desired attributes so you can try to reproduce on your code.

Try to write really short classes/methods that have only one responsibility. For more about this, search for Single Responsibility Principle.

How familiar are you with unit testing and TDD? It should help you a lot to write better designed code.

Some other resources:

u/SlobberGoat · 2 pointsr/java
u/st4rdr0id · 2 pointsr/androiddev

Hey that is the million dollar question. But because software is not an engineering, actually there is no reference book on SW architecture. Certainly there are books talking about this, but usually covering only some aspects and without real application examples.

Notice that in iOS programming the system imposes a great part of the architecture, so these guys are usually less concerned. But in Android we have more freedom, and the API actually encourages really bad practices (thanks Google). Because of this we are all a bit lost. Nowadays layered architecture and MVP seems to be the most popular approach, but then again everybody produces a different implementation...

Specifically for Clean Architecture you should read its author, Robert C. Martin. AFAIK this is not covered in detail in his books. You can read this blog post and watch this video. Other designs usually coming up in conferences are the Onion Architecture and the Hexagonal Architecture. But make no mistake: there's no route map on how to implement any of those, and examples claiming to follow this or that approach are usually not written by the authors of the architecture.


For DDD there is a very good book by Scott Millet with actual examples. But this style is meant for large enterprise backend apps, and the author himself advices against using is in small apps. So I'd say it is overkill for Android, but of course you could reuse some concepts successfully.


Theres also Software Architecture in Practice 3rd, but having read the 2nd edition I can tell you this is just smoke.


Probably best book to date is Fowler's but this is more a patterns compilation than an architecture guide.

u/vinnyvicious · 2 pointsr/gamedev

Have you ever heard of the open/closed principle? Or the single responsibility principle? Or Liskov substitution principle? All three are being violated. It drastically reduces the maintainability and extensibility of your code. I can't swap serializers easily, i can't tweak or extend them without touching that huge class and it's definitely not the responsibility of that class to know how to serialize A, B, C, D, E and the whole alphabet.

I highly recommend some literature on the subject if you're curious about it, it would drastically improve your approach to software architecture:

https://www.amazon.com/dp/0132350882

https://www.amazon.com/dp/0201485672

https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215

http://cc2e.com/

https://www.amazon.com/dp/0321127420

u/vladmihalceacom · 2 pointsr/java

&gt; Yes, the Native Query and access to Connection is always THE Hibernate's answer to all the lacking support of basic SQL features like Window Functions or being able to count aggregated results.

That's a very common misconception. Hibernate is not a replacement for SQL. It's an alternative to JDBC API that implements the Enterprise Patterns stated by Martin Flower in his book.

Thre are many alternatives to JPA or Hibernate. In fact, I'm also using jOOQ, and I like it a lot. I wrote about it. I'm using it in my training and workshops as well.

There are things you can do in jOOQ that you can't do with Hibernate, and there are also things you can do with Hibernate that you can't do with jOOQ.

u/LXXXVI · 2 pointsr/learnprogramming

Thanks, I'm sure you will. It's just a question of getting that first success. Afterwards, it gets much easier, once you can point at a company and say "Their customers are using my code every day."

As for the interviews, I don't know, I'm honestly not the type to get nervous at interviews, either because I know my skill level is most likely too low and I take it as a learning experience, or because I know I can do it. I'd say that you should always write down all the interview questions you couldn't answer properly and afterwards google them extensively.

Besides, if you're from the US, you have a virtually unlimited pool of jobs to interview for. I live in a tiny European country that has 2 million people and probably somewhere in the range of 20 actual IT companies, so I had to be careful not to exhaust the pool too soon.

Funnily enough, right now, my CTO would kill for another even halfway competent nodejs developer with potential, but we literally can't find anyone.

Anyway, I'm nowhere near senior level, but I can already tell you that the architecture:language part is something your bootcamp got right. To that I would add a book my CTO gave me to read (I'm not finished yet myself, but it is a great book) - Patterns of Enterprise Architecture. Give it a look. I suspect, without ever having tried to implement a piece of architecture like that, it won't make much sense beyond theoretical, but I promise you, it's worth its weight in gold, once you start building something more complex and have to decide how to actually do it.

u/slowfly1st · 2 pointsr/learnprogramming

Your foes are kids in their twenties with a degree which takes years to achieve, this will be tough! But I think your age and your willingness to learn will help you lot.

&amp;#x200B;

Other things to learn:

  • JDK - you should be at least aware what API's the JDK provides, better, have used them (https://docs.oracle.com/javase/8/docs/). I think (personal preference / experience) those are the minimum: JDBC, Serialization, Security, Date and Time, I/O, Networking, (Internationalization - I'm from a country with more than one official language), Math, Collections, Concurrency.
  • DBMS: How to create databases and how to access them via JDBC. (I like postgreSQL). Learn SQL.
  • Learn how to use an ORM Mapper. (I like jOOQ, I dislike JPA/hibernate)
  • Requirements Engineering. I think without someone who has the requirements you can't really practice that, but theory should be present. It's a essential part of software development: Get the customers requirements and bring it to paper. Bad RE can lead to tears.
  • Writing Unit Tests / TDD. Having working code means the work is 50% done - book recommendation: Growing Object-Oriented Software, Guided by Tests
  • CI/CD (Continuous Integration / Delivery) - book recommendation: Continuous Delivery.
  • Read Clean Code (mandatory!)
  • Read Design Patterns (also mandatory!)
  • (Read Patterns of Enterprise Application Architecture (bit outdated, I think it's probably a thing you should read later, but I still love it!))
  • Get familiar with a build tool, such as maven or gradle.

    &amp;#x200B;

    If there's one framework to look at, it would be spring: spring.io provides dozens of frameworks, for webservices, backends, websites, and so on, but mainly their core technology for dependency injection.

    &amp;#x200B;

    (edit: other important things)
u/ladywanking · 2 pointsr/learnprogramming

You are asking a good question.

Wouldn't doing a separate class for each of the use cases you described be repeating yourself?

I would read about DTO and see how it goes.
A good book about this is this.

u/cderwin15 · 2 pointsr/compsci

For what it's worth, I'm in the midst of working through the dragon book and would highly recommend it. Unfortunately I don't know of any online courses you could take for credit.

u/0xf3e · 2 pointsr/programming

For anyone who wants to learn more about compilers and loves reading books, the so called dragon book is highly recommended lecture on this topic: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

u/case-o-nuts · 2 pointsr/programming

That's a good question, acutally. I picked it up in bits and pieces over years. I probably started to pick up when I tried to implement an object-oriented programming system in C. The dragon book was also a great help in figuring this sort of stuff out.

Another great way to learn is to write simple test programs in C or C++, and see what they compile down to with GCC. Using '-O' I find gives me the most readable "direct" assembly.

http://asm.sourceforge.net/howto/Assembly-HOWTO.html

Also, if you have any specific questions, possibly a tutorial or two... well, it's time that I started putting together a website.

u/echelonIV · 2 pointsr/gamedev

I ordered these for our company library, based on recommendations for/from other programmers (of all levels).

ISBN | Title
---|---
978-1568814247 | Real-time Rendering
0321486811 | Compilers: Principles, Techniques, and Tools (2nd Edition)
1482250926 or 0123742978 | Essential Mathematics for Games and Interactive Applications, Third Edition 3rd Edition
978-1482264616 | GPU Pro 6: Advanced Rendering Techniques
1466560010 | Game Engine Architecture, Second Edition
978-1482243567 | Multithreading for Visual Effects
978-0123750792 | Physically Based Rendering: From Theory To Implementation

u/Yulfy · 2 pointsr/AskProgramming

If you mean writing an interpreter or compiler, then yes. The iconic book for learning how to build languages is called Compilers, Principles, Techniques and Tools. It's often referred to as 'The Dragon Book'. It's pretty heavy reading but contains everything you need to know about building a language!

If you're looking for something more implementation driven, I recently read a book about building a programming language in Go. The principles are the same, just with a different language. The book was called Writing an Interpreter in Go. It's a much lighter read and details the construction of an interpreter from scratch!

u/cantstopthemoonlight · 2 pointsr/learnprogramming

Compilers-Principles-Techniques-Tools is considered THE definitive book on the subject. It's old, but in a fine wine kind of way.

u/WannabeDijkstra · 2 pointsr/linux

The Design and Implementation of the FreeBSD Operating System by Marshall Kirk McKusick

Though it uses FreeBSD-specific details, its breadth and level of detail is really high, with most of the concepts being applicable to nearly all Unix-likes, and much of it to operating systems in general.

The second edition came out just a few days ago, which I link to.

u/InfinitelyManic · 2 pointsr/openbsd
u/melp · 2 pointsr/zfs

That is not true. Sync writes go to memory (in a transaction group) and the SLOG at the same time.

The only reference I know of for this is in this book: https://www.amazon.com/dp/0321968972/

Section 10.4, logging. You can see it in the Amazon preview, starting on page 538.

u/chalbersma · 2 pointsr/linux
u/albatrossy · 2 pointsr/DSP

It kind of sounds like you'd be good just getting a textbook. I think any book will be fine since you mainly just want questions (and presumably answers), but try to find one that implements code in a language that you're comfortable with, or that you want to learn.

There are a lot of different "final year" DSP courses, but it sounds like you want something covering the fundamentals rather than anything too advanced. I started off with The Scientist &amp; Engineer's Guide to Digital Signal Processing and then used Signals and Systems for my first undergraduate course, but we used it largely because he co-authored it. I would recommend scouring the web for some free books though. There are books like ThinkDSP popping up that seem pretty neat.

Edit: Oppenheim is always mentioned also.

u/mitchell271 · 2 pointsr/audioengineering

Software dev checking in. If you want to go into plugin design, make sure you read books like The Scientist And Engineer's Guide to Digital Signal Processing, and have a heavy focus on algorithms, physics, and matrix math.

There are SDKs and APIs to help though. The Steinberg VST SDK is how VST plugins are made, and it removes a lot of the underlying math that you need to know. Writing multi-threaded C code with a library like OpenMP will also help, as you plugins will be more efficient, resulting in less latency.

u/Hello_Dongan · 2 pointsr/DSP

I personally like The Scientist &amp; Engineer's Guide to Digital Signal Processing . The author explains a lot of concepts very clearly in laymen terms. I think the only flaw is that it doesn't cover a ton of material, only the basics.

Other than that, I think Mitra is a good book. One thing to look out for is its errata list. It's somewhat frustrating to have to double check for errors in the book when working homework problems.

u/trump_pushes_mongo · 2 pointsr/neoliberal

O'Reilly Publishing is a reputable source for programming in general. Here is an embedded systems book.

Edit; stupid formatting

u/adi123456789 · 2 pointsr/cpp

I'm an embedded software developer who used to use C and now primarily works with C++.

Learning C is relatively easier when you start off and gives you a better appreciation of memory handling and it's complexities than C++ does in my opinion. The C knowledge will also transfer well to C++.

C++ is definitely a much more powerful language and you can get your tasks done quicker with it. There are a lot of things to learn in C++, but you can get them with time. A lot of embedded processors, particularly the ARM based ones, support C++ as well, so that is not a problem

Like someone else mentioned though, embedded development relies on a good knowledge of programming as well as a good understanding of computer architecture.

Here's a nice book I've read which is useful for new embedded developers - Making Embedded Systems: Design Patterns for Great Software https://www.amazon.com/dp/1449302149/ref=cm_sw_r_cp_apa_i_MuFhDb1WWXK3W

u/MrAureliusR · 2 pointsr/ElectricalEngineering

Okay, you're definitely at the beginning. I'll clarify a few things and then recommend some resources.

  1. Places to buy components: Depending on where you live in the world, the large component suppliers are almost always the way to go, with smaller suppliers like Adafruit/Sparkfun if you need development boards or specialised things. I buy almost exclusively from Digikey -- they have $8 flat shipping to Canada, which typically arrives the next day, with no customs fees. They have some sort of agreement in place where they cover these costs. This *always* saves money over going to my local stores where the prices are inflated. It's crazy how cheap some things are. If I need a few 2.2K 1206 resistors for a project, I just buy a reel of 1000 because they are so cheap.
  2. "Steer a joystick with an app" Do you mean connect motors to it and have them move the joystick for you? You're going to want some sort of microcontroller platform, along with a motor controller and way to communicate with a smartphone app. You mention you know C++ so it will be easy to switch to C. This is both true and false. Programming for microcontrollers is not the same as programming for computers. You are much closer to the hardware, typically manipulating many registers directly instead of abstracting it away. Each microcontroller vendor has their own tools and compilers, although *some* do support GCC or alternatives. You mentioned PIC, which is a line of microcontrollers by a large company called Microchip. There are 8-bit, 16-bit, and 32-bit PICs, all at different price points and with hugely differing capabilities. Selecting the microcontroller for a project can be half the battle sometimes. Or, like me, you can just go with whatever you have on hand (which is usually MSP430s or PIC32MX's)
  3. A lot of people will recommend the book The Art of Electronics. It's decent, but it's not for everyone. Some really like the conversational style, others don't. Many people who want to get into microcontroller programming and embedded development want to skip over the fundamentals and just get something working. For those, I point them to Arduino and let them on their merry way. However, if you actually want to learn something, I highly recommend buying an actual microcontroller development board, learning the fundamentals about electrical circuits, and programming in actual C with actual IDEs.
  4. As far as resources go, again it depends on your actual goal. Whenever I want to learn a new tool (like a PCB layout software, or a new IDE) I always start with a simple project. Having an end point to reach will keep you motivated when things seem complicated. Your controlling a joystick with motors is a great starting point. I would buy a development board, Microchip PICs are popular, as are ST32s, and MSP430. It doesn't really matter that much in the long run. Just don't tie yourself too hard to one brand. Then pick up some stepper motors, and a stepper motor control board (grab one from Sparkfun/Adafruit, etc). Get yourself a breadboard, and some breadboard jumpers, a cheap power supply (there are tons available now for cheap that are pretty decent), and then jump in head first!
  5. I highly recommend the book Making Embedded Systems by Elecia White, once you've covered the basics. It's a great way to learn more about how professionals actually design things. For the basics, you can watch *EARLY* EEVBlog videos (anything past around video 600/650 he gets progressively more annoying and set in his ways, another topic entirely, but the early stuff is decent). I'd also recommend picking up your choice of books about the fundamentals -- Electronics for Dummies, the aforementioned Art of Electronics, Making Embedded Systems, The Art of Designing Embedded Systems, and even stuff like Design Patterns for Embedded Systems in C. Again, it all depends on what your goal is. If you want to do embedded design, then you'll need to focus on that. If you're more into analog circuits, then maybe check out The Art and Science of Analog Circuit Design. Either way, grounding yourself in the fundamentals will help a LOT later on. It will make reading schematics way easier.

    I feel like I've gone off on a few tangents, but just ask for clarification if you want. I'd be happy to point you towards other resources.
u/returnofthecityboy · 2 pointsr/programming

May I ask you the reverse question? I'd be interested what it takes to move from embedded to backend.

Embedded is a wide field. It kind of depends on whether you're developing firmware or the application layer for an embedded target. Principally it's not much different than programming normal C, but for a lot of targets there's no malloc (because it can have disastrous timing behavior) and you'll need to learn about hard real-time scheduling and working with peripherals.

If you work closely with the hardware guys it might be useful to know the basics of handling an oscilloscope.

For the basics I can recommend [this book here].(https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149)

u/Yelneerg · 2 pointsr/embedded

Check out the embedded podcast and blog, and the book "Making Embedded Systems"
https://www.embedded.fm/

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

u/Iwishiknewwhatiknew · 2 pointsr/embedded


Making Embedded Systems: Design Patterns for Great Software https://www.amazon.com/dp/1449302149/ref=cm_sw_r_cp_apa_i_BbryCbHZDAV0T

u/RadioactiveAardvark · 2 pointsr/embedded

There aren't any that I'd recommend, unfortunately.

This book is not specifically about embedded C, but about embedded in general:

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

Anything by Jack Ganssle is good as well.

u/Caret · 2 pointsr/hardware

As someone else mentioned, the Hennessy and Patterson Computer Architecture: A Quantitative Approach, and the Patterson and Hennessy Computer Organization and Design are the de facto standards (I used both in my Comp. Eng. undergrad) and are really fantastic books (the latter being more "software" oriented so to speak).

They are not EE textbooks (as far as I know) but they are text books nonetheless. A great book I found that is slightly dated but gives a simplified review of many processors is Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture which is less technical but I enjoyed it very much all the same. It is NOT a textbook, and I highly, highly recommend it.

Hope that helps!

u/RoombaCultist · 2 pointsr/ECE

1st year EE student here.

For circuit analysis my biggest resources have been allaboutcircuits.com, and Jim Pytel's Youtube videos. I need to send that guy a dozen beers for getting me through my first two terms of circuits.

For the math: review algebra (exponents in particular), trigonometry, and, if your program will be using calculus review that too. When you get to studying AC, you'll get to use phasors, which makes for far less calculus and honestly makes more sense. Phasors were completely new for me, but were completely doable thanks to Jim Pytel's lecture. Honestly though, It's mostly algebra; Ohm's Law and KVL will get your far.

If/when you get interested digital circuit design check out Ben Eater. His explanations for what is going on in a circuit are far more clear than any other resource I've found so far.

Of course all that theory is pretty dry for a fun summer break. It might be best to bookmark those until classes actually start then use those resources to compliment your class. A more fun approach might be to get yourself a basic electronics kit, a breadboard, and a DMM (that's digital multimeter) and build some circuits out of a book like any of Forrest Mim's books, or a more modern "maker flavored" book. Then probe the circuits you make and ask yourself "What's going on here?"

Also, the sooner you can get your hands on a good oscilloscope, the better. It's the best tool for seeing what's going on in a circuit because it shows how voltage relates to time, unlike a DMM which shows an average. A DMM is indispensible (and affordable), so it will likely be the first major tool in your kit, but don't be shy of all the knobs on the fancy expensive box.

u/DCoder1337 · 2 pointsr/PHP

If you are working on a small project with three tables, you don't need to worry too much. The problems arise when you have a large system where good architecture and appropriate patterns become really important.

With AR, the model class is responsible for both a) modelling a domain object and b) saving it to the DB and loading it back. Which means the model class gets mixed with methods like beforeSave, afterSave and afterFind to turn DB column values into your domain values and back (e.g. timestamps and datetimes converted to DateTime, complete with some timezone). This breaks the Single Responsibility Principle. Note: this is not a big deal for a small solution where your business logic is simple, but it tends to get worse as the system grows.

If an attribute holds something more complex than a string/number (e.g. a DateTime, a custom Money class, etc.), you have to manually (un)stringify it using beforeSave and afterSave (or stow the original value away somewhere in an additional private property).

  • Are you sure those conversions are lossless (of course they should be, but are you certain, especially with datetime/timezones)? I've been bitten by (my own fault) fun bugs before where a DateTime was constructed with a local timezone, then (correctly, by design) saved and unstringified into a DateTime with UTC, and suddenly that entity's publication date is different than it was before...
  • What happens if there's an exception thrown between beforeSave and afterSave - what state is your model left in? Personally, I am not fond of a property that can be both a string and an object at different times.

    See also: Martin Fowler - Patterns of Enterprise Applications.
u/empleadoEstatalBot · 1 pointr/argentina

&gt; It’s hard to consolidate databases theory without writing a good amount of code. CS 186 students add features to Spark, which is a reasonable project, but we suggest just writing a simple relational database management system from scratch. It will not be feature rich, of course, but even writing the most rudimentary version of every aspect of a typical RDBMS will be illuminating.
&gt;
&gt; Finally, data modeling is a neglected and poorly taught aspect of working with databases. Our suggested book on the topic is Data and Reality: A Timeless Perspective on Perceiving and Managing Information in Our Imprecise World.
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ### Languages and Compilers
&gt;
&gt; Most programmers learn languages, whereas most computer scientists learn about languages. This gives the computer scientist a distinct advantage over the programmer, even in the domain of programming! Their knowledge generalizes; they are able to understand the operation of a new language more deeply and quickly than those who have merely learnt specific languages.
&gt;
&gt; The canonical introductory text is Compilers: Principles, Techniques &amp; Tools, commonly called “the Dragon Book”. Unfortunately, it’s not designed for self-study, but rather for instructors to pick out 1-2 semesters worth of topics for their courses. It’s almost essential then, that you cherrypick the topics, ideally with the help of a mentor.
&gt;
&gt; If you choose to use the Dragon Book for self-study, we recommend following a video lecture series for structure, then dipping into the Dragon Book as needed for more depth. Our recommended online course is Alex Aiken’s, available from Stanford’s MOOC platform Lagunita.
&gt;
&gt; As a potential alternative to the Dragon Book we suggest Language Implementation Patterns by Terence Parr. It is written more directly for the practicing software engineer who intends to work on small language projects like DSLs, which may make it more practical for your purposes. Of course, it sacrifices some valuable theory to do so.
&gt;
&gt; For project work, we suggest writing a compiler either for a simple teaching language like COOL, or for a subset of a language that interests you. Those who find such a project daunting could start with Make a Lisp, which steps you through the project.
&gt;
&gt;
&gt;
&gt; [Compilers: Principles, Techniques &amp; Tools](https://teachyourselfcs.com//dragon.jpg) [Language Implementation Patterns](https://teachyourselfcs.com//parr.jpg)&gt; Don’t be a boilerplate programmer. Instead, build tools for users and other programmers. Take historical note of textile and steel industries: do you want to build machines and tools, or do you want to operate those machines?
&gt;
&gt; — Ras Bodik at the start of his compilers course
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ### Distributed Systems
&gt;
&gt; As computers have increased in number, they have also spread. Whereas businesses would previously purchase larger and larger mainframes, it’s typical now for even very small applications to run across multiple machines. Distributed systems is the study of how to reason about the tradeoffs involved in doing so, an increasingly important skill.
&gt;
&gt; Our suggested textbook for self-study is Maarten van Steen and Andrew Tanenbaum’s Distributed Systems, 3rd Edition. It’s a great improvement over the previous edition, and is available for free online thanks to the generosity of its authors. Given that the distributed systems is a rapidly changing field, no textbook will serve as a trail guide, but Maarten van Steen’s is the best overview we’ve seen of well-established foundations.
&gt;
&gt; A good course for which some videos are online is MIT’s 6.824 (a graduate course), but unfortunately the audio quality in the recordings is poor, and it’s not clear if the recordings were authorized.
&gt;
&gt; No matter the choice of textbook or other secondary resources, study of distributed systems absolutely mandates reading papers. A good list is here, and we would highly encourage attending your local Papers We Love chapter.
&gt;
&gt;
&gt;
&gt; [Distributed Systems 3rd edition](https://teachyourselfcs.com//distsys.png)
&gt;
&gt;
&gt;
&gt; ## Frequently asked questions
&gt;
&gt; #### What about AI/graphics/pet-topic-X?
&gt;
&gt; We’ve tried to limit our list to computer science topics that we feel every practicing software engineer should know, irrespective of specialty or industry. With this foundation, you’ll be in a much better position to pick up textbooks or papers and learn the core concepts without much guidance. Here are our suggested starting points for a couple of common “electives”:
&gt;
&gt; - For artificial intelligence: do Berkeley’s intro to AI course by watching the videos and completing the excellent Pacman projects. As a textbook, use Russell and Norvig’s Artificial Intelligence: A Modern Approach.
&gt; - For machine learning: do Andrew Ng’s Coursera course. Be patient, and make sure you understand the fundamentals before racing off to shiny new topics like deep learning.
&gt; - For computer graphics: work through Berkeley’s CS 184 material, and use Computer Graphics: Principles and Practice as a textbook.
&gt;
&gt; #### How strict is the suggested sequencing?
&gt;
&gt; Realistically, all of these subjects have a significant amount of overlap, and refer to one another cyclically. Take for instance the relationship between discrete math and algorithms: learning math first would help you analyze and understand your algorithms in greater depth, but learning algorithms first would provide greater motivation and context for discrete math. Ideally, you’d revisit both of these topics many times throughout your career.
&gt;
&gt; As such, our suggested sequencing is mostly there to help you just get started… if you have a compelling reason to prefer a different sequence, then go for it. The most significant “pre-requisites” in our opinion are: computer architecture before operating systems or databases, and networking and operating systems before distributed systems.
&gt;
&gt; #### Who is the target audience for this guide?
&gt;
&gt; We have in mind that you are a self-taught software engineer, bootcamp grad or precocious high school student, or a college student looking to supplement your formal education with some self-study. The question of when to embark upon this journey is an entirely personal one, but most people tend to benefit from having some professional experience before diving too deep into CS theory. For instance, we notice that students love learning about database systems if they have already worked with databases professionally, or about computer networking if they’ve worked on a web project or two.
&gt;
&gt; #### How does this compare to Open Source Society or freeCodeCamp curricula?
&gt;
&gt; The OSS guide has too many subjects, suggests inferior resources for many of them, and provides no rationale or guidance around why or what aspects of particular courses are valuable. We strove to limit our list of courses to those which you really should know as a software engineer, irrespective of your specialty, and to help you understand why each course is included.
&gt;
&gt; freeCodeCamp is focused mostly on programming, not computer science. For why you might want to learn computer science, see above.
&gt;
&gt; #### What about language X?
&gt;
&gt; Learning a particular programming language is on a totally different plane to learning about an area of computer science — learning a language is much easier and much less valuable. If you already know a couple of languages, we strongly suggest simply following our guide and fitting language acquisition in the gaps, or leaving it for afterwards. If you’ve learned programming well (such as through Structure and Interpretation of Computer Programs), and especially if you have learned compilers, it should take you little more than a weekend to learn the essentials of a new language.
&gt;
&gt; #### What about trendy technology X?
&gt;

&gt; (continues in next comment)

u/balefrost · 1 pointr/AskProgramming

OK, a few things:

It looks like you're trying to build a shift/reduce parser, which is a form of an LR parser, for your language. LR parsers try to reduce symbols into more abstract terms as soon as possible. To do this, an LR parser "remembers" all the possible reductions that it's pursuing, and as soon as it sees the input symbols that correspond to a specific reduction, it will perform that reduction. This is called "handle finding".

&gt; If I am correct, my Automaton is a DFA?

When the parser is pursuing a reduction, it's looking for sequences of symbols that match the right-hand sides of the relevant (to our current parse state) productions in our grammar. Since the right-hand sides of all the productions in a grammar are simple sequences, all the handle finding work can be done by a DFA. Yes, the handle recognizer of your parser is a DFA. But keep in mind that it needs to be combined with other parts to make a full parser, and your actual grammar can't be recognized with just a DFA.

In particular, you've shown the ACTION table for a shift/reduce parser. It determines what to do when you encounter a symbol in the input stream. But a shift/reduce parser typically needs a second table as well - the GOTO table - that determines what to do after a reduction has taken place.

One other thing that's worth mentioning: you've expressed your ACTION table as a plain DFA transition table. That's not necessarily wrong, but it's not commonly done that way. Instead of reducing when you reach a certain state, it's common to instead attach an action - either 'shift' or 'reduce' ('accept') - to each transition itself. So in a shift/reduce parser, your table might look more like this:

| [ | ] | &lt; | &gt; | id | / | attr
----+-----+-----+-----+-----+------+-----+--------
0 | S1 | | S4 | | | |
1 | | | | | S2 | | R3 : Reduce Tag -&gt; [ id ]
2 | | R3 | | | | | R7 : Reduce Tag -&gt; &lt; id ??? / &gt;
4 | | | | | S5 | S10 | R9 : Reduce Tag -&gt; &lt; id ??? &gt;
5 | | | | R9 | | S6 | S8 R12 : Reduce Tag -&gt; &lt; / id &gt;
6 | | | | R7 | | |
8 | | | | R9 | | S6 | S8
10 | | | | | S11 | |
11 | | | | R12 | | |

Note that R7 and R9 aren't well-formed, since multiple sequences of input tokens might cause you to reach these actions. While it would be possible to construct a shift / reduce parser this way, it's not commonly done. Typically, the DFA to recognize handles is an acyclic graph, but your have a self-transition in state 8.

&gt; What would be the best way of implementing this automaton in C++? Do I really have to make a huge array?

In general, yes, you need a big array (or, as suggested before, two big arrays). But you can use any space-saving technique you want. For example, since most entries in the ACTION table are invalid, one could represent that data with a sparse array data structure. Also, both The Dragon Book and Cooper and Torczon briefly cover parser-specific ways to compress those tables. For example, notice that rows 5 and 8 in your example have the same entries. Most real grammars have multiple instances of identical rows, so factoring out this commonality can save enough space that the extra complexity is worth it.

---

I'm a little surprised that you're building a parser like this by hand, though. Typically people do one of two things:

  1. Build, by hand, a modified LL(1) recursive descent parser (or variant, like a packrat parser)
  2. Build, using a tool like YACC or Bison, a LR(1) shift/reduce parser

    You're sort of doing a mix of the two, which means you have the downsides of both approaches. You need to track all the states and transitions by hand, instead of relying on tools to automate that process, yet you don't get the flexibility of a hand-coded recursive descent parser.

    If you're doing this for education's sake, then by all means proceed. I'd highly encourage you to pick up a book on parsing; I think Cooper and Torczon is a great source. But if you just want a parser that works, I'd definitely recommend using a tool or using a more direct approach, like recursive-descent.
u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/Yunath_ · 1 pointr/uwaterloo

LOL it seems interesting to me. I'm reading https://www.amazon.ca/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=dp_ob_title_bk right now.

&amp;#x200B;

Maybe its good in theory, and not in practice.

u/a4qbfb · 1 pointr/C_Programming

&gt;&gt; You are confusing sections with segments

&gt; and you're behaving the way C++ usually behave

I am not a C++ anything. I have 25 years of experience with C and 20 years of experience with operating system design and development. The odds are better than even that both the computer you used to post this abusive rant and the computers which reddit use to store and serve it run code that I have written.

(Yes, I've been writing C++ on and off for almost as long as I've been writing C, but I vastly prefer the latter.)

&gt; code section or code segment or text section or test segment are same in this context

Absolutely not. Segments are a feature of
some computer architectures, most prominently the Intel 8086 and 80386 (but not their 64-bit descendants), used to organize and access code and data in memory. Sections are a feature of most executable file formats (such as ELF and COFF) used on modern non-embedded platforms. The OS and / or the run-time linker read code and data from sections in the executable file and store them somewhere in memory.

&gt; simple googling gives you the result: https://stackoverflow.com/questions/2589949/string-literals-where-do-they-go

Stack Overflow is not a reliable source of
correct* information (see for instance this article or this one about how SO's karma system encourages a race to the bottom). I would suggest reading Tanenbaum or McKusick et al. instead.

This specific answer is correct only in the sense that the literals are included in the same file as the code. Setting aside embedded platforms and their idiosyncrasies, they are stored in different sections of the file and loaded into different memory locations at run-time.

&gt; Where do they go? the same segment where the .text section of the object file gets dumped, which has Read and Exec permissions, but not Write

Your information is 20 to 30 years out of date. No modern OS uses segments the way Intel envisioned when they designed the 8086 or the 80386. The former needed segments to escape pointer size limitations when the address bus outgrew the registers, and the latter retained them for backward compatibility and added memory protection features on a per-segment basis. However, modern OSes for the 80386 and its descendants implement memory protection at the page table level rather than at the segment level and use segment registers purely as conveniences for array operations, which is why they were left out when the 80386 architecture was expanded to 64 bits.

&gt; But go ahead c++ pleb continue **** your ***
like your kind always does.

I understand that it can be hard to admit that you are wrong, but that's not a reason to resort to that kind of language.

u/jeremiahs_bullfrog · 1 pointr/linux

Well, it is copyrighted by Kirk McKusick, who was a core contribute in the early days of FreeBSD, and he has a restriction that it only be used tastefully (so there's some subjectivity to it). I don't know if he still works on it as a core contributor though, but he did recently release v2 of The Design and Implementation of the FreeBSD Operating System, so he's involved in some capacity.

I'm not sure of the legal restrictions on the new FreeBSD logo, Beastie or Tux so you may very well be right they they don't need to be defended.

u/xiongchiamiov · 1 pointr/cscareerquestions

Fairly frequently while I was in college; I shopped up to about a shelf's worth then.

I find that I go to books far less frequently now. Part of it is that I'm much more invested in the current technologies I'm using. Another part is that many books are of the "learn X language!" type, and I know enough general programming that those aren't as useful.

The things I still buy are things like The Mythical Man-Month, Scalable Internet Architectures, and Design for Hackers - that is, less reference and more for reading to get ideas from.

u/ss0317 · 1 pointr/ECE

I wouldn't exactly say the programming is easy... there's a lot of new ideas and vocabulary to become familiar with.

I am fairly new in microcontrollers and am still confused about a lot of things. Writing bootloaders/brickloaders, watchdog timers, configuring fuse bits, handling interrupts, prescalers, timers, adjusting pwm frequencies, i2c/spi/uart/1-wire/usb, ethernet, wifi... the list goes on...

Not to mention the techniques for optimization/memory handling/reduction of power consumption...

There's a lot of concepts related to hardware programming that you just won't encounter when say, writing console applications.

With that being said, I haven't found a complete tutorial series on youtube, but Human Hard Drive has a decent intro to AVR programming and I found this book to be a helpful introduction to the topic.

u/OSHWjunkie · 1 pointr/ECE

I would suggest also picking up this one
http://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149/ref=sr_1_1?ie=UTF8&amp;amp;qid=1417370827&amp;amp;sr=8-1&amp;amp;keywords=making+embedded+systems

As for AVR versus PIC, really doesn't matter. I like to use PICs just because I prefer the way their documentation is handled and they seem to have more specialized silicon for specific projects.

HOWEVER, I would play the devils advocate and say neither. Start looking at development boards made by TI or Renesas. They ship more units and if you can get an embedded job you'll more likely be working with those toolchains.

u/StrskyNHutch · 1 pointr/ECE

The research project I was working on was specifically geared towards improving TLB misses by restructuring the cache, but over the course of being prepared for this, I read this book front to back. I learned a lot about how a microprocessor is designed and how its pipelined stages work together, and I wanted to know if that knowledge could be translatable to a hardware related field. I know I'm probably not qualified to work in this field full time but I was hoping I could get at least get my foot through the door with an internship

u/DJ027X · 1 pointr/comparch

A quick search found http://www.amazon.com/Modern-Processor-Design-Fundamentals-Superscalar/dp/1478607831 as well as http://www.ebay.com/itm/like/111520297055?lpid=82
I have no idea what their contents are though, or if they cover what you are looking for

u/farmvilleduck · 1 pointr/electronics

For understanding how computers work , there's a good book by the co-founder of arstechnica.

http://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593271042

u/PinPinIre · 1 pointr/learnprogramming

It largely depends on which Computer Science degree you are going to do. There can be some that focus heavily on software and very little on hardware and some that get a nice balance between the two. If the degree is going to focus on hardware I would recommend reading up on the underlying logic of a computer and then reading this book (Inside the machine). ITM isn't a very technical book(I would label it as the computer science equivalent of popular science) but it gives a nice clear overview of the what happens in a processor.

When it comes to programming, I would recommend starting with Java and Eclipse. Java gets quite a bit of hate but for a newcomer, I think Java would be easier to grasp than the likes of C/C++. C/C++ are nice languages but a newcomer may find their error messages a little bit obscure and may get confused with the nitty-gritty nuances of the languages.

Though the one thing you should realise is that programming is a skill that isn't confined to one language. If you understand the basic concepts of recursion, arrays, classes, generics/templates, inheritance, etc. you can apply this knowledge to almost any language. Ideally i would recomend two books on programming (Algorithmics) and (Introduction to Algorithms). Algorithmics is another books I would label as the cs equivalent to popular science but the early chapters give a nice overview of exactly what algorithms actually are. Introduction to Algorithms is a more technical book that I would recommend to someone once they know how to program and want a deeper understanding of algorithms.

The rest is personal preference, personally I prefer to use a Unix machine with Sublime Text 2 and the command line. Some will try to convince you to use Vim or Emacs but you should just find whichever you are most comfortable with.

u/Hantaile12 · 1 pointr/IWantToLearn

Assuming you’re a beginner, and are starting with little to no knowledge:

I bought the 3rd edition of the book called “Practical Electronics for Inventors” by Scherz and Monk it starts from the basics and you slowly build more and more complex and practical circuits.
https://www.amazon.com/dp/1259587541/ref=cm_sw_r_cp_api_eTs2BbXN9S1DN

Another fun on by Monk is “The Maker's Guide to the Zombie Apocalypse: Defend Your Base with Simple Circuits, Arduino, and Raspberry Pi”
https://www.amazon.com/dp/1593276672/ref=cm_sw_r_cp_api_XVs2BbYMVJT5N

If you are looking for something more theory based (I wouldn’t recommend initially unless you’re just curious) there’s a whole slew of texts books depending on what exactly you’re interested in you can pick up for cheap at a used book store or on amazon.

Remember build slowly in the beginning until you get a good grasp on the content and have fun. Diving in too deep to quickly can overwhelm and kill morale.

Happy learning!

u/lawanda123 · 1 pointr/india

Hi,

Can somebody recommend me good books/sources for the following-:

1.Advanced Design Patterns - OOPS + Functional
2.Refactoring
3.Big data analytics and ML algorithms
4.Any fast track course/refresher for JS + Angular(Im looking for something that has finer details,ive done JS in the past but ive forgotten most of it)

Also,ive picked up on some of Martin Fowlers books for now,but would like more perspective-:

https://www.csie.ntu.edu.tw/~r95004/Refactoring_improving_the_design_of_existing_code.pdf

http://www.amazon.in/Enterprise-Application-Architecture-Addison-Wesley-Signature-ebook/dp/B008OHVDFM

Would highly recommend these for anyone interested..

u/Zidanet · 1 pointr/arduino

If you have no idea where to start, try starting with this book: http://www.amazon.co.uk/30-Arduino-Projects-Evil-Genius/dp/007174133X

It's quite cheap, designed for beginners and has awesome starter projects. I'd highly reccomend it.

I was in your position, really excited about the tech, but had no idea where to start! The book has some really cool projects and is very hands on. Each project has a parts list and a tools list so you can make sure you are ordering the right thing and know what you are doing. It got me going right away, and it's very easy to understand.

u/Shnazzyone · 1 pointr/randomactsofamazon

I must become even more of a madman. I will use this book (on my handyman wishlist)

Using the knowledge of arduino i will build a fighting giant robot.

u/blahbot90 · 1 pointr/AskEngineers

Ok, so you're into space. Go into EE. That's probably the most useful advice I can give you. Seriously, spacecraft are pretty much electrical projects and without people in CS and EE, you couldn't do ANYTHING.

Buy an arduino, they're super cheap, start experimenting with it. What I highly suggest you do, and it is also what I do in my spare time to learn, is the following, which is build a satellite!

  1. Buy this starter book

  2. Purchase one arduino (Uno or Pro Mini would be my suggestion), 2 XBEE Radios (one for on the satellite, one for the computer), maybe a couple of sensors (accelerometer, altimeter, GPS, whatever you want).

  3. Now start programming and get your computer to send signals to the arduino, or the other way around.

    Its a very long project, especially for a beginner, and it can be expensive (~200-300) but if you do it right, its a ton of fun, and I have something that can take measurements and send them a mile out to my computer. But for me, its super awesome.

    I also built a rocket, and launched that sucker up, and watched it come down. Had the satellite transmit data, plotted it on Google Earth and had a 3D graph of the trajectory. Its such a great feeling to have something like this succeed.

    Edit: Forget learning the math/science reading side, you'll get that regardless of where you go and I would highly recommend practical experience over reading over the subjects that will likely just fly over your head. When I was 15, theory was the first thing that would make me fall asleep, actually building something with my hands would keep me up til the sun rose.
u/DuckGod · 1 pointr/norge

Arduino er en veldig god start i min mening, om du ikke mener det er overveldende. Det er jo fort litt elektronikk og kode som må læres før man gjør kule prosjekter, men det er ikke egentlig så ille om man ikke biter av for mye om gangen. En fordel med Arduino er at du bruker et språk veldig likt C++, så hoppet fra Arduino til C++ er ikke så stort.

Er noe mer usikker på det med lærebøker, men det finnes en del hobbybøker med x prosjekter i som kunne vært en god basis, som f.eks denne eller denne.

Skulle han ha mer intresse av koding enn nødvendigvis å lage duppedingser vil jeg derimot peke retning python. Code Academy har et helt greit begynnerkurs (gratis) for nybegynnere.

u/NLJeroen · 1 pointr/embedded

Fellow Embedded Engineer here.
You learn this from books: The Definitive Guide to ARM® Cortex®-M3 and Cortex®-M4 Processors.
And just RTFM of course: Cortex M4 technical reference manual.

And of course the chip vendors documentation, since there will be some implementation defined stuff (eg: which memory bank stuff will boot to).

Don't forget the compiler and linker documentation. Lot's stuff is there, just scrolling through once gives you so much more understanding, and what you might find there if your solving some problem later on. Like the special instructions, and the compiler flags and conditions for it to actually use the FPU.

If you're going to try this bare metal assembly programming, I'd recommend the Cortex M0, since an M4 often comes in a complex chip.

u/Elynole · 1 pointr/nfl

I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:

Universal Principles of Design

Dieter Rams: As Little Design as Possible

Rework by 37signals

Clean Code

The Art of Programming

The Mythical Man-Month

The Pragmatic Programmer

Design Patterns - "Gang of Four"

Programming Language Pragmatics

Compilers - "The Dragon Book"

The Language of Mathematics

A Mathematician's Lament

The Joy of x

Mathematics: Its Content, Methods, and Meaning

Introduction to Algorithms (MIT)

If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.

I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.


u/genjipress · 1 pointr/Python

Further notes, here's some of the books I've been looking at:

Modern Compiler Implementation (there's Java, C, and ML editions; this is C)

https://www.amazon.com/Modern-Compiler-Implementation-Andrew-Appel/dp/052158390X

Design Concepts in Programming Languages

https://www.amazon.com/Design-Concepts-Programming-Languages-Press/dp/0262201755

Engineering A Compiler

https://www.amazon.com/Engineering-Compiler-Keith-Cooper/dp/012088478X

Programming Language Pragmatics

https://www.amazon.com/Programming-Language-Pragmatics-Michael-Scott/dp/0124104096

u/mrdrozdov · 1 pointr/cscareerquestions

I like reviewing the curriculum for the fundamental courses in undergrad/grad CS programs. One way to get ahead is to become familiar with the roots of programming language theory. I found the book Programming Language Pragmatics helpful, and it goes well with this course's curriculum although I am sure there are others. Alternatively, try building your own language/compiler using yacc and lex.

u/brucehoult · 1 pointr/ComputerEngineering

Welcome!

You need two books:

https://www.amazon.com/Computer-Organization-Design-RISC-V-Architecture/dp/0128122757

Get the original MIPS or later ARM version if you prefer -- they're absolutely fine, and the principles you learn one one apply to everything -- but the RISC-V one is the newest and is the only only one that you're actually legally allowed to make an implementation of at home and distribute, put on github etc.

But of course designing and making your own 16 bit ISA is huge fun, so I definitely recommend that too!

Once you've digested all that, their other book is more advanced. But the first one will get you a long way. This next one is the absolute bible of real computer architects and hardware designers.

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

That's by these guys, who originally invented the RISC-I and MIPS processors in the early 80s, invented the term "RISC" (and also RAID, btw). They recently received the Turing award for their lifetime efforts:

https://www.youtube.com/watch?v=3LVeEjsn8Ts

Join comp.arch on usenet / google groups. There are lots of actual working or retired computer architects there, and they're helpful to energetic students and amateurs designing their own toy stuff.

u/throwdemawaaay · 1 pointr/AskComputerScience

https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269

After that:

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

These authors are the foremost authorities in the field. The second book is *the* textbook for computer architecture. These are the people that invented RISC.

u/exp11235 · 1 pointr/buildapc

The formal name for this field is "computer architecture." The most popular textbook by far is Patterson and Hennessey, and it's pretty easy to find materials from college courses posted online, ex. MIT open courseware, UC Berkeley.

For something less likely to put you to sleep, Ben Eater has a great Youtube channel where he explains computer architecture from a practical angle. He's got a great series where he builds a simple 8-bit computer from scratch, explaining all the pieces along the way.

u/tkphd · 1 pointr/HPC

What are you trying to do with it? Programming Massively Parallel Processors was useful to me, but without more info, it's hard to make recommendations.

u/biglambda · 1 pointr/gpgpu

I started with this book. I think it's mainly Cuda focused but switching to OpenCL was not that hard.

u/mrchowmein · 1 pointr/csMajors
u/saitt04 · 1 pointr/compsci

[This one?]
(http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210)

I just started my computer architechture class and this is one of the books they recommended, I think I'll try to get this one if it is that good.

u/cschaef66 · 1 pointr/learnprogramming

Structured Computer Organization by Andrew Tanenbaum:
https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0131485210
One of the best books I've read.

u/smith7018 · 1 pointr/jailbreak

It's hard to say because that's a more advanced section of computer science called architecture and I learned it in a college setting. With that being said, I've heard good things about this textbook. Hopefully whichever book you pick up on ARM assembly will have a few chapters going over how the processor functions.

Good luck! :)

u/oldsecondhand · 1 pointr/technology

I'd check out these two books from the local library and read the first 2-3 chapters. It might contain more than what you need, but these are pretty well written books and don't assume a lot of previous knowledge.

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

http://www.amazon.com/Computer-Networks-5th-Andrew-Tanenbaum/dp/0132126958/ref=la_B000AQ1UBW_sp-atf_title_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1376126566&amp;amp;sr=1-1

Or you could just check out your network settings and search for the terms that you encounter (IP address, DNS, DHCP, gateway, proxy, router, firewall)

u/zendruid · 1 pointr/learnprogramming

I study in South Africa and we did this book in our first year, with Chapters 1-2 in first semester and 3-4 in the second. It had very little on algorithms, we did basic recursion and for the most part that was it.

We then did this book in the 3rd semester, which covered more algorithms in detail (DFS/BFS/MergeSort/QuickSort etc). And I'm now in the 4th semester where we have a ton of different books including this, and this.

u/squaganaga · 1 pointr/ECE

I haven't yet designed boards with EMC and RF in mind. I've seen recommendations for the high-speed digital design books thrown around, though.

u/VectorPotential · 1 pointr/AskElectronics

What are you trying to measure? Your pulse rise time will do some funny things to your results if you're not careful.

If you can obtain a copy of High Speed Digital Design, the author describes several test jigs for such tests.

u/somekindofdevil · 1 pointr/AskElectronics

Almost every PCB/EDA software doing length matching automatically so you don't need to worry about that. If you wanna know how softwares are doing it, It's more like a mathematical problem. I think they are using parametric curves like Bezier. You can calculate length of a bezier curve easily so you can match them.

https://en.wikipedia.org/wiki/B%C3%A9zier_curve

If you wanna know more about high speed pcb design, I recommend this book.
https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/Beggar876 · 1 pointr/AskElectronics

Find/download/buy this book: High speed Digital Design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

&amp;#x200B;

Scan it cover to cover. It will pay for itself the first time you save a pcb re-spin because of something you saw in it. It has for me.

u/kevlarcoated · 1 pointr/PrintedCircuitBoard

The books referenced by the most presenters and PCB design conferences are
Right the first time by Lee Ritchie http://www.thehighspeeddesignbook.com/
Highspeed design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

Note that A handbook of black magic reads like a text book, it is very long and very boring.
The subject of PCB is complicated and requires an in depth understanding of the physics because just knowing the rules isn't enough convince other engineers that it's the right way to do something. More importantly, in my experience PCB design is always the least bad solution, you have to understand when you can break the rules and what the implications will be and understand if the trade off is acceptable

u/reddit_user33 · 1 pointr/Electrical_Engineers

Different applications require different rules to abide by to a certain extent.

High speed digital design - Howard Johnson has given me such a great insight to the world of high speed electronics.

u/ElectricWraith · 1 pointr/AskEngineers

Control Systems Engineering, 6th Ed, Nise

Modern Control Systems, 12th Ed, Dorf &amp; Bishop

Automatic Control Systems, 9th Ed, Golnaraghi &amp; Kuo

Control Systems Design: An Introduction To State-Space Methods

Control Handbook, 2nd Ed

Those are some that I have. The Nise book is excellent, the Dorf book is as well, it was my primary text for Controls I &amp; II, supplemented by the Kuo book. The latter has more on digital controls. All of those three focus primarily on classical control theory and methods, but the Nise book goes into more depth on modern methods. I got the state-space methods book because it's more focused. The Control Handbook is a beastly collection, but it's very broad, hence not possessed of much depth. It's more of a reference than a text.

If you want to dive deeply into PID control, look no further than Akstrom and Hagglund's works on the subject, it doesn't get much better.

Source: I'm a degreed EE that specialized in control systems and a licensed control systems PE.

u/carbacca · 1 pointr/engineering

this is just about the only book that was precribed in my mechatronics programme

http://www.amazon.com/Modern-Control-Systems-12th-Edition/dp/0136024580

cant say i actually used it apart from some last minute cram and highlighter abuse just before the exam - most of what i know and do just came from real work experience

u/bytewarrior · 1 pointr/AskEngineers

You are looking at an area called Control System Engineering. If you are familiar with the Laplace transform I strongly recommend reading through this book.

http://www.amazon.ca/Modern-Control-Systems-12th-Edition/dp/0136024580

Even if you do not understand the Laplace transform this book covers the material initially using traditional Differential Equations. You can get a copy online through resourceful means.

u/Sogeking89 · 1 pointr/AskEngineers

hmmm well there are a lot of books that could be recommended depending on how you want your guitar tuner to work and what sort of methods you will be using to model your system as well as control it, do you want books on signal processing as well ? do you want discrete control? state space ? or just a book that will cover most bases? Either way I have put down a couple of basic texts that could help.

http://www.amazon.co.uk/Modern-Control-Systems-Richard-Dorf/dp/0136024580

http://www.amazon.co.uk/Modern-Control-Engineering-International-Version/dp/0137133375/ref=sr_1_sc_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1382300545&amp;amp;sr=1-1-spell&amp;amp;keywords=control+ogatta

u/Ayakalam · 1 pointr/DSP

Hands down, no question, I would recommend Richard Lyons' book FIRST.

u/kwaddle · 1 pointr/DSP

I think The Scientist and Engineer's Guide to Digital Signal Processing and Understanding Digital Signal Processing and generally considered the most accessible introductions. I've gotten more mileage out of Understanding DSP; I feel like it goes into a little more detail and really works to walk you through concepts, step by step.

http://www.dspguide.com/pdfbook.htm

https://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419


Aside from searching out good learning resources, IMO nothing is more helpful for learning than setting up your environment with Matlab, Jupyter notebooks, or whatever you're going to use, and getting comfortable with the tools you'll be using to explore these topics.

u/fr3nch13 · 1 pointr/FPGA

Although not specifically targeting FPGAs, “Understanding DSP” by Richard Lyons is very good. Very readable.

https://www.amazon.com/dp/0137027419/ref=cm_sw_r_cp_awdb_t1_FQ4ZCbSRHV7QQ

u/necr0tik · 1 pointr/amateurradio

Thanks for the great reply!

The Lessons In Electric Circuits was already on my radar, and I believe will be the first resource in electronics I go through after hearing it beat in my head yet again!

That DSP book I have not seen. I just grabbed a copy and it looks like a great text. I mentioned this post to a fellow electronics enthusiast and he loaned me a copy of a book he said was exceptional for entry into the world of DSP: http://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419/ DSP is pretty complex, More than likely I will go through both to fully absorb this topic.

EMRFD sounds like a cookbook. Given that its by ARRL I expect its quality to be superb. I am not against these type of text, I have a few already, however I'd rather have more of the theory at this point. I imagine this will be great once I am satisified with the basics, and want to build an actual radio with its operation noted.

u/khafra · 1 pointr/DebateReligion

Much of your thinking seems to be based on a confusion of levels. If you knew more specifically how the firing together of neurons strengthens the probability they'll fire together in the future; or if you'd examined a program simulating physics, you wouldn't be using confusion as building blocks for arguments.

For instance, you would not be as confused right here if you were a systems developer instead of a philosopher; one read-through of the Dragon Book would clear everything right up. I'll try to summarize, but please understand this is not rigorous:

Your mind is running the algorithm "Step 1: Move to front of house. Step 2: Move to back of house. Step 3: Go to Step 1." Your mind is something your brain does. Your brain is implemented on physics. Exactly like the boulder.

The most legitimate question related to this post is that of substrate. Note: I do not agree with everything in this essay, but it presents the problem better than writings on "dust theory" (unless you're willing to read the whole Greg Egan novel Permutation City).

u/name_censored_ · 1 pointr/learnprogramming

&gt;Do you know of a book or a website that teach useful optimization techniques?

I'm only an enthusiast, I've never needed really optimised code (truth be told, most of what I do day to day is quick-and-dirty, appallingly inefficient scripts, because it "needs to be done yesterday"), so I can't give you a canonical list, but here's what I do know;

For books, there's this /r/compsci reddit thread from a while ago. Something on compilers like The Dragon Book might be your best bet, especially the optimisation chapter. And obviously jotux's "How Computers Do Maths" - though never having even flicked through it, I can't say if it's any good.

You could try your luck in /r/ReverseEngineering (or the quieter /r/asm and /r/compilers), there are a lot of low-level guys there who'd know a lot more than me. You could also try /r/compsci or /r/algorithms, although they'd be more useful for algorithms than for optimisation. And of course, /r/quantfinance.

u/Zonr_0 · 1 pointr/news

Really? Unless you're in a specific special topics course, the principles for most computer science haven't changed much. This book was the assigned reading for my compilers course and it was written in the mid 80s. Similarly, the core algorithms and data structures in the standard CS education haven't changed much (except for a move away from teaching bubble sort as an intro sort and into insertion sort).

But maybe that was just my education.

u/alanwj · 1 pointr/learnprogramming

it sort of depends on what you are trying to do.

I can't really tell from the name what that book is going to cover, but I expect that most books on programming language theory are going to start with things like lambda calculus, and go into type theory, etc, etc. If you are trying to learn the theoretical underpinnings of programming languages then this is great!

However, in my opinion a more practical place to start is with learning how to implement a programming language. That is, how to write a compiler. For that there is a whole separate set of theory (regular expressions, grammars, automata, etc) that you need to learn. The standard text for this is "the dragon book".

u/vplatt · 1 pointr/java

I have this as well, but don't really have any remarks for you. That said, maybe you should look through some of the reviews for it on Amazon or the like. The reviews there seem pretty authentic.

https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

u/SamHennessy · 1 pointr/PHP

When I said "I can see how maybe they could be useless to you.", that's because I instantly know what kind of programmer you were. You're a low level guy.

I have a copy of "Algorithms in a Nutshell" (http://www.amazon.com/Algorithms-Nutshell-In-OReilly/dp/059651624X) but I never finished it. My favorit programming book may be "Patterns of Enterprise Application Architecture" (http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420). Neither of these books are language specific, but I don't think they could be further apart in every way. Both are very valuable and I appreciate that they both exist.

There is a good number of reasons that you should maximize your use of the built-in PHP functions (http://webandphp.com/5reasonstomaximizeyouruseofPHP%E2%80%99sbuiltinfeatures). My book is an attempt to come up with a system that will help you learn all of the built-in PHP functions by giving a realistic use case that could be applied in your everyday work.

Being a PHP programmer, it is much more useful to know what functions PHP has for array sorting, than it is to know how to implement array sorting in PHP code.

u/xnoise · 1 pointr/PHP

There are a ton of books, but i guess the main question is: what are you interested in? Concepts or examples? Because many strong conceptual books are using examples from java, c++ and other languages, very few of them use php as example. If you have the ability to comprehend other languages, then:

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;amp;qid=1322476598&amp;amp;sr=8-1 definetly a must read. Beware not to memorize it, it is more like a dictionary. It should be pretty easy to read, a little harder to comprehend and you need to work with the patterns presented in that book.

http://www.amazon.com/PHP-5-Objects-Patterns-Practice/dp/1590593804 - has already been mentioned, is related directly to the above mentioned one, so should be easier to grasp.

http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/ref=sr_1_1?ie=UTF8&amp;amp;qid=1322476712&amp;amp;sr=8-1 - one of the most amazing books i have read some time ago. Needs alot of time and good prior knowledge.

http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_4?ie=UTF8&amp;amp;qid=1322476712&amp;amp;sr=8-4 - another interesting read, unfortunatelly i cannot give details because i haven't had the time to read it all.

u/bitcycle · 1 pointr/Database
  1. Use a relational data store first: MySQL/PostgreSQL/M$ SQL Server.
  2. Put CRUD operations behind a web service.
  3. Instead of arbitrary key/value pairs, try to get as specific as possible about the data (instead of storing everything as strings). Being more specific will help things (usually) to be more performant.
  4. Build the application on top of the service.
  5. Scale to the # of users you want to be able to support
  6. At this point, if you need to move part of the data into a non-relational data store, you can.

    I also recommend reading this book: Patterns of Enterprise Application Architecture
u/celtric · 1 pointr/PHP

You can read books with code examples written in Java or C#, the syntax is very similar and the principles apply in the same way (PoEAA comes to mind).

u/postmodern · 1 pointr/programming
  • ActiveRecord is a pattern according to Patterns of Enterprise Application Architecture. It's definitely the easiest pattern to implement, and thus the most popular amongst ORMs. Therefore, the simple misconception that ORM == ActiveRecord.
  • Your own description of the DataMapper pattern would imply that DataMapper does in fact differ from ActiveRecord. :) ActiveRecord has no concept of mapping, nor separation between the schema representation and the model. ActiveRecord simply instantiates Models froms rows.
  • Note: I am not quoting Martin Fowler as an Apple to Authority, but simply because Martin Fowler wrote Patterns of Enterprise Application Architecture (PoEAA), in which the ActiveRecord and DataMapper patterns are explained. :) See also the Wikipedia entry for ActiveRecord which lists Martin Fowler as the creator.
u/cythrawll · 1 pointr/PHP

Honestly I haven't read a "PHP book" in ages so I am a very bad source in critiquing. the majority of the one's I have come across are painfully outdated and some right out inaccurate. My suggestion to learn PHP programming better is try books that have to do with programming in general. Books on object orientation and patterns, like GOF http://en.wikipedia.org/wiki/Design_Patterns or PoEAA http://www.amazon.com/Enterprise-Application-Architecture-Addison-Wesley-Signature/dp/0321127420 are great for learning Object Oriented principals.

But those will help somewhat. What really helped me become a better PHP programmer is to study other languages, and then study their web frameworks, then take what you learned back to PHP. Find out why one aspect is pretty common in X language frameworks, but not PHP frameworks. How do other language frameworks solve the dependency issue, etc. Some languages I suggest learning are actually other ones that are pretty mainstream in web programming: Python for it's typing and how it compares to PHP. Ruby with how mixins relate to PHP traits. and Java is great as there are quite a bit of aspects that PHP stole off Java in it's OO design.

u/ndanger · 1 pointr/compsci

Upvote for Domain-Driven Design, it's a great book. Depending on the size of the system, Martin Fowler's PoEAA might also be helpful.

Also what dethswatch said: what's the audience &amp; scope; i.e. what's in the previous document? If you're presenting three architectures you probably need enough detail that people can choose between them. That means knowing how well each will address the goals, some estimate on implementation effort (time &amp; cost), limitations, future-proofing, etc.

Finally, IMHO, this really isn't computer science. You might have better luck asking in /r/programming/ or the new r/SWArchitecture/

u/mrferos · 1 pointr/PHP

When you get to a project larger than a small business or personal website it's less about the language and more about imposing a structure that makes sense backed by a data model that's thought out and performant.

I recommend these books:

http://www.amazon.com/High-Performance-MySQL-Optimization-Replication/dp/1449314287/ref=sr_1_1?ie=UTF8&amp;amp;qid=1425831227&amp;amp;sr=8-1&amp;amp;keywords=high+performance+mysql
http://www.amazon.com/gp/product/0321127420/ref=oh_aui_detailpage_o05_s00?ie=UTF8&amp;amp;psc=1

u/jasonlotito · 1 pointr/PHP

First, ignore the code examples on the page. They are fairly bad. The reason you don't grok OOP is because of examples like these. What you want is a good solid book.

Patterns of Enterprise Application Architecture

PoEAA sounds imposing, but it's really good. I highly recommend it.

I also recently wrote up an article on Data Mapping that you might be interested in reading. If you have any questions, you can ask. Basically, for me, when I finally groked OOP, was when I was reading an article on observers, and how that pattern works, and I started to get it. I'm still learning. I'm only now getting over my need for getters and setters (You don't need them, they are bad, stop using them).

But yeah, the reason most OOP articles are bad is because most people using objects are merely using them as namespaces for functions and variables. If you have any questions, let me know.

u/pjmlp · 1 pointr/programming

&gt; Those are interpreted (JIT - just in time compiled)

You are contradicting yourself.

Plus there are also native compilers for Java and .NET languages.

Just a few examples:

  • OS/400 compiles to native code at installation time

  • Android ART compiles to native code at installation time

  • Websphere Real Time JVM has an AOT compiler

  • Aonix JVM has an AOT compiler

  • Mono -aot, NGEN and .NET Native are all compilers generating native code

    Given that C and C++:

  • have the CINT interpreter

  • on OS/400 they compile to bytecode

  • on Windows CE they could be compiled to bytecode

  • were compiled to bytecode with the TenDRA compilers

    Does that make C and C++ interpreted?

    Of course not, because as any compiler design student knows, language != implementation.

    Bytecodes are just a form of architecture independent machine instructions, by no means do they require interpretation vs further compilation to native code.

    Since the Summer is still not over, here is some reading:

    Compilers: Principles, Techniques, and Tools (2nd Edition)
u/OhYourFuckingGod · 1 pointr/funny

She's trying to hide her disappointment. I'm pretty sure this is the one she wanted .

u/cmtedouglas · 1 pointr/learnprogramming

well, the common source out there that i can't avoid recommend is the Dragon's book

http://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

u/johannes1971 · 1 pointr/cpp

European dragons have a heritage that stretches back to at least the time of the Greek civilisation; calling them "pale imitations" does them a grave disservice. The oldest known sources for dragon myths are from the Middle East, not the Far East.

If you feel like arguing this point, please don't use the Dresden Files. Just stick with authorative sources instead, ok?

u/blahdom · 1 pointr/learnpython

No problem. Good luck finding a class, compilers are a really fun subject!

If you are just generally interested (I don't know your experience level) but the dragon book is still highly regarded. And might be a good entry way into the theory of it all.

u/hou32hou · 1 pointr/ProgrammingLanguages

I suggest you to read the Dragon Book.

u/dnew · 1 pointr/worldnews

&gt; Is this a realistic goal

Yes, quite. The bits you are going to be missing are some of the mathematical underpinnings. Depending on what you're programming, you'll also want to grab books on the particular topic at hand that don't try to teach you programming at the same time.

For example, if you want to learn why C# is object-oriented and what that means and how to use it, grab a copy of this book: http://en.wikipedia.org/wiki/Object-Oriented_Software_Construction

If you want to learn how relational databases work, read this one http://www.amazon.com/Introduction-Database-Systems-8th-Edition/dp/0321197844 (You can easily find online versions, but I didn't investigate whether they were legally released or not.)

You want to write a compiler? Grab the "dragon book": http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

None of those teach you how to program. They teach you the math and background behind major inventions in programming. Keep up with those, find a local mentor who enjoys talking about this stuff, and you'll do fine.

u/kmafb · 0 pointsr/IAmA

For lexing and parsing, you should just pick up a compiler book. You could bang your head against it your whole life without figuring it out, and the Right Answer is not that hard if you have someone to show it to you. There are lots of good ones; the classic is the "dragon book" (http://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886).

Beyond that, VMs are a big topic. They include all of compilers, and almost all of systems programming. The Smith and Nair book (http://www.amazon.com/Virtual-Machines-Versatile-Platforms-Architecture/dp/1558609105) is a great jumping off point. But so is playing around with a project that means something to you. It depends what you find more rewarding.

u/levu-webworks · 0 pointsr/learnprogramming
  • The "Red Dragon Book of Compiler Design"
  • Compiler Design in C

    Both books I've read. The latter sits on my bookshelf. It was a gift from my girlfriend. Please don't waste your time trying to implement a compiler. It's a PhD level endeavor that will take years of dedicated 60 hour work weeks.

    Here are the same links linked from my Amazon affiliates account:

  • The Red Dragon Book of Compiler Design
  • Compiler Design in C


    You are better off implementing a algebraic calculator using LR Parse. Start with Tom Torf's - Programmers Calculator - PCalc. It's written in C and pretty simple. You can fork it from my GitHub account if you have trouble finding Tom's source archive. Tom (may he rest in peace) also wrote several programming tutorials and contributed to comp.lang.c, alt.lang.c and the comp.lang.c FAQ.
u/CodeShaman · 0 pointsr/java

Once you've mastered basic design patterns, this book will take you to a heightened state of consciousness.

u/henk53 · 0 pointsr/java

No, I'm not getting anything confused!

I started in the 70-ties with asm, worked my way through C, then C++ and lately Java. At some point during that time I've implemented a Pascal compiler and I've read the dragon book cover to back and again.

I think I've got a pretty firm grasp of what call by reference is, and let me tell you: Java DOES NOT have call by reference, ONLY call by value.

If there's anyone who's confused it's you. Sorry.

Things get a LOT easier mentally when you call it a pointer in Java. This is just terminology, even the Java designers knew it was a pointer (hence the name NullPointerException).

In Java, objects are only accessible via pointers. To be more consistent with C++, every variable in Java should actually be written as:

MyObject obj = new MyObject();

But since there's no concept of a stack allocated object in Java, you would always have to put that
there, and so they left it out.

And indeed, for pass-by-reference, a reference to whatever you have (a pointer in this case) has to be passed. So that's indeed a double indirection. In C++ you can even express this explicitly, since there's not only call-by-reference semantics, but also the address off operator &amp;. Applying the &amp; operator to a pointer (e.g. foo*) yields a double pointer (foo*). This is what happens behind the scenes when you use call-by-reference in C++ and various other languages.

In Java, everything but primitives is a pointer, and that pointer is passed, BY VALUE. In C++ terminology, there's only foo
, foo** does not exist. Hence, you can follow that pointer from within a method and thus modify the object that is being pointed too, but you CAN NOT re-assign the pointer as I showed above.

Pass-by-reference semantics DEMAND that a pointer can be re-assigned.

Modifying that was is being pointed too does not count. You are CONFUSED with modifying a stack allocated object. If you pass that into a method and can modify it, then it's pass by reference, since the object itself is passed (no pointer involved), but in Java there is no stack allocated object, only a pointer to an object and that pointer is passed by value.

If you're still confused, please read the dragon book. Even if not for this argument, it will greatly enhance your understanding of how languages and computers work.

u/r2p42 · 0 pointsr/AskProgramming

I tried to come up with a simple explanation how a compiler works but I feel unable to provide a wording which is still somehow correct.
I guess there is a reason why this book has 1000 pages: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811

The simplest explanation would be that somewone wrote a program which the computer understands, that is able to detect your language and converts it also to something your computer can understand.

u/eclectro · 0 pointsr/programming

&gt;We are talking about a high-level language compiler, remember?

I still consider C a high level language. Some people don't for various reasons..

&gt;You were complaining that it compiles to C rather than emit instructions.

You simply read/took my post wrong. Nowhere am I complaining. Merely making an observation. It is not an unusual feat for a compiler to generate assembly instructions or machine code. Nor would I call it super difficult to write a compiler, but rather straightforward.

&gt;If you are going to emit instructions, it's up to you to write your own optimizer.

Or buy/obtain a compiler that already is capable of doing this step.