Best microprocessor design books according to redditors

We found 74 Reddit comments discussing the best microprocessor design books. We ranked the 26 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top Reddit comments about Microprocessor Design:

u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/VK2DDS · 9 pointsr/DSP

+1 for Cortex-M (with FPUs). I'm building a guitar pedal with an STM32F407 and it handles 12x oversampled distortion and a bunch of biquads at 48kHz (mono). It is paired with a CS4272 audio codec with DMA handling the I2S data.

It won't handle any reasonable FIR filter and the RAM limits it to ~500ms delay. There is a discovery board with external RAM but I haven't tried using it.

The F7 series are clocked a bit faster and some come with a double precision FPU instead of single but they have the same instruction set as the F4s. The Cortex-M7 has a longer pipeline (6 Vs 3 stages, probably to support the higher clock rate) so branching is probably less of a penalty on the M4.

This book is an excellent guide to the low level guts of the Cortex-M3 & M4 chips and contains a chapter dedicated to DSP on the M4. Long story short is contains a bunch of DSP instructions such as saturating integer arithmetic, integer SIMD, floating point fused multiply-accumulate etc which makes it semi-competitive against "true" DSP cores. The book compares the M4 and Sharc DSP to show that there's a big jump between them but the M4 wins hands down for ease of learning & development (strong community, free (GNU) tools etc).

Edit: If you want hardware this audio codec can be paired with this STM32F7 kit or this motherboard paird with this STM32F4Discovery board can take it as well.

u/maredsous10 · 7 pointsr/ECE

I'd check Amazon for older books you can get on the cheap. Schaum's outlines as mentioned are also good.

Books

u/EngrToday · 7 pointsr/CUDA

As far as I know this is the go to for most people learning CUDA programming. For CUDA 9+ specific features, you're best bet is probably looking at the programming guide on NVIDIA's site for the 9 or 10 release. I don't believe there's much in terms of published books on specific releases like there is for C++ standards.

u/asm2750 · 6 pointsr/FPGA

I don't know of a book for RISC-V creation in Verilog but there is a book on making a Z80 like processor in Verilog. https://www.amazon.com/Microprocessor-Design-Using-Verilog-HDL/dp/0963013351

u/elchief99 · 5 pointsr/raspberry_pi

Buy the Maker's Guide to the Zombie Apocalypse and make some of the contraptions in it. They use RPis

u/deaddodo · 4 pointsr/askscience

Strictly speaking, yes. However, even the most basic CPU's would be insanely slow, difficult to keep in sync and have fairly large footprints. The next best thing has been done, frequently in fact, as a way for EE's and hobbyist's to test their mettle (much like building a CPU in Minecraft, littlebigplanet, etc). Here are a few examples: [1][2][3][4][5][6]


As you can see, almost unequivocally, integrated circuits are included in these projects. These are generally, at least: 55X timers, transistor-transistor logic, 4000 series logic gates, etc

u/yumology · 4 pointsr/arduino

Might want to check out Makers Guide to Zombie Apocalypse. It has lots of home defense things you can do with an arduino.

u/mantra · 3 pointsr/ECE

Where/how you figure this out?

By 1) getting into semiconductor processing and process design, 2) getting into device modeling, and 3) becoming an analog IC designer - and, of course, working in the semiconductor industry. In school you focus on upper division and graduate classes in these areas.

Generally you need all three to understand this area well. That's kind of how I fell into it. Leading edge analog design quickly becomes limited by the specifics of your simulation models and your specific process implementation. Usually parameters of process and device become factor in your analog circuit design and you may even adjust physical CAD layout to tweak them.

This is where SPICE models come in. Basically you keep getting new ones added to CAD systems over the last 40 years because of some corner that isn't well modeled. The simplest models (like MOS 1->3 and Gummel-Poon) worked OK for very large devices 40 years ago when SPICE was invented but process shrinks have created lots of nonidealities since (which is the nonideality? the device or the model? :-) ). Nature of the beast.

The simple fact however is that you can never get a device model to actually cover all corners of operation equally well. Such a model doesn't exist and probably never will.

Instead the reality is that you generally need fairly peaked experts extracting parameters and often even creating new models with the caveat that you always have to compromised on the model extraction accuracy to fit the particular application corner you are designing to.

So, for example, if are doing high power, you'll optimize one of the standard models for that corner and sacrifice low power accuracy or vice versa. If you are doing RF/uW devices, you make a different set of compromises than you would if you were doing digital or LF linear. In 40 years it's never become turn-key and automated - the degrees of freedom in the models generally don't properly match those of reality. Too many or too few cause problems with the extraction.

There are other areas related to SPICE model extraction that are very similar with just a small change of emphasis.

These include parametric process measurement which monitors each fabrication step using end-of-line analog testing of specialized test structures. This is more focused with manufacturing process control and device operational integrity "out the door". A side area to this is reliability testing - when with the devices fail in the field (and they will fail). Bread and butter to me. Been doing stuff in this general area for most of my career.

Some books on my shelf are the following (they are so common they are usually referred to by the author's name):

Physics of Semiconductor Devices (Sze)

MOS (Metal Oxide Semiconductor) Physics and Technology (Nicollian/Brews)


Semiconductor Device Modeling with Spice (Kielkowski)

SPICE: Practical Device Modeling (Antognetti)

Semiconductor Material and Device Characterization (Schroder)

Failure Mechanisms in Semiconductor Devices (Amerasekera/Najm)

Failure Modes And Mechanisms In Electronic Packages (Singh/Viswanadham)

You can also hang out at /r/chipdesign which is probably the closest subreddit to this area. I'm a moderator there.

u/chronicENTity · 3 pointsr/Piracy

Just curious, do you know about The 8088 Project Book? According to this page on Helm PCB, it is much more useful than Walter's.

I did my fair share of searching around and found a copy at a community college library about 200 miles away (but near where I grew up). You may want to look into searching around local libraries (typically they can exchange/share between many libraries in your area, so make sure and include other libraries in your search, if that's an option).
Edit: Here's where you can search libraries near you, assuming you're in the US.

u/pyroscopic24 · 3 pointsr/CUDA

I agree with guy. Having a solid C++ background is good but programming for CUDA specifically is something else.

The book that I used when I took CUDA programming as an undergrad was this:
Programming Massively Parallel Processors: A Hands-on Approach 3rd Edition


Here's a sample of the 1st edition of the book. It's not too far from the 3rd edition but checkout Chapter 3 to see how much different it is from programming typically in C++.

u/mfukar · 3 pointsr/askscience

> When we say dual core of quad core processor, what we really mean is a single integrated chip (CPU) with 2 (dual) or 4 (quad) processors on it. In the old days processors were single core so this confusion didn't arise as a single core processor was just a processor.
>

Many CPUs can be included in a single integrated die.

In "the old days" there were multi-chip modules includinged multiple CPUs (and/or other modules) in separate ICs.

> A processor consists of a control unit (CU) and a arithmetic logic unit (ALU).

And many other things, which it sometimes shares (MMUs, I/O controllers, memory controllers, etc). Don't be too picky over what a processing unit includes. For those that want to dive in, grab this or this book and read on.

> The combination of components is why just having more cores or more GHz doesn't always mean a faster CPU - As the onboard cache and other factors can also slow the processing down, acting as a bottleneck.

Bit of a superfluous contrast, these days. Using anything external to the CPU slows it down, by virtue of propagation delays alone. That's one of the reasons we want many cores / CPUs. The more CPUs or faster clocks question is a red herring - here's an article that explores why (the context is CAD, but the observations are valid in most areas of application).

u/FearMonstro · 3 pointsr/compsci

Nand to Tetris (coursera)

the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris.

Code: The Hidden Language of Hardware and Software

This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive.

Computer Networking Top Down Approach

one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics.

Operating System Design

A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!)

Digital Design Computer Architecture

another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice.

u/Jake_Salmi · 3 pointsr/ECE

TI puts on precision labs seminars, videos here of the lectures, including noise:
http://www.ti.com/lsds/ti/amplifiers/op-amps/precision-op-amps-precision-labs.page#TIPLnoises
Also at TI, Art Kay wrote a book, I haven't read the book, but I've seen most of Art's source material, so I expect the book is quite good as well: https://www.amazon.com/Operational-Amplifier-Noise-Techniques-Analyzing/dp/0750685255

u/motivated_electron · 3 pointsr/ECE

Hi,

I have two-part suggestion for you. Naturally, this is really just what I did to move from your shoes to where I am now (writing device drivers, and using real-time operating systems) on nice toolchains (sets of IDEs coupled with compilers and debugger interfaces).

The first thing you ought to do next is focus on learning C. ANSI C.

This is book to use: Introduction to Embedded Systems by David Russel

By actually stepping through the book, you'll get to learn what is embedded is all about, without having to learn a new debugging interface (because you won't have one), and without having to buy a new board.

The book uses the Arduino Uno board, and the Arduino IDE to teach you how to NOT use the Arduino API and libraries. This teaches you about the "building" process of C programs - the compilation, linking, bootlaoders, etc. You'll learn all the low level stuff, on a platform (The AT Mega 328p) that is easier to understand. You'll even get into making interrupt-based programs, a buffered serial driver, pulse-width modulation, input capture on timers etc.

Ever since having gone through the book, I've been able to move to other platforms, knowing that the ideas are essentially the same, but more involved. If you were to jump straight into the Arm Cortex-based 32 bit processors, you would feel rather overwhelmed by the complexity of the peripherals that typically get placed onto the same die. You would end up resorting to high level hardware abstraction libraries (like ST Microelectronic's horrid HAL) and MAYBE be able to use them, but you would have no idea what would actually be going on. As soon as a library stops working, you need to be able to know where it broke and why.

So do this patiently.

Start with the 8 bit micro on the Arduino using the book, using basic timers, basic interrupts, basic C types.


Only later, it is wise to pick up an ARM board to start experimenting with. These devices are powerful monsters that get work done. But you won't have an appreciation for what they and their powerful peripherals can do until you've wrestled with their simpler cousins on the AT Mega 328p.

You'll have to pick an IDE, (or none if you really want to know and understand the build process), a set of libraries to use (or none, if you want to learn the monster peripherals the hard way), and an operating system (or none, if you want to stick to bare-metal programs, but really, a 120 MHz cpu with all that power and no OS is sort of a waste).

I would recommend starting with the TIVA C series board from Texas Instruments. It's used in the very nicely paced Intro to ARM Microcontrollers by Jonathan Valvano.

That would be another book well worth the time to go through like a tutorial, because it is.

These book have also helped me along the way: Definitive Guide Cortex M3 Cortex M4 Processors by Joseph Yiu
and Computer Organization and Embedded Systems by Zvonko Vranesic.

If you want to explore this field, please be prepared to be patient. The learn curves here can be steep. But the things you'll be capable of making are incredible if you keep it up. I hope your exploration goes well!
















u/ArithmeticIsHard · 3 pointsr/Cplusplus

When I took a High Performance Computing course, this book came in handy.

Programming Massively Parallel Processors: A Hands-on Approach https://www.amazon.com/dp/0128119861/ref=cm_sw_r_cp_api_i_Xc3SCbDS47WCP

u/jgm27 · 3 pointsr/ECE

The second book is good. You might also want to check out: Modern Processor Design: Fundamentals of Superscalar Processors https://www.amazon.com/dp/1478607831/ref=cm_sw_r_cp_api_M8YmxbVCM44YW

u/EighteenthVariable · 2 pointsr/robotics

Yes! It's totally possible to experience robotics without a physical robot. IMO a lot of robotics is the programming and watching how your little creation reacts to it's environment, and it's possible to get a feel for how that's like using gazebo. There are some tutorials for getting started with gazebo.

First though, I suggest installing ROS and becoming familiar if you already haven't, and then maybe fiddling around with the turtlebot to get a feel for things (If you're really unfamiliar with ROS though, I highly recommend starting off with the beginner tutorials here and going through them one by one. There's also this book (sorry couldn't find a different link). I think it was a bit outdated (I don't remember but I think I heard someone say something haha) but I found it super useful when ROS tutorials didn't really cut it.)

And then you can fiddle around with urdfs (That's what I worked with, but apparently SDFs are in now, but apparently it's up to you?) and I guess you can take it from there? There are some blogs that help out too, so I suggest seriously looking around for the odd tutorial.

Hope this helps!

ps. is there anything in specific you're looking to learn/make? Maybe I can point you in a more relevant direction than super general tutorials?

u/PCneedsBuilding · 2 pointsr/computerscience

I enjoyed this book https://www.amazon.ca/Definitive-Guide-Cortex®-M3-Cortex®-M4-Processors/dp/0124080820, you're going to have to get acquainted with the technical reference manuals at arm's website also.

u/paranoidwififamily · 2 pointsr/ECE

Different kinds of math.

Here are the basics for most of an EE undergrad degree. I'll include some signals math for the sake of completeness, and not Discrete Math since this isn't CS and we aren't making new languages or looking at complexity and turning problems.

  • Calculus (single and multi variable and vector)
  • Differential Equations ( ODE, Systems of DE, PDEs (though it depends), Special Functions and Series Solutions)
  • Transforms (Laplace, Z, the whole Fourier alphabet soup in multiple dimensions)
  • Linear Algebra (EigenV(insert thing here), Spaces, operators, data fitting)
  • Probability
  • Statistics
  • Numerical Analysis or Techniques
  • Complex Analysis (aka more calculus in imaginary domain)
  • Possibly things like Finite Element Method

    Don't go taking things like Real Analysis or set theory or some nonsense. We take different math than the math department. That proof heavy shit will knock your ass flat if you aren't prepared for it.

    The thing that I find the most mathy (aka boring as sin) is DSP, Systems and Controls. Like it varies. If you get into image processing or Machine learning which are all in the realm of DSP or Controls then have fun.

    Software Engineering.... not really that mathy except for formal equivalence checking of software and states and stuff.

    CompE, not really. I mean maybe a bit in computer arithmetic, but not reallllyyyy. Unless you're doing Queuing theory to benchmark computer performance. There's a bit of it when you get to parallel processing, mathy shit I mean, but it isn't the worst thing ever. Architecture is ok, VLSI you don't need much math. The only field in compE that needs math would be high speed digital design aka Electromagnetics for Dummies/CEs. Handbook of black magic, more like should have read a book on transmission lines and Pozar, sheesh.

    Solid State Electronics... that's interesting. See I doubt it for under grad stuff. You need basic QM, which means you need to understand Linear Algebra, PDEs and Numerical Techniques (3 classes in the math department, maybe 4 if you take a matrix theory class). It's mathy but of a different kind since they are PDEs and what not.

    Communications, ugh. Fourier Series, Spaces a lot of probability. Comm is fairly math intensive, especially once you get past Comm I, with things like Digital Comm or Wireless Comm

    Controls.... just no. Just noooooo. You get past feedback controls aka WHOA YOU READ NISES BOOK TOOOOOOO, aka I know root locus and Nyquist and state space does that mean I'm smahtttt guys? Super Mathy, non linear controls and beyond, ugh. Have fun with those Z transforms all the live long day. Those are just math classes. https://www.amazon.com/Computer-Controlled-Systems-Theory-Electrical-Engineering/dp/0486486133/ref=pd_sim_14_10?_encoding=UTF8&psc=1&refRID=H7HP229A7D4728MVCMVS mmmmm delicious..... If you like things like that, where you're brain hurts after awhile, where you wonder if you should use the fuzzy logic toolbox in matlab because you're so confused that you can't use the SISO tool anymore, where you wonder why the world has forsaken you, then be my guest.

    DSP - Complex analysis in the basic level, probability and random processes in the intermediate levels. Lest we forget Forier everything everyday, and with such great (read boring) books from Oppenhiem lauded everywhere, good luck with the coffee, you'll need it. That said DSP itself is fun as heck, combined with Mixed Signal Circuit design you'll have fun once you get past the intro material on Transforms. Once you get to filters, its a good time, lots of real applications. Then you get to the image processing and it goes back to math.

    You will have competition where ever you go. Get it out of your head that you won't. Just because this is hard field doesn't mean the area isn't swamped with people. You will hardly stand out, even if you are working with like I don't know Triacs or Tube electronics. And that's not a bad thing. They need engineers, you don't need to stand out or be a super genius to get a good job in the field.

    The reason some places may not have many positions are if they are incredibly indepth. Every field in EE is difficult. But you don't see many people in Solid State since this isn't the 80s anymore, we don't have fabs everywhere, silicon is hard, forget other compound materials, and a Masters degree is the equivalent of a technician in a fab. You need a PhD and for what, working on a material that may not go anywhere.

    You wanna know what was hard as hell, and the field is practically non existant? Quantum Electronics.

    Hot as hell in the 90s, hard as bananas, had every egocentric EE/Material Scientist/Physicst jerking each other off in an intellectual masturbation session. Look into how difficult it is to even understand the basics of a laser diode. You need to understand QM, EM, Optics, and SSE up to heterostructure devices. Then electron photon interaction. Qantum ElectroDynamics, its what the cool kids are learning. Yeah you know who cares about that? The two or three Laser diode makers there are? I mean LEDs are super popular but they are also easy as bananas to understand. They thought CDs and DVDs would change everything, you'd be using laser diodes everywhere, Free Space LOS would take over the world. Yeah right no one cares.

    Trust me man, stick to software engineering, learn ARM programming, take a few architecture courses, and known your 8 bit to 64 bit micocontrollers. Pick up a few ARM M4s a few AVRs and a beaglebone, learn some OS basics and get yourself an embedded systems job.

    Don't mind me though, I am just bitter as hell about my degree in EE. I ended up making more for doing general application development and SE, where I hardly use my EE degree. I did programming for fun, that ended up being my job. I should've just done CS, but it doesn't really matter at the end of the day anymore. It's been two years since I graduated and I'm still bitter.
u/tolos · 2 pointsr/AskComputerScience

The dragon book is a bit dated. I would recommend Engineering a Compiler (Cooper). Though I have heard good things about Modern Compiler Design (Grune).

Stanford's open course on compilers is available on youtube but I can't offer a critique. https://www.youtube.com/watch?v=sm0QQO-WZlM&list=PLFB9EC7B8FE963EB8

u/creav · 2 pointsr/programming

> I'm personally happy that didn't focus on any single language that much.

Programming Language Pragmatics is one of my favorite books of all time!

u/RoombaCultist · 2 pointsr/ECE

1st year EE student here.

For circuit analysis my biggest resources have been allaboutcircuits.com, and Jim Pytel's Youtube videos. I need to send that guy a dozen beers for getting me through my first two terms of circuits.

For the math: review algebra (exponents in particular), trigonometry, and, if your program will be using calculus review that too. When you get to studying AC, you'll get to use phasors, which makes for far less calculus and honestly makes more sense. Phasors were completely new for me, but were completely doable thanks to Jim Pytel's lecture. Honestly though, It's mostly algebra; Ohm's Law and KVL will get your far.

If/when you get interested digital circuit design check out Ben Eater. His explanations for what is going on in a circuit are far more clear than any other resource I've found so far.

Of course all that theory is pretty dry for a fun summer break. It might be best to bookmark those until classes actually start then use those resources to compliment your class. A more fun approach might be to get yourself a basic electronics kit, a breadboard, and a DMM (that's digital multimeter) and build some circuits out of a book like any of Forrest Mim's books, or a more modern "maker flavored" book. Then probe the circuits you make and ask yourself "What's going on here?"

Also, the sooner you can get your hands on a good oscilloscope, the better. It's the best tool for seeing what's going on in a circuit because it shows how voltage relates to time, unlike a DMM which shows an average. A DMM is indispensible (and affordable), so it will likely be the first major tool in your kit, but don't be shy of all the knobs on the fancy expensive box.

u/Zidanet · 1 pointr/arduino

If you have no idea where to start, try starting with this book: http://www.amazon.co.uk/30-Arduino-Projects-Evil-Genius/dp/007174133X

It's quite cheap, designed for beginners and has awesome starter projects. I'd highly reccomend it.

I was in your position, really excited about the tech, but had no idea where to start! The book has some really cool projects and is very hands on. Each project has a parts list and a tools list so you can make sure you are ordering the right thing and know what you are doing. It got me going right away, and it's very easy to understand.

u/Elynole · 1 pointr/nfl

I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:

Universal Principles of Design

Dieter Rams: As Little Design as Possible

Rework by 37signals

Clean Code

The Art of Programming

The Mythical Man-Month

The Pragmatic Programmer

Design Patterns - "Gang of Four"

Programming Language Pragmatics

Compilers - "The Dragon Book"

The Language of Mathematics

A Mathematician's Lament

The Joy of x

Mathematics: Its Content, Methods, and Meaning

Introduction to Algorithms (MIT)

If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.

I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.


u/Shnazzyone · 1 pointr/randomactsofamazon

I must become even more of a madman. I will use this book (on my handyman wishlist)

Using the knowledge of arduino i will build a fighting giant robot.

u/AnonysaurusRex · 1 pointr/ECE

I run tutorials and labs for a "Foundations of Digital Design" unit. We use:

http://www.amazon.com/Digital-Design-Computer-Architecture-Harris/dp/0123704979

From my experience this is a good book, an improvement on what we used to use (Mano & Kime).

u/RadioactiveAardvark · 1 pointr/FPGA
u/mrdrozdov · 1 pointr/cscareerquestions

I like reviewing the curriculum for the fundamental courses in undergrad/grad CS programs. One way to get ahead is to become familiar with the roots of programming language theory. I found the book Programming Language Pragmatics helpful, and it goes well with this course's curriculum although I am sure there are others. Alternatively, try building your own language/compiler using yacc and lex.

u/demonGeek · 1 pointr/ECE

Modern Z80 computer project: N8VEM

Or if you really want to go old school, get a book from that era and build it from scratch:

Microcomputers and Microprocessors

Microprocessor Programming, Troubleshooting, and Interfacing

You can often find the best prices on AbeBooks.com

And finally, this.

u/newfor2019 · 1 pointr/ECE

plenty of resources online, not only are they comprehensive, they're also free. But, if you really want to get a book to hold or something, there's a whole series of books about these processors

https://www.amazon.com/Definitive-Guide-Cortex-Cortex-M0-Processors/dp/0128032774/ref=sr_1_2?keywords=cortex-m0&qid=1562115097&s=gateway&sr=8-2

u/blahbot90 · 1 pointr/AskEngineers

Ok, so you're into space. Go into EE. That's probably the most useful advice I can give you. Seriously, spacecraft are pretty much electrical projects and without people in CS and EE, you couldn't do ANYTHING.

Buy an arduino, they're super cheap, start experimenting with it. What I highly suggest you do, and it is also what I do in my spare time to learn, is the following, which is build a satellite!

  1. Buy this starter book

  2. Purchase one arduino (Uno or Pro Mini would be my suggestion), 2 XBEE Radios (one for on the satellite, one for the computer), maybe a couple of sensors (accelerometer, altimeter, GPS, whatever you want).

  3. Now start programming and get your computer to send signals to the arduino, or the other way around.

    Its a very long project, especially for a beginner, and it can be expensive (~200-300) but if you do it right, its a ton of fun, and I have something that can take measurements and send them a mile out to my computer. But for me, its super awesome.

    I also built a rocket, and launched that sucker up, and watched it come down. Had the satellite transmit data, plotted it on Google Earth and had a 3D graph of the trajectory. Its such a great feeling to have something like this succeed.

    Edit: Forget learning the math/science reading side, you'll get that regardless of where you go and I would highly recommend practical experience over reading over the subjects that will likely just fly over your head. When I was 15, theory was the first thing that would make me fall asleep, actually building something with my hands would keep me up til the sun rose.
u/SecretJerker · 1 pointr/C_Programming

Check out the dedinitive guide to arm cortex m* by joseph yui. Ive only read the m3/m4 one but it gives a great overviewhttp://www.amazon.com/gp/aw/d/0128032774/ref=mp_s_a_1_2?qid=1453564034&sr=8-2&pi=SY200_QL40&keywords=the+definitive+guide+to+arm&dpPl=1&dpID=51gPqtgAklL&ref=plSrch

Also, a course on edx just started about programming the m4 on a tiva c dev board. (tm4c123gh6pm is the mcu)
edit: link to edx course
https://www.edx.org/course/embedded-systems-shape-world-utaustinx-ut-6-03x

u/NLJeroen · 1 pointr/embedded

Fellow Embedded Engineer here.
You learn this from books: The Definitive Guide to ARM® Cortex®-M3 and Cortex®-M4 Processors.
And just RTFM of course: Cortex M4 technical reference manual.

And of course the chip vendors documentation, since there will be some implementation defined stuff (eg: which memory bank stuff will boot to).

Don't forget the compiler and linker documentation. Lot's stuff is there, just scrolling through once gives you so much more understanding, and what you might find there if your solving some problem later on. Like the special instructions, and the compiler flags and conditions for it to actually use the FPU.

If you're going to try this bare metal assembly programming, I'd recommend the Cortex M0, since an M4 often comes in a complex chip.

u/tkphd · 1 pointr/HPC

What are you trying to do with it? Programming Massively Parallel Processors was useful to me, but without more info, it's hard to make recommendations.

u/biglambda · 1 pointr/gpgpu

I started with this book. I think it's mainly Cuda focused but switching to OpenCL was not that hard.

u/jakub_h · 1 pointr/promos

The Emacs thing was an OS joke. Having said that, whether "writing a full featured text editor is crazily complicated" is debatable. It's been very well researched by now as to how to do it and it's absolutely possible to design a compact, powerful, highly extensible editor in a comparatively small amount of code. Scintilla is hardly example of that, though.

Somehow I get the impression that you've decided to weigh yourself down by the accidental complexity of re-doing many other people's mistakes. Now if you've decided to do that, the million lines of code makes obviously sense. In case of communication protocols, there's at least some reason, so if you absolutely need to do that, that's obviously a justification. Me, if I were to part myself from existing libraries (which I'm kind of about to do with an algebraic system), it would be because I wouldn't want to do the same things that have already been done.

u/Truth_Be_Told · 1 pointr/learnprogramming

The Computer Systems: A Programmer's perspective is a very good book (the only book of which i have all 3 editions!). You can easily get the cheap South Asian editions and save money.

u/DJ027X · 1 pointr/comparch

A quick search found http://www.amazon.com/Modern-Processor-Design-Fundamentals-Superscalar/dp/1478607831 as well as http://www.ebay.com/itm/like/111520297055?lpid=82
I have no idea what their contents are though, or if they cover what you are looking for

u/genjipress · 1 pointr/Python

Further notes, here's some of the books I've been looking at:

Modern Compiler Implementation (there's Java, C, and ML editions; this is C)

https://www.amazon.com/Modern-Compiler-Implementation-Andrew-Appel/dp/052158390X

Design Concepts in Programming Languages

https://www.amazon.com/Design-Concepts-Programming-Languages-Press/dp/0262201755

Engineering A Compiler

https://www.amazon.com/Engineering-Compiler-Keith-Cooper/dp/012088478X

Programming Language Pragmatics

https://www.amazon.com/Programming-Language-Pragmatics-Michael-Scott/dp/0124104096

u/mrchowmein · 1 pointr/csMajors
u/DuckGod · 1 pointr/norge

Arduino er en veldig god start i min mening, om du ikke mener det er overveldende. Det er jo fort litt elektronikk og kode som må læres før man gjør kule prosjekter, men det er ikke egentlig så ille om man ikke biter av for mye om gangen. En fordel med Arduino er at du bruker et språk veldig likt C++, så hoppet fra Arduino til C++ er ikke så stort.

Er noe mer usikker på det med lærebøker, men det finnes en del hobbybøker med x prosjekter i som kunne vært en god basis, som f.eks denne eller denne.

Skulle han ha mer intresse av koding enn nødvendigvis å lage duppedingser vil jeg derimot peke retning python. Code Academy har et helt greit begynnerkurs (gratis) for nybegynnere.

u/Hantaile12 · 1 pointr/IWantToLearn

Assuming you’re a beginner, and are starting with little to no knowledge:

I bought the 3rd edition of the book called “Practical Electronics for Inventors” by Scherz and Monk it starts from the basics and you slowly build more and more complex and practical circuits.
https://www.amazon.com/dp/1259587541/ref=cm_sw_r_cp_api_eTs2BbXN9S1DN

Another fun on by Monk is “The Maker's Guide to the Zombie Apocalypse: Defend Your Base with Simple Circuits, Arduino, and Raspberry Pi”
https://www.amazon.com/dp/1593276672/ref=cm_sw_r_cp_api_XVs2BbYMVJT5N

If you are looking for something more theory based (I wouldn’t recommend initially unless you’re just curious) there’s a whole slew of texts books depending on what exactly you’re interested in you can pick up for cheap at a used book store or on amazon.

Remember build slowly in the beginning until you get a good grasp on the content and have fun. Diving in too deep to quickly can overwhelm and kill morale.

Happy learning!

u/StrskyNHutch · 1 pointr/ECE

The research project I was working on was specifically geared towards improving TLB misses by restructuring the cache, but over the course of being prepared for this, I read this book front to back. I learned a lot about how a microprocessor is designed and how its pipelined stages work together, and I wanted to know if that knowledge could be translatable to a hardware related field. I know I'm probably not qualified to work in this field full time but I was hoping I could get at least get my foot through the door with an internship