Reddit Reddit reviews Digital Design and Computer Architecture

We found 22 Reddit comments about Digital Design and Computer Architecture. Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
Digital Design and Computer Architecture
Morgan Kaufmann Publishers
Check price on Amazon

22 Reddit comments about Digital Design and Computer Architecture:

u/Enlightenment777 · 42 pointsr/ECE

-----
-----

BOOKS


Children Electronics and Electricity books:

u/myrrlyn · 18 pointsr/learnprogramming

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

This book is an excellent primer for a bottom-up look into how computers as machines function.

https://www.amazon.com/gp/aw/d/0123944244/ref=ya_aw_od_pi

This is my textbook from the class where we built a CPU. I greatly enjoy it, and it also starts at the bottom and works up excellently.

For OS development, I am following Philipp Opperman's excellent blog series on writing a simple OS in Rust, at http://os.phil-opp.com/

And as always Wikipedia walks and Reddit meanders fill in the gaps lol.

u/OddCoincidence · 15 pointsr/compsci

I think Digital Design and Computer Architecture is the best book for this.

u/nsfwelchesgrapejuice · 9 pointsr/cscareerquestions

If you already have an engineering degree then you already know how to study. What experience do you have with embedded? If you don't have any then you should be sure it's what you want before you commit to anything huge.

I think the best way to get a job in embedded systems is to build embedded systems, and not bother with language certifcations. I might be going against the grain here a bit but I would suggest starting to dip your toes into embedded systems by buying an arduino and messing around with it.

Arduino gets a lot of flack for being "not real" embedded systems, and while it's true nobody is going to hire you because you can make an impressive arduino project, IMHO it's a great introduction to what embedded is about. The hardware equivalent of "hello world" is blinking an LED. If you are serious about learning then you will quickly outgrow the arduino, but you can always throw away the bootloader and try to program the ATmega with bare metal gcc and avrdude.

I don't know what you already know nor how you feel about math, but things you will want to learn include:

  • Analog electrical theory, DC and AC, resistance/capacitance/inductance. Understand basic circuit networks and input vs output impedance. Hopefully you remember complex numbers and frequency response. You don't need a lot of circuit theory but you will need to understand what a pull-up resistor is and why it's necessary. Depending on your math background you can get into filters, frequency response, fourier analysis. A good introduction here might be www.allaboutcircuits.com

  • Digital theory, starting with boolean algebra, logic gates, adders/multiplexers/flip-flops, all the way up to computer architecture. I like this book because it has a very holistic approach to this area https://www.amazon.com/Digital-Design-Computer-Architecture-Second/dp/0123944244/ref=sr_1_1?ie=UTF8&qid=1494262358&sr=8-1&keywords=harris+digital+design

  • Linux, C. Linux and C. You need to understand pointers, and the best way to understand C is to understand computer architecture. If you're not already running Linux, install linux, as well as gcc and build-essential. Start learning how to manipulate memory with C. Learning about computer architecture will help here. My favourite book on C is one of the classics: https://www.amazon.com/Programming-Language-Brian-W-Kernighan/dp/0131103628/ref=sr_1_1?ie=UTF8&qid=1494262721&sr=8-1&keywords=the+c+programming+language

    If you get this far and still want to become an embedded systems engineer then you're doing pretty well. I would say just try to build projects that utilize these skills. Maybe you can use your mech background to build a robot and then design the software to support it. Get used to reading datasheets for parts, and imagining what the digital logic inside looks like. Get used to searching google for answers to your questions.
u/okiyama · 7 pointsr/learnprogramming

Yeah, it's actually really interesting! When I say ARM, MIPS and x86 I'm talking about the processor architecture. Basically, your desktop has a processor that speaks x86 whereas your smart phone only knows ARM. MIPS is primarily for low power things like routers. Learning an assembly language means learning a lot about processors, which is also a lot of fun.

If you do want to learn about all of that, this book is great. Well written and pretty fun to read

http://www.amazon.com/Digital-Design-Computer-Architecture-Edition/dp/0123944244

u/Echohawkdown · 6 pointsr/TechnologyProTips

In the interim, I suggest the following books:

  • Digital Design and Computer Architecture, by Harris & Harris - covers the circuitry & hardware logic used in computers. Should also cover how data is handled on a hardware level - memory's a bit rusty on this one, and I can't find my copy of it right now. Recommend that you read this one first.

  • Computer Organization and Design, by Patterson & Hennessy - covers the conversion of system code into assembly language, which itself turns into machine language (in other words, covers the conversion of programs from operating system code into hardware, "bare metal" code). Knowledge of digital circuitry is not required before reading, but strongly recommended.

  • Operating System Concepts, by Silberschatz, Galvin & Gagne - covers all the basic Operating System concepts that each OS today has to consider and implement. While there are Linux-based ones, there are so many different Linux "flavors" that, IMO, a book that covers a specific Linux base (called a Linux kernel) exclusively would be incomplete and fail to address all the key aspects you'll find in modern OSes. Knowledge of coding is required for this one, and therefore should be read last.

     

    As for the coding books, I suggest you pick one up on Python or Java - I'm personally biased towards Python over Java, since I think Python's syntax and code style looks nicer, whereas Java makes you say pretty much everything you're doing. Both programming languages have been out for a long time and see widespread usage, so there's plenty of resources out there for you to get started with. Personally, I'd suggest going with this book for Java and this book for Python, but if you go to Coursera or Codecademy, you might be able to get better, more interactive learning experiences with coding.

    Or you can just skip reading all of the books I recommended in favor of MIT's OpenCourseWare. Your choice.
u/donnacav · 5 pointsr/AskComputerScience

Hi, I can highly recommend “Digital Design and Computer Architecture” by Harris & Harris!

https://www.amazon.com/Digital-Design-Computer-Architecture-Harris/dp/0123944244

Extremely approachable and well-constructed for someone with little background knowledge

u/crusaderblings2 · 4 pointsr/FPGA

Digital Design and Computer Architecture is my favorite plain-language start to digital design.

You start with transistors and logic gates, and move all the way up to assembly language and compilers. Basically all the knowledge you need to create a simple processor.

It's going to help you think of Verilog as what it is, a description of physical hardware on a chip, as opposed to a set of instructions. Programming is like giving someone directions on a map, HDL is like building the roads. An if statement in C is a set of instructions executed in order. An if statement in Verilog is saying "Place a multiplexer here, and wire it up to these other pieces of hardware."

u/CheeseZhenshi · 4 pointsr/AskScienceDiscussion

Based on my understanding of what you're interested in, I would see if your college offers a Digital Logic Design class. It's all about how you use and/or/not gates to create more complex logic gates, and then using those to do things like math and storing values.
If you don't want to shell out the money for a class, or it's not available, the book we use in mine is this one.

 

Also, if possible I would purchase a small FPGA. They're Field Programmable Gate Arrays - we use something like this in my labs, but there are some cheaper and smaller ones if you google for them. Basically they allow you to programmatically describe logic gates and run electrical signals through them, then output the results to various things like the LED's - it really helps ground the knowledge to reality.

Fortunately Digital Logic Design is pretty much a starting point, so it doesn't have any pre-reqs (at my college), should you choose to take it.

u/fuzzyPuppy · 4 pointsr/embedded

in college, my intro computer engineering class used Digital Design and Computer Architecture by Harris and Harris. It talks mostly about the PIC32, but the concepts are the same. It also covers FPGAs. The 1st edition doesn't include a chapter on C, so it might be better for you to get the second edition.

u/maredsous10 · 3 pointsr/ECE

VHDL was what I first was exposed to. I was initially reluctant to switch to Verilog because it is a lot less verbose and less strict. The less strict part can cause issues when you make assumptions and don't verify what the synthesis tools produce. Now I prefer Verilog and feel I am able to create faster with Verilog.

Altera, Xilinx, Microsemi, or Lattice all offer free software tools you can download. Get yourself an evaluation board then read related documentation for the tools and particular FPGA used on the board.

Here are my book suggestions.

Advanced Chip Design, Practical Examples in Verilog [Paperback]
http://www.amazon.com/Advanced-Design-Practical-Examples-Verilog/dp/1482593335/
Cheap book that covers a lot of material from a practicing engineer.

Digital Design and Computer Architecture, Second Edition
http://www.amazon.com/Digital-Design-Computer-Architecture-Second/dp/0123944244/
This book covers Digital Design, Computer Architecture, VHDL, and Verilog.

Embedded SoPC Design with Nios II Processor and Verilog Examples

http://www.amazon.com/Embedded-Design-Processor-Verilog-Examples/dp/1118011031/

This book covers a some verilog and then moves on into Altera SOPC builder.

FPGA Prototyping By VHDL or Verilog Examples: Xilinx Spartan-3 Version
http://www.amazon.com/FPGA-Prototyping-VHDL-Examples-Spartan-3/dp/0470185317/

http://www.amazon.com/FPGA-Prototyping-Verilog-Examples-Spartan-3/dp/0470185325/

u/3242542342352 · 3 pointsr/learnprogramming

google Stackframe, calling convention, ISA

Also, for more insight on how cpus work: this

u/lostchicken · 3 pointsr/AskElectronics

If you want something a little more hardware oriented and more aimed at the practicing engineer, this book has great reviews: http://www.amazon.com/Digital-Design-Computer-Architecture-Edition/dp/0123944244

(Full disclosure, I helped work on this book when I was a student.)

u/morto00x · 2 pointsr/FPGA

Yes, it's possible. But unless you have a good reason to do so, I'd stick to using development boards to learn FPGA design. Making the FPGA board alone will be a very time consuming project if you don't have much experience making hardware.

Your Digilent board should be fine for your requirements unless you want to make large or high-speed design projects (which is unlikely if you are just learning).

I'd look into this book for a good refresher. You can also find projects in www.FPGA4fun.com.

u/ArnaudF · 2 pointsr/videos

A freaking great book about how computers work : https://www.amazon.com/Digital-Design-Computer-Architecture-Second/dp/0123944244

It goes from transistors to micro architecture, raising the abstraction level each chapter. There are a lot of exercise to put everything in practice.

u/lovelikepie · 2 pointsr/ECE

Read a book that approaches computer architecture from an implementation prospective, something like Digital Design and Computer Architecture

Take something relatively simple and like RISCV and read the ISA spec.

Using this spec figure out what state the machine defines. What registers must you keep track of in order to be ISA compliant. Implement the basic machine state.

Figure out what you need to do to implement specific operations. What information is encoded in all the fields of the instruction. What state is modified. Like for ADDIU: what does it mean to add unsigned add immediate, where is the immediate stored, what register do you read, what do you write? Implement a single instruction.

Writes tests, start implementing more of these operations. Learn about the rest of the ISA features (memory handling, exceptions). Implement this is any language. Try running small hand written assembly programs in your simulator, try larger programs.

u/realistic_hologram · 1 pointr/learnprogramming

Love code, very accessible. If you want more of a textbook I thought this one was very good while still being quite accessible: http://www.amazon.com/gp/aw/d/0123944244/ref=redir_mdp_mobile/189-7884811-1167562

u/[deleted] · 1 pointr/EngineeringStudents

Your interests are almost identical to mine! I didn't fully decide on my major until Junior year (Electrical Engineering). Started going into Biomedical end of senior year, went to grad school for it, found out it wasn't for me and switched back to EE. Now I'm researching Aerospace navigation systems and absolutely love it.

So no, you aren't going to die if you don't have your life figured out by end of sophomore year. The nice thing about engineering is a lot of the skills carry over between disciplines.

My suggestions are to study topics you are interested on your own; don't wait to learn it in your classes. If you think you are interested in computer engineering and computer science, start teaching yourself digital electronics, learning C and building projects with microcontrollers (AVR microcontrollers are very common) Don't wait for your career to find you, you have to seek it out. Follow your passion and the rest will come naturally.

Side note: I've been teaching myself AVR microcontrollers and digital design this summer, with the end goal of launching a high altitude balloon carrying an HD camera, temp sensors and Geiger counter. Here are the books I've been using, both are fantastic and easy to follow.
http://www.amazon.com/gp/aw/d/1449355781
http://www.amazon.com/gp/product/0123944244

u/westernrepublic · 1 pointr/learnprogramming

Digital Design and Computer Architecture, Second Edition. It was my textbook for my college's Computer Organization class. You'll learn how to build a preprocessor!

u/Truth_Be_Told · 1 pointr/C_Programming

First note that Career/Job/Market is quite different from Knowledge/Intellectual satisfaction. So you have to keep "earning money" separate from "gaining knowledge" but do both parallely. If you are one of the lucky few who has both aligned in a particular job, you have got it made. Mostly that is never the case and hence you have to work on your Motivation/Enthusiasm and keep hammering away at the difficult subjects. There are no shortcuts :-)

I prefer Books to the Internet for study since they are more coherent and less distracting, allowing you to focus better on a subject. Unless newer editions are reqd. buy used/older editions to save money and build a large library. So here is a selection from my library (in no particular order);