Best computer programming logic books according to redditors
We found 48 Reddit comments discussing the best computer programming logic books. We ranked the 24 resulting products by number of redditors who mentioned them. Here are the top 20.
We found 48 Reddit comments discussing the best computer programming logic books. We ranked the 24 resulting products by number of redditors who mentioned them. Here are the top 20.
-----
-----
BOOKS
Children Electronics and Electricity books:
Newbie Electronics books:
Basic Circuit Theory books:
Analog Design books:
Digital Design books:
(download old edition)
Digital Signal Processing books:
Computer Design books:
6502,
6800,
6809,
8080,
8085,
Z80,
68000,
x86
processors on Wikipedia.
8051,
ARM,
AVR,
PIC,
RISC-V
microcontrollers on Wikipedia.
Electronics Reference books:
Historical books:
-----
-----
MAGAZINES
Current Electronics Magazines: (subscribe now)
Historical Electronics Magazines: (archives)
Historical Computer Magazines: (archives)
"Kilobaud"
-----
No such thing as required reading for Cambridge courses - the lecturers are expected to deliver all their material in the lectures. Oh, there was the time we read Sketches of an Elephant - but it would have been ridiculous and unreasonable to expect us to buy the books.
When you start programming the FPGA, I would definitely start with SystemVerilog, there are some really good resources on it, and it is even easier to pick up than Verilog, but gets the same thing done. By and large the tools from Xilinx and Altera get the same thing done as well, but I wouldn't completely write off the Xilinx toolchain. You should also expect to spend most of your learning time at first inside of the simulator, and even when you think you know what you are doing the simulator is an invaluable tool. For learning the ins and outs of SystemVerilog, the FPGA itself is really just a way to demonstrate what you've built, 95% of the learning can happen without it. You will probably also hear a lot about synthesizable vs. non-synthesizable SystemVerilog, for that I would definitely recommend https://www.amazon.com/Logic-Design-Verification-Using-SystemVerilog/dp/1500385786 as its what Carnegie Mellon uses to start off their digital logic classes.
Edit: Also the Pynq board is a great place to start learning about SoC development, while still having high level resources like Python and Linux available should you need them for a project.
http://store.digilentinc.com/pynq-z1-python-productivity-for-zynq/
Yes of course you can. And actually, there are infinitely many that can be designed and built that haven't been built yet. Just like there are still many many PCBs that no one has built yet, because in essence a custom IC, generally known as an ASIC (and the most popular circuit technology today on that ASIC is called CMOS VLSI), is a tiny PCB. What you design there is literally a tiny circuit board with tiny transistors, resistors, capacitors and connecting wires that then gets built by etching silicon and depositing materials on top, etching these again etc, pretty much like an electronic layer cake.
Problem there is the setup costs are incredibly high, and the software you need costs in the millions per year per seat. If you're affiliated with a university, you can get the software cheaper (thousands instead of millions) and there are waver pooling programs that bring down setup costs, just like PCB pooling programs bring down costs per PCB if you need only small numbers. But even then you're still looking at something between $50k and $100k. But the actual production cost is very low. You could say the first one costs $100k, but the second, third, fouth, ... one cost $0.10 each.
The process is complicated, much more complicated than a large regular PCB, and you have to observe a large number of design constraints, and you have exactly one shot to get it right. If anything goes wrong, anything at all, you just burned $100k and probably just lost your job. That's why in ASIC design generally everything is double and triple checked at every single of the design steps. And I do mean literally triple-checked, with multiple different software packages from multiple vendors and by multiple engineers.
If you want to know how this works in detail, look at "VLSI CMOS Design, A circuits and systems perspective, 4th edition", by Neil Weste and David Harris, which you can get at the book store of your least distrust (here for instance), or find as decentralized backup copy pdf via your favorite search engine...
Now, for simple requirements like yours, there is actually something easier and much cheaper, called an FPGA, or a CPLD. They're both the same thing, but an FPGA has more design elements. CPLDs have dozens or hundreds, FPGAs have tens of thousands to millions.
Now what are these? Instead of building the circuit from scratch like in an ASIC by etching silicon, what you get in them are predefined logic blocks and a switchable interconnect fabric between them premade for you. The logic blocks usually consist of one look-up table (which can simulate a number of connected and, or, xor and not gates), and flipflops as storage elements. Often they also have a few specialized elements that are needed often and that would otherwise use up a large number of logic elements. Popular specialized elements are SRAM, and DSP ALUs (math units optimized for signal processing). The larger ones also have entire arm cores next to the FPGA fabric that you can run a regular linux on (or whatever you want to run on it, it's a regular CPU) that has connections into the programmable logic.
It's essentially a huge Lego set of logic elements. These are much cheaper than ASICs (you can get simple ones for less than $10 a piece, even without volume rebates), and they're much easier to program for (but still not trivial). While for an ASIC, you need an actual IC foundry (i.e. a factory/company that makes the ICs), an FPGA is generic, and you program it with a bitstream, which in turn is generated from your desciption of the hardware you want inside, which is written in either VHDL or Verilog. That's actually the same languages you also write the hardware for ASICs in. (They're both equally good/bad, so pick one. I would recommend VHDL, but just like with vi and emacs, if you ask 100 people, you get 250 different recommendations and reasonings there. So pick one, start with it, and learn the other one later once you know the basics. Ultimately you'll need to know both).
For a first glance into this area of electronics, have a look at this tutorial from Intel/Altera (Intel's FPGA division, formerly Altera, is one of the big three FPGA makers, the other two being Xilinx and Lattice Semiconductors). As cheap first hardware to go along with this and get your feet wet, have a look at the MAX1000 board from Arrow Electronics/Trenz Electronics, which costs about $30, give or take. That's about as low as you can get with FPGA boards, price-wise. Sadly there isn't an "FPGA Arduino" yet, but I actually think adafruit is actively working on that right now, so that may actually happen very soon.
Either way, feel free to ask questions if you have any. Good luck and have fun!
Well, you need to learn some HDL(s) first, VHDL or Verilog for exemple. I would recommend you to pick up a reference book like the Volnei Pedroni VHDL, or Pong. P. Chu.
http://www.amazon.com.br/Circuit-Design-Simulation-Volnei-Pedroni/dp/0262014335
http://www.amazon.com/FPGA-Prototyping-VHDL-Examples-Spartan-3/dp/0470185317/ref=asap_bc?ie=UTF8
And you can practice here: http://www.edaplayground.com/ without having to download any tool. After you stick to the HDL methodology, you start to think about prototyping, placing, routing, layout on a physical FPGA. I would recommend you to take a look in hardware verification and testbench design, depending on what you're implement it will be crucial to have a robust verification enviroment to avoid huge debugging efforts and lots of headaches.
Depends very much where you start. "FPGA development" is a pretty broad field touching electronic and digital design, system architecture, hardware design languages, toolchains, simulation, testing, design synthesis, timing analysis and more. I'm not aware of "one" book covering everything. Here are some popular titles from entry level to advanced with a decent coverage of the mentioned topics:
https://www.amazon.com/Programming-FPGAs-Getting-Started-Verilog/dp/125964376X
https://www.amazon.com/Fundamentals-Digital-Logic-Verilog-Design/dp/0073380547
https://www.amazon.com/Advanced-FPGA-Design-Architecture-Implementation/dp/0470054379
Possibly one of these? They were the only books about coding/computers with the name Charles that I could find. I'm guessing you're talking about the first one. It looks like a more popular version of things, but probably all still new stuff for me, so I'll check it out!
EDIT: The second one looks really promising too. Thanks for the suggestion!
Code: The Hidden Language of Computer Hardware and Software by Charles Petzold
Fundamentals of Logic Design by Jr. Charles H. Roth & Larry L. Kinney
This book is awesome: https://www.amazon.com/Analysis-Design-Linear-Circuits/dp/1118065581
Found my old textbook: https://www.amazon.com/Fundamentals-Logic-Design-Companion-CD-ROM/dp/0495471690/ref=sr_1_2?ie=UTF8&qid=1478084681&sr=8-2&keywords=fundamentals+of+logic+design
There might be better ones out there, but this did the job for me.
I found the book:
http://www.amazon.ca/Fundamentals-Logic-Design-Companion-CD/dp/0495471690/ref=sr_1_1?ie=UTF8&qid=1254258981&sr=8-1
0.
Calculus up to derivatives & integrals
(Circuit analysis)
(Mixed logic design & Synthesis of circuits)
Before these I would highly urge that you finish calculus. These two books are what I started with as a hardware engineer @ university (in silicon valley). Then move on to FPGA development. The basic fundamentals are crucial for you to be able to move forward.
Hmmmm I don't consider myself an authority on VHDL/Verilog but I can try and answer your questions.
Here's a link to a pdf, but I do advice buying the book. https://is.muni.cz/el/1433/podzim2008/PV200/um/_eBook__MIT_Press_-_Circuit_Design_with_VHDL__2005_.pdf
https://www.amazon.com/Circuit-Design-Simulation-VHDL-Press/dp/0262014335/ref=sr_1_1?ie=UTF8&qid=1469912010&sr=8-1&keywords=circuit+design+and+simulation
Hope this help! Feel free to msg me if you have any VHDL questions.
Agree. I picked up on that from the intro to GEB, stopped reading GEB, and decided to get a better understanding of Gödel's proof by reading the book Hofstadter says introduced him to Gödel - Gödel's Proof, by Ernest Nagel and James R. Newman. I recommend it as a very approachable introduction to Gödel's incompleteness theorems. Even now I can recall moments reading that little book where I'd get a big smile on my face as the force of his argument and conclusion would bear down on me. What Gödel did is nothing short of mind blowing.
After that, if you want more, then go to Gödel's Incompleteness Theorems by Raymond M. Smullyan (You'll want to buy this one used). This one is a much more technical, though still approachable if you're prepared at an undergrad level, guide through to Gödel's conclusions. You should go into it with an undergrad level of fluency in propositional and predicate logic.
You can read GEB without all that, certainly without the second book, but I've found it a better experience having more familiarity with Gödel as I work through it.
https://www.amazon.com/Starting-Programming-Design-Computer-Science/dp/0134801156/
Hello, it appears you tried to put a link in a title, since most users cant click these I have placed it here for you
^I ^am ^a ^bot ^if ^you ^have ^any ^suggestions ^dm ^me
I used to be a nanosystems engineering major, most nano engineering is geared towards MEMS and MOSFET production (think mechEng at a tiny scale)
http://www.amazon.com/gp/product/0982299109
http://www.amazon.com/Introduction-Microelectronic-Fabrication-Modular-Devices/dp/0201444941
Ooooh look! a bargain!!
DON'T PANIC - 2020 Pocket Sized Planner: Words of Wisdom to Guide Hitchhikers Across the Galaxy | One Full Year Calendar | 1 Yr | Pocket Purse Sized | Jan 1 - Dec 31 | Weekly & Monthly Planner + To Do List | Day Week Month Views | January to December https://www.amazon.ca/dp/1703139801/ref=cm_sw_r_cp_apa_i_7Wy2DbF8T7AQ2
Added to wishlist♡
BONUS... NO PUPPERS BUT TWO FLUFFERS (peppy and princess fluffyface!)
https://i633.photobucket.com/albums/uu56/MummyMayhem23/Mobile%20Uploads/FB_IMG_1574287633577.jpg
For books, my adviser seems to like: https://www.amazon.com/VHDL-Digital-Design-Frank-Vahid/dp/0470052635/ref=sr_1_11?ie=UTF8&s=books&qid=1231766888&sr=8-11.
I'm sure it has nothing to do with Frank Vahid being his old advisor... but Frank does know his stuff and his books are typically easy to follow (I have not read this book). It's obviously VHDL and not Verilog, but it's not hard to go from one language to another.
I'm currently reading through: https://www.amazon.com/gp/product/1523364025/ref=oh_aui_detailpage_o04_s03?ie=UTF8&psc=1.
It's SystemVerilog, still not Verilog, but SV is a super-set of Verilog so it still may be useful. Also, if you use SV for verification you will be happy. DPI is your friend.
Hopefully this was a little helpful.
There are a million solutions to the Fermi Paradox, in my mind it really isn't a paradox at all. If you're interested to read more deeply into this subject, and to consider more possibilities as to why the aliens are not here, I would HIGHLY recommend this book. It's one of the most interesting pieces of non fiction I have ever read:
http://www.amazon.com/Contact-Alien-Civilizations-Encountering-Extraterrestrials/dp/1441921079/ref=sr_1_1?ie=UTF8&qid=1370528739&sr=8-1&keywords=contact+with+alien+civilizations
You might want to start with a book on logic and program design rather than focus on a specific language. Some of it you will have already picked up implicitly, but it's a good place to start none the less. There are pdfs of this book floating around the web.
This Book was the text for my Circuits I course. It was also my sister's text, and I think even my Dad had a copy from way back.
I'm sure there are others, but this one works pretty well I think. If you're looking for a book.
I would consider reading this book (https://www.amazon.com/Starting-Out-Programming-Logic-Design/dp/0133985075). It is independent of any language and teaches you the basics of programming using pseudo-code. If you do decide to learn a particular language, you could write out the pseudo-code in that language to help you learn that way.
edit:
find an earlier edition to save money. maybe 2nd edition if you can find it.
Tony Gaddis's Starting Out with Programming Logic and Design is pretty good. Pretty language-agnostic in that it uses pseudocode to illustrate programming concepts you've mentioned, plus things like program design.
If you intend on getting up to higher circuit analysis it’s going to require a bit more math. To be able to understand microelectronics/amplification/filtering, you’ll need to learn a bit about differential equations and linear algebra. You can check out this great circuits book (https://www.amazon.com/Analysis-Design-Linear-Circuits/dp/1118065581/ref=nodl_) and once you’ve gotten through that, you can dig deeper into integrated circuits with this book (http://global.oup.com/us/companion.websites/umbrella/sedrasmith/). Good luck!
Fundamentals of Logic Design - Charles H. Roth Jr.
this was my Digital Electronics Design and Analysis textbook. I can definitely say that this was the most useful textbook I ever purchased for school. The class was "self-taught" and the book is written for people that are trying to learn on their own. It makes concepts really easy to understand and covers a wide range of topics including: digital components, VHDL programming, algorithm development, practical applications, etc. If you are even the least bit interested in learning anything about digital electronics, this book is excellent.
Book recommendations:
https://www.amazon.com/Starting-Programming-Design-Computer-Science/dp/0134801156/ref=mp_s_a_1_1?keywords=tony+gaddis+logic&qid=1568820550&s=gateway&sr=8-1
Cheaper older editions of this book would be just as good, basically the same content. It's a college book so they just make small insignificant changes to keep charging top price (hint: google older edition PDFs). The book is good though.
https://www.amazon.com/gp/aw/d/B07SZPTZ1K/ref=tmm_kin_title_0?ie=UTF8&qid=1568820681&sr=8-1-spons
This book would help you in writing flowcharts and then translating to an actual programming language (python) back and forth. I highly recommend it. There are tons of examples, practice problems, quizzes, etc. with answer keys all on the guy's website.
The authors are a married couple who used to work in software engineering and now are actual CS high school teachers. In other words they actually use teaching theory in their instruction which is rare in tech books, the spiral approach to be exact (https://en.m.wikipedia.org/wiki/Spiral_approach).
Our Logic class was entirely self taught, our book was this. It seemed to be good enough, and I am sure getting a much older edition wouldn't change much at all.
Well, if you want to become an engineer, you'll need to go to college. After you get your prereqs out of the way, the first courses you'll take will be something like Circuits 1 and 2, covering RLC circuits and basic transistors, opamps, etc., and a digital course covering logic gates, flip-flops, etc. Later on, you'll get into Fourier and Laplace transforms, more analog and digital, and elective subjects based on your specialization.
Typical books:
Circuits: http://www.amazon.com/Fundamentals-Electric-Circuits-Charles-Alexander/dp/0077263197
Digital Design: http://www.amazon.com/Fundamentals-Logic-Design-Companion-CD-ROM/dp/0495471690
Thanks!
Once you're up and running and have a few designs working you might find this book useful:
http://www.amazon.com/Advanced-FPGA-Design-Architecture-Implementation/dp/0470054379
​
https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw
https://www.youtube.com/channel/UC9-y-6csu5WGm29I7JiwpnA
https://www.youtube.com/channel/UCr-5TdGkKszdbboXXsFZJTQ
https://www.youtube.com/channel/UClcE-kVhqyiHCcjYwcpfj9w
https://www.youtube.com/channel/UCeQhZOvNKSBRU0Mdg7V44wA
https://www.youtube.com/channel/UCMy_zy0dw4fCfs2cL7UPBQA
https://www.youtube.com/channel/UCxzC4EngIsMrPmbm6Nxvb-A
https://www.youtube.com/channel/UCYaNsGvyvIupxpecr4rZY9A
- Exercises for programmers: https://www.amazon.fr/dp/1680501224/ref=cm_sw_r_tw_dp_U_x_vAMPCbVG5MBP5
- Daily coding problem: https://www.amazon.fr/dp/1793296634/ref=cm_sw_r_tw_dp_U_x_2AMPCbT6F7Q4S
- Any book on 'programming logic and design' however I like the book by Tony Gaddis titled 'starting out with programming logic and design.': https://www.amazon.fr/dp/0134801156/ref=cm_sw_r_tw_dp_U_x_2BMPCbEG3XSET
There are some variants of programming languages in other languages, however I find pretty much every country will usually code in English with the exceptions mainly being in Russia and China who I believe have strongly developed programming languages in there native languages, not to sure about French, see here for more information: https://en.wikipedia.org/wiki/Non-English-based_programming_languages
The ones I use in my work mostly is usually Python and Matlab for more scientific like stuff, but I also use C and C++ a lot.
Hello, I tried clicking the weekly questions link on the sidebar and I was directed here. Not really a purchase help post but I'm really interested in learning the theory behind audio and I don't know where to begin. I took Physics II last year but I wasn't interested in any of it then and I barely remember anything. I still do have access to my Physics book in which they have 2 chapters about sound and waves so I plan on reading that.
I understand what amps, preamps, dacs, different kinds of headphones, and whatnot but not really the theories. So since I have next to 0 knowledge about this, I would like to ask what readings I should go through if I want to learn all this although I'm mainly interested in headphones. Someone recommended Op Amps for Everyone but I'm kind of intimidated with it. I'm not so sure.
Thanks, any input would be great.
Syllabus looks pretty standard for a digital electronics course. Sedra/Smith's book is also pretty popular among integrated circuit designers. What exactly are you having problem with? Circuits with transistors? CMOS devices? What circuit based classes have you taken before this?
I've taken a similar class called VLSI design and this is the book we used in it. Explanations in it could be a little easier for you than Sedra/Smith.
You (meaning OP) may find useful either Smullyan's book http://www.amazon.com/Godels-Incompleteness-Theorems-Oxford-Guides/dp/0195046722 or Hofstadter's http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567 .
Goedel's Incompleteness Theorems, by Raymond Smullyan.
From the preface:
> [intended] for the general mathematician, philosopher, computer scientist and any other curious reader who has at least a nodding acquaintance with the symbolism of first-order logic..and who can recognize the logical validity of a few elementary formulas.
I'm guessing most of the people on /r/math meet that description and more. If you want a general introduction to mathematical logic and computation, you should read Computability and Logic by George Boolos. If you can read Boolos, you can probably read Smullyan, and if you read them both you should emerge with some understanding of incompleteness.
EDA Playground (simulate VHDL/Verilog online)
https://www.edaplayground.com/
Check youtube for usage tutorials.
Vendor Materials for General VHDL/VERILOG & FPGA Concepts
Altera Intel https://www.altera.com/support/training/university/materials.html
Xilinx https://www.xilinx.com/support/university/vivado/vivado-teaching-material/hdl-design.html
Past Recommendations
https://www.reddit.com/r/ECE/comments/50tlkl/fpga_bookresource_reccomendations/d7c08i8/
https://www.reddit.com/r/ECE/comments/451e7z/can_fpga_be_self_taught/czusnlc/
I recently picked up two books to refresh my VHDL knowledge. The first one appears to give a good introductory rundown.
https://www.amazon.com/Circuit-Design-Simulation-VHDL-Press/dp/0262014335/
https://www.amazon.com/Effective-Coding-VHDL-Principles-Practice/dp/0262034220/