(Part 2) Best computer hardware & diy books according to redditors

Jump to the top 20

We found 1,470 Reddit comments discussing the best computer hardware & diy books. We ranked the 449 resulting products by number of redditors who mentioned them. Here are the products ranked 21-40. You can also go back to the previous section.

Next page

Subcategories:

Microprocessor & system design books
Mainframes & minicomputers books
Computer hardware upgrade & repair books
Robotics books
Computer hardware peripherals books
Computer design & architecture books
Internet & networking books
Personal computer books
Single board computers books

Top Reddit comments about Computer Hardware & DIY:

u/sunny20d · 55 pointsr/ProgrammerHumor

I love that book. I just bought the original for school.

Edit: For the curious: http://www.amazon.com/Nutshell-Fourth-Edition-Arnold-Robbins/dp/0596100299

u/EngSciGuy · 51 pointsr/science

A decent book that isn't too complicated (some matrix algebra is really the only background necessary, some probability wouldn't hurt);

http://www.amazon.ca/Introduction-Quantum-Computing-Phillip-Kaye/dp/019857049X

u/Enlightenment777 · 42 pointsr/ECE

-----
-----

BOOKS


Children Electronics and Electricity books:

u/wheeman · 25 pointsr/ECE

The Scientist & Engineer's Guide to Digital Signal Processing is a pretty decent book as a crash course. It covers the high level concepts in the first half and the hard core math at the very end.

In the middle there’s a chunk of stuff that’s very practical if you don’t have the time to learn all the math behind it. This is the stuff that I found most useful. It covered the various filters, why you would use one over the other, and basic implementations.

If you really want to learn DSP, a course might be useful but it all depends on what you want from it.

u/DeepMusing · 23 pointsr/engineering

Any engineering job is going to have a significant amount of domain knowledge that is specific to that company's products, services, or research. Getting an engineering degree is just the beginning. Once you get a job at a company, you will need to learn a shit load of new terms, IP, history, and procedures that are specific to that company. It's the next level of your education, and will take years to fully assimilate. School doesn't teach you anywhere near enough to walk into most engineering jobs and be independently productive. You are there to learn as much as do. The senior engineers are your teachers and gaining their knowledge and experience is the key to building a successful career. You need to look at them as a valuable resource that you should be taking every opportunity to learn from. If you don't understand what they are saying, then ask, take notes, and do independent research to fill in your knowledge gaps. Don't just dismiss what they say as techo-babble.

!!!!!! TAKE THIS TO HEART !!!!! - The single biggest challenge you will have in your engineering career is learning how to work well with your peers, seniors, and managers. Interpersonal skills are ABSOLUTELY critical. Engineering is easy: Math, science, physics, chemistry, software, electronics.... all of that is a logical, and learnable, and a piece of cake compared to dealing with the numerous and often quirky personalities of the other engineers and managers. Your success will be determined by your creativity, productivity, initiative, and intelligence. Your failure will be determined by everyone else around you. If they don't like you, no amount of cleverness or effort on your part will get you ahead. Piss off your peers or managers, and you will be stepped on, marginalized, criticized, and sabotaged. It's the hard truth about the work world that they don't teach you in school. You aren't going anywhere without the support of the people around you. You are much more likely to be successful as a moron that everyone loves, than a genius that everyone hates. It sucks, but that's the truth.

You are the new guy, you have lots to learn, and that is normal and expected. It's going to be hard and frustrating for a while, but you will get the hang of it and find your footing. Learn as much as you can, and be appreciative for any help or information that you can get.

As for digitizing a signal, it is correct that you should stick with powers of 2 for a number of technical reasons. At the heart of the FFT algorithm, the signal processing is done in binary. This is part of the "Fast" in Fast Fourier Transforms. By sticking with binary and powers of 2, you can simply shift bits or drop bits to multiply or divide by 2, which is lightning fast for hardware. If you use non powers of 2 integers or fractional sampling rates, then the algorithm would need to do extensive floating point math, which can be much slower for DSPs, embedded CPUs, and FPGAs with fixed-point ALUs. It's about the efficiency of the calculations in a given platform, not what is theoretically possible. Power of 2 sample rates are much more efficient to calculate with integer math for almost all digital signal processing.

I highly recommend reading the book "The Scientist and Engineer's Guide to Digital Signal Processing" by Steven W. Smith. It is by far the best hand-holding, clearly-explained, straight-to-the-point, introductory book for learning the basics of digital signal processing, including the FFT.

You can buy the book from Amazon [here.] (https://www.amazon.com/Scientist-Engineers-Digital-Signal-Processing/dp/0966017633/ref=sr_1_1?ie=UTF8&qid=1492940980&sr=8-1&keywords=The+Scientist+and+Engineer%27s+Guide+to+Digital+Signal+Processing) If you can afford it, the physical book is great for flipping though and learning tons about different signal processing techniques.

Or you can download the entire book in PDF form legally for free here. The author is actually giving the book away for free in electronic form ( chapter by chapter ).

Chapter 12 covers FFTs.



u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/Horizivertigraph · 16 pointsr/QuantumComputing

Don't get discouraged, it's possible to get to a reasonable understanding with some sustained effort. However, you need to get the following into your head as quickly as possible:

Popular level explanations of anything quantum are a waste of your time.

Go back and read that again. You will never get close to understanding the field if you rely on someone else managing to "find the right metaphors" for you. Quantum computing is a mathematical field, and if you want to understand a mathematical field, you need to do mathematics. This sounds super scary, but it's actually no problem! Math is not what you think it is, and is actually a lot of fun to learn. You just need to put some work in. This just means maybe doing an hour or so of learning every day before you go to work, or afterwards.

Let's look at a little bit of a roadmap that you can follow to get to a reasonable understanding of quantum computing / quantum information. This is pretty much the path I followed, and now I am just about to submit my PhD thesis on quantum computational complexity. So I guess it worked out OK.

  1. You can get really far in quantum computing with some basic understanding of linear algebra. Go to Khan Academy and watch their fantastic introduction.

    If Sal asks you to do an exercise, do the exercise.

  2. Now you know what a vector is, can kind of grasp what a vector space is, and have some good intuition on how matrix-vector and matrix-matrix multiplication works, then you can probably make a reasonable start on this great intro book: https://www.amazon.co.uk/Quantum-Computing-Computer-Scientists-Yanofsky/dp/0521879965

    Start from the start, take it slowly, and do all of the exercises. Not some of the exercises, do all of the exercises. If you don't know a term, then look it up on wikipedia. If you can't do an exercise, look up similar ideas on Google and see if you can muddle your way through. You need to get good at not being scared of mathematics, and just pushing through and getting to an answer. If there is an explanation that you don't understand, look up that concept and see if you can find somebody else's explanation that does it better. Do the first few intro chapters, then dip in to some of the other chapters to see how far you get. You want to get a pretty good coverage of the topics in the book, so you know that the topics exist and can increase your exposure to the math involved.

  3. If you manage to get through a reasonable chunk of the book from point 2), then you can make a start on the bible: Quantum information and computation by Nielsen and Chuang (https://www.amazon.co.uk/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/ref=pd_lpo_sbs_14_img_1?_encoding=UTF8&psc=1&refRID=S2F1RQKXKN2268JJF3M2). Start from the start, take it slowly, and do all of the exercises.

    Nielsen and Chuang is not easy, but it's doable if you utilise some of the techniques I mention in point 2): Google for alternative explanations of concepts that the book explains in a way that confuses you, do all of the exercises, and try to get good coverage throughout the whole book. Make sure you spend time on the early linear algebra and basic quantum chapters, because if you get good at that stuff then the world is your oyster.

    Edit:

    Just remembered two more excellent resources that really helped me along the way

    A) Quantum mechanics and quantum computation, a video lecture course by Umesh Vazirani (YouTube playlist here) is fantastic. Prof. Vazirani is one of the fathers of the field of quantum computing, with a bunch of great results. His lecture course is very clear, and definitely worth devoting serious attention to. Also, he has a wonderful speaking voice that is very pleasant to listen to...

    B) Another lecture course called "Quantum Computing for the determined", this time given by Michael Nielsen (YouTube playlist here). In my opinion Nielsen is one of the best scientific communicators alive today (see also his unrelated discourse on neural networks and machine learning, really great stuff), and this series of videos is really great. Communicating this sort of stuff well to non-practitioners is pretty much Nielsen's whole jam (he quit academia to go on and write about science communication ), so it's definitely worth looking at.
u/quixotidian · 15 pointsr/compsci

Here are some books I've heard are good (I haven't read them myself, but I provide commentary based on what I've heard about them):

u/pmjones · 13 pointsr/PHP

Scalable Internet Architectures by Theo Schlossnagle. He was my boss at OmniTI and knows his stuff.

u/[deleted] · 12 pointsr/programming

So I've been doing a bit more ASM programming etc lately. I liked this book when I read it, but these days I've gotten interested in really doing fast programming, i.e. taking advantage of the processors design in your code.

So if you liked this book and wanted to take it to the next step in superfast, I suggest these resources:

  • Agner Fog's optimization page
  • Jon Stokes' book Inside the Machine is AMAZING and covers the dawn of advanced x86 processor design up until recently - all the way from the Pentium to the Core 2 line, and covers PPC design too!

    And if you're on Linux, you NEED to install perf and check if your CPU has any performance counters it can take advantage of. They give tons of insight and may upset some assumptions you have about how your code really performs. valgrind's cachegrind tool is wonderful in the same vein, but only for simulated L1-L2 cache usage.

    Also, if you have one of those fancy new phones with a modern processor, ARM assembly is wonderful and fun (I do it on my iPhone.) Shit, some of them are dual core now. Throw your C code in gcc -S or whatever and look at the generated assembly. I'll try and find my resources for that later.
u/ToTimesTwoisToo · 12 pointsr/C_Programming

C targets a virtual memory system and instruction set architecture (ISA). It's an abstraction over the hardware implementation of the ISA. Those worlds are just different, and you'll gain a better understanding if you just study them separately.

for computer architecture, I've found two books to be most helpful.

https://www.amazon.com/Digital-Design-Computer-Architecture-Harris-ebook/dp/B00HEHG7W2

https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0132916525/ref=sr_1_1?ie=UTF8&qid=1536687062&sr=8-1&keywords=tanenbaum+computer+architecture&dpID=41B7uYANs%252BL&preST=_SX218_BO1,204,203,200_QL40_&dpSrc=srch

there is a low level operating system book that uses a lot of C code to explain how to build a kernel. This might interest you

https://www.amazon.com/Operating-System-Design-Approach-Second/dp/1498712436

u/kainolophobia · 9 pointsr/programming

Look into computer engineering. If you're interested in hardware meets software, you'll explore computer architecture.

See: Computer Architecture Quantitative Approach

and
Computer Organization and Design

u/Boojum · 9 pointsr/programming

I wouldn't worry about it too much unless it shows up as a hotspot in a profile. Mature compilers these days are pretty good about optimizing loops and applying small tricks like conditional moves to make some things branchless.

Still, the Agner Fog manuals are a pretty good place to start for learning about some of these things. And a good introductory text on computer architecture such as Hennessy and Patterson can be helpful too.

u/VK2DDS · 9 pointsr/DSP

+1 for Cortex-M (with FPUs). I'm building a guitar pedal with an STM32F407 and it handles 12x oversampled distortion and a bunch of biquads at 48kHz (mono). It is paired with a CS4272 audio codec with DMA handling the I2S data.

It won't handle any reasonable FIR filter and the RAM limits it to ~500ms delay. There is a discovery board with external RAM but I haven't tried using it.

The F7 series are clocked a bit faster and some come with a double precision FPU instead of single but they have the same instruction set as the F4s. The Cortex-M7 has a longer pipeline (6 Vs 3 stages, probably to support the higher clock rate) so branching is probably less of a penalty on the M4.

This book is an excellent guide to the low level guts of the Cortex-M3 & M4 chips and contains a chapter dedicated to DSP on the M4. Long story short is contains a bunch of DSP instructions such as saturating integer arithmetic, integer SIMD, floating point fused multiply-accumulate etc which makes it semi-competitive against "true" DSP cores. The book compares the M4 and Sharc DSP to show that there's a big jump between them but the M4 wins hands down for ease of learning & development (strong community, free (GNU) tools etc).

Edit: If you want hardware this audio codec can be paired with this STM32F7 kit or this motherboard paird with this STM32F4Discovery board can take it as well.

u/robinoob · 9 pointsr/buildapc

in this book you'll find a chapter on the evolution of PC's architectures and parts, but i'm not sure this is what you're looking for.

u/jmct · 9 pointsr/Physics

Your best bet is to read an introductory text first and wrap your head around what quantum computing is.

I suggest this one: Intro Text

I like it because it isn't very long and still gives a good overview.

My former supervisor has a web tutorial: here

Lastly, Michael Nielson has a set of video lectures: here

The issue is, there is a decent sized gap between what these introductions and tutorials will give you and the current state of the art (like the articles you read on arxiv). A good way to bridge this gap is to find papers that are published in something like the Physical Review Letters here is their virtual journal on quantum information and see what they cite. When you don't understand something either refer to a text, or start following the citations.

Basically, if you can start practicing this kind of activity (the following of references) now, you'll already have a good grasp on a large part of what grad school is about.

Best of luck!

u/Dear_Occupant · 8 pointsr/SubredditDrama

At the very earliest, pretty much anything on a computer at the time was written in BASIC, so it was a trivial matter to do things like add new words to Hangman, or change the colors in a Space Invaders clone to make the ships more visible. Later, when the C64 came out, I got this bad boy right here and started learning how to make tweaks to machine code. It was all very primitive, and involved a lot of trial and error, but figuring out things like how to get extra lives in a super hard game like Raid Over Moscow was sooo worth it.

The first game I ever played that came with a "modding tool" as it were was Enchanted Scepters. A friend of mine had that on the Mac. We tried to make our own game and didn't get very far with it, but we did have a lot of fun putting vulgar words in the dialogue of the base game.

E: Wrong link.

u/scasagrande · 7 pointsr/ECE

As someone who just finished their MSc at the Institute for Quantum Computing there is a lot of interest for those with EE experience in quantum computing. The IQC has people from many disciplines, including EE.

In our group I was one of the few people that had any EE experience. I did a lot of circuit design and microwave engineering. You'd be surprised how poor the average physics grad student is at the basics of using T&M equipment. If you're more into the chip fabrication side of EE there also is groups for that in QC.

You will want to take as many quantum mechanics electives as you can. If your school does not offer a QC specific course I suggest you read this book: http://www.amazon.com/Introduction-Quantum-Computing-Phillip-Kaye/dp/019857049X

If you have any specific questions feel free to ask me either here or in a PM.

u/DVWLD · 7 pointsr/node

You should start by learning Go, Erlang, Rust and C.

/trolololololololol

Seriously, though, if you're talking about cramming as many users as onto a single machine as possible then node is not your runtime.

Node is great at building things that scale horizontally. It makes it really easy to write realtime, event based code. Node is really good at doing things that involve a lot of network IO since it's easy to do that in a non-blocking way. It's not a great choice for a high scale game server where memory usage is key.

If you want to know more about horizontal scaling patterns (which Eve only qualifies for if you squint a bit), I'd recommend starting here:

http://www.amazon.com/Scalable-Internet-Architectures-Theo-Schlossnagle/dp/067232699X

And looking at distributed consensus approaches, message queues, and bumming around http://highscalability.com/ a bit.

u/EngrToday · 7 pointsr/CUDA

As far as I know this is the go to for most people learning CUDA programming. For CUDA 9+ specific features, you're best bet is probably looking at the programming guide on NVIDIA's site for the 9 or 10 release. I don't believe there's much in terms of published books on specific releases like there is for C++ standards.

u/theoldwizard1 · 7 pointsr/hardware

>ISAs are commonly categorized by their complexity, i.e., the size of their instruction space: large ISAs such as x86-64 are called Complex Instruction Set Architectures (CISC), while the chips powering smartphones and other portable, low-power devices are based on a Reduced Instruction Set Architecture (RISC). The huge instructions space of the typical CISC ISA necessitates equally complex and powerful chips while RISC designs tend to be simpler and therefore less power hungry.

I do understand that this a "once over lightly", but, based on spending a few years of my career working with a team to select a "net gen" embedded processor for a Fortune 50 company (that would purchase millions) I do feel qualified to make these comments. (I am also the proud owner of a well worn 1st Edition of the Hennessy and Patterson Computer Architecture: A Quantitative Approach)

The lines between RISC and CISC keep getting muddier every year that goes by. While probably no longer true, but the biggest differentiator between RISC and CISC was that RISC used fixed length instructions. This made decoding the instructions MUCH simpler. The decode portion of a CISC CPU had to grab a few bytes, partially decode them, and then decide how many more bytes to grab.

The old Digital Equipment Corporation VAX architecture was (and probably still is) the MOST complex instruction set architecture. Most arithmetic and logical operation could have 3 operands and each operand could have any combination of multiple addressing modes. Worse, the VAX architecture dedicated 3 of the only 16 register for "context" (SP, FP and AP).

RISC machines had more registers than CISC machines and, over time, compiler writers figured out how to do the equivalent of the FP and AP from deltas off the SP. With the larger number of registers, typically one register was a dedicated constant zero register, necessary because all memory was accessed via indirect addressing. For embedded processor that had no loader to do "fix up" at load time, 1 or 2 more registers became dedicated pointers to specific types of memory (perhaps RAM vs ROM or "short" data vs "complex" data i.e. arrays, strings, etc)

With smaller die sizes, RISC machines could have more cache on chip. More cache meant "more faster" !

u/elchief99 · 5 pointsr/raspberry_pi

Buy the Maker's Guide to the Zombie Apocalypse and make some of the contraptions in it. They use RPis

u/dolphinrisky · 5 pointsr/Physics

Ah gotcha, yeah to be honest this approach probably won't be terribly illuminating. The problem is that the D-Wave really doesn't work in any kind of classically equivalent way. When you think about algorithms classically, the procedure is highly linear. First you do this, then that, and finally the other. The D-Wave One involves nothing of the sort.

Here's a quick rundown of what a quantum annealing machine actually does, with analogies to (hopefully) clarify a few things. In fact, an analogy is where I'll start. Suppose you had a problem you were working on, and in the course of trying to find the solution you notice that the equation you need to solve looks just like the equation describing how a spring moves with a mass hanging from it. Now you could continue your work, ignoring this coincidence, and solve out the equation on your own. Alternatively, you could go to the storage closet, grab a spring and a mass, and let the physics do the work for you. By observing the motion of the spring, you have found the solution to your original problem (because the equations were the same to begin with).

This is the same process used by the D-Wave One, but instead of a spring and a mass, the D-Wave system uses the physics of something called an Ising system (or model, or problem, etc.). In an Ising system, you have a series of particles^ with nonzero spin that can interact with each other. You arrange this system so that you can easily solve for the ground state (lowest energy) configuration. Now with the system in this ground state, you very, very slowly vary the parameters of the system so that the ground state changes from the one you could easily solve to one that you can't. Of course this new ground state, if you've done things correctly, will be the solution to the problem you were actually concerned with in the first place, just like the spring-mass example above.

So perhaps now I have explained at least a little bit of why I don't call the D-Wave One a "computer". It doesn't compute things. Rather, by a happy coincidence, it sets up an experiment (i.e. the Ising system) which results in a measurement that gives you the answer to the problem you were trying to solve. Unfortunately for you, the software engineer, this resembles precisely nothing of the usual programming-based approach to solving problems on a classical computer.

My advice is this: if you want to learn some quantum computing, check out An Introduction to Quantum Computing by Kaye, Laflamme, and Mosca, or the classic Quantum Computation and Quantum Information by Nielson and Chuang.

^
They don't actually have to be single particles (e.g. electrons), but rather they are only required to have spin interactions with each other, as this is the physical mechanism on which computations are based.

Edit: Okay, this was supposed to be a reply to achille below, but apparently I'm not so good with computers.

u/smeezy · 5 pointsr/iOSProgramming
  1. You should learn Objective-C. Start with Learning Objective-C from the Developer site, and follow the rabbit trail to other documents. Also, read up on iOS Application Design

  2. Yes. You can register your app to be woken up in case of a significant location change. Or, you can register your app for continuous location updates in the background, which will kill the user's battery if not used correctly. See Executing Code in the Background.
  3. It may be easier for you to pick up Cocoa programming on the Mac before going to the iPhone. Pick up Aaron Hillegass's excellent Cocoa Programming for Mac OSX and read the first five chapters. (I noticed that Hillegass has produced a new iPhone Programming textbook. I haven't read it but it has good reviews).
u/fogus · 5 pointsr/programming

The Commodore 64 Prgrammer's Guide. http://www.amazon.com/Commodore-64-Programmers-Reference-Guide/dp/0672220563

I wore that book out.
-m

u/nwndarkness · 4 pointsr/FPGA

Computer Organization and Design RISC-V Edition: The Hardware Software Interface (ISSN) https://www.amazon.com/dp/B0714LM21Z/ref=cm_sw_r_cp_api_i_Wn3xDbMYHH61S

Computer Architecture: A Quantitative Approach (The Morgan Kaufmann Series in Computer Architecture and Design) https://www.amazon.com/dp/0128119055/ref=cm_sw_r_cp_api_i_jp3xDbRYQ12GA

u/rcaraw1 · 4 pointsr/iphone

Thanks a whole lot!

I read a few basic books like this one for the first few weeks. Then I really just kept an idea journal and picked a few easy ideas out of it to get started. Once I decided what I thought should go into the app, I just dove in and started messing around until I eventually reached something that worked.

I only started programming a year before in Java and Android but decided to give iPhone development a shot because I used an iPhone.

u/yumology · 4 pointsr/arduino

Might want to check out Makers Guide to Zombie Apocalypse. It has lots of home defense things you can do with an arduino.

u/xiangwangzhe · 4 pointsr/csharp

Pro C# 7: With .NET and .NET Core https://www.amazon.co.uk/dp/1484230175/ref=cm_sw_r_cp_api_i_suTDCbEGBZC36

Really very good. Teaches the modern fundamentals of OOP in a clear way, as well as comprehensive covering of C# and .NET features.

u/jhillatwork · 3 pointsr/compsci

In addition to these, check out Computer Architecture: A Quantatative Approach by Hennesey & Patterson. I had this as a textbook as an undergrad and still throw it at folks when they are doing very low-level optimizations that require intimate understanding of modern computers.

u/RetardedSmackWhore · 3 pointsr/QuantumComputing

I began by just plowing through wikipedia articles and trying to get the basic concepts in broad strokes. I would suggest that you do that.

I found this book, that is totally awesome: Quantum computing for computer scientists

Quantum computing for the determined, already mentioned here, is also awesome.

There is also IBMs quantum experience web site, that contains a front-end for programming a real 5-qubit quantum computer. It is free of charge, you just need to register with an email address in order to access it. Soon (as in a few days) there will also be a 16-qubit computer to play with at that site. Playing with the real thing is very good for study morale, at least for me..

u/MatmaRex · 3 pointsr/pics

O'Reilly books look like that. This one was, apparently, Unix in a Nutshell.

u/Shagnasty · 3 pointsr/iphone

I say Border's Book - since they're going out of business, I got the Big Nerd Ranch guide to iPhone programming for 50% off.

u/TheHighlander71 · 3 pointsr/c64

Although many C64 machines continue to work flawlessly, there is a probability that the original hardware will fail. So, when you buy a machine, make sure it actually works and that all the keys on the keyboard work well.

Eventually chips may fail. The usual suspects are the two CIA chips, the PLA chip, memory chips and perhaps even the SID (sound) and VIC (video) chips. You'll have to replace them if they fail. Note that Commodore produced cost reduced main boards towards the end of the C64's lifespan, which are not 100% the same as the ones that came before.

C64 reloaded is a C64 board you can buy which allows you to insert legacy c64 chips in a new main board.

Ultimate64 is an FPGA based 'implementation' of the original c64 hardware. Doesn't need any legacy hardware, but is a full working C64

1541Ultimate is a 1541 disk drive and tape emulator that slots into your C64 (much like the sd2iec)

Ray Carlsen is a great resource for hardware related things.

The original PSU has a tendency to fail. Failure of the PSU can fry chips in your C64. There are modern PSU's to prevent that from happening, or you can get a 'power saver' which serves the purpose of protecting your c64 from PSU failures.

Mapping the c64 Learn this and you know everything there is to know about your C64 hardware. It's a lot to take in.

Mapping the C64 the book This books is also essential, together with Commodore's "Programmer's reference guide"

Programmer's Reference Guide You need this

Welcome to the world of C64, have a nice journey.

u/mschaef · 3 pointsr/programming

> Second, you don't have books or toolchain to make "native" software. Third, you don't even know what books or toolchain are required.

I don't know if the situation was quite as terrible as you seem to imply. There were quite a few hobbyist magazines, that back in that day went into some great depth on how to program the machines. Until the late 80's, Byte magazine even included articles describing how to build hardware, up to and including full computers and co-processor boards. There were also a large number of technical reference books commonly available at bookstores.

It's worth noting that all of these were very commonly available at mass market bookstores. (At least the bookstores I went to in Houston.)

http://www.amazon.com/Commodore-64-Programmers-Reference-Guide/dp/0672220563

http://www.amazon.com/Mapping-Commodore-64C-Sheldon-Leemon/dp/0874550823/ref=sr_1_1?ie=UTF8&s=books&qid=1213025886&sr=8-1

http://www.amazon.com/Mapping-Atari-Ian-Chadwick/dp/0874550041/ref=sr_1_1?ie=UTF8&s=books&qid=1213025986&sr=1-1

On the Apple ][, this book originally came with the machine, and covered everything from unpacking the box, to a firmware listing, to a schematic and pinouts.

http://apple2history.org/museum/books_manuals/a2refmanorig.html

Regarding toolchain availablity, Apple machines came with a BASIC that loaded on startup and a built in assembly language monitor (call -151, IIRC). On the C64, Jim Butterfield had a nice monitor that was also commonly available. IBM machines came with a complete BIOS listing and pinouts of all major ports, including the expansion slots.


And, while Google wasn't around, BBS's were around, and they tended to be more specialzed to computer hobbyists than 'the Internet', so search was less important.

u/pyroscopic24 · 3 pointsr/CUDA

I agree with guy. Having a solid C++ background is good but programming for CUDA specifically is something else.

The book that I used when I took CUDA programming as an undergrad was this:
Programming Massively Parallel Processors: A Hands-on Approach 3rd Edition


Here's a sample of the 1st edition of the book. It's not too far from the 3rd edition but checkout Chapter 3 to see how much different it is from programming typically in C++.

u/solid7 · 3 pointsr/learnprogramming

In that pile-o-stuff there are really two main subjects: architecture and operating systems. I'd pick up recent copies of the dinosaur book and where's waldo. Silbershatz and Tanenbaum are seminal authors on both subjects.

There are numerous resources to learn C. Since I seem to be recommending books, Kernighan and Ritchie's book is pretty much the gold standard.

Good luck.

u/xamino · 3 pointsr/learnprogramming

I don't think we need to go that deep. An excellent book on how CPUs work at the assembly level is Inside the Machine. I can only recommend it even for programmers.

u/pitch_away · 3 pointsr/AskElectronics

Here are 2 awesome guides: 1 & 2. But as indicated in this thread you probably should get a well know micro-controller and use it to build a knowledge base. The Arduino is an Italian micro-controller that is based on an Atmel chipset. It has a massive online following and support for it can be found in /r/Arduino or here at their website. This has numerous shields that can be added on to add features. These are things like GPRS (SMS and Mobile connectivity), Ethernet (Wireless) and Motor Control. You can buy components and such from: https://www.sparkfun.com/ ; https://www.adafruit.com/ ; http://www.mouser.com/ . The Arduino favors hardware prototyping and tinkering. The Arduino is programmed using its own software that is free and available. It has its own IDE (integrated development Environment) and is programmed using its own take on the assembly language. It is quite easy to use, well supported and open source.

Also, you might consider the Raspberry Pi which is explained in this Ted Talk by Eben Upton one of the creators of the board. I believe it uses AVR. The "A" board is slightly smaller (storage) and boots linux from a SD card or Flash as does the "B" which has slightly more storage and WiFi hardware. The Pi is typically programmed in Python but can be used I think with almost any language(C, C+, Assembly etc.)

Also, there is a Micro called BeagleBone. It is similar to Pi but has a few different features. It is very powerful and can be researched(as a starting point here. I know very little about this board and believe it to be more advanced than the former 2 I had mentioned.

These resources can be used for the Arduino: Getting Started and Cookbook.

A few resources to get started: Python & Pi and Getting Started

The first 2 resources I listed 1 & & 2 Are absolutely brilliant. They teach basic electronics introduction to Eagle Cad A free PCB (printed Circuit Board) program that people use to draw schematics and PCB. Which is pretty important. I linked the free version which is more than powerful enough for a beginner. The resource [1] is really helpful I would read it thoroughly if interested. Also, places like http://makezine.com/ is a good place for DIYers. Also you might like this news channel that follows hacker stuff (it is from Y-Combinator an Incubator for some silicon valley start ups) listed here. These links should cover you for a while.

u/motivated_electron · 3 pointsr/ECE

Hi,

I have two-part suggestion for you. Naturally, this is really just what I did to move from your shoes to where I am now (writing device drivers, and using real-time operating systems) on nice toolchains (sets of IDEs coupled with compilers and debugger interfaces).

The first thing you ought to do next is focus on learning C. ANSI C.

This is book to use: Introduction to Embedded Systems by David Russel

By actually stepping through the book, you'll get to learn what is embedded is all about, without having to learn a new debugging interface (because you won't have one), and without having to buy a new board.

The book uses the Arduino Uno board, and the Arduino IDE to teach you how to NOT use the Arduino API and libraries. This teaches you about the "building" process of C programs - the compilation, linking, bootlaoders, etc. You'll learn all the low level stuff, on a platform (The AT Mega 328p) that is easier to understand. You'll even get into making interrupt-based programs, a buffered serial driver, pulse-width modulation, input capture on timers etc.

Ever since having gone through the book, I've been able to move to other platforms, knowing that the ideas are essentially the same, but more involved. If you were to jump straight into the Arm Cortex-based 32 bit processors, you would feel rather overwhelmed by the complexity of the peripherals that typically get placed onto the same die. You would end up resorting to high level hardware abstraction libraries (like ST Microelectronic's horrid HAL) and MAYBE be able to use them, but you would have no idea what would actually be going on. As soon as a library stops working, you need to be able to know where it broke and why.

So do this patiently.

Start with the 8 bit micro on the Arduino using the book, using basic timers, basic interrupts, basic C types.


Only later, it is wise to pick up an ARM board to start experimenting with. These devices are powerful monsters that get work done. But you won't have an appreciation for what they and their powerful peripherals can do until you've wrestled with their simpler cousins on the AT Mega 328p.

You'll have to pick an IDE, (or none if you really want to know and understand the build process), a set of libraries to use (or none, if you want to learn the monster peripherals the hard way), and an operating system (or none, if you want to stick to bare-metal programs, but really, a 120 MHz cpu with all that power and no OS is sort of a waste).

I would recommend starting with the TIVA C series board from Texas Instruments. It's used in the very nicely paced Intro to ARM Microcontrollers by Jonathan Valvano.

That would be another book well worth the time to go through like a tutorial, because it is.

These book have also helped me along the way: Definitive Guide Cortex M3 Cortex M4 Processors by Joseph Yiu
and Computer Organization and Embedded Systems by Zvonko Vranesic.

If you want to explore this field, please be prepared to be patient. The learn curves here can be steep. But the things you'll be capable of making are incredible if you keep it up. I hope your exploration goes well!
















u/wgren · 3 pointsr/dcpu_16_programming

Code: The Hidden Language of Computer Hardware and Software,The Elements of Computing Systems and Inside the Machine were recommended on Hacker News.

I have the last one, I will re-read it over Easter holidays...

u/nbneo · 3 pointsr/fsharp

I found this to be a good guide to .net: Pro C# 7: With .NET and .NET Core

u/ArithmeticIsHard · 3 pointsr/Cplusplus

When I took a High Performance Computing course, this book came in handy.

Programming Massively Parallel Processors: A Hands-on Approach https://www.amazon.com/dp/0128119861/ref=cm_sw_r_cp_api_i_Xc3SCbDS47WCP

u/RoombaCultist · 2 pointsr/ECE

1st year EE student here.

For circuit analysis my biggest resources have been allaboutcircuits.com, and Jim Pytel's Youtube videos. I need to send that guy a dozen beers for getting me through my first two terms of circuits.

For the math: review algebra (exponents in particular), trigonometry, and, if your program will be using calculus review that too. When you get to studying AC, you'll get to use phasors, which makes for far less calculus and honestly makes more sense. Phasors were completely new for me, but were completely doable thanks to Jim Pytel's lecture. Honestly though, It's mostly algebra; Ohm's Law and KVL will get your far.

If/when you get interested digital circuit design check out Ben Eater. His explanations for what is going on in a circuit are far more clear than any other resource I've found so far.

Of course all that theory is pretty dry for a fun summer break. It might be best to bookmark those until classes actually start then use those resources to compliment your class. A more fun approach might be to get yourself a basic electronics kit, a breadboard, and a DMM (that's digital multimeter) and build some circuits out of a book like any of Forrest Mim's books, or a more modern "maker flavored" book. Then probe the circuits you make and ask yourself "What's going on here?"

Also, the sooner you can get your hands on a good oscilloscope, the better. It's the best tool for seeing what's going on in a circuit because it shows how voltage relates to time, unlike a DMM which shows an average. A DMM is indispensible (and affordable), so it will likely be the first major tool in your kit, but don't be shy of all the knobs on the fancy expensive box.

u/maredsous10 · 2 pointsr/ECE

Links
www.dspguru.com

Videos
Oppenheimer's MIT Lectures
(http://ocw.mit.edu/resources/res-6-008-digital-signal-processing-spring-2011/video-lectures/)
Digital Filters I through V (Hamming Learning to Learn on Youtube)
Monty's Presentations http://www.xiph.org/video/

Books
Schaum's Digital Signal Processing (<= Recommended It's good and cheap.)
http://www.amazon.com/Schaums-Outline-Digital-Processing-Edition/dp/0071635092
Signals and System Made Easy
http://www.amazon.com/Signals-Systems-Made-Ridiculously-Simple/dp/0964375214
ftp://ftp.cefetes.br/cursos/EngenhariaEletrica/Hans/Sinais%20e%20Sistemas/ZIZI%20Press%20-%20Signals%20and%20Systems%20Made%20Ridiculously%20Simple.pdf

Discrete Time Signal Processing
http://www.amazon.com/Discrete-Time-Signal-Processing-Edition-Prentice/dp/0131988425/

Discrete Time Signal Processing (Older version I used in school)
http://www.amazon.com/Discrete-Time-Signal-Processing-Edition-Prentice-Hall/dp/0137549202/

DSP using MATLAB
http://www.amazon.com/Digital-Signal-Processing-Using-MATLAB/dp/1111427372/

Digital Signal Processing Prokais
(Similar to Oppenheimer book, but found it clearer in some instances from what I remember. )
http://www.amazon.com/Digital-Signal-Processing-4th-Edition/dp/0131873741/

Books I've seen around
Understanding Digital Signal Processing
http://www.amazon.com/Understanding-Digital-Signal-Processing-Edition/dp/0137027419/

Scientist-Engineers-Digital-Signal-Processing
http://www.amazon.com/Scientist-Engineers-Digital-Signal-Processing/dp/0966017633/

http://www.dspguide.com


u/skovos · 2 pointsr/needadvice

It's a broad field, but I think "How Computers Work" by Ron White would be a great starting point to get the core concepts down:
https://www.amazon.com/How-Computers-Work-9th-White/dp/0789736136

I'm happy you want to get involved with computers, they truly are more amazing the more you learn about them. Let me know if I can help.

u/albatrossy · 2 pointsr/DSP

It kind of sounds like you'd be good just getting a textbook. I think any book will be fine since you mainly just want questions (and presumably answers), but try to find one that implements code in a language that you're comfortable with, or that you want to learn.

There are a lot of different "final year" DSP courses, but it sounds like you want something covering the fundamentals rather than anything too advanced. I started off with The Scientist & Engineer's Guide to Digital Signal Processing and then used Signals and Systems for my first undergraduate course, but we used it largely because he co-authored it. I would recommend scouring the web for some free books though. There are books like ThinkDSP popping up that seem pretty neat.

Edit: Oppenheim is always mentioned also.

u/mitchell271 · 2 pointsr/audioengineering

Software dev checking in. If you want to go into plugin design, make sure you read books like The Scientist And Engineer's Guide to Digital Signal Processing, and have a heavy focus on algorithms, physics, and matrix math.

There are SDKs and APIs to help though. The Steinberg VST SDK is how VST plugins are made, and it removes a lot of the underlying math that you need to know. Writing multi-threaded C code with a library like OpenMP will also help, as you plugins will be more efficient, resulting in less latency.

u/loubs001 · 2 pointsr/hardware

Agree. It depends on what you want to know, and how much you're willing to commit to learning. It's a big world. Code is a nice book if you want a very very simple explanation of the basics of bits and bytes and logic gates. It might be a good place to start, though it's intended for a non-technical audience and you may find it a little TOO simple. A proper digital systems book will go in to much more detail about digital logic (AND gates, flip-flops etc.). You might be surprised just how easy to learn the fundamentals are. I learned from Tocci which I found to be excellent, but that was a long time ago and I'm sure there's many other good ones around.

That's pretty low level digit circuits though. If you are really serious about learning computer architecture, I'd highly recommend Patterson and Hennssey . It covers the guts of how processors execute instructions, pipelining, caches, virtual memory and more.

If you're more interested in specific, modern technologies... then obviously Wikipedia, or good tech review sites. Especially reviews that focus on major new architectures. I remember reading lots of good in depth stuff about Intel's Nehalem architecture back when it was new, or nvidia's Fermi. There's a wealth of information out there about CUDA and GPU computing which may give you a sense of how GPUs are so different to CPUs. Also when I first started learning many years ago, I loved my copy of Upgrading and Repairing PCs , great for a less technical, more hobbyist perspective.

Lastly, ask questions! For example, you ask about DDR vs GDDR. Deep inside the memory chips themselves, actually not a great deal of difference. But the interface between the memory and the processor are quite different, they're designed for very different purposes. I'm simplifying here but CPUs have relatively low levels of parallism, they tend to operate on small units of memory (say a single value) at a time, they have quite unpredictable access patterns so low latency is essential, and the cores often work tightly together so coherency has to be maintained. With GPUs, they have a very predictable access pattern, so you can load much larger chunks at a time, latency is less important since you can easily keep your processors busy while memory is streamed in, and the GPUs many many tiny processors for the most part all work on separate words of memory, so coherence usually does not need to be maintained and they have much less need for caches.

The "L" (Level) naming for caches is quite simple. Memory that is closer to the core is faster to access. Generally each core has it's own L1 and L2, with L2 being slightly slower but there's more of it, and all cores share an L3, slower still but way more of it. Memory on the cpu is made out of transistors and is super fast but also takes up alot of space. Look how big the L3 is (here)[http://www.anandtech.com/show/8426/the-intel-haswell-e-cpu-review-core-i7-5960x-i7-5930k-i7-5820k-tested] and that's just 20MB. external ram is obviously much slower, but it is made out of capacitors and has much higher densities.

u/Hello_Dongan · 2 pointsr/DSP

I personally like The Scientist & Engineer's Guide to Digital Signal Processing . The author explains a lot of concepts very clearly in laymen terms. I think the only flaw is that it doesn't cover a ton of material, only the basics.

Other than that, I think Mitra is a good book. One thing to look out for is its errata list. It's somewhat frustrating to have to double check for errors in the book when working homework problems.

u/zitterbewegung · 2 pointsr/compsci

If I were you I would read this book quantum computing for computer scientists
which is geared torward undergraduates.

He has a arxiv article which he based his book on. See an introduction to quantum computing.

Someone else suggested other books. Those are more geared toward graduates and I would read these books first and then try to tackle those.

u/mearse · 2 pointsr/computers

One of the best books on the topic. It's been revised year after year. While I can't speak to this current version, I used an earlier version years ago and it had just about everything you could ever want to know on the hardware side of things.

http://www.amazon.com/Upgrading-Repairing-21st-Scott-Mueller/dp/0789750007

u/beeff · 2 pointsr/lisp

Check your sources, all of the listed techniques have been around in computer architectures for decades. cfr. Computer Architectures

u/CodeTamarin · 2 pointsr/computerscience

The Stanford Algorithm book is complete overkill in my opinion do NOT read that book. That's insane. Read it when you've been doing programming for a while and have a grasp of how it even applies.

Here's my list, it's a "wanna be a decent junior" list:

  • Computer Science Distilled
  • Java/ C# / PHP/ JS (pick one)
  • Do some Programming Challenges
  • SQL
  • Maybe build a small web app. Don't worry about structure so much, just build something simple.
  • Applying UML: and Patterns: An Introduction to Object Oriented Anaysis and Design Iterative Development
  • Head First Design Patterns
  • Clean Architecture
  • Refactoring: Improving the Design of Existing Code
  • If you're interested in Web
  • Soft Skills: Power of Habit , A Mind for Numbers , Productivity Project

    ​

    Reasoning: So, the first book is to give you a sense of all that's out there. It's short and sweet and primes you for what's ahead. It helps you understand most of the basic industry buzz words and whatnot. It answers a lot of unknown unknowns for a newbie.

    Next is just a list languages off the top of my head. But you can pick anything, seriously it's not a big deal. I did put Java first because that's the most popular and you'll like find a mountain of resources.

    Then after some focused practice, I suggest grabbing some SQL. You don't need to be an expert but you gotta know about DBs to some degree.

    Then I put an analysis book that's OOP focused. The nifty thing about that book, is it breaks into design patterns nicely with some very simple design patters to introduce you to design patterns and GRASP.

    Then I put in a legit Design Patterns book that explains and explores design patterns and principles associated with many of them.

    Now that you know how code is structured, you're ready for a conversation about Architecture. Clean architecture is a simple primer on the topic. Nothing too crazy, just preps you for the idea of architecture and dealing with it.

    Finally, refactoring is great for working devs. Often your early work will be focused on working with legacy code. Then knowing how to deal with those problems can be helpful.

    FINAL NOTE: Read the soft skills books first.

    The reason for reading the soft skills books first is it helps develop a mental framework for learning all the stuff.

    Good luck! I get this isn't strictly computer science and it's likely focused more toward Software Development. But I hope it helps. If it doesn't. My apologies.
u/gmartres · 2 pointsr/askscience

A simpler quantum algorithm: https://secure.wikimedia.org/wikipedia/en/wiki/Deutsch%27s_Algorithm though I'm not sure that'll enlighten you. if you're really interested I'd recommend Quantum Computing for Computer Scientists which is a great intro and doesn't assume much knowledge (the first chapter introduces complex numbers, the second linear algebra).

u/solyanik · 2 pointsr/compsci

Why not change the school?

If this is not an option, you have to study yourself. Here are the books:
http://www.amazon.com/dp/0262033844
http://www.amazon.com/dp/0123704901

u/creav · 2 pointsr/programming

> I'm personally happy that didn't focus on any single language that much.

Programming Language Pragmatics is one of my favorite books of all time!

u/FormerlyTurnipHugger · 2 pointsr/Physics

The good thing about quantum information is that it's mostly linear algebra, once you're past the quantization itself. The good thing though is that you don't have to understand that in order to understand QI.

There are books written about quantum computing specifically for non-physicists. Mind you, they are written for engineers and computer scientists instead and they're supposed to know more maths and physics than you as well. Still, you could pick up one of those, e.g. the one by Mosca, or even better the one by David Mermin.

There are also two very new popular-science books on the topic, one by Jonathan Dowling, Schrödinger's Killer App, and one by Scott Aaronson, Quantum computing since Democritus.

u/PCneedsBuilding · 2 pointsr/computerscience

I enjoyed this book https://www.amazon.ca/Definitive-Guide-Cortex®-M3-Cortex®-M4-Processors/dp/0124080820, you're going to have to get acquainted with the technical reference manuals at arm's website also.

u/tjlusco · 2 pointsr/ElectricalEngineering

I'm going to be frank, this is probably the worst engineering article I've ever read. I may be biased because I majored in control systems, but this article doesn't even remotely cover what would be a control systems 101 introductory lecture, it is littered with grammatical and technical inaccuracies, and is completely devoid of technical depth that someone who would bother reading the article would be interested in. It is also obvious that the submitter is also affiliated with the site, not that I have a problem with shameless self promotion but this is simply bad content.

For those who would like a good introduction to control systems, this is IMOH the best text on the subject: Modern Control Systems, R.H. Bishop. (Amazon,Torrent)

u/Caret · 2 pointsr/hardware

As someone else mentioned, the Hennessy and Patterson Computer Architecture: A Quantitative Approach, and the Patterson and Hennessy Computer Organization and Design are the de facto standards (I used both in my Comp. Eng. undergrad) and are really fantastic books (the latter being more "software" oriented so to speak).

They are not EE textbooks (as far as I know) but they are text books nonetheless. A great book I found that is slightly dated but gives a simplified review of many processors is Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture which is less technical but I enjoyed it very much all the same. It is NOT a textbook, and I highly, highly recommend it.

Hope that helps!

u/NiuRouGan · 2 pointsr/csharp

I used "Pro C# 5.0 and the .NET 4.5 Framework 6th Ed" to teach myself C#.

​

I'm pretty sure there are newer editions now, but the content will be mostly the same, specially at beginner levels.

u/subheight640 · 2 pointsr/engineering

http://www.amazon.com/Modern-Control-Systems-12th-Edition/dp/0136024580

This is the book written by my former controls professor.

u/stupider_than_you · 2 pointsr/robotics

I learned from Modern Control Systems by Bishop and Dorf, I thought it was pretty good. Here is the newest version.

u/replicated · 2 pointsr/Cyberpunk

Subjects like this book on computers and physics interest me A LOT. Does this mean I might like electrical engineering?

Although I like the subjects I'm horrible in math and by NO means an expert at anything beyond those casual presentations. I'm nearing college and with so many interests, I need to decide on something. I love cyberpunk and what you've said sounds great I'm just worried about the math..I don't mean to hijack but I didn't want to start a new post on "cyberpunk careers".

u/Jiveturkeyjibbajabba · 2 pointsr/buildapc

1200 pages. Get reading.

On a more serious note, just keep reading and learning. If you come across a term you don't know what it means, google it for pictures, and to understand what it does.

u/vtomole · 1 pointr/QuantumComputing

I haven't entirely read this book, but I remember that it explains Grover's algorithm well. It might be what you are looking for because it states that it's target audience is undergraduate computer science students.

u/ElectricWraith · 1 pointr/AskEngineers

Control Systems Engineering, 6th Ed, Nise

Modern Control Systems, 12th Ed, Dorf & Bishop

Automatic Control Systems, 9th Ed, Golnaraghi & Kuo

Control Systems Design: An Introduction To State-Space Methods

Control Handbook, 2nd Ed

Those are some that I have. The Nise book is excellent, the Dorf book is as well, it was my primary text for Controls I & II, supplemented by the Kuo book. The latter has more on digital controls. All of those three focus primarily on classical control theory and methods, but the Nise book goes into more depth on modern methods. I got the state-space methods book because it's more focused. The Control Handbook is a beastly collection, but it's very broad, hence not possessed of much depth. It's more of a reference than a text.

If you want to dive deeply into PID control, look no further than Akstrom and Hagglund's works on the subject, it doesn't get much better.

Source: I'm a degreed EE that specialized in control systems and a licensed control systems PE.

u/carbacca · 1 pointr/engineering

this is just about the only book that was precribed in my mechatronics programme

http://www.amazon.com/Modern-Control-Systems-12th-Edition/dp/0136024580

cant say i actually used it apart from some last minute cram and highlighter abuse just before the exam - most of what i know and do just came from real work experience

u/weegee101 · 1 pointr/Python

I started by going through Grey Hat Python. If you're an experienced programmer and enjoy the low-level stuff once in a while it is a very fun way to learn Python.

If you're specifically looking to learn how to program the Rasberry Pi and learn Python at the same time (which will keep your interest up), Programming the Raspberry Pi: Getting Started with Python is a great book for beginners of all skill levels. If you are a newbie programmer, buy this book.

u/bytewarrior · 1 pointr/AskEngineers

You are looking at an area called Control System Engineering. If you are familiar with the Laplace transform I strongly recommend reading through this book.

http://www.amazon.ca/Modern-Control-Systems-12th-Edition/dp/0136024580

Even if you do not understand the Laplace transform this book covers the material initially using traditional Differential Equations. You can get a copy online through resourceful means.

u/farmvilleduck · 1 pointr/electronics

For understanding how computers work , there's a good book by the co-founder of arstechnica.

http://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593271042

u/SgtTechCom · 1 pointr/explainlikeimfive

You could get this book I have it and it explains what you're asking.

u/seca · 1 pointr/funny

I've seen this monkey in my dads house on the book shelf. Can't say I noticed the title though. He works with Oracle, so you might not be too far off. Can we figure out what it is?

Found it: http://www.amazon.com/Unix-Nutshell-Fourth-Arnold-Robbins/dp/0596100299/ref=ntt_at_ep_dpt_3

u/Elynole · 1 pointr/nfl

I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:

Universal Principles of Design

Dieter Rams: As Little Design as Possible

Rework by 37signals

Clean Code

The Art of Programming

The Mythical Man-Month

The Pragmatic Programmer

Design Patterns - "Gang of Four"

Programming Language Pragmatics

Compilers - "The Dragon Book"

The Language of Mathematics

A Mathematician's Lament

The Joy of x

Mathematics: Its Content, Methods, and Meaning

Introduction to Algorithms (MIT)

If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.

I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.


u/PinkyThePig · 1 pointr/raspberry_pi

I liked this one. http://www.amazon.com/Programming-Raspberry-Pi-Getting-Started/dp/0071807837/ref=sr_1_1?ie=UTF8&qid=1371223914&sr=8-1&keywords=python+raspberry+pi

It has an intro section that describes how to setup your raspberry pi and it ramps up pretty quickly with the final chapters covering programming the GPIO pins.

u/nireon · 1 pointr/CompTIA

Is This a good book? I already have it, but I am not sure if/how it relates to getting the A+

u/brucehoult · 1 pointr/ComputerEngineering

Welcome!

You need two books:

https://www.amazon.com/Computer-Organization-Design-RISC-V-Architecture/dp/0128122757

Get the original MIPS or later ARM version if you prefer -- they're absolutely fine, and the principles you learn one one apply to everything -- but the RISC-V one is the newest and is the only only one that you're actually legally allowed to make an implementation of at home and distribute, put on github etc.

But of course designing and making your own 16 bit ISA is huge fun, so I definitely recommend that too!

Once you've digested all that, their other book is more advanced. But the first one will get you a long way. This next one is the absolute bible of real computer architects and hardware designers.

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

That's by these guys, who originally invented the RISC-I and MIPS processors in the early 80s, invented the term "RISC" (and also RAID, btw). They recently received the Turing award for their lifetime efforts:

https://www.youtube.com/watch?v=3LVeEjsn8Ts

Join comp.arch on usenet / google groups. There are lots of actual working or retired computer architects there, and they're helpful to energetic students and amateurs designing their own toy stuff.

u/throwdemawaaay · 1 pointr/AskComputerScience

https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269

After that:

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

These authors are the foremost authorities in the field. The second book is *the* textbook for computer architecture. These are the people that invented RISC.

u/xiongchiamiov · 1 pointr/cscareerquestions

Fairly frequently while I was in college; I shopped up to about a shelf's worth then.

I find that I go to books far less frequently now. Part of it is that I'm much more invested in the current technologies I'm using. Another part is that many books are of the "learn X language!" type, and I know enough general programming that those aren't as useful.

The things I still buy are things like The Mythical Man-Month, Scalable Internet Architectures, and Design for Hackers - that is, less reference and more for reading to get ideas from.

u/CodeSquad · 1 pointr/raspberry_pi

Anything you do, do not buy this book , or this one . Just a waste of money.

u/arghcisco · 1 pointr/homelab

So I thought about writing such a post and I started an outline when I realized that it's going to turn into a textbook which will rapidly become obsolete. And then I remembered that someone already wrote it and I read the thing cover to cover when I was a kid:

http://www.amazon.com/Upgrading-Repairing-PCs-21st-Edition/dp/0789750007/ref=pd_sim_sbs_b_1?ie=UTF8&refRID=1E2A3Z3N0Q2WV4MBXA4K

Go to your library or bookstore or whatever and get a copy before you do anything else. Seriously.

On the other hand there's a need for the homelab community to encourage fresh blood so they can get real jobs and contribute something. I was really lucky and had good mentors when I was growing up. I have trouble imagining myself in my current position had they not gently applied the dumbass correction stick now and then.

If you want you can message me privately and I can answer your questions. I'll filter the good ones up to /r/homelab in a writeup.

The bottom line is that part of putting together a homelab necessarily requires dealing with an extremely fast paced industry. A huge part of creating and maintaining a homelab is learning how to learn at the speed of the industry. I think most of us would agree that specing a build without at least a few hours of refreshing current knowledge is nuts. If you're not constantly testing new stuff in your lab then why bother having it?

That being said, I can answer a few of your basic questions:

> explaining the basics of different kinds of servers

They're all basically the same: it's just a computer which is more reliable than a typical PC. This usually means it has ECC memory, redundant disks (RAID), and some kind of out of band management so you don't have to touch it to fix it. Generally you want to get different cheap ones from your local flea market or ebay so you can learn about the nuances between different vendors. Don't blow all your cash on nice stuff, you'll limit your learning opportunities.

Intel wants you to think that Xeon == server but I've got a Celeron J1900 drawing about 20 watts and it's one of the nicest little server boards I've ever owned. Bottom line: go try it for yourself. If it runs your server software then it's a server, period.

> What are RAID controllers

A hardware RAID controller gangs together a bunch of disks so they look like one big one. This allows the operating system to save time by issuing one input or output request to all the disks at the same time. That way if one disk fails the data is still available. Some RAID controllers are "software RAID" because they cheat and only help with the math to write the fault tolerance data. These software controllers are cheaper but slower because the OS has to issue one I/O request per disk. You really should just get a crappy one and some crappy disks and mess around with them to help the concepts stick.

> a list of kind of software people should have to make sure things are secure and will work.

This is going to get expensive real fast. My usual recommendation for broke people is spiceworks, Windows Defender and some kind of netflow analyzer like ntop.

As far as making sure things will work, it hasn't been invented yet. The entire point of the lab is to try things and see if they work together.

> hyper visor, server OS, firewall

I recommend Ubuntu or CentOS and kvm as the hypervisor. (I personally cut my teeth on FreeBSD and still recommend it but the learning curve is really steep.) If you can't deal with Linux then get Hyper-V Server 2012 R2:

http://www.microsoft.com/en-us/evalcenter/evaluate-hyper-v-server-2012-r2

You'll need hvremote to configure it, unless you feel like learning about DCOM guts first:

http://code.msdn.microsoft.com/windowsapps/hyper-v-remote-management-26d127c6

As far as the server OS goes it depends on what you want to run on top of it. Evaluations are free, go try them!

> the amount of RAM you should use based on what your doing.

If apps are slow you probably need more. Really, it's that simple. See, memory is like money: it's a precious resource and it's a waste if you're not at least getting interest on it. Memory is faster than disk, so the OS keeps copies of the stuff on the disk in what would be your free memory. If the cached memory (see the performance tab in task manager on Windows) is less than say 5% of your total amount of memory then you don't have enough.

> what components and software make up a good server.

ECC memory, hardware RAID, and out of band management at a minimum. But don't focus on getting a "good" server. Focus on getting lots of crappy ones and building redundancy. It's cheaper and you get a wider variety of experience. The minor quality differences between the products will become apparent very quickly. Pontificating about which product is "better" is a huge waste of time. Get stuff and try it.

> someone could explain how you can start having more then one box hooked up and getting switches and ether-nets involved

I usually take the house address, divide by 256 and take the remainder (modulo operation). Example: 31337 divided by 256 has remainder 105. Let's call this X. Use these settings:

Network: 192.168.X.0
Subnet mask: 255.255.255.0
Gateway (router): 192.168.X.1
DHCP range: 192.168.X.100-200
Switches (if managed) : 192.168.X.250 + switch number

Just plug everything into switches and connect the switches together. Don't plug any switch into itself and don't plug any switches together with more than one cable. If you want to know more then you'll have to read a book.

u/jalagl · 1 pointr/explainlikeimfive

I recommend you grab this book, I used it in the university and gives a pretty good explanation of how computers work.

That being said, you would need input from material physicist, electronic engineers, chemists, and a bunch of other professionals to really understand how a computer really works. It is a complex machine, and building one combines knowledge from many many disciplines.

u/a_raconteur · 1 pointr/iOSProgramming

I've only begun learning iOS and Objective-C, with very little previous coding experience (some work with Visual Basic in high school...Har har). I'm using The Big Nerd Ranch Guide to iPhone Programming and Programming in Objective-C 2.0. Both come pretty highly recommended, and are even suggested for beginners, though both seem geared towards those with some previous coding experience. Either way I haven't had too much trouble yet, so I imagine someone with expertise in another language shouldn't have issues with these books.

u/alexpud · 1 pointr/csharp

There is this book which basically talks about everything in C# 7, a good and deetailed book. https://www.amazon.com/Pro-NET-Core-Andrew-Troelsen/dp/1484230175

u/MuckYu · 1 pointr/learnpython
u/brintoul · 1 pointr/pics

I think this is a good book.

u/tkphd · 1 pointr/HPC

What are you trying to do with it? Programming Massively Parallel Processors was useful to me, but without more info, it's hard to make recommendations.

u/biglambda · 1 pointr/gpgpu

I started with this book. I think it's mainly Cuda focused but switching to OpenCL was not that hard.

u/mrchowmein · 1 pointr/csMajors
u/arthurno1 · 1 pointr/linuxquestions

When you say you want to make a simplistic OS, do you mean you want to put together a simplistic Linux distro, or you want to code an OS from scratch?

In former case DSL might be your friend (Damn Small Linux):
http://www.damnsmalllinux.org/. There are other similar distros that might fit under 25 megabyte, google is your friend. As already mentioned by somebody else linuxfromscratch.org is another option. If you go with LFS, you want to look at minimal libraries instead of standard GNU libs for C library and standard system applications. For example you would like to get https://www.uclibc.org/ for c library (or some other similar, there are few) and say busybox https://www.busybox.net/ for your system apps. There other "micro" versions of some popular software (X server etc) which you might wish to consider if you are going completely custom route.

If I was you, I wouldn't do it, since many others have same thoughts as you and have already put effort and hours into making it, so why repeating all that work if you can just get a distro like DSL and install it and simply customize/change what you dislike. If you want it as an educational experience than certainly go for it, LFS might be very rewarding in that case.

If you want to code your own kernel and OS, than you might wish to take a CS class about OS:s, Tanenbaum is your eternal friend:
https://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/013359162X/ref=sr_1_1?ie=UTF8&qid=1498831929&sr=8-1&keywords=andrew+tanenbaum

https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0132916525/ref=sr_1_4?ie=UTF8&qid=1498831929&sr=8-4&keywords=andrew+tanenbaum

And don't forget Google ...

u/Hantaile12 · 1 pointr/IWantToLearn

Assuming you’re a beginner, and are starting with little to no knowledge:

I bought the 3rd edition of the book called “Practical Electronics for Inventors” by Scherz and Monk it starts from the basics and you slowly build more and more complex and practical circuits.
https://www.amazon.com/dp/1259587541/ref=cm_sw_r_cp_api_eTs2BbXN9S1DN

Another fun on by Monk is “The Maker's Guide to the Zombie Apocalypse: Defend Your Base with Simple Circuits, Arduino, and Raspberry Pi”
https://www.amazon.com/dp/1593276672/ref=cm_sw_r_cp_api_XVs2BbYMVJT5N

If you are looking for something more theory based (I wouldn’t recommend initially unless you’re just curious) there’s a whole slew of texts books depending on what exactly you’re interested in you can pick up for cheap at a used book store or on amazon.

Remember build slowly in the beginning until you get a good grasp on the content and have fun. Diving in too deep to quickly can overwhelm and kill morale.

Happy learning!

u/genjipress · 1 pointr/Python

Further notes, here's some of the books I've been looking at:

Modern Compiler Implementation (there's Java, C, and ML editions; this is C)

https://www.amazon.com/Modern-Compiler-Implementation-Andrew-Appel/dp/052158390X

Design Concepts in Programming Languages

https://www.amazon.com/Design-Concepts-Programming-Languages-Press/dp/0262201755

Engineering A Compiler

https://www.amazon.com/Engineering-Compiler-Keith-Cooper/dp/012088478X

Programming Language Pragmatics

https://www.amazon.com/Programming-Language-Pragmatics-Michael-Scott/dp/0124104096

u/exp11235 · 1 pointr/buildapc

The formal name for this field is "computer architecture." The most popular textbook by far is Patterson and Hennessey, and it's pretty easy to find materials from college courses posted online, ex. MIT open courseware, UC Berkeley.

For something less likely to put you to sleep, Ben Eater has a great Youtube channel where he explains computer architecture from a practical angle. He's got a great series where he builds a simple 8-bit computer from scratch, explaining all the pieces along the way.

u/Sogeking89 · 1 pointr/AskEngineers

hmmm well there are a lot of books that could be recommended depending on how you want your guitar tuner to work and what sort of methods you will be using to model your system as well as control it, do you want books on signal processing as well ? do you want discrete control? state space ? or just a book that will cover most bases? Either way I have put down a couple of basic texts that could help.

http://www.amazon.co.uk/Modern-Control-Systems-Richard-Dorf/dp/0136024580

http://www.amazon.co.uk/Modern-Control-Engineering-International-Version/dp/0137133375/ref=sr_1_sc_1?s=books&ie=UTF8&qid=1382300545&sr=1-1-spell&keywords=control+ogatta

u/NLJeroen · 1 pointr/embedded

Fellow Embedded Engineer here.
You learn this from books: The Definitive Guide to ARM® Cortex®-M3 and Cortex®-M4 Processors.
And just RTFM of course: Cortex M4 technical reference manual.

And of course the chip vendors documentation, since there will be some implementation defined stuff (eg: which memory bank stuff will boot to).

Don't forget the compiler and linker documentation. Lot's stuff is there, just scrolling through once gives you so much more understanding, and what you might find there if your solving some problem later on. Like the special instructions, and the compiler flags and conditions for it to actually use the FPU.

If you're going to try this bare metal assembly programming, I'd recommend the Cortex M0, since an M4 often comes in a complex chip.

u/nullcone · 1 pointr/QuantumComputing

...also I recommend to you the textbook by Kaye, Laflamme, and Mosca. Alternatively, if you're feeling daring you can pick up the QC bible. And if you're feeling too cheap to buy books, you can find lecture notes from a lot of QC courses posted online. Check out John Preskill's website, or maybe MIT open courseware.

u/mrdrozdov · 1 pointr/cscareerquestions

I like reviewing the curriculum for the fundamental courses in undergrad/grad CS programs. One way to get ahead is to become familiar with the roots of programming language theory. I found the book Programming Language Pragmatics helpful, and it goes well with this course's curriculum although I am sure there are others. Alternatively, try building your own language/compiler using yacc and lex.

u/magpi3 · 1 pointr/technology

This book is amazing. I'm a programmer and system administrator who has been working with computers his entire life, but I learned something after reading this for five minutes.

u/harlows_monkeys · 1 pointr/apple

The same folks who did that first book you recommend have a similar book for iPhone programming: iPhone Programming: The Big Nerd Ranch Guide

u/Azoth_ · 1 pointr/IAmA

I don't know too much about quantum computing, I've just had some basic courses in quantum physics and a side interest in it. I recently started reading An Introduction to Quantum Computing. It's a pretty great book.

My school doesn't offer any undergrad courses (or possibly even grad, I haven't checked) on the subject, but from what I've learned it's basically a mix of linear algebra, cs, ee, and physics.

u/PinPinIre · 1 pointr/learnprogramming

It largely depends on which Computer Science degree you are going to do. There can be some that focus heavily on software and very little on hardware and some that get a nice balance between the two. If the degree is going to focus on hardware I would recommend reading up on the underlying logic of a computer and then reading this book (Inside the machine). ITM isn't a very technical book(I would label it as the computer science equivalent of popular science) but it gives a nice clear overview of the what happens in a processor.

When it comes to programming, I would recommend starting with Java and Eclipse. Java gets quite a bit of hate but for a newcomer, I think Java would be easier to grasp than the likes of C/C++. C/C++ are nice languages but a newcomer may find their error messages a little bit obscure and may get confused with the nitty-gritty nuances of the languages.

Though the one thing you should realise is that programming is a skill that isn't confined to one language. If you understand the basic concepts of recursion, arrays, classes, generics/templates, inheritance, etc. you can apply this knowledge to almost any language. Ideally i would recomend two books on programming (Algorithmics) and (Introduction to Algorithms). Algorithmics is another books I would label as the cs equivalent to popular science but the early chapters give a nice overview of exactly what algorithms actually are. Introduction to Algorithms is a more technical book that I would recommend to someone once they know how to program and want a deeper understanding of algorithms.

The rest is personal preference, personally I prefer to use a Unix machine with Sublime Text 2 and the command line. Some will try to convince you to use Vim or Emacs but you should just find whichever you are most comfortable with.

u/Nihilist_T21 · 1 pointr/learnprogramming
u/ChrisF79 · 1 pointr/learnprogramming

Programming in Objective C (Amazon link) is pretty well thought of as the bible for Objective C programming (the language iPhone apps are written in). I'm making the assumption you're talking iPhone here. Once you've gone through that book, which actually doesn't take a whole lot of time, you can watch the Stanford University iTunes courses on Objective C and iOS development. They're pretty great. If you still want more hands-on learning, the Big Nerd Ranch guide is awesome. It is screenshotted the whole way through and basically tells you to drag this here, click this, etc. to guide you through the creation of some programs.

u/godlikesme · 1 pointr/learnprogramming

I highly recommend Structured Computer Organization by Andrew Tanenbaum. It is a great book for beginners

u/agaskell · 0 pointsr/AskReddit

There's a book called "How Computers Work" - that'd be a good place to start.