Reddit Reddit reviews Computer Systems: A Programmer's Perspective (2nd Edition)

We found 32 Reddit comments about Computer Systems: A Programmer's Perspective (2nd Edition). Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
Computer Systems: A Programmer's Perspective (2nd Edition)
Check price on Amazon

32 Reddit comments about Computer Systems: A Programmer's Perspective (2nd Edition):

u/DucBlangis · 20 pointsr/netsecstudents

Here is a "curriculum" of sorts I would suggest, as it's fairly close to how I learned:

  1. Programming. Definitely learn "C" first as all of the Exploitation and Assembly courses below assume you know C: The bible is pretty much Dennis Richie and Kernighan's "The C Programming Language", and here is the .pdf (this book is from 1988, I don't think anyone would mind). I actually prefer Kochan's book "Programming in C" which is very beginner freindly and was written in 2004 rather than 1988 making the language a little more "up to date" and accessible. There are plenty of "C Programming" tutorials on YouTube that you can use in conjunction with either of the aforementioned books as well. After learning C than you can try out some other languages. I personally suggest Python as it is very beginner friendly and is well documented. Ruby isn't a bad choice either.

  2. Architecture and Computer basics:
    Generally you'll probably want to look into IA-32 and the best starting point is the Intel Architecture manual itself, the .pdf can be found here (pdf link).
    Because of the depth of that .pdf I would suggest using it mainly as a reference guide while studying "Computer Systems: A Programmers Perspective" and "Secrets of Reverse Engineering".

  3. Operating Systems: Choose which you want to dig into: Linux or Windows, and put the effort into one of them, you can come back to the other later. I would probably suggest Linux unless you are planning on specializing in Malware Analysis, in which case I would suggest Windows. Linux: No Starch's "How Linux Works" is a great beginner resource as is their "Linux Command Line" book. I would also check out "Understanding the Linux Kernel" (that's a .pdf link). For Windows you can follow the Windows Programming wiki here or you can buy the book "Windows System Programming". The Windows Internals books are generally highly regarded, I didn't learn from them I use them more as a reference so I an't really speak to how well they would teach a "beginner".

  4. Assembly: You can't do much better than OpenSecurityTraining's "Introductory Intel x86: Architecture, Assembly, Applications, & Alliteration" class lectures from Xeno Kovah, found here. The book "Secrets of Reverse Engineering" has a very beginner friendly introduction to Assembly as does "Hacking: The Art of Exploitation".

  5. Exploitation: OpenSecurityTraining also has a great video series for Introduction to Exploits. "Hacking: The Art of Exploitation" is a really, really good book that is completely self-contained and will walk you through the basics of assembly. The author does introduce you to C and some basic principles of Linux but I would definitely suggest learning the basics of C and Linux command line first as his teaching style is pretty "hard and fast".

  6. Specialized fields such as Cryptology and Malware Analysis.


    Of course if you just want to do "pentesting/vuln assessment" in which you rely more on toolsets (for example, Nmap>Nessus>Metasploit) structured around a methodology/framework than you may want to look into one of the PACKT books on Kali or backtrack, get familiar with the tools you will use such as Nmap and Wireshark, and learn basic Networking (a simple CompTIA Networking+ book will be a good enough start). I personally did not go this route nor would I recommend it as it generally shys away from the foundations and seems to me to be settling for becoming comfortable with tools that abstract you from the real "meat" of exploitation and all the things that make NetSec great, fun and challenging in the first place. But everyone is different and it's really more of a personal choice. (By the way, I'm not suggesting this is "lame" or anything, it was just not for me.)

    *edited a name out





u/rtz90 · 8 pointsr/embedded

Sounds like you don't know some of your low-level computing fundamentals as well as you should for the jobs you want. I recommend studying up on those, and then developing more familiarity with them by tinkering or doing relevant projects.

​

If you're looking for a book recommendation, try Computer Systems: A Programmer's Perspective. If you read and understand chapter 2 (it's dry, hang in there), your question #1 will seem trivial to you (and you'll learn much more as well; pretty much all of it is important material). The book overall is a great read for embedded programmers, and anyone doing any form of low-level computing. There is a newer edition but the one I linked is the one I read.

​

u/srnull · 7 pointsr/cscareerquestions

I'm always sad to see this book never get mentioned in these sorts of discussions: Computer Systems: A Programmer's Perspective.

This book is awesome, and should be required reading for any serious programmer. It covers so much, and does it clearly.

You cover the complete system of computing as seen by a programmer, from the lowest level representation of numbers, more complicated data structures, programming language constructs, and entire programs. By this I mean how n-bit integers are represented, floats and doubles, structs, classes, and entire binaries including how they are linked and loaded.

You see programming languages from the lowest level machine code to modern x86/x64 assembly and higher level languages like C and Java.

You learn about processor architecture from simple sequential execution through modern pipelined architectures.

You learn about the memory hierarchy, from I/D caches to L1, L2, main memory, disk, network, etc. You learn about operating system constructs like virtual memory.

You learn system-level I/O, network programming, and concurrent programming including both I/O multiplexing and threads exploiting real parallelism.

By the concluding chapter, you've built a small toy threaded web server. And you understand its execution from loading the binary into memory down through how it's executed on modern architectures.

u/bonekeeper · 5 pointsr/compsci

Computer Systems - A Programmer's Perspective is a great book IMO, from Carnegie Mellon's CS course.

u/SneakingNinjaCat · 4 pointsr/AskComputerScience

I read Computer Systems A programmer's perspective on my first BSc year in Software Engineering.

u/nabnob · 3 pointsr/AskReddit

Are you in high school or college?

C# is very similar to Java - it's object oriented, has garbage collection (meaning you can get away with not learning about memory), and strongly typed. I wouldn't really say it's that useful to learn if you already know Java unless you end up working for a software company that does work in C#.

C doesn't have any of those nice features of Java and C#(strongly typed, garbage collection), and all variables - pointers, integers, characters - are treated as bits stored somewhere in memory, either in the stack or the heap. Arrays and structs (similar to objects in Java, sort of) are longer blocks of memory. C++ is an object-oriented version of C, and if you already know C and Java you would be able to pick up on C++ fairly quickly.

Learning C forces you to learn a lot of memory and system concepts. It's not really used in the software industry as much because, since it's missing all those nice Java and C# features, it can be difficult to write huge, complicated systems that are maintainable. If you want to be a serious developer, you DO need to learn these things before you graduate from college. Most major software companies ask systems/memory type questions in their interviews.

However, if you're in high school, I wouldn't say it's really necessary to try to learn C on your own unless you really want to. A good computer science program in college would require at least one class on C programming. If you are really interested, I would look at this to learn C, and later this for more information on how computers work.

TLDR; Learn C in college if want to be a software engineer and, if you're in high school, learn whatever you find interesting.

u/hewhomustbenamed · 3 pointsr/learnprogramming

Umm....if you want to do that...then write a simple program to lets say sum 10 numbers in C. Now , compile this file and "step" through the program in gdb....as you will see each assembly line executed you will have an understanding of whats going on.

However , for some sanity please refer the Intel manual or use this book (there might be other references as well) ... http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040


There's a free beta edition somewhere...and you will need Chapters 2 and Chapters 3. One full day read both of them thoroughly..and you"ll be golden. Let me know how it goes.

u/opensourcedev · 3 pointsr/programming

If you are looking to go a little deeper, I can recommend this book as well:

"Computer Systems: A Programmer's Perspective"

http://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040/ref=sr_1_1?ie=UTF8&qid=1292882697&sr=8-1

This book has a similar thread, but is much more in-depth and covers modern Intel and AMD processors, modern Operating Systems and more.

I learned all about linkers, compiler optimization, the crazy memory management structure of x86...

u/sarpunk · 3 pointsr/learnprogramming
  • I second the other comments about practice & sticking with projects. Perfectionism can be a great thing, but if it keeps you from finishing a project, let it go. The first iterations of your projects don't have to be perfect - just getting through them will help you grow.

  • Procrastinating on homework assignments will also tank your grade (been there, done that), even if the material seems easy - some programming assignments just take loads of time.

  • It sounds like you're still in school, so you'll probably be exposed to lots of different languages and paradigms, and that's a good thing. If you're going to insist on perfection in personal projects, though, it might be easiest to focus on one area, like halfercode suggested.

  • Finally, for reading material: It sounds like you don't need any basic intros, so look for advanced tutorials to new languages you want to learn, or just read the language documentation. This is a pretty good competency matrix to rate yourself against - if something looks unfamiliar, browse through the wiki page. Other great books: Computer Systems: A Programmer's Perspective - doesn't assume a ton of prior knowledge, but gets to a fair amount of depth pretty quickly. There are also really cool systems programming labs. Matt Might's list of everything a CS major should know is really comprehensive, with lots of reading material referenced. If I were you, I would focus specifically on the Data Structures & Algorithms and Theory sections, supplementing with practical projects.

  • As for projects: Start small, no matter the final size of the project. Focus on getting out a minimal example of what you want to do before you worry about what the UI looks like or perfect functioning.

    tl:dr Practice & perserverence are the main points. No one is really any good at programming until they've got a few years of churning out code, so don't get discouraged. Finally: don't let the breadth of the computer science/software world overwhelm you. Focus on small pieces, and in a few years you'll have learned more than you would have expected.
u/Intrexa · 2 pointsr/shittyprogramming

Because unlike what the OP said, IP's that are actively being used to route information are ints in memory. When I ping Facebook.com, the program doesn't get 173.252.120.6 back, it gets 2919004166 back (which is identical to getting 0xADFC7806 back, because it's just a bit order, there's no way to differentiate the decimal and the hex. It's just a display thing). It has to convert that number to the 4 octets that make sense to humans.

Same thing in reverse. When I go to http://173.252.120.6, it can't just start sending data out to 173.252.120.6, it has to convert it to 0xADFC7806 first. This is true of any program that does networking. Why is that? It saves space in an area where space is most important. So you see, it's not like the program is doing an extra step to allow this, it's actually doing 1 less step. What's more, I bet Mozilla (I use firefox, this is probably true of all browsers) didn't implement a specific "convert IP to int" function, this is default functionality of networking libraries, libraries that would routinely be handling both octet notation and ints, because most programs dealing with this would already dealing with low level ints. So not only are they doing 1 less step, Mozilla would have to go out of their way to specifically disallow this.

And again, for completeness sake, an IP address that is traversing a network is stored in a big endian int, meaning the bit order you see here (1010 1101 1111 1100 0111 1000 0000 0110) isn't actually the bit order a network switch will receive when it needs to route the packet. Also, it's technically not an int, it's a structure that only contains a single int.

/ Internet address. /
struct in_addr {
uint32_t s_addr; / address in network byte order /
};

If you want to learn about low-level hardware and what is actually happening behind the scenes, I strongly recommend Computer Systems a Programmers Perspective. It's a hard book, no doubt, but it will show you everything that happens that you don't see when you compile a program and then run it, including memory management, cache fetching, how hard drives store data, how processor pipelining works, all to the level of detail in my posts.

u/ultimatt42 · 2 pointsr/linux

You're looking for a clear dividing line, and there isn't one. The term "emulator" is more descriptive of the problem you're trying to solve (I have a program for X but I only have Y, how can I get it to run on Y?) than any particular implementation. It's all in the name, "emulate" means to copy or imitate. If that's the goal of the software, or even just how you're using software that was designed for another purpose, it could be considered an emulator.

> So, [...] it has to be called "emulating"?

No, but if it fits the definition you shouldn't complain if someone says it is.

> I really think that "computer system" refers to hardware, not software.

Maybe you'll trust the textbook I was taught from.

> A computer system consists of hardware and systems software that work together to run application programs.

u/ac1d8urn · 2 pointsr/asm

Maybe you're not looking for this sort of thing and it's a bit more advanced (expecting you to know C, or Java. Any programming experience will be good.), but it's a goldmine of information and covers a broad range of topics.

http://www.amazon.com/dp/0136108040/

From a quick Googling you can find a pdf of the most recent version of it on this guy's Github:
https://github.com/largetalk/datum/blob/master/others/Computer%20Systems%20-%20A%20Programmer's%20Perspective%20(2nd%20Edition).pdf

You can view the raw file to download the PDF.

u/bluebathysphere · 2 pointsr/compsci

The two starting books that gave me a great deal of understanding on systems (which I think is one of the toughest things to grasp and CLRS and the Art of Programming have already been mentioned):

[Computer Systems: A Programmer's Perspective] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=sr_1_2?ie=UTF8&qid=1407529949&sr=8-2&keywords=systems+computer)

This along with its labs served as a crash course in how the system works, particularly a lot about assembly and low-level networking.

The Elements of Computing Systems: Building a Modern Computer from First Principles

I've mostly only done the low-level stuff but it is the most fun way I have found to learn starting all the way at gate architecture. It pairs well if you have read Petzold's Code. A great introduction to the way computers work from the ground up.

u/rjt_gakusei · 2 pointsr/programming

This book has a pretty strong breakdown of how computers and processors work, and goes into more advanced things that modern day hacks are based off of, like address translation and virtualization with the recent Intel bugs:
https://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040
The book can be found online for free. The author's website has practice challenges that you can download, one of them being a reverse engineer of a "binary bomb". I did a challenge similar to it, and it felt pretty awesome when I was able to get around safeguards by working with the binaries and causing buffer overflows.

u/CSMastermind · 2 pointsr/AskComputerScience

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex & yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/zhay · 1 pointr/compsci
u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/ironcrown9 · 1 pointr/learnprogramming

There's a course at my school that covers exactly that (216 at UMD).
The book that's recommended is computer systems a programmer's perspective, it's exactly what you're looking for. Code is only used for examples, C and more often assembly. Mostly details on CPU instructions, hardware implementation and the creation of Unix

u/Watabou90 · 1 pointr/learnprogramming

If you want x86 assembly, this book is very good: http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=dp_ob_title_bk/180-6741587-3105245

I'm talking an assembly class this semester that involves writing assembly from scratch and this book (which is required for this class) is a lifesaver because the professor isn't that great at summarizing the important points.

I think it's a good book. It starts easy and it has a lot of exercises that have answers on the back of the chapter so you can check your answers pretty easily.

u/Sean-Der · 1 pointr/uCinci

Just for my own clarification, is that just applied programming? I am a freshman also, and I have to make a decision about this also.

On one side it really is not that hard to learn how to program. Anyone can make it through LPTHW or hell even K&R... but being able to grapple SICP is a whole other story.

I really enjoy the whole spectrum, but what I am really looking for is the traditional theoretical courses. These sort of lessons are what really make me a better programmer I have found. I was a crappy PHP dev until I learned C, then I was a crappy C dev until I picked up Computer Systems: A Programmers Perspective

The one thing I want to avoid is to sit through garbage I am never gonna use like C#. First off its non-standard (No ISO or ECMA for the later version) Secondly non-free software doesn't teach you anything, merely makes you memorize what buttons and knobs to press.

So any upper classmen want to give advice to clueless youngins :D

u/huck_cussler · 1 pointr/learnprogramming

Computer Systems: A Programmer's Perspective is the one we use at my school, and it is pretty awesome. It's engaging and entertaining inasmuch as a book on systems programming can be. There are tons of exercises and there is a website where you can work on lab assignments that the authors created.

u/__mp · 1 pointr/sysadmin

For this case I usually recommend Computer Systems: A Programmer's Perspective. However it's a layer deeper than the OP is looking for:

> This book focuses on the key concepts of basic network programming, program structure and execution, running programs on a system, and interaction and communication between programs. (Amazon)

However, I think that this should giver a better understanding of system and networking internals and will allow to pick her up on any other topic. Additionally I think that its one of those books that contains timeless knowledge which stands in contrast to a lot of other IT books out there.
My physical copy stands directly next to Physically-Based Rendering.

u/PicklesInParadise · 1 pointr/learnprogramming

I haven't read it in years, but I remember The C Programming Language being very useful.

If you want to learn more about the low level details of how computers work in general, I own the following books and recommend them:

---

u/wgunther · 1 pointr/learnprogramming

The question you want to ask is how is dynamic memory allocation implemented. You can probably find something online that walks through am implementation of malloc. Essentially, it has to take a big block of memory, and when a request is received it has to find space for it in this big block. A naive implementation might have a header for each block of memory, and that header points to the next block of memory and the current state of the block (free or allocated). When a memory allocation request is received, it finds a block of free memory large enough by "jumping" from header to header, alters that header to say allocated, and then adds a header at the end of allocated bit of memory which says how much is left over. When memory is freed, the header is changed to reflect its status and merged with surrounding free blocks.

Clearly, I'm brushing over a lot of details. And a challenge is finding strategies for avoiding fragmenting memory. You should read the chapter on memory allocation from Computer Systems.

u/modernzen · 1 pointr/cuboulder

We used this one when I took the course two years ago. I don't think it's changed, but you might want to double check that.

u/zifyoip · 1 pointr/learnprogramming

This kind of stuff is commonly taught in university CS courses called something like "Computer Architecture." The book that was used in my computer architecture course was Computer Systems: A Programmer's Perspective, by Bryant and O'Hallaron (book home page). This book uses C and IA32 assembly (or rather something the authors call "Y32," which is a simplified version of IA32).

I cannot support copyright violations, so I will not say anything that might lead you to believe you might be able to find a PDF of this book on the Web if you Google for the title.

u/Merad · 1 pointr/askscience

> I wanted to write a program to emulate a CPU so I could fully understand how it's operation actually worked, but I have no idea of what the "start up" process is, or how we determine when to get a new instruction

The CPU begins loading instructions at a fixed address known as its reset vector. On AVR microcontrollers the reset vector is always at address 0, and it is always a jump instruction with the address where startup code actually begins. On x86 processors the reset vector is at address 0xFFFFFFF0. For a CPU emulator, which presumably doesn't need to deal with interrupt vectors or anything like that, I would just start loading instructions from address 0.

Also, you should should look at some of the simplified CPU designs that have been made for teaching. In my classes we used the LC-3 (a very simple and straightforward design) then moved to the y86 (a simplified x86 design mainly used to teach pipelining). It will be much more realistic to make an emulator for one of them rather than an extremely complex real-world design. I've linked below to textbooks that are based around each of those designs.

http://highered.mheducation.com/sites/0072467509/index.html

http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040

u/c3261d3b8d1565dda639 · 1 pointr/programming

> I know basically nothing about x86 internals to make an accurate statement

If you're interested in learning about the internals, check out some real world technologies articles. For instance, Intel’s Haswell CPU Microarchitecture. On page 3, Haswell Out-of-Order Scheduling, it talks about the register renaming that goes on to support out-of-order execution.

It's more detail than most people really need to know, but it's interesting to understand what modern microprocessors are doing under the hood during program execution.

For anyone else reading, an even easier introduction to the topic is in the awesome Computer Systems: A Programmer's Perspective. It'll get you comfortable with the machine language representations of programs first, and then move on to basic architecture for sequential execution, and finally pipelined architecture. It's a solid base to move forward from to modern architecture articles like on real world technologies. There are more detailed treatments if you're really interested, e.g. Computer Architecture, A Quantitative Approach, but I have never read it so can't say much about it.

u/[deleted] · 0 pointsr/learnprogramming

Hey dude, definitely do Computer Systems: A Programmer's Perspective.. Also, i saw you mention that google didn't help, but it definitely would have. Look up Stack Overflow's most recommended books and you'd find some awesome C stuff. Good luck!

u/slaystation25 · 0 pointsr/learnprogramming

Sorry, I just didn't put that right. I meant how data is stored in memory, and how it can be manipulated using C. I'm using this book for the course, if that helps clear what I'm trying to say.