Best mainframes & minicomputers books according to redditors

We found 117 Reddit comments discussing the best mainframes & minicomputers books. We ranked the 28 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top Reddit comments about Mainframes & Minicomputers:

u/sunny20d · 55 pointsr/ProgrammerHumor

I love that book. I just bought the original for school.

Edit: For the curious: http://www.amazon.com/Nutshell-Fourth-Edition-Arnold-Robbins/dp/0596100299

u/EngSciGuy · 51 pointsr/science

A decent book that isn't too complicated (some matrix algebra is really the only background necessary, some probability wouldn't hurt);

http://www.amazon.ca/Introduction-Quantum-Computing-Phillip-Kaye/dp/019857049X

u/Horizivertigraph · 16 pointsr/QuantumComputing

Don't get discouraged, it's possible to get to a reasonable understanding with some sustained effort. However, you need to get the following into your head as quickly as possible:

Popular level explanations of anything quantum are a waste of your time.

Go back and read that again. You will never get close to understanding the field if you rely on someone else managing to "find the right metaphors" for you. Quantum computing is a mathematical field, and if you want to understand a mathematical field, you need to do mathematics. This sounds super scary, but it's actually no problem! Math is not what you think it is, and is actually a lot of fun to learn. You just need to put some work in. This just means maybe doing an hour or so of learning every day before you go to work, or afterwards.

Let's look at a little bit of a roadmap that you can follow to get to a reasonable understanding of quantum computing / quantum information. This is pretty much the path I followed, and now I am just about to submit my PhD thesis on quantum computational complexity. So I guess it worked out OK.

  1. You can get really far in quantum computing with some basic understanding of linear algebra. Go to Khan Academy and watch their fantastic introduction.

    If Sal asks you to do an exercise, do the exercise.

  2. Now you know what a vector is, can kind of grasp what a vector space is, and have some good intuition on how matrix-vector and matrix-matrix multiplication works, then you can probably make a reasonable start on this great intro book: https://www.amazon.co.uk/Quantum-Computing-Computer-Scientists-Yanofsky/dp/0521879965

    Start from the start, take it slowly, and do all of the exercises. Not some of the exercises, do all of the exercises. If you don't know a term, then look it up on wikipedia. If you can't do an exercise, look up similar ideas on Google and see if you can muddle your way through. You need to get good at not being scared of mathematics, and just pushing through and getting to an answer. If there is an explanation that you don't understand, look up that concept and see if you can find somebody else's explanation that does it better. Do the first few intro chapters, then dip in to some of the other chapters to see how far you get. You want to get a pretty good coverage of the topics in the book, so you know that the topics exist and can increase your exposure to the math involved.

  3. If you manage to get through a reasonable chunk of the book from point 2), then you can make a start on the bible: Quantum information and computation by Nielsen and Chuang (https://www.amazon.co.uk/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/ref=pd_lpo_sbs_14_img_1?_encoding=UTF8&psc=1&refRID=S2F1RQKXKN2268JJF3M2). Start from the start, take it slowly, and do all of the exercises.

    Nielsen and Chuang is not easy, but it's doable if you utilise some of the techniques I mention in point 2): Google for alternative explanations of concepts that the book explains in a way that confuses you, do all of the exercises, and try to get good coverage throughout the whole book. Make sure you spend time on the early linear algebra and basic quantum chapters, because if you get good at that stuff then the world is your oyster.

    Edit:

    Just remembered two more excellent resources that really helped me along the way

    A) Quantum mechanics and quantum computation, a video lecture course by Umesh Vazirani (YouTube playlist here) is fantastic. Prof. Vazirani is one of the fathers of the field of quantum computing, with a bunch of great results. His lecture course is very clear, and definitely worth devoting serious attention to. Also, he has a wonderful speaking voice that is very pleasant to listen to...

    B) Another lecture course called "Quantum Computing for the determined", this time given by Michael Nielsen (YouTube playlist here). In my opinion Nielsen is one of the best scientific communicators alive today (see also his unrelated discourse on neural networks and machine learning, really great stuff), and this series of videos is really great. Communicating this sort of stuff well to non-practitioners is pretty much Nielsen's whole jam (he quit academia to go on and write about science communication ), so it's definitely worth looking at.
u/suhcoR · 14 pointsr/programming
u/PenTestWS · 10 pointsr/HowToHack

Read through Sparc Flow's books and then take his 24 hour challenge. I did this a couple weeks ago and it was amazing:

Hack Like A Porn Star : Amazon Link

Hack Like A God : Amazon Link

And here is the PenTest Walk Through Guide for the 24 Hour PenTest he offer's as an added service to the two books above:

Ultimate Hacking Challenge : Amazon Link

The hacking challenge is the really fun part. The book will walk you through the entire thing if you get stuck along the way. It took me about 12 hours to get through the full challenge. Its like an extra $10 with the discount you'll find inside either book. $15 without if I remember correctly.

u/jmct · 9 pointsr/Physics

Your best bet is to read an introductory text first and wrap your head around what quantum computing is.

I suggest this one: Intro Text

I like it because it isn't very long and still gives a good overview.

My former supervisor has a web tutorial: here

Lastly, Michael Nielson has a set of video lectures: here

The issue is, there is a decent sized gap between what these introductions and tutorials will give you and the current state of the art (like the articles you read on arxiv). A good way to bridge this gap is to find papers that are published in something like the Physical Review Letters here is their virtual journal on quantum information and see what they cite. When you don't understand something either refer to a text, or start following the citations.

Basically, if you can start practicing this kind of activity (the following of references) now, you'll already have a good grasp on a large part of what grad school is about.

Best of luck!

u/kev009 · 8 pointsr/programming

I think there is a lot of hidden awesome in systems like NonStop, VMS, and IBM's 'i' (aka AS/400, System i). Very fertile grounds for current systems programmers to review.

For instance, OS/400 (or whatever you want to call it) has a notion of machine independence that allowed them to move from CISC to RISC with minimal impact. It also has an object filesystem (think: WinFS) that is in the system address space. Therefore, RAM is almost indistinguishable from disk. Migration to tiered storage with SSDs is as simple as adding it to the storage pool (www.ibm.com/systems/resources/ssd_ibmi.pdf). Fortress Rochester, written by the architect of the AS/400, was one of the better systems books I've ever read and that's a large accolade coming from me as a UNIX systems zealot.

Unfortunately, there isn't a lot of lesson-learning done beforehand because these classic systems aren't HTTP based and therefore assumed to not be webscale by the masses.

u/scasagrande · 7 pointsr/ECE

As someone who just finished their MSc at the Institute for Quantum Computing there is a lot of interest for those with EE experience in quantum computing. The IQC has people from many disciplines, including EE.

In our group I was one of the few people that had any EE experience. I did a lot of circuit design and microwave engineering. You'd be surprised how poor the average physics grad student is at the basics of using T&M equipment. If you're more into the chip fabrication side of EE there also is groups for that in QC.

You will want to take as many quantum mechanics electives as you can. If your school does not offer a QC specific course I suggest you read this book: http://www.amazon.com/Introduction-Quantum-Computing-Phillip-Kaye/dp/019857049X

If you have any specific questions feel free to ask me either here or in a PM.

u/dolphinrisky · 5 pointsr/Physics

Ah gotcha, yeah to be honest this approach probably won't be terribly illuminating. The problem is that the D-Wave really doesn't work in any kind of classically equivalent way. When you think about algorithms classically, the procedure is highly linear. First you do this, then that, and finally the other. The D-Wave One involves nothing of the sort.

Here's a quick rundown of what a quantum annealing machine actually does, with analogies to (hopefully) clarify a few things. In fact, an analogy is where I'll start. Suppose you had a problem you were working on, and in the course of trying to find the solution you notice that the equation you need to solve looks just like the equation describing how a spring moves with a mass hanging from it. Now you could continue your work, ignoring this coincidence, and solve out the equation on your own. Alternatively, you could go to the storage closet, grab a spring and a mass, and let the physics do the work for you. By observing the motion of the spring, you have found the solution to your original problem (because the equations were the same to begin with).

This is the same process used by the D-Wave One, but instead of a spring and a mass, the D-Wave system uses the physics of something called an Ising system (or model, or problem, etc.). In an Ising system, you have a series of particles^ with nonzero spin that can interact with each other. You arrange this system so that you can easily solve for the ground state (lowest energy) configuration. Now with the system in this ground state, you very, very slowly vary the parameters of the system so that the ground state changes from the one you could easily solve to one that you can't. Of course this new ground state, if you've done things correctly, will be the solution to the problem you were actually concerned with in the first place, just like the spring-mass example above.

So perhaps now I have explained at least a little bit of why I don't call the D-Wave One a "computer". It doesn't compute things. Rather, by a happy coincidence, it sets up an experiment (i.e. the Ising system) which results in a measurement that gives you the answer to the problem you were trying to solve. Unfortunately for you, the software engineer, this resembles precisely nothing of the usual programming-based approach to solving problems on a classical computer.

My advice is this: if you want to learn some quantum computing, check out An Introduction to Quantum Computing by Kaye, Laflamme, and Mosca, or the classic Quantum Computation and Quantum Information by Nielson and Chuang.

^
They don't actually have to be single particles (e.g. electrons), but rather they are only required to have spin interactions with each other, as this is the physical mechanism on which computations are based.

Edit: Okay, this was supposed to be a reply to achille below, but apparently I'm not so good with computers.

u/NorseZymurgist · 4 pointsr/IBMi

IBM i developer and consultant here - who also went through a similar program as yours (I was in Mankato).

I would suggest you try to get an internship with IBM, or places (like Help Systems) where they do a lot of IBM i programming. There is very little good alternative to real world usage, and once you get your foot in the door you should be able to explore many other facets/factors of the platform and where you'd like to end up. Like any BS degree, you're learning how to use the basic tools and what makes you an expert (and sets your career path) will be the education you receive once you're out of school. For example, you can learn OOD/OOP (c++/java) principles in school, but it's not until you're in the workplace wrangling with other people's code, making your own mistakes, and deciding where the time vs perfection vs functionality balance is.

In terms of 'real world uses', the IBM i OS was designed to be a 'database support system', i.e. primarily transaction-based, so you'll find relational database technologies to be most helpful - commitment control, journaling, normalization etc. Also learn how that database integrates and communicates with the outside world (jdbc/odbc etc). Usually I see linux-based front-end's accessing the database on the '400. Pay attention to IASP's and work management, RDB entries etc.

You'll definitely want to have a good understanding of CL programming, and the IFS file system (which include QSYS!). Understand how to access both streamfiles and record-based files (i.e. C++ _Ropen vs fopen). SQL too, and how to incorporate it into your code (like SQLExecDirect).

Complexity analysis is important - there's a lot of 'big data' applications on the IBM i where you're dealing with millions of records. My customers are the titans of the commerce and banking industries, and you'll find complex ecosystems where a database can spans hundreds of LPAR, or even the big iron (z/OS). Although the IBM i is a 'midrange' computer, the past ten years has it running on some very nice hardware and has captured customers who would otherwise be on a small main-frame, so data set can be huge. You don't need to know bubble-sort vs. red-black trees if you're just presenting a list of html options, but when you're analyzing 50 terabyte databases that's a different story.

RPG is the traditional IBM i database programming language and it is still used in many places, so some knowledge of that will be helpful. I don't see COBOL very often. But what is unique about the programming languages on the IBM i is how they work together through the binding and service programs. Definitely learn how to use that effectively, so that you can write your code in whatever language is best for the task. Think modules, service programs, programs and commands. Regardless if you're coding in RPG or Python you'll want to know how to compile and bind etc.

Get really good at the "green screen". Sure, RDi is nice, but there's no excuse for not being proficient at PDM in 5250. For user interactions we don't use screen files (SDA) much, UIM panels (and the API's to make them work) are the way to go. Menus are easy to create, and I see them used a lot by admin's to make the life of operators easier.

Don't skimp on the security side of things, that will ALWAYS be important.

If you're really motivated to learn the history of the system, read this book:

https://www.amazon.com/Fortress-Rochester-Inside-Story-iSeries/dp/1583040838

Hopefully you can find a library that has it.

OK - back to work - hopefully there is some useful information in here. I guess the tl;dr is if you want to play in the IBM i industry, focus on database technologies and how to create/compile/bind and call program procedures.

u/avilay · 3 pointsr/QuantumComputing

As a beginner myself what is helping me most is the book Quantum Computing Explained by David MacMahon. It has almost no Physics but is pretty heavy on the Math.

I have also started to look at these online resources and find them promising:

u/RetardedSmackWhore · 3 pointsr/QuantumComputing

I began by just plowing through wikipedia articles and trying to get the basic concepts in broad strokes. I would suggest that you do that.

I found this book, that is totally awesome: Quantum computing for computer scientists

Quantum computing for the determined, already mentioned here, is also awesome.

There is also IBMs quantum experience web site, that contains a front-end for programming a real 5-qubit quantum computer. It is free of charge, you just need to register with an email address in order to access it. Soon (as in a few days) there will also be a 16-qubit computer to play with at that site. Playing with the real thing is very good for study morale, at least for me..

u/MatmaRex · 3 pointsr/pics

O'Reilly books look like that. This one was, apparently, Unix in a Nutshell.

u/amazon-converter-bot · 3 pointsr/FreeEBOOKS

Here are all the local Amazon links I could find:


amazon.co.uk

amazon.ca

amazon.com.au

amazon.in

amazon.com.mx

amazon.de

amazon.it

amazon.es

amazon.com.br

amazon.nl

amazon.co.jp

Beep bloop. I'm a bot to convert Amazon ebook links to local Amazon sites.
I currently look here: amazon.com, amazon.co.uk, amazon.ca, amazon.com.au, amazon.in, amazon.com.mx, amazon.de, amazon.it, amazon.es, amazon.com.br, amazon.nl, amazon.co.jp, if you would like your local version of Amazon adding please contact my creator.

u/LiquidLogic · 3 pointsr/arduino

For learning to program the Arduino, I HIGHLY recommend Simon Monk's : Programming Arduino, Getting Started with Sketches books.


For the electronics side, his Electronics Cookbook: Arduino and Raspberry Pi is also great.


Jeremy Blum's : Exploring Arduino is also very nice.

u/pitch_away · 3 pointsr/AskElectronics

Here are 2 awesome guides: 1 & 2. But as indicated in this thread you probably should get a well know micro-controller and use it to build a knowledge base. The Arduino is an Italian micro-controller that is based on an Atmel chipset. It has a massive online following and support for it can be found in /r/Arduino or here at their website. This has numerous shields that can be added on to add features. These are things like GPRS (SMS and Mobile connectivity), Ethernet (Wireless) and Motor Control. You can buy components and such from: https://www.sparkfun.com/ ; https://www.adafruit.com/ ; http://www.mouser.com/ . The Arduino favors hardware prototyping and tinkering. The Arduino is programmed using its own software that is free and available. It has its own IDE (integrated development Environment) and is programmed using its own take on the assembly language. It is quite easy to use, well supported and open source.

Also, you might consider the Raspberry Pi which is explained in this Ted Talk by Eben Upton one of the creators of the board. I believe it uses AVR. The "A" board is slightly smaller (storage) and boots linux from a SD card or Flash as does the "B" which has slightly more storage and WiFi hardware. The Pi is typically programmed in Python but can be used I think with almost any language(C, C+, Assembly etc.)

Also, there is a Micro called BeagleBone. It is similar to Pi but has a few different features. It is very powerful and can be researched(as a starting point here. I know very little about this board and believe it to be more advanced than the former 2 I had mentioned.

These resources can be used for the Arduino: Getting Started and Cookbook.

A few resources to get started: Python & Pi and Getting Started

The first 2 resources I listed 1 & & 2 Are absolutely brilliant. They teach basic electronics introduction to Eagle Cad A free PCB (printed Circuit Board) program that people use to draw schematics and PCB. Which is pretty important. I linked the free version which is more than powerful enough for a beginner. The resource [1] is really helpful I would read it thoroughly if interested. Also, places like http://makezine.com/ is a good place for DIYers. Also you might like this news channel that follows hacker stuff (it is from Y-Combinator an Incubator for some silicon valley start ups) listed here. These links should cover you for a while.

u/grotgrot · 3 pointsr/technology

They tend to be throughput optimised, and this is done by having more processors/code throughout the system. For example an x86 system has processor interrupts whenever anything happens in the hardware such as network packets arriving, disk transfer complete, and key presses. AS/400 has separate processors on the I/O components and they handle the fiddly work, leaving the main processors to keep going. The separate ones can be very high level - for example disk processors can do database work going off and finding relevant records, and only once done interrupting the main processor with the results.

I highly recommend reading Frank Soltis' book about the AS/400 which covers how it came about (very much a business system) and how the operating system and apps work which again are very different than x86 style hardware and software.

Edit: the hardware (and hence operating system) has more reliability features. For example most things can be hot plugged (memory, cpus etc) and there is more error checking (think ECC memory). You can do that kind of thing on x86 too, but not the commodity stuff. Another example software difference is the database being part of the operating system - not some extra component. I don't mean distributed with it, but actually part of the OS kernel.

u/jay9909 · 2 pointsr/financialindependence

Programming the Raspberry Pi, Second Edition: Getting Started with Python

I find I learn a new language better when I have some goal to tinker with along the way, like playing with a Raspberry Pi. :)

u/gawron · 2 pointsr/QuantumComputing

Quantum games. It is a simple topic and might be quite interesting for young students. As a reference you can use paper by my collaborator:
An invitation to Quantum Game Theory

There are also two recent papers about the subject (both paywalled)

u/zitterbewegung · 2 pointsr/compsci

If I were you I would read this book quantum computing for computer scientists
which is geared torward undergraduates.

He has a arxiv article which he based his book on. See an introduction to quantum computing.

Someone else suggested other books. Those are more geared toward graduates and I would read these books first and then try to tackle those.

u/gmartres · 2 pointsr/askscience

A simpler quantum algorithm: https://secure.wikimedia.org/wikipedia/en/wiki/Deutsch%27s_Algorithm though I'm not sure that'll enlighten you. if you're really interested I'd recommend Quantum Computing for Computer Scientists which is a great intro and doesn't assume much knowledge (the first chapter introduces complex numbers, the second linear algebra).

u/QuirkyQuarQ · 2 pointsr/raspberry_pi

Umm, it's basically a glorified 8x8 RGB LED matrix. They give you a very nice "driver" library, allowing you to simply address the LEDs as X-Y with R/G/B values. You even have example code on displaying lots of different math-based patterns, or even a small graphic file. You can't really expect a sample Android notifier application too...

Conceptually, you could easily scale your notification from one single-color LED to an 8x8 RGB matrix like the Unicorn hat. So I don't know why you thought this was the tool to start learning Python with.

I recommend you start off with a good introductory book, such as Programming a Raspberry Pi: Getting Started with Python. Whenever a project in the book talks about connecting an LED to GPIO, you may skip that and use your Unicorn Hat instead.

u/FormerlyTurnipHugger · 2 pointsr/Physics

The good thing about quantum information is that it's mostly linear algebra, once you're past the quantization itself. The good thing though is that you don't have to understand that in order to understand QI.

There are books written about quantum computing specifically for non-physicists. Mind you, they are written for engineers and computer scientists instead and they're supposed to know more maths and physics than you as well. Still, you could pick up one of those, e.g. the one by Mosca, or even better the one by David Mermin.

There are also two very new popular-science books on the topic, one by Jonathan Dowling, Schrödinger's Killer App, and one by Scott Aaronson, Quantum computing since Democritus.

u/[deleted] · 2 pointsr/linux

My Guru made me buy the following books. I scoffed at him, "I can get everything on the internet." But they've been invaluable in my learning how to do various things on the system:

u/vtomole · 1 pointr/QuantumComputing

I haven't entirely read this book, but I remember that it explains Grover's algorithm well. It might be what you are looking for because it states that it's target audience is undergraduate computer science students.

u/KermMartian · 1 pointr/TI_Calculators

Your example sound like you want to do binning of samples; you can definitely not enter ranges on the calculator. Although it sounds like your questions are a little broad for a single Reddit answer, might I be so bold as to do some self-promotion of "Using the TI-84 Plus", 2nd Edition, which explains statistics on the TI-84 Plus CE (and basically everything else you can do with it) in great detail: http://www.amazon.com/dp/1617293156

u/weegee101 · 1 pointr/Python

I started by going through Grey Hat Python. If you're an experienced programmer and enjoy the low-level stuff once in a while it is a very fun way to learn Python.

If you're specifically looking to learn how to program the Rasberry Pi and learn Python at the same time (which will keep your interest up), Programming the Raspberry Pi: Getting Started with Python is a great book for beginners of all skill levels. If you are a newbie programmer, buy this book.

u/seca · 1 pointr/funny

I've seen this monkey in my dads house on the book shelf. Can't say I noticed the title though. He works with Oracle, so you might not be too far off. Can we figure out what it is?

Found it: http://www.amazon.com/Unix-Nutshell-Fourth-Arnold-Robbins/dp/0596100299/ref=ntt_at_ep_dpt_3

u/PinkyThePig · 1 pointr/raspberry_pi

I liked this one. http://www.amazon.com/Programming-Raspberry-Pi-Getting-Started/dp/0071807837/ref=sr_1_1?ie=UTF8&qid=1371223914&sr=8-1&keywords=python+raspberry+pi

It has an intro section that describes how to setup your raspberry pi and it ramps up pretty quickly with the final chapters covering programming the GPIO pins.

u/CodeSquad · 1 pointr/raspberry_pi

Anything you do, do not buy this book , or this one . Just a waste of money.

u/MuckYu · 1 pointr/learnpython
u/ssaltmine · 1 pointr/raspberry_pi

There is a wealth of information out there, starting with books and kits on simple electronics. Ultimately what may frustrate most people is the lack of patience and time to develop useful projects. You need to keep motivated to continue going.

u/Enlightenment777 · 1 pointr/arduino

Do you own these books?

u/nullcone · 1 pointr/QuantumComputing

...also I recommend to you the textbook by Kaye, Laflamme, and Mosca. Alternatively, if you're feeling daring you can pick up the QC bible. And if you're feeling too cheap to buy books, you can find lecture notes from a lot of QC courses posted online. Check out John Preskill's website, or maybe MIT open courseware.

u/Azoth_ · 1 pointr/IAmA

I don't know too much about quantum computing, I've just had some basic courses in quantum physics and a side interest in it. I recently started reading An Introduction to Quantum Computing. It's a pretty great book.

My school doesn't offer any undergrad courses (or possibly even grad, I haven't checked) on the subject, but from what I've learned it's basically a mix of linear algebra, cs, ee, and physics.