Reddit Reddit reviews Applied Cryptography: Protocols, Algorithms and Source Code in C

We found 8 Reddit comments about Applied Cryptography: Protocols, Algorithms and Source Code in C. Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Web Encryption
Computer Security & Encryption
Applied Cryptography: Protocols, Algorithms and Source Code in C
Wiley
Check price on Amazon

8 Reddit comments about Applied Cryptography: Protocols, Algorithms and Source Code in C:

u/JobDestroyer · 5 pointsr/GoldandBlack

I post this because this is the money system of a libertarian society, and we might have to think about some solutions to the problems he's outlining here. Also, this isn't just some ignorant NY Times reporter, this is the guy that wrote the book on cryptography. Applied Cryptography was, and still is, one of the first books people recommend on cryptographic protocols, so it seems like a good idea to consider what he has to say on the subject.


u/HenryJonesJunior · 3 pointsr/AskComputerScience

You mention a diverse set of topics, and you're probably not going to find any one book that covers all of them.

For algorithms for cryptography, signatures, protocols, etc. the definitive go to (last I checked) was still Schneier's Applied Cryptography.

For a history of cryptography, I'm fond of Kahn's The Codebreakers, but be forewarned that it is a large book.

For Network Security and Information Assurance concepts, I like Anderson's Security Engineering, but the state of the art changes so rapidly that it's difficult to recommend a book.

u/josejimeniz2 · 3 pointsr/crypto

Applied Cryptography by Bruce Schneier.

Yes it's older, but it will get you up to speed with the concepts.

I think the book really is the gold standard when it comes to introducing cryptography. I read it cover-to-cover in 1999 and it really explains everything well. I used encryption software before that, but this explains how it all works.

u/AaronKClark · 2 pointsr/OMSCyberSecurity

The cryptography course is the one I'm really excited about. I read the red book when I was like 19, and I've been waiting for that course ever since.

u/perladdict · 2 pointsr/hacking

Yeah just don't buy it unless you really want to do and explore a lot of C programming. That said it was great for my systems programming class as a sort of reference. But as for attack/defense start looking into networking if you aren't familiar with it, and if you are network security then, of course, there is actual information security, what most people mean by crypto even though there is a lot more to it. For crypto, I'd recommend This but that's more an overview of what your crypto algorithms actually do.

u/OmegaNaughtEquals1 · 2 pointsr/cpp_questions

Everything you ever wanted to know about cryptography (but not necessarily all cryptographic algorithms) is in Practical Cryptography. If that doesn't fill your cup, then put on your big-boy pants and dive into Applied Cryptography. You will note that Brian Schneier is a common author between those two books. There is a reason for that. :-)

u/tufty_thesinger · 1 pointr/cryptography

Read: Applied Cryptography by Bruce Schneier. Goes through implementation and attack details on several older algorithms, as well as all sorts of cool applications. It's an older book, but the older algorithms are easier to understand and start with.

u/CypherZealot · 1 pointr/singularity

From Applied Cryptography 1996

>One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

>Given that k = 1.38×10-16 erg/°Kelvin, and that the ambient temperature of the universe is 3.2°Kelvin, an ideal computer running at 3.2°K would consume 4.4×10-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

>Now, the annual energy output of our sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn't have the energy left over to perform any useful calculations with this counter.

>But that's just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

>These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.