## 04 January 2012

### Seven Books that Everyone Should Read

If you want a solid background in the theoretical parts of computing, then you should read the following books. The best time to do so is in high-school and as an undergraduate. But never is too late.

Types and Programming Languages. You will enjoy this book if you like to manipulate symbols. Once you understand ordered sets and induction, you are ready to begin reading. You will learn at least the following:

• Lambda calculus is the computational model of functional languages.
• Well-typed programs don't fault (except when they do).
• You construct more complicated types, such as variants, records, and function types, out of simpler ones.
• Types are much like propositions in logic, and polymorphic types are much like propositions in second-order logic.
• Mutable state is tricky.

Algorithms. You will enjoy this book if you like the problems used in Google Code Jam. Once you understand the notation $O(f(n))$ and you can solve the recurrence $x_{n+1}=2x_n+1$, you are ready to begin reading. You will learn at least the following:

• Reductions are the algorithm design technique.
• When you reduce an instance of a problem to smaller instances of the same problem, you say that you use recursion (or, if you want a fancier phrase, say ‘divide and conquer’).
• If you cache the results of recursive calls, then you say that you use memoization.
• If you use at most one recursive call, then you are greedy.
• If you are stuck, then try to reduce your problem to linear programming, or some other problem that is famous and has good implementations.

Purely Functional Data Structures. You will enjoy this book if you puzzled over how to implement breadth-first search in a programming language without mutation. Once you know the basics of a functional language (such as OCaml or Haskell), you are ready to begin reading. You will learn at least the following:

• Persistent implementations of heaps and of search trees are elegant.
• Amortization is a technique for data structure design, although it is usually presented as a technique for algorithm analysis.
• The requirements of persistence and amortization imply the need for lazy evaluation.
• Number representations correspond to collection implementations.
• Polymorphic recursion is useful.

Information Theory, Inference, and Learning Algorithms. You will enjoy this book if you like probabilities (or gambling). Once you can answer ‘What is the probability that the sum of a pair of dice is $\ge3$?’, you are ready to begin reading. You will learn at least the following:

• Information equals entropy; it is measured in bits.
• Communication systems are reliable because they use codes, which rely on redundancy.
• Out of functions whose spectrum has finite support, those of type $\mathbb N\to\mathbb R$ carry as much information as those of type $\mathbb R\to\mathbb R$.
• The bandwidth and the signal-to-noise ratio of a channel determine a fundamental upper limit on its capacity, known as the Shannon limit.
• The codes that best approach the Shannon limit are based on sparse graphs.

The Art of Computer Programming. You will enjoy this book if you like detailed precision. Once you wrote a program in a low-level language, you are ready to begin reading. You will learn at least the following:

• You must prove that your programs are correct and fast.
• There are ${2n \choose n}/(n+1)$ binary trees with $n$ nodes; that is, $\sim 4^n/n$.
• One of the best hashes of $x$ takes the high bits of $\lfloor 2^wa\rfloor x$, where $w$ is the word width, and $a$ is some irrational number from $(0,1)$.
• BDDs represent boolean functions; ZDDs represent families of sets.
• You can generate the permutations of $n$ objects using only a constant number of operations for each step.

Approximation Algorithms. You will enjoy this book if you like good-enough, fast code more than you like perfect, but nonexistent code. Once you finished the Algorithms book from above, you are ready to begin reading. You will learn at least the following:

• You need not despair if you have to solve an NP-hard optimization problem.
• Greedy algorithms are surprisingly good, but you should know examples on which they behave worst.
• The approximation ratio compares the solutions produced by an algorithm to optimal solutions.
• The greedy algorithm for Set-Cover has approximation ratio $O(\lg n)$, where $n$ is the size of the universe.
• To design a minimization algorithm, search for a certificate that provides a lower bound.

Computational Complexity. You will enjoy this book if you like to know what computers can do in principle more than you like to know what computers really do. Once you understand why the halting of a program cannot be decided, you are ready to begin reading. You will learn at least the following:

• ‘Can we solve a problem efficiently if we can check its answers efficiently?’ is the most famous open question in theoretical computer science.
• We can efficiently find models for 3CNF formulas only if the answer to the famous question is ‘yes’.
• Every problem that can be solved with reasonable amounts of memory can be solved (not much slower) by playing a game.
• Some proofs don't teach you anything.
• If you can quickly count models of CNF formulas, then you are good at playing games.

Disclaimer. I read only parts of these books.

Other lists. I wrote this post after I saw two other book lists: one for programmers (on a Romanian site popular among high-school students) and one for theoretical computer scientists (on CSTheory).

What other books would you put on this list? Which ones would you take off this list?

michee said...