In this chapter, we introduce the elements of probability theory. In the spirit of the book, we confine ourselves to the discrete case, that is, probabilities on finite domains, leaving aside the infinite case. We begin by defining probability functions on a finite sample space and identifying some of their basic properties. So much is simple mathematics. This is followed by some words on different philosophies of probability, and warnings of traps that arise in applications. Then back to the mathematical work, introducing the concept of conditional probability, outlining its behaviour, connections with independence, and deployment in Bayes’ rule. An interlude reflects on a curious configuration known as Simpson’s paradox. In the final section we explain the notions of a payoff function and expected value.