Principles of Computer Systems

Butler W. Lampson

 

Citation: The on-line version is for the 2006 edition of the course. This material was developed jointly with Bill Weihl, Nancy Lynch, Martin Rinard and Daniel Jackson.

Links: Acrobat (2.6 MB, about 450 pages), Word (5.5 MB),  Web page (this is the course Web page at MIT). Slides for a short talk on the essential ideas of the course are here.

Email: blampson@microsoft.com. This paper is at http://research.microsoft.com.

 

Abstract:

This is a course for computer system designers and builders, and for people who want to really understand how systems work, especially concurrent, distributed, and fault-tolerant systems.

The course teaches you how to write precise specifications for any kind of computer system, what it means for an implementation to satisfy a specification, and how to prove that it does. It also shows you how to use the same methods less formally, and gives you some suggestions for deciding how much formality is appropriate (less formality means less work, and often a more understandable spec, but also more chance to overlook an important detail).

The course also teaches you a lot about the topics in computer systems that we think are the most important: persistent storage, concurrency, naming, networks, distributed systems, transactions, fault tolerance, and caching. The emphasis is on

careful specifications of subtle and sometimes complicated things,

the important ideas behind good implementations, and

how to understand what makes them actually work.

We spend most of our time on specific topics, but we use the general techniques throughout. We emphasize the ideas that different kinds of computer system have in common, even when they have different names.