Time: MW 3:00 — 4:15
Place: Anderson Hall 309
Questions to: Piazza
Instructor: Norman Ramsey, Halligan 222

Quick Reference: Syllabus, Calendar, Photos, Scouting

At this point the main resource for the class is a syllabus, which is available in both HTML and PDF. I will gradually build out a calendar.

Information about readings will be forthcoming; in order to avoid spoiling the first day of class, I am withholding it for a time.

Resources from scouting reports

Erlang resources:

Case-study resources

In addition to the curated reading below, there is

Curated Reading


Andrew Appel, A Run-time System, Lisp and Symbolic Computation, 1990.

By making a number of simplifying assumptions, Appel built a run-time system that is relatively simple, relatively performant, and can support an ambitious language like Standard ML

Garbage Collection

Henry Lieberman and Carl Hewitt. 1983. A real-time garbage collector based on the lifetimes of objects. Commun. ACM 26, 6 (June 1983), 419-429. http://dx.doi.org/10.1145/358141.358147

Early paper on generational GC. Also notable for dividing the heap into pages. Collection is incremental. Contains many interesting observations, but when an old object points to a new object, it does so with a level of indirection.

David Ungar. 1984. Generation Scavenging: A non-disruptive high performance storage reclamation algorithm. In Proceedings of the first ACM SIGSOFT/SIGPLAN software engineering symposium on Practical software development environments (SDE 1). ACM, New York, NY, USA, 157-167. http://dx.doi.org/10.1145/800020.808261

A careful approach to the performance of memory management. Reduced memory-management costs by a factor of eight and got Berkeley Smalltalk running 1.7 times faster overall. The system includes an “old object area” in which objects are tenured and therefore never reclaimed. Collection is not incremental, but pause times are nevertheless short—but not short enough for real-time animation.

Andrew W. Appel. 1989 (February). Simple Generational Garbage Collection and Fast Allocation. Software—Practice & Experience, 19(2):171–183.

A “simple recipe” for building your own generational collector. Slickly sold.

Examining the stack

(Coming soon.)

Message-passing concurrency

Books and papers

C. A. R. Hoare. 1978 (August). Communicating Sequential Processes. Communications of the ACM, 21(8):666–677.

The original paper on synchronization using communication primitives, not locks. Once you fight past the notation, which is based on Dijkstra’s notation of guarded commands, you’ll see the paper is mostly examples.

A key construct is the “alternative command” (generalization of an if statement) that uses communication primitives as “guards.” This construct gives you the ability to make choices based on what communications are ready to proceed. It is an essential part of the model, but it is quite difficult to implement, and thus has been eliminated from a lot of libraries based on Hoare’s ideas.

There is also a book.

Rob Pike. 1989 (April). Newsqueak: A Language for Communicating with Mice. Technical Report 143, AT&T Bell Laboratories, Computing Science Dept.

Rob Pike. 1990 (July). The Implementation of Newsqueak. Software—Practice & Experience, 20(7), 649-659.

Newsqueak is a really nice implementation, in an interpreter, of a concurrent programming language based on synchronous message passing.

Rob Pike. 1989 (Spring). A Concurrent Window System. Computing Systems, 2(2):133–153.

A very nice case study of using synchronous message passing to implement a window system. It’s a huge step up from the systems of the era, which were based on event loops.

John H. Reppy. 1988. Synchronous Operations as First-class Values. SIGPLAN Notices (Proc. SIGPLAN ’88 Conf. on Prog. Lang. Design and Implementation), 23(7):250–59.

John H. Reppy. 1991 (June). CML: a higher-order concurrent language. In Proceedings of the ACM SIGPLAN ’91 Conference on Programming Language Design and Implementation.

John H. Reppy. 1999. Concurrent Programming in ML. Cambridge University Press, Cambridge, England.

John Reppy invented a beautiful generalization of synchronous communication: call “send” and “receive” events, and allow programmers to create new events by composing existing events. By enabling synchronous communication to support abstraction and composition, Reppy makes it a first-class citizen of a programming language.

The 1998 paper describes the idea. The 1991 paper shows it in a higher-order setting (ML). The 1999 book develops related programming techniques at length.

A note on design

Some big questions on message passing:


Lua in context

Roberto Ierusalimschy, Luiz H. de Figueiredo, and Waldemar Celes. 1996 (June). Lua—an Extensible Extension Language. Software—Practice & Experience, 26(6):635–652.

Introduction to the language, with a couple of illustrative applications.

Roberto Ierusalimschy, Luiz H. de Figueiredo, and Waldemar Celes. 2007 (June). The Evolution of Lua. In Proceedings of the third ACM SIGPLAN conference on History of Programming Languages, pages 2-1 to 2-26.

Twenty years later, a retrospective. In addition to detailing more technical detail, this paper also points at some of the things that make Lua important and successful in its niche.

Roberto Ierusalimschy, Luiz Henrique de Figueiredo, and Waldemar Celes. 2005. The Implementation of Lua 5.0. Journal of Universal Computer Science, 11(7):1159–1176.

Not really an overview of the implementation, this paper focuses on a few key aspects of Lua 5.0. From our point of view, the most interesting section is probably the one about coroutines. But the section on the virtual machine may help provide some context in which to understand the implementation.

Details of Lua’s implementation

Ravi Programming Language Documentation, Lua 5.3 Bytecode Reference.

Contains a nice picture of the Lua machine state, plus some of the notation used in the implementation. Has examples of bytecodes executed for calls.

Ravi Programming Language Documentation, Lua Parsing and Code Generation Internals.

Mostly out of scope for us. Overlaps with the bytecode reference, but also contains a short description of register windows.

Kein-Hong Man. 2006. A No-Frills Introduction to Lua 5.1 VM Instructions.

A much more detailed and discursive account of the VM’s instruction set, with more pictures and more examples. Far more detail than we might wish to know.

Lua Coroutines

Programming in Lua, Chapter 9, Coroutines.

A nice, short introduction to coroutines. Emphasizes common patterns of usage.

Moura, A.L.D. and Ierusalimschy, R., 2009. Revisiting coroutines. ACM Transactions on Programming Languages and Systems (TOPLAS), 31(2), p.6.

A thorough, scholarly look at the design space of coroutines. Advocates for a revival of coroutines as a useful control mechanism for concurrency, and shows they are equivalent in expressive power to one-shot continuations. This paper mostly subsumes an earlier paper “Coroutines in Lua,” although the earlier paper does contain a short survey of coroutines and similar features in other languages.

Other coroutines

Wample, S.B. and Griswold, R.E. 1983. Co-expressions in Icon. The Computer Journal, 26(1), pp.72-78.

Ralph E. Griswold and Madge T. Griswold. 1996. The Icon Programming Language. Third Edition, Chapter 9 (Co-Expressions).

Coroutines in a language that is expression-oriented, not statement oriented. This is a a “symmetric” alternative to Lua’s “asymmetric” coroutines. Probably you cannot understand Icon’s co-expressions without also understanding Icon’s generators. They are well worth understanding—it is a highly crafted and effective evaluation model, especially for search problems and string processing—but it is a bit far afield for our course.

Uncurated Reading

There’s a big mess.