I’m a second-year PhD student at MIT’s Probabilistic Computing Project, co-advised by Vikash Mansinghka and Josh Tenenbaum. Before coming to MIT, I taught high school computer science at Commonwealth School in Boston. And before that, I was a student at Yale, where I received a B.S. in computer science and mathematics in 2015.

I’m interested in designing systems and abstractions that help people apply, invent, and reason about sophisticated algorithms for probabilistic modeling and inference. I’m also interested in using those systems to explore ideas in knowledge representation, expert systems, logic, and program synthesis.

### Trace Types and Denotational Semantics for Sound Programmable Inference in Probabilistic Languages (POPL 2020)

**Alexander Lew**, Marco Cusumano-Towner, Benjamin Sherman, Michael Carbin, Vikash MansinghkaWe develop a type system for probabilistic programs that enables us to mechanically verify measure-theoretic correctness properties of inference algorithms. Using our type system and semantics, we design sound-by-construction versions of programmable MCMC, SMC, and variational inference.

### Few-Shot Bayesian Imitation Learning with Logical Program Policies (AAAI 2020)

Tom Silver, Kelsey Allen,

**Alexander Lew**, Leslie Kaelbling, Josh TenenbaumWe define an expressive class of program-based policies, and derive an approximate Bayesian inference algorithm for learning them from demonstrations. In five strategy games played on a 2D grid, after just a handful demonstrations, the inferred policies generalize to new game instances that differ substantially from the demonstrations.

### Gen: A General-Purpose Probabilistic Programming System with Programmable Inference (PLDI 2019)

Marco Cusumano-Towner, Feras Saad,

**Alexander Lew**, Vikash MansinghkaWe introduce Gen, a new probabilistic programming system that outperforms state-of-the-art systems, sometimes by orders of magnitude, on diverse problems including object tracking, estimating 3D body pose from depth images, and inferring the structure of a time series.

### Probabilistic Scripts for Automating Common-Sense Tasks (StrangeLoop 2019)

As engineers, we love to automate tedious tasks, but when the task at hand requires common sense, automation can be difficult. Probabilistic scripting is a style of declarative programming, based on probabilistic programming, that lets us replace heuristics with principled common-sense reasoning.

### Gen: A General-Purpose Probabilistic Programming System (JuliaCon 2019)

A ten-minute introduction to Gen, and some thoughts on how it compares to other probabilistic programming systems.

**At MIT**, I co-taught a January-term course on applied probabilistic programming, and served twice as a TA for 6.885: Probabilistic Programming & AI.

**From 2015 to 2018**, I taught computer science full-time at Commonwealth School.

### CS 1: Intro to Program Design

In this hands-on introduction to computer science using Racket and Python, students create dozens of interactive programs while exploring core ideas in computation like data representation, abstraction, and recursion.

### CS 2: AP Computer Science Principles

Students explore big ideas in computing, from machine learning to networking to cryptocurrencies, by coding up the relevant algorithms themselves.

### CS 3: Data Structures and Algorithms

We take a deep dive under the covers to see how lists, trees, and arrays really work. Then we practice algorithmic thinking and algorithm design.

### CS 4: Advanced Topics in Computer Science

This course is different each year depending on student interest.

**From 2015 to 2019**, I served as a TA at the Duke Machine Learning Summer School.

### TensorFlow fundamentals

A slow and thorough introduction to TensorFlow from the ground up, designed for use at the Duke Machine Learning Summer School. Rather than jump straight into building a neural net, we focus first on the fundamentals: graphs, operations, tensors, placeholders, variables, and sessions.

### Robot Mind Meld

A fun little word game powered by machine learning. With a hundred thousand words to choose from, can you and the robot converge to the same one?

Email me at alexlew AT mit DOT edu.