CS369E: Communication Complexity (for Algorithm Designers)
Roughgarden (Office hours: by appointment, Gates 474.)
10AM--Noon on Thursdays in Gates 100.
The best algorithm designers can prove both
possibility and impossibility results --- both upper and lower
bounds. For example, every serious computer scientist knows a
collection of canonical NP-complete problems and how to reduce them
to other problems of interest.
Communication complexity offers a clean theory that is extremely
useful for proving lower bounds for lots of different fundamental
problems. The two biggest goals of the course are:
1. Learn several canonical problems that have proved the most useful
for proving lower bounds (Disjointness, Index, Gap-Hamming, etc.).
2. Learn how to reduce lower bounds for fundamental algorithmic
problems to communication complexity lower bounds.
Along the way, we'll also:
3. Get exposure to lots of cool computational models and some famous
results about them --- data streams and linear sketches, compressive
sensing, space-query time trade-offs in data structures,
sublinear-time algorithms, and the extension complexity of linear
4. Scratch the surface of techniques for proving communication
complexity lower bounds (fooling sets, discrepancy bounds, corruption
Prerequisites: undergraduate algorithms (CS161, or equivalent) and
Occasional written exercises. Students taking
the course for a letter grade should also give a 20-minute
presentation on a research paper related to the course.
All lectures notes in one PDF
- Lecture 1 (Streaming: Upper and Lower Bounds) [beta version]
- Lecture 2 (Lower Bounds for 1-Way Communication Complexity) [beta version]
- Lecture 3 (Lower Bounds for Compressive Sensing) [beta version]
- Lecture 4 (Communication Complexity Boot Camp) [beta version]
- Lecture 5 (Extended Formulations) [alpha version]
- Lecture 6 (Data Structures) [beta version]
- Lecture 7 (Algorithmic Game Theory) [beta version]
- Lecture 8 (Property Testing) [beta version]
Notes on large deviation inequalities (Markov, Chebyshev, and Chernoff-Hoeffding): see Avrim Blum's notes
Exercises: We'll be continually adding to the
list here. (Most recent additions on 3/17/15.)
Detailed schedule and references:
- Lecture 1:
The streaming model of computation.
The AMS algorithm for F_2 estimation.
1-way communication complexity.
Some reductions from Disjointness.
- Lecture 2: Lower Bounds on 1-Way Randomized
Communication Complexity. Disjointness, Index, and Gap-Hamming.
- Lecture 3: Lower Bounds for Compressive Sensing.
(Appetizer: Randomized communication complexity of Equality.)
- Lecture 4: Boot camp on communication complexity.
Deterministic and randomized protocols. Monochromatic rectangles,
fooling sets, and covering lower bounds. Distributional complexity
and the corruption method.
- Lecture 5:
Covers and nondeterministic communication complexity.
The Face-Vertex problem. Yannakakis's Theorem: nonnegative rank of
the slack matrix characterizes extension complexity.
Lower bound on the extension complexity of the correlation polytope.
- Lecture 6:
The approximate nearest neighbors problem.
From communication protocols to dimension reduction in the Hamming cube.
Asymmetric communication complexity.
The richness method.
Data structure lower bounds. Optimality of KOR dimensionality
- Lecture 7:
Lower bounds in algorithmic game theory. The welfare maximization problem.
Multi-party communication complexity (number in hand).
Lower bounds for multi-party disjointness.
Lower bounds for equilibria.
- Lecture 8:
Lower bounds in property testing. The edge tester for monotonicty.
The case of a large range. Lower bounds from Disjointness.