**[general info] **
**[lecture notes] [exams and projects]**

- 10/31. Notes for Lecture 14, Notes for Lecture 18
- 10/23.Notes for Lecture 12, updated plan for future lectures
- 10/18.Notes for Lecture 16
- 10/12.Notes for Lecture 10
- 10/7. Notes for Lecture 7
- 10/3. updated notes for Lecture 5, Notes for Lecture 11
- 9/28. Notes for Lecture 5
- 9/27. Notes for Lecture 9, update to Notes for Lecture 8
- 9/22.Notes for Lecture 8
- 9/21.Notes for Lecture 6
- 9/16.Notes for Lecture 4
- 9/13. Summary of Lecture 6
- 9/11.Summary of Lecture 5
- 9/8. Notes for Lecture 2 and Notes for Lecture 3
- 9/7. Summary of Lecture 4
- 8/31. Summary of Lecture 3
- 8/30. Summary of Lecture 2
- 8/28. Notes for Lecture 1
- 4/24. Page created

**Instructor**: Luca
Trevisan, 625 Soda Hall, email *luca at berkeley dot edu*

**Classes** are TuTh, 3:30-5pm, in 310 Soda

**Office hours**: Wednesdays 2-3pm, 625 Soda Hall (Starting August 30)

**Piazza**: piazza.com/berkeley/fall2017/cs294134

- Properties of random graphs
- Cliques in random graphs
- Independent sets in random graphs
- Planted bisection and the stochastic block model
- Recovering a planted sparse vector in a random subspace
- Matrix completion
- Finding cuts in semi-random graph models
- Stable cuts and stable clustering
- Compressed sensing, LP decoding, and sparse FFT
- Smoothed analysis of the simplex
- Randomized hashing of worst-case data vs deterministic hashing of random data
- An introduction to average-case complexity

**Prerequisites:** CS170 or equivalent, a good understanding of discrete probability.

**Assignments:** scribing one lecture, a take-home midterm exam, a final project.

- 8/24. Introduction, clique in random graphs.

[notes] - 8/29. Max Cut in random graphs, Independent set in sparse random graphs.

[summary] [notes] - 8/31. Independent set and Max Cut in sparse random graphs.

[summary] [notes] - 9/5. Semidefinite programming and the SDP relaxation of Max Cut.

[summary] [notes] - 9/7. Optimal bounds for Max Cut in random graphs using the SDP relaxation.

[summary] [notes] - 9/12. Understanding the spectrum of the adjacency matrix of a random graph.

[see section 5 and 6 in these notes] [summary] [notes] - 9/14. Finding planted cliques in random graphs. Introduction to random k-SAT.

[See this paper for the planted clique algorithms]

[notes] - 9/19. More about certifying the unsatisfiability of random k-SAT.

[this is the paper introducing the reduction from certifying unsatisfiability of random k-SAT to certifying upper bounds to independent set in random graphs]

[this is the Feige-Ofek paper on spectral properties of random graphs with constant average degree]

[notes (updated 9/27)] - 9/21. Introducing the stochastic block model.

[notes] - 9/26. Matrix norms other than the spectral norm, and Grothendieck's inequality.

[notes] - 9/28. Approximate reconstruction in the stochastic block model

See also this paper

[notes] - 10/3. Exact reconstruction in the stochastic block model

See this paper and these notes

[notes] - 10/5. Exact reconstruction in the stochastic block model, continued

[notes in preparation] - 10/10. Exact reconstruction in the stochastic block model, conclusion. Semi-random models for cut problems

[notes] - 10/12. Notions of "stability" for cut and clustering problem. Certifying that a random linear subspace has no sparse vector.

The first part of the lecture referenced:- Balcan, Blum, Gupta on approximation stability
- Bilu and Linial on a different notion of stability
- Makarychev, Makarychev and Vijayaraghavan on solving max cut in Bilu-Linial-stable instances

- 10/17. Finding a planted sparse vector in a random subspace using linear programming.

[notes] - 10/19. Guest lecture on tensor decomposition by Tengyu Ma (Facebook).

See also this paper

[notes in preparation] - 10/24. Certifying that a random subspace has no sparse vector

This lecture and the next cover a weaker version of a result in Section 7 of this paper

[notes] - 10/26. Certifying that a random subspace has no sparse vector, continued

[notes in preparation] - 10/31. An overview of tensor decomposition using semidefinite programming

See this paper [notes in preparation] - 11/2. Matrix completion using nuclear norm

See this paper

[notes in preparation] - 11/7. Guest lecture by Tengyu Ma on matrix completion using non-convex optimization

See this paper

[notes in preparation] - 11/9. An introduction to average-case complexity

[notes in preparation] - 11/14. Daniel Dadush on smoothed analysis of the simplex

See this paper

- 10/31. Tensor decomposition using semidefinite programming
- 11/2. Matrix completion using nuclear norm
- 11/7. Matrix completion using gradient descent, guest lecture by Tengyu Ma
- 11/9. An introduction to average-case complexity
- 11/14, 11/16 Guest lectures on smoothed analysis by Daniel Dadush (CWI)
- 11/21. More on average-case complexity
- 11/28, 11/30, dead week. Presentations