Lecture 7-8

We discussed combinatorial and spectral notions of graph expansion and sketched why a random d-regular graph is an expander for d large enough. Then I presented Margulis’ expander construction and the Gabber-Galil analysis. To end, we proved that expanders require logarithmic distortion to embed into any Euclidean space.

Here is a modified analysis of the Margulis expanders.

Advertisements

Homework #1

This homework is due in two weeks on Wednesday, April 18th. You make work in small groups of 2 or 3. You should reference any discussions with others, books, lecture notes, or other resources that you use in coming up with a solution.

[Reminder: The notation {A \lesssim B} is the same as {A \leq O(B)} and {A \asymp B} is defined as the conjuniction of {A \lesssim B} and {B \lesssim A} .]

Problems

  1. Let {A} be the adjacency matrix of a {d} -regular graph {G} . Prove that {G} is bipartite if and only if {A} has {-d} as an eigenvalue.
  2. Consider the {L \times L} two-dimensional grid graph {G_{L,L}} . Let {n=L^2} . Prove that

    \displaystyle  \lambda_2(G_{L,L}) \asymp \frac{1}{n},

    where {\lambda_2} refers to the second eigenvalue of the combinatorial Laplacian. (Prove this without using the Spielman-Teng theorem for planar graphs.) For the lower bound, it may help to remember that {\lambda_2(P_L) \asymp \frac{1}{L^2}} , where {P_L} is the path on {L} vertices.

  3. Let {n=2^h-1} for some integer {h \geq 1} . Prove that if {T_h} is the complete binary tree of height {h} , we have

    \displaystyle  \lambda_2(T_h) \asymp \frac{1}{n},

    where again {\lambda_2} refers to the combinatorial Laplacian.

  4. Let {G} be an arbitrary graph. Recall that

    \displaystyle  h_G = \min_{|S| \leq n/2} \frac{|E(S)|}{|S|}.

    Let {\lambda_2} be the second eigenvalue of the combinatorial Laplacian on {G} .

    Prove that,

    \displaystyle  \lambda_2 = \min_{f : V \rightarrow \mathbb R} \max_{c \in \mathbb R} \frac{\sum_{u \sim v} |f(u)-f(v)|^2}{\sum_{u \in V} |f(u)-c|^2}\,,

    and

    \displaystyle  h_G = \min_{f : V \rightarrow \mathbb R} \max_{c \in \mathbb R} \frac{\sum_{u \sim v} |f(u)-f(v)|}{\sum_{u \in V} |f(u)-c|}\,,

    where both minimums are over non-constant functions.

  5. Let {G} be a {d} -regular graph and set {M = \frac12(I+\frac{1}{d} A)} , where {A} is the adjacency matrix of {G} . Let {\tau = \tau_{\infty}} be defined as in the notes for lecture 3. Let {\mu_1, \mu_2, \ldots, \mu_n} be the eigenvalues of {M} . Prove that

    \displaystyle  \sum_{i=1}^n \mu_i^{2\tau} \leq O(1)\,,

    by evaluating {\mathrm{tr}(M^{2\tau})} in two different ways.

Remarks on Lecture 3

Today we saw the connection between mixing times of random walks and the second eigenvalue of the normalized Laplacian. Recall that {G=(V,E,w)} is an {n} -vertex, weighted graph.

First, we introduced the lazy walk matrix,

\displaystyle  W_G = \frac{1}{2} (I + A_G D_G^{-1})\,,

which is related to the normalized Laplacian via,

\displaystyle  W_G = I - \frac{1}{2} D_G^{1/2} \mathcal{L}_G D_G^{-1/2}\,.

In particular, if we arrange the eigenvalues of {W_G} as {1=\beta_1 \geq \beta_2 \geq \cdots \geq \beta_n} , then

\displaystyle  \beta_i = 1 - \frac{\lambda_i}{2}\,,

where {\lambda_i} is the {i} th smallest eigenvalue of {\mathcal{L}_G} . Furthermore, the eigenfunctions of {W_G} are given by {g = D_G^{1/2} f} , where {f} is an eigenfunction of {\mathcal L_G} .

The eigenfunctions of {W_G} are orthogonal with respect to the inner product

\displaystyle  (f,g) = \sum_{u \in V} \frac{1}{w(u)} f(u) g(u)\,.

We will refer to this Hilbert space as {\ell^2(1/w)} . Recall that {\|f\|_{\ell^2(1/w)} = \sqrt{(f,f)} = \sqrt{\sum_{v \in V} \frac{1}{w(v)} f(v)^2}} .

If we define the stationary distribution {\pi : V \rightarrow [0,1]} by

\displaystyle  \pi(u) = \frac{w(u)}{w(V)} = \frac{w(u)}{2 w(E)},

then {W_G \pi = \pi} .

Let {f_1, f_2, \ldots, f_n} be an {\ell^2(1/w)} -orthornormal system of eigenvectors for {W_G} . Now fix an initial distribution {p_0 : V \rightarrow [0,1]} and write,

\displaystyle  p_0 = \sum_{i=1}^n \alpha_i f_i\,.

Then we have,

\displaystyle  p_t = W^t p_0 = \pi + \sum_{i > 1} \alpha_i \left(1-\frac{\lambda_i}{2}\right)^t f_i\,.

In particular,

\displaystyle  \|p_t - \pi\|_{\ell^2(1/w)}^2 = \sum_{i > 1} \alpha_i^2 \left(1-\frac{\lambda_i}{2}\right)^{2t} \leq \left(1-\frac{\lambda_2}{2}\right)^{2t} \sum_{i > 1} \alpha_i^2 \leq \left(1-\frac{\lambda_2}{2}\right)^{2t} \|p_0\|_{\ell^2(1/w)}^2\,.

Let {\pi_* = \min_{u \in V} \pi(u)} be the minimum stationary measure. A particularly strongly notion of mixing time is given by the following definition.

\displaystyle  \tau_{\infty} = \sup_{p_0} \min \left\{ t : \|p_t - \pi\|_{\infty} < \frac{\pi_*}{4} \right\}\,

where the supremum is taken over all initial distributions {p_0} . By time {\tau_{\infty}} , the distribution {p_t} is pointwise very close to the stationary distribution {\pi} .

Let {w^* = \max_{u \in V} w(u)} and {w_* = \min_{u \in V} w(u)} . Using our preceding calculation, we have

\displaystyle  \|p_t - \pi\|_{\infty} \leq \sqrt{w^*} \|p_t - \pi\|_{\ell^2(1/w)} \leq \sqrt{w^*} \left(1-\frac{\lambda_2}{2}\right)^{t} \|p_0\|_{\ell^2(V,w)} \leq \left(1-\frac{\lambda_2}{2}\right)^{t} \sqrt{\frac{w^*}{w_*}}\,.

Observe that {\frac{w^*}{w_*} \leq \frac{1}{\pi_*}} , so if we now choose {t \asymp \frac{1}{\lambda_2} \log \frac{1}{\pi_*}} appropriately, then

\displaystyle  \|p_t - \pi\|_{\infty} < \frac{\pi_*}{4},

yielding our main result:

\displaystyle  \tau_{\infty} \lesssim \frac{1}{\lambda_2} \log \frac{1}{\pi_*}\,.

One can also proof that {\tau_{\infty} \gtrsim \frac{1}{\lambda_2}} (see the exercises). In general, we will be concerned with weaker notions of mixing. We’ll discuss those when they arise.

Excersies

Prove that {\tau_{\infty} \gtrsim \frac{1}{\lambda_2}} for any graph G.

Agenda

Our agenda for the next few weeks looks something like this:

  1. The discrete Cheeger inequality
  2. The connection between {\lambda_2} and the mixing time of random walks
  3. Planar separators, eigenvalues, and circle packings
  4. Expander graphs and rapid mixing
  5. The small-set expansion problem and the Unique Games Conjecture
  6. Higher-order Cheeger inequalities
  7. Cheeger inequalities for very large eigenvalues (Arora-Barak-Steurer)
  8. Mixing times and the spectral profile