Remarks on Lecture 2

In this lecture, we generalized our setting to weighted graphs {G=(V,E,w)} where {w : E \to \mathbb R^+} is a positive weight function on edges. We also looked at the basic spectral theory of the normalized Laplacian, defined by

{\mathcal L_G = I - D_G^{-1/2} A_G D_G^{-1/2}} ,

which operates on functions {f : V \to \mathbb R} via

{\displaystyle \mathcal L_G f(u) = f(u) - \sum_{v \sim u} \frac{w(u,v)}{\sqrt{w(u) w(v)}} f(v)} .

In particular, we associated with {\mathcal L_G} the normalized Rayleigh quotient,

{\displaystyle \mathcal R_G(f) = \frac{\sum_{u \sim v} w(u,v) (f(u)-f(v))^2}{\sum_{u \in V} w(u) f(u)^2}} ,

and we showed that we can express the second-smallest eigenvalue as

{\lambda_2(\mathcal L_G) = \min \{ \mathcal R_G(f) : \sum_{v \in V} w(v) f(v) = 0 \}} .

We defined the expansion of a subset {S \subseteq V} by

{\phi_G(S) = \mathcal R_G(\mathbf{1}_S)} ,

where {\mathbf{1}_S} is the characteristic function for {S} .

Finally, we defined the expansion of a graph by

{\Phi_G = \min \left\{ \phi_G(S) : 0 < w(S) < w(V)/2\right\}}

We then stated and (almost) proved the following discrete Cheeger inequality.

Theorem 1. For any weighted graph {G=(V,E,w)} , we have

{\displaystyle \frac{\lambda_2}{2} \leq \Phi_G \leq 2 \sqrt{\lambda_2}} ,

where {\lambda_2 = \lambda_2(\mathcal L_G)} .

Exercises (optional)

  1. Prove that for the normalized Laplacian, just as for the combinatorial Laplacian, the number of connected components in a graph G is precisely the multiplicity of the smallest eigenvalue.
  2. We saw the the right-hand inequality in Theorem 1 is asymptotically tight. Prove that the left-hand side is asymptotically tight for the complete graph on n nodes.
Advertisements

5 thoughts on “Remarks on Lecture 2

  1. Is (2) still true if we allow arbitrary edge weights? It seems like both lambda_2 and the edge expansion scale proportionately…. but I couldn’t figure out the asymptotic bound.

    • Both the Rayleigh quotient and the expansion (which is just the Rayleigh quotient of a characteristic function) are homogeneous in the weights. So if you scale the weights uniformly, the bound will remain true. If you put arbitrary weights on the edges, you can get any Laplacian you want essentially, so then no, it doesn’t hold.

Leave a Reply to Cyrus Rashtchian Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s