Lean formalization: Finding Colorings in One-Sided Expanders
I formalized in Lean the results in Section 4 of our paper Finding Colorings in One-Sided Expanders. Section 4 proves one of the paper's core structural results: If a graph is a one-sided expander, then its adjacency matrix has only a constant number of eigenvalues below a certain negative threshold.
The Lean code was generated with ChatGPT Pro 5.1 and GPT-5.1-Codex-Max. The Lean repo is on GitHub.
Code overview
The main file is Colorexpanders.lean. Theorem 4.1 corresponds to theorem large_bottom_rank_implies_large_top_rank and Corollary 4.2 corresponds to theorem small_top_rank_implies_small_bottom_rank.
The statement of Theorem 4.1 is formalized as follows:
theorem large_bottom_rank_implies_large_top_rank
[Nonempty n] (A : Matrix n n ℝ)
(hHerm : A.IsHermitian)
(hNonneg : ∀ i j, 0 ≤ A i j)
(hOp : ‖A‖ ≤ (1 : ℝ))
{μ : ℝ} (hμ : 0 ≤ μ)
{t : ℕ} (hBottom : bottomThresholdRank A hHerm μ ≥ t)
{σ : ℝ} (hσ₀ : 0 < σ) (hσ₁ : σ < 1) :
(topThresholdRank A hHerm ((μ^(2 : ℕ) - σ) / (1 - σ)) : ℝ)
≥ σ^2 * (t : ℝ)
In words, this says that if A is a real Hermitian matrix with non-negative entries and operator norm at most 1, and if A has at least t eigenvalues less than or equal to -μ, then for every 0 < σ < 1, it has at least σ^2 * t eigenvalues greater than or equal to (μ^2 - σ) / (1 - σ).
The statement uses the definitions topThresholdRank and bottomThresholdRank, which are:
noncomputable def topThresholdRank
(A : Matrix n n ℝ) (hA : A.IsHermitian) (τ : ℝ) : ℕ :=
Fintype.card { i : n // τ ≤ hA.eigenvalues i }
noncomputable def bottomThresholdRank
(A : Matrix n n ℝ) (hA : A.IsHermitian) (μ : ℝ) : ℕ :=
Fintype.card { i : n // hA.eigenvalues i ≤ -μ }
That is, topThresholdRank A hHerm τ counts the eigenvalues of A that are at least τ, while bottomThresholdRank A hHerm μ counts the eigenvalues that are at most -μ. In both cases, eigenvalues are counted with multiplicity.