I proved a super simple helper lemma.
In English the proof is 1 line.
In lean4, it took me around 50 lines of code. 1/n
The proof took me around 5 hours (lol).
Any help shortening would be appreciated.
2/n
The trickiest things: 1. Can't take basic math -- e.g., commutativity, automatic casting of naturals as reals -- for granted. 2. Tactics feel simultaneously powerful and weak. E.g., sometimes 'linarith' easily proves inequalities, sometimes not. 3. GPT4 is not v helpful
3/n
Next up for me is figuring out support for convexity, differentiability, and the fundamental theorem of calculus. I'm also about 7 chapters in to the lean4 tutorial. I'll continue working through that in parallel.
I'm posting my code to a github repo, which i'll link below.
4/n
You can find the code above at the following link:
The bound had also been stated but not proven in a 1994 paper. The 2015 paper gave a decently simple self-contained proof.
This question had been asked to a few prominent people in this field. They didn't know, so it wasn't well known. I'm sure they could have figured it out.
I didn't know this literature at all. I tried not to read it except through interaction with llms.
The solutions given by all llms always had a flaw. The approach didn't look like the one in the 2015 paper.
Next formalization is the "gradient inequality" for smooth convex functions.
The proof (next tweet) is a few sentences of English.
My lean4 proof (below) is roughly 350 lines of code.
1/n
The proof in English is short. It is a consequence of the simple convex analysis fact:
If f is a convex function such that f(0) = 0 and f(y)>= o(y), then f(y) \geq 0 globally.
To formalize this proof, I first had to learn whether mathlib had support for gradients. Interestingly this was just added in October: leanprover.zulipchat.com/#narrow/stream…