Edward Raff Profile picture
Chief Scientist @BoozAllen. Chair @CamlisOrg. Author of #InsideDeepLearning @ManningBooks & of JSAT Machine Learning library. PhD from & Visiting Prof @UMBC
Jul 13, 2023 7 tweets 3 min read
This is getting a lot of traction and was basically my PhD thesis area! Compression algorithms are a super cool way to do a lot of very useful AI/ML, and can be more scalable than other options! JOIN ME IN A THREAD 🧵 You use compression to measure similarity, which is awesome, but it is also slow. This ACL paper cited/noted my first work in this space. If you know what compression algorithm you are going to use, you can hack out the compression into an ML approach directly and may it faster!
Mar 30, 2021 12 tweets 5 min read
Chapters 5 & 6 of Inside Deep Learning are out! manning.com/books/inside-d… The first 4 introduce more of a "classic" set of neural networks as they existed in <= 2000, but using a modern framework. These two chapters have the job of lifting the reader to 2021. Chapter 5 does this by focusing on how we solve the optimization problem of neural networks. Turns out we can do much better than just SGD, and I want you to see how big a difference these changes can make. The current favorite son of optimization Adam will not be here forever!
Dec 24, 2020 10 tweets 4 min read
So big personal news, I've been working on a book w/ @ManningBooks and it's now early-access! manning.com/books/inside-d… I'm targeting what I think is an under-served area. The middle between "give me a tool" and "CS/Stats/ML PhD graduate book" that gives utility and understanding. Throughout the book I'm not going to shy away from showing the math involved. I want people to learn it because it's a concise way to read and describe what is happening. But I'm going to try and explain the math at an intuitive level & annotate equations back to english & code.