Probabilistic Numerics: big ideas for the internals of learning machines — and now also a BOOK, w/ @maosbot and @HansKersting!
What is #ProbabilisticNumerics, and why does it matter for ML? Thread below, with a link to a free pdf of the book at the end :)
We hear a lot about *models* in ML. But what actually happens within the “machine” during learning is a *numerical* task: Optimization (for loss minimization), Integration (for Bayesian inference), Simulation (for control, RL, physics,..). Linear Algebra for, well, everything.
Numerical analysis is well established, so ML’ers tend to think of numerical methods as primordial, immutable. But a numerical routine is itself a (low-level) learning machine! Because it _estimates_ an _unknown_ quantity from _data_, for data that is _computed_, not collected.
Thus, we can understand numerical methods in the language of (Bayesian) machine learning. The result of this process is, you guessed it, Probabilistic Numerics! After a decade of research by many wonderful colleagues, this idea now produces practical, fast methods—and a book!
Probabilistic numerical methods are not just a nice idea—they unlock the advantages of Bayesian inference for numerics, which can yield faster algorithmic shortcuts and more reliable uncertainty measures. We will give an overview of these benefits in the next thread, coming soon.
To find out why _you_ should care about #PNbook, please stay tuned for threads to follow, or simply start reading the book: