Discover and read the best of Twitter Threads about #pnbook

Most recents (3)

Random numbers (e.g. from PRNGs) are everywhere in AI—but are they actually a good idea? In particular, are random numbers the best we can do for numerical problems like linear algebra, integration and optimisation? Probabilistic Numerics (PN) has (very radical!) views 🧵 Ain image of a chaotic, glowing, six-sided die, generated wi
In numerical integration, e.g. estimating ∫_{-3}^{3} f(x) dx, one popular approach is Monte Carlo (named after the casino), which uses random numbers to select the locations x_i of evaluations f(x_i).

The PN approach to numerical integration is called Bayesian quadrature (BQ).
BQ—e.g. our algorithm WSABI—can be faster than Monte Carlo, using fewer evaluations to achieve a required level of error. In this example, Monte Carlo takes minutes (using thousands of evaluations) to reach an error that WSABI achieves in seconds (using a handful of evaluations).
Read 26 tweets
What is Probabilistic Numerics (PN)? To illustrate, take one core use case of PN— computing integrals. Most integrals are intractable (life is hard), so we must often integrate numerically. Sadly, numerical integrators are unreliable & computationally expensive.

PN can help! 🧵
Consider

F = ∫_{-3}^{3} f(x) dx
f(x) = exp(-(sin 3x)^2 - x^2)

The integrand f(x) here is simple—~20 characters, only atomic functions, can be evaluated in nanoseconds. However—the integral F is intractable! Let's try to calculate F numerically using PN. Image
The central idea of Probabilistic Numerics is to treat a numerical method as a *learning machine*. What about when the numerical method is an integrator? Well, a learning machine

• receives data,

• predicts and then

• takes actions.
Read 18 tweets
It’s out! Oh boy!

Probabilistic Numerics: big ideas for the internals of learning machines — and now also a BOOK, w/ @maosbot and @HansKersting!

What is #ProbabilisticNumerics, and why does it matter for ML? Thread below, with a link to a free pdf of the book at the end :) A picture of the book “Prob...
We hear a lot about *models* in ML. But what actually happens within the “machine” during learning is a *numerical* task: Optimization (for loss minimization), Integration (for Bayesian inference), Simulation (for control, RL, physics,..). Linear Algebra for, well, everything.
Numerical analysis is well established, so ML’ers tend to think of numerical methods as primordial, immutable. But a numerical routine is itself a (low-level) learning machine! Because it _estimates_ an _unknown_ quantity from _data_, for data that is _computed_, not collected. a diagram consisting of thr...
Read 6 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!