, 12 tweets, 3 min read
Particle filters are general algorithms for inferring the state of a system with noisy dynamics and noisy measurements. Here's an example with a robot in a circular room. Red=true robot, blue=guesses, occasional red line=noisy range sensor measurement. Details in thread 1/
A particle filter (PF) does the same job as a Kalman filter (KF). Generally: you have a system in an unknown state, evolving over time according to some known dynamics + noise, and you occasionally get noisy sensor data. The task is to infer the current state 2/
In a KF your current uncertainty of the system is represented by a Gaussian, the dynamics are linear + Gaussian noise, and measurements have Gaussian noise. All those assumptions together mean that at any time your uncertainty is captured by a MV Gaussian 3/
In a PF, your uncertainty of the system is represented by a discrete collection of possible states ('particles'). Each particle evolves according to the dynamics of the system (+ sampling whatever noise is involved) 4/
Initially, the 'particles' are equally weighted. When you make a measurement, you reweight each particle according to the likelihood of the measurement. When the weights become sufficiently unbalanced, you resample with replacement according to weight 5/
It's essentially Bayesian inference with discrete blobs of probability mass. (In the simulation, I've done the re-sampling step after every measurement, because it makes it clearer what's going on) 6/
Here's what happens to the state of the particle filter if the robot never makes a measurement 7/
Advantages of PFs: 1) The dynamics don't have to be linear; they can be anything at all. 2) The noise in the dynamics and the measurement can follow any distribution whatsoever. 3) They're really simple to code up 8/
Disadvantages: 1) being sample-based, they won't work well in very high dimensions. 2) if you end up with no samples near the true state of the system, you can't really recover 9/
Sometimes you can model systems with a hybrid KF/PF approach. E.g., you have a variable x, and you model p(x) sampled via a PF, and you have another variable y and you model p(y|x) as a Gaussian with a KF (so you have 1 KF per particle) 10/
This can be especially useful in very high dimensional systems where a very large collection of variables can be modelled by a multivariate Gaussian, conditionally on a small number of variables that you handle with a PF 11/11
Err, 12/11. Here's a kernel density estimate of the posterior distribution over time, computed from the particle positions. It's fun to watch the distribution spreading out between measurements before dramatically collapsing
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Andrew M. Webb

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!