Robert Peharz Profile picture
Nov 28 10 tweets 4 min read
Happy to advertise our paper, "Active Bayesian Causal Inference," to be presented at #NeurIPS2022, @NeurIPSConf.

openreview.net/forum?id=r0bjB…
Causal models are powerful reasoning tools, but usually we don't know which model is the correct one. Traditionally, one first aims to find the correct causal model from data, which is then used for causal reasoning.
How about a Bayesian approach? I.e., we put a prior over a class of causal models, and, when observing new data, we simply update the posterior using Bayes' rule?

Advantage 1. We maintain our uncertainty about the causal model in a rigorous way (because it's Bayesian 🥰😍).
Advantage 2. This uncertainty can be used for causal predictions using Bayes predictive distributions. Specifically, we define the "causal query function" q, which represents what we want to know from the model (the whole graph, particular edges, some causal effect, etc).
Advantage 3. We can get active about learning the model, e.g. by maximizing Information Gain derived from our posterior. Roughly, this iteratively designs optimal interventional experiments, so that we learn as much as possible about the model.
Advantage 4 is a combination of Advantage 2 and 3: we can get active about causal queries of interest, by maximizing Information Gain of the query posterior! Thus, even if we are still uncertain about the model, we might already be pretty certain about the target causal query!
We have developed a tractable implementation for non-linear additive Gaussian noise models, using a DIBs prior over causal graphs and Gaussian processes for the causal mechanisms.
The implementation scales to several dozens of variables and indeed learns target causal queries faster than competing methods.
We are happy if you drop by on Tuesday 29th Nov in New Orleans! Let's have a chat about this cool work!

Spotlight, Lightning Talks 1A-3, 12:00-12:15, presented by @chritoth
neurips.cc/virtual/2022/s…

Poster session: 16:00-18:00.
With fantastic co-authors: @chritoth, Lars Lorch, @ChriKnoll, @arkrause, Franz Pernkopf, and @JKugelgen.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Robert Peharz

Robert Peharz Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(