Founder, Prof of #machinelearning at ELLIS Institute Tübingen and University of Freiburg & , 3x ERC Grant holder, EurAI & ELLIS fellow. All opinions are my own.
Jan 13 • 11 tweets • 4 min read
#TabPFN v2 also excels on time series!
Just before our Nature paper came out, we had this paper at the #NeurIPS time series workshop: We simply cast time series as tabular regression and use exactly the model from Nature. It’s crazy that this works!
🧵1/10openreview.net/forum?id=ho8Yx…
2/10 The #TabPFN-TS approach is straightforward: Convert timestamps to tabular features (e.g., day of week, day of month, time of day), and then simply apply TabPFN v2 regression to these features for a probabilistic prediction at any given timestamp.
Jan 8 • 19 tweets • 6 min read
The data science revolution is getting closer. TabPFN v2 is published in Nature: On tabular classification with up to 10k data points & 500 features, in 2.8s TabPFN on average outperforms all other methods, even when tuning them for up to 4 hours🧵1/19 nature.com/articles/s4158…
2/19 Two years ago, I tweeted about TabPFN v1 “This may revolutionize data science”. I meant this as “This line of work may revolutionize tabular ML”. We’re now a step closer to this. Like every model, TabPFN v2 will have failure modes, but it starts to get closer to the promise.
Oct 21, 2022 • 7 tweets • 3 min read
This may revolutionize data science: we introduce TabPFN, a new tabular data classification method that takes 1 second & yields SOTA performance (better than hyperparameter-optimized gradient boosting in 1h). Current limits: up to 1k data points, 100 features, 10 classes. 🧵1/6
TabPFN is radically different from previous ML methods. It is meta-learned to approximate Bayesian inference with a prior based on principles of causality and simplicity. Here‘s a qualitative comparison to some sklearn classifiers, showing very smooth uncertainty estimates. 2/6