Valeriy M. Profile picture
Sep 24 15 tweets 4 min read
Motivated by having seen yet another Platt’s scaler post.

Platt’s scaling and isotonic regression are ~20 years old at this point. Both of them don’t have any mathematical guarantees of validity and are outperformed by conformal prediction Venn-ABERs

#conformalprediction
VENN-ABERS is in fact a better regularised version of isotonic regression that constructors two isotonic regressions by postulating that a test object can a priori have both 0 and 1 as a label.
By doing that VENN-ABERS is able to achieve theoretical guarantees of validity (lack of bias) as the expense of multi probability prediction where the test object will have two probabilities of class 1 instead of one. p0 is a lower bound and p1 is an upper bound for class 1 prob
The actual probability of the object being of class 1 is guarantees to be within this interval. As VENN-ABERS belongs to Conformal Prediction such guarantees are standard regardless of the data distribution, the type of classifier and even of the data size.
The width of the interval (p0,p1) is a valuable information in itself as it reflects on the difficulty of classifying specific object, for decision making such interval can be easily combined into one probability (of class 1) as p = p1 / (1-p0+p1)
Unlike Platt’s scaler that is based on restrictive assumptions of underlying classifier producing sigmoid shape scores (which holds for SVM that Platt’s scaler was developed for but not for other classifiers). VENN-ABERS doesn’t make any assumptions beyond IID
And hence, being part of Conformal Prediction, offers robust mathematical guarantees.
Multiple papers have shown that VENN-ABERS scaler also often outperforms both Platt’s scaler and Isotonic regression empirically, whilst not suffering from prediction bias.
There is a Python library available where anyone can run Venn-ABERS with a few lines of Python code github.com/ptocca/VennABE…

#opensource
There is a paper from creator of #conformalprediction Prof Vladimir Vovk proceedings.neurips.cc/paper/2015/fil…
A talk explaining the paper
As well as several tutorials about Venn ABERS on Awesome Conformal Prediction github.com/valeman/awesom…
Video tutorial

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Valeriy M.

Valeriy M. Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @predict_addict

Aug 17
Another great development from @nixtlainc, ETS from StatsForecast is much more accurate and over 100 faster than NeuralProphet.

nice work @fede_gr and @nixtlainc team

#timeseries #forecasting
Last year a few data scientists including myself exposed severe issues with facebook prophet resulting in facebook pulling down claims such as “with facebook prophet anyone can produce better forecasts than human experts “.
Still the hope was that with Neural Prophet things will be better and one can at least partially repair the worst forecasting model of the XXIst century - Facebook Prophet. valeman.medium.com/benchmarking-f…
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(