Expected value is one of the most fundamental concepts in probability theory and machine learning.

Have you ever wondered what it really means and where does it come from?

The formula doesn't tell the entire story right away.

💡 Let's unravel what is behind the scenes! 💡
First, let's take a look at a simple example.

Suppose that we are playing a game. You toss a coin, and

• if it comes up heads, you win $1,
• but if it is tails, you lose $2.

Should you even play this game with me? 🤔

We are about to find out!
After 𝑛 rounds, your earnings can be calculated by the number of heads times 1 minus the number of tails times 2.

If we divide total earnings by 𝑛, we obtain the average earnings per round.

What happens if 𝑛 approaches infinity? 🤔
As you have probably guessed, the number of heads divided by the number of tosses will converge to the 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 of a single toss being heads.

In our case, this is 1/2.

(Similarly, tails/tosses also converges to 1/2.)
So, your average earnings per round are -1/2. This is the 𝑒𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑣𝑎𝑙𝑢𝑒.

By the way, you definitely shouldn't play this game. 😉

💡 How can we calculate the expected value for a general case? 💡
Suppose that similarly to the previous example, the outcome of your experiments can be quantified. (Like throwing a dice or making a bet at the poker table.)

The expected value is just the average outcome you have per experiment when you let it run infinitely. ♾️🤯
The formula above is simply the expected value in English.

If we formally denote the variable describing the outcome of the experiment with 𝑋 and its possible values with 𝑥ᵢ, we get back the formula in the first tweet.

It looks much easier now, isn't it?
This concept came up recently when I gave this explanation to a friend.

Inspired by @haltakov and his awesome recent thread (), I figured this can be interesting for a lot of you!

Next time, I am planning to explain entropy! What do you think?
Update: I have just posted a thread explaining the formula behind entropy, as promised! Check it out!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Tivadar Danka

Tivadar Danka Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @TivadarDanka

10 Feb
At @telestoAI, we have built the entire backend of our competition platform in FastAPI.

Why did we choose this instead of Flask or Django?

👇 This is a thread about why.
1️⃣ Defining schemas for endpoints is brilliantly simple with Pydantic.

You only have to create a Pydantic class and use type annotations in the path operation function.
2️⃣ Dependency injections. This is such a powerful and versatile feature!

Essentially, these are functions that are automatically called during path operations, handing the return value as an argument o the path operation.

One common usage is to get database connections.
Read 9 tweets
4 Feb
If you are building a product, chances are you severely underestimate the importance of idea validation. (Especially if you are a developer.)

Key business assumptions can flop because you fail to look at different angles.

What are some basic questions to ask?

🧵 A thread. 🧵
𝐀𝐦 𝐈 𝐬𝐨𝐥𝐯𝐢𝐧𝐠 𝐚𝐧 𝐞𝐱𝐢𝐬𝐭𝐢𝐧𝐠 𝐩𝐫𝐨𝐛𝐥𝐞𝐦?

Often, the problem is not important enough to justify the existence of a solution. This is the most basic trap to fall for: there is no market need for the product.
Take a look at the top 20 reasons why startups fail by @CBinsights. The number 1 is no market need, causing around 43% percent of failures.

cbinsights.com/research/start…
Read 11 tweets
2 Nov 20
So, you want to wrap your machine learning model into an API. Flask used to be the best tool for that, but lately, FastAPI has become my favorite.

Here are my five main reasons why! 👇
1️⃣ Simple yet brilliant interface.

You define the request body model in Pydantic, write the endpoint function to process it, and finally register the route to the app.

That's it.
You can launch the app right away with uvicorn, ready to receive requests!
Read 10 tweets
1 Nov 20
Have you ever implemented a dynamic function dispatcher in Python, where you can register functions at runtime using the decorator syntax? (Like the routers for FastAPI.)

I did this recently, and I am going to teach you how to do it! I'll walk you through it in the thread below.
We will build an event handler to catch arbitrary events and dynamically execute a function to handle the event.

(A good example is catching events in a webhook listener.)

Time to use some decorator magic!
The usage is straightforward:
1) instantiate the EventHandler,
2) register handler functions for specific events,
3) pass the event to the EventHandler instance when caught.
Read 9 tweets
30 Oct 20
Last week I made a massive 4000 words article about mathematics the No. 1 trending on Medium.

I achieved this by setting three guiding principles, resulting in explosive growth.

Here is a thread about how can you do it too. Image
1⃣ Set out to write the single most important article on the topic. Instead of looking for quick wins, aim to create the best resource out there.

Making this article took me more than a month. Every minute of work was worth it.
Pay attention to all the details, make sure you understand every nook and cranny of the subject.

Explain complex technical details in a way such that even newcomers would get them.

Never compromise on quality. If you think that something can be done better, do it.
Read 12 tweets
28 Oct 20
Neural networks are getting HUGE. In their @stateofaireport 2020, @NathanBenaich and @soundboy visualized how the number of parameters grew for breakthrough architectures. The result below is staggering.

What can you do to compress neural networks?

👇A thread.
1⃣ Neural network pruning: iteratively removing connections after training. Turns out that in some cases, 90%+ of the weights can be removed without noticeable performance loss.
A few selected milestone papers:
📰Optimal Brain Damage by @ylecun, John S. Denker, and @SaraASolla. As far as I know, this was the one where the idea was introduced.

papers.nips.cc/paper/250-opti…
Read 12 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!