1/ The key to understanding "news" is to think about probabilities and incentives. A thread... πŸ‘‰πŸ‘‰πŸ‘‰
2/ Let's say you're trying to estimate the chances of some future event happening.

As an example, let's take P(Trump runs for President in 2024).
3/ So you read the news about such an event. CNN, FoxNews, Politico, whatever. And it feels like a steady drip of "news" about tiny events which could affect that probability.

So how do we integrate all these pieces of information into a prediction?
4/ Naively here's what you should do if you're a good Bayesian:

- Start with some baseline prior probability.
- For every news story you see, update that probability based on the content and importance of the news.

Doing this well gets you the best possible estimate.
5/ But, while this isn't exactly *wrong*, it's close enough to wrong. Reason:

You're ignoring the process by which news stories are generated!!!
6/ The model of "news happens, media reports on it" is completely incorrect. There's way more news and pseudo-news that happens than media outlets have time to report on.

They perform an editing function by selecting what to report on.
7/ So far this is all obvious, and it's also obvious that they select based on perceived interest (clicks, viewers, etc). But that doesn't help us yet.

The key question: "How do (perceived) probabilities affect story selection?"
8/ If you're a good Bayesian, you recognizes this as precisely Bayes Theorem. And here's what I'm hereby calling Lebron's Theory Of Media Probabilities:

"Media companies select stories which maximize entropy."
9/ If everyone agrees something is going to happen, it's no longer news. There aren't any stories about the chances the sun won't come up tomorrow. If it didn't, that would be HUGE news.

But P(no sun tomorrow) = 0%.

Not news.
10/ If you recall your 2nd year information theory for binary outcomes, entropy is maximized when P(true) = 50%.

For any binary outcome event, media are incentivized to report on stories which move the perceived probability to 50%.

That maximizes uncertainty and interest.
11/ Will Trump run for President in 2024?

My read is that the probability is perceived to be under 50%, which is why news organizations are falling over themselves to find any indication he will. The more stories they report that he will, the more the prob moves closer to 50%.
12/ Right now it seems likely Biden will run again. Over 50%.

So any piece of news that indicates Biden won't run is going to get a big push. Moving perceived probability closer to 50%.
13/ So when you're estimating probabilities, you have to back out the selection bias of the stories you're seeing. You have to undo the squeeze towards 50%.

They don't care about providing *accurate* probabilities. It's just not in their set of incentives.
14/ But if you're a trader, or otherwise in the business of making accurate predictions, you need to back out their incentives.

That way you can make good use of the info they provide.
15/ This also generalizes to non-binary outcomes. After all, entropy is a property of any probability distribution.

TL;DR: News is an entropy-maximizing process.

/END

β€’ β€’ β€’

Missing some Tweet in this thread? You can try to force a refresh
γ€€

Keep Current with Agustin Lebron

Agustin Lebron Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @AgustinLebron3

1 Oct
1/ A short thread about the relationship between academic finance and real $ trading. πŸ‘‰πŸ‘‰πŸ‘‰
2/ First of all, I need to get something off my chest. I loved the Asimov Foundation books as a kid.

Much like Paul Krugman, the idea of doing math to understand complex societies, and to predict their evolution, was transfixing. And still is.
3/ I used to do mathematical modeling of complex systems before I even knew it was a thing.

When I was working as an engineer (and running a lot), I spent more time that I'll admit trying to build a model of the human lactate response to exercise. For fun.
Read 15 tweets
18 Sep
1/ One of the strongest predictors of success in trading and in life, I think, is working at the appropriate level of meta. πŸ‘‰
2/ Most often the failure mode is not going high enough in meta-level.

For example, when faced with problems and challenges, the weakest interns asked for help solving the problem.

The strongest asked for help in how to *think* about the problem.
3/ For them, improving their mental schema about a domain was the important thing. Once they had a good mental model, the rest was merely *work*.

And we all know work is easy.
Read 7 tweets
14 Sep
1/ Observed correlation:

The better the trader, the less they care about which specific product/market they're trading.

That doesn't mean they don't have deep knowledge of the market or product. Far from it.

It means they don't *care*.
2/ Conversely, I see a lot of aspiring, new, and frankly bad traders who care a *lot* about the product.

"I trade options," "I trade futures" like it's a religious commitment. It's not. The product you're trading is a means to an end, at least if you care about money.
3/ One of the founders of my former company loved saying something like:

"If they made financial markets illegal tomorrow, we'd probably suffer for a while but we'd eventually be fine. We'll just go find something else to trade."

I think he was right.
Read 5 tweets
6 Sep
1/ A thread about the relationship between getting older and learning new things.

πŸ§΅πŸ‘‰πŸ‘‰
2/ It’s a weird relationship. One way of looking at it is through the lens of the explore-exploit tradeoff.
3/ In reinforcement learning, when you have to act in a novel environment and learn in an online way, there's a tension between trying new things vs doing the things you’ve already learned are good.
Read 24 tweets
27 Jul
There’s a subtle but very real fallacy about backtesting that lots of smart quant-y people fall into. I’ve fallen into it many times. And arguably I still do, just in more and more subtle ways.

A thread πŸ‘‰πŸ‘‰

1/n
So you have a trading strategy, and you want to backtest it to see if it’s any good. Being good boys and girls and others, we know we mustn’t overfit to the data we already have.

We know that historical data is precious gold, and it must be used carefully.

2/n
Well, imagine I propose the following solution: build a model of the market in all its gory detail: fat tails, heteroskedasticity, vol clustering, etc etc. I calibrate this model using historical data, and it’s pretty good.

It's awesome in fact.

3/n
Read 14 tweets
14 Jul
1/ How the hiring game is like trading, and vice versa.

A thread. πŸ‘‰πŸ‘‰πŸ‘‰
2/ Most of what I talk about here is trading, but one of the things that pays my bills is helping companies get better at hiring.

I don’t usually talk much about that.

Mostly because the audience for that stuff is… niche.
3/ But it’s become clear, over the years of helping clients hire better, that a lot of what I’m teaching is trading skills and mindsets.

Here’s what I mean...
Read 29 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(