**
This Thread may be Removed Anytime!**

Twitter may remove this content at anytime! Save it as PDF for later use!

- Follow @ThreadReaderApp to mention us!
- From a Twitter thread mention us with a keyword "unroll"

`@threadreaderapp unroll`

Practice here first or read more on our help page!

Apr 21
Read 12 tweets

The Gram-Schmidt process is one of the most important algorithms in linear algebra.

Its task is simple: orthogonalizing vector sets.

Its applications are endless: matrix decompositions, eigenvalue problems, numerical linear algebra...

This is how it works:

Its task is simple: orthogonalizing vector sets.

Its applications are endless: matrix decompositions, eigenvalue problems, numerical linear algebra...

This is how it works:

Apr 20
The Gram-Schmidt process is one of the most important algorithms in linear algebra.

Its task is simple: orthogonalizing vector sets.

Its applications are endless: matrix decompositions, eigenvalue problems, numerical linear algebra...

This is how it works:
Read 14 tweets

Its task is simple: orthogonalizing vector sets.

Its applications are endless: matrix decompositions, eigenvalue problems, numerical linear algebra...

This is how it works:

Apr 12
Read 9 tweets

Here is a probabilistic puzzle.

Feedex and Acme are two delivery companies. Feedex trains are 80% on time, while only 40% of its trucks are.

However, Acme's trains are 100% on time, and 60% of its trucks are as well.

Yet, Feedex is more reliable! Why?

Feedex and Acme are two delivery companies. Feedex trains are 80% on time, while only 40% of its trucks are.

However, Acme's trains are 100% on time, and 60% of its trucks are as well.

Yet, Feedex is more reliable! Why?

This lesson is brought to you @brilliantorg's Introduction to Probability course. Their interactive, first-principles approach will make sure you understand and retain the things you learn there.

Since I'm partnering with them, I have a special offer for you later.

Let's go!

Since I'm partnering with them, I have a special offer for you later.

Let's go!

Apr 10
Read 27 tweets

In machine learning, we take gradient descent for granted. We rarely question why it works.

What's usually told is the mountain-climbing analogue: to find the valley, step towards the steepest descent.

But why does this work so well? Read on.

What's usually told is the mountain-climbing analogue: to find the valley, step towards the steepest descent.

But why does this work so well? Read on.

Our journey is leading through

• differentiation, as the rate of change,

• the basics of differential equations,

• and equilibrium states.

Buckle up! Deep dive into the beautiful world of dynamical systems incoming. (Full post link at the end.)

• differentiation, as the rate of change,

• the basics of differential equations,

• and equilibrium states.

Buckle up! Deep dive into the beautiful world of dynamical systems incoming. (Full post link at the end.)

Mar 29
Read 17 tweets

The single most important "side-effect" of solving linear equation systems: the LU decomposition.

Why? Because in practice, it is the engine behind inverting matrices and computing their determinants.

Here is how it works.

Why? Because in practice, it is the engine behind inverting matrices and computing their determinants.

Here is how it works.

Mar 28
Read 12 tweets

What can go wrong will go wrong.

This is Murphy's famous First Law. Here is a probabilistic reformulation: "what can go wrong with probability 𝑝 > 0, will go wrong with probability 1". But when will it go wrong?

Surprisingly, this is encoded in the probability.

This is Murphy's famous First Law. Here is a probabilistic reformulation: "what can go wrong with probability 𝑝 > 0, will go wrong with probability 1". But when will it go wrong?

Surprisingly, this is encoded in the probability.

Understanding probabilistic thinking is one of the best investments you can make, and @brilliantorg's Introduction to Probability will teach you the very essence.

Since I'm partnering with them, I have a special offer from them for you later.

Let's get started!

Since I'm partnering with them, I have a special offer from them for you later.

Let's get started!