Discover and read the best of Twitter Threads about #30DaysOfJAX

Most recents (2)

#30DaysOfJAX Day 4

So far I learned how JAX.numpy can be converted to XLA ops using Lax lower level API

But where is Jit in all of this?

[this is cool but hard, grab a coffee! β˜•οΈ]

1/12🧡
Even with JAX operations being compiled to run using XLA, they are still executed one at a time in a sequence.

Can we do better?
Yes, Compiling it!🀯

2/12🧡
Compilers are great pieces of software and one the magicsπŸͺ„ they do is to look into your code and creating an optimized faster version by, for example:

β€’ fusing ops βž•
β€’ not allocating temp variables πŸš«πŸ—‘οΈ

3/12🧡
Read 13 tweets
What is JAX?

JAX is Autograd and XLA, brought together for high-performance numerical computing and ML research. It provides composable transformations of Python+NumPy programs: differentiate, vectorize, parallelize, JIT compile to GPU/TPU, and more.

πŸ€”πŸ§

#30DaysOfJAX

1/11🧡
That's already a lot to take in!
Let's try to understand the key words first

What is:
β€’ Autograd
β€’ XLA
β€’ Differentiate
β€’ Just-in-time compile

2/11🧡
What is Differentiate?

🚨 If you studied Calculus you might remember this one. (bear with me, don't run!)

Imagine you have a function:

-> f(x) = 3*x + 4

and you want to know how sensible your output (f) is to changes in the input (x).

3/11🧡
Read 11 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!