Patrick Kidger Profile picture
Sep 13, 2021 9 tweets 3 min read Read on X
Announcing Equinox v0.1.0! Lots of new goodies for your neural networks in JAX.

-The big one: models using native jax.jit and jax.grad!
-filter, partition, combine, to manipulate PyTrees
-new filter functions
-much-improved documentation
-PyPI availability!

A thread:
1/n 🧵 Image
First: simple models can be used directly with jax.jit and jax.grad. This is because Equinox models are just PyTrees like any other. And JAX understands PyTrees.

2/n Image
More complex models might have arbitrary Python types in their PyTrees -- we don't limit you to just JAX arrays.

In this case, filter/partition/combine offer a succient way to split one PyTree into two, and then recombine them.

3/n Image
As this often happens around JIT/grad, then we have some convenient wrappers. (You can still use the previous explicit version if you prefer.)

How about that for easy-to-use syntax!

4/n Image
(For those who've used Equinox before: "filter_jit" and "filter_grad" are the successors to the old "jitf" and "gradf" functions.

We've tidied up the interface a bit, and as with tweet 3, optionally separated the filtering and the transformation.)

5/n
New filter functions, covering all the common use cases:

(is_array_like has also gotten a huge speed improvement)

6/n Image
Much improved documentation! In fact all the code snippets above are from the documentation. :)

This also includes new examples, such as how to use Equinox in the "classical" init/apply way e.g. in conjunction with other libraries.

7/n Image
And finally -- sound the trumpets! -- Equinox is now available via "pip"!

(Huge thanks to the guy who previously had "equinox" registered on PyPI, and agreed to let us use it.)

8/n Image
Equinox demonstrates how to use a PyTorch-like class-based API without compromising on JAX-like functional programming.

It is half tech-demo, half neural network library, and comes with no behind-the-scenes magic, guaranteed.

Give it a try:

github.com/patrick-kidger…

9/9

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Patrick Kidger

Patrick Kidger Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @PatrickKidger

Feb 8, 2022
⚡️ My PhD thesis is on arXiv! ⚡️

To quote my examiners it is "the textbook of neural differential equations" - across ordinary/controlled/stochastic diffeqs.

w/ unpublished material:
- generalised adjoint methods
- symbolic regression
- + more!

arxiv.org/abs/2202.02435

v🧵 1/n
If you follow me then there's a decent chance that you already know what an NDE is. (If you don't, go read the introductory Chapter 1 to my thesis haha -- it's only 6 pages long.) Put a neural network inside a differential equation, and suddenly cool stuff starts happening.

2/n
Neural differential equations are a beautiful way of building models, offering:
- high-capacity function approximation;
- strong priors on model space;
- the ability to handle irregular data;
- memory efficiency;
- a foundation of well-understand theory.

3/n
Read 21 tweets
Aug 3, 2021
Announcing Equinox!

github.com/patrick-kidger…

A JAX neural network library with
- a PyTorch-like class API for model building
- whilst *also* being functional (no stored state)

It leverages two tricks: *filtered transformations* and *callable PyTrees*.

1/n🧵
First of all, I know what you're thinking. We already have e.g. Flax and Haiku (+ a few others as well).

What's new, and do we really need another?

To the best of my knowledge, Equinox overcomes some of the core technical difficulties faced in previous libraries.

2/n
We love having a PyTorch-like class API for model building.

We love having JAX-like functional programming.

But these seem like completely different paradigms, and making them work together is tricky.

3/n
Read 20 tweets
May 12, 2021
New paper: Neural Rough Differential Equations !

Greatly increase performance on long time series, by using the mathematics of rough path theory.

arxiv.org/abs/2009.08295
github.com/jambo6/neuralR…

Accepted at #ICML2021!

🧵: 1/n
(including a lot about what makes RNNs work) Image
(So first of all, yes, it's another Neural XYZ Differential Equations paper.

At some point we're going to run out of XYZ differential equations to put the word "neural" in front of.)

2/n
As for what's going on here!

We already know that RNNs are basically differential equations.

Neural CDEs are the example closest to my heart. These are the true continuous-time limit of generic RNNs:
arxiv.org/abs/2005.08926
github.com/patrick-kidger…

3/n Image
Read 22 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(