3 ML frameworks you should check:

1. VISSL
2. AdaNet
3. Archai neural architecture search

Here they are. Boom!⬇️
VISSL for self-supervised learning
1/3
AdaNet for neural networks discovery
2/3
Archai for neural architecture search
3/3

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with TheSequence

TheSequence Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @TheSequenceAI

6 Oct
Our favorite open ML courses!

Topics?

1. Deep Learning by @ylecun
2. NLP with Deep Learning by @stanfordnlp
3. Deep Learning for NLP by @CompSciOxford and @DeepMind

Here they are⬇️
1) Deep Learning by @ylecun

This unique course material consists of:
- close captioned lecture videos
- detailed written overviews
- executable Jupyter Notebooks with PyTorch implementations

2) NLP with Deep Learning by @stanfordnlp

Course from one of the famous NLP Groups in the world.

Read 4 tweets
20 Sep
AutoML means using ML to create better ML models.

Meta-learning improves the learning of sophisticated tasks by reusing knowledge.

3 main forms of meta-learning relevant to AutoML⬇️
1) Based on Model Evaluations

Learn new tasks based on the configuration of models based on similar tasks.

Configuration = hyperparameter settings, pipeline, and/or network architecture components.
2) Based on Task Properties

Transfer knowledge between empirically similar tasks.

This method describes tasks as a set of meta-features and predicts the outcome of similar tasks by evaluating the distance with its meta-feature vector and other tasks.
Read 4 tweets
14 Sep
Master Neural Architecture Search (NAS) to automate the creation of neural networks.

4 topics you need to cover⬇️
1) The concept of NAS

1. Read one of the fundamental papers, "A Survey on Neural Architecture Search" @IBMResearch
2. Explore our dedicated Edge#4 for free thesequence.substack.com/p/thesequence-…
2) NAS algorithms

1. Differentiable Architecture Search
2. Differentiable ArchiTecture Approximation
3. eXperts Neural Architecture Search
4. Petridish (Read "Efficient Forward Architecture Search")
Read 5 tweets
4 Sep
NAS is one of the most promising areas of deep learning.

But it remains super difficult to use.

Archai = an open-source framework that enables the execution of state-of-the-art NAS methods in PyTorch.⬇️
Archai enables the execution of modern NAS methods from a simple command-line interface.

Archai developers are striving to rapidly update the list of algorithms.

Current deck:
- PC-DARTS
- Geometric NAS
- ProxyLess NAS
- SNAS
- DATA
- RandNAS
2/5
Benefits for the adopters of NAS techniques:
- Declarative Approach and Reproducibility
- Search-Space Abstractions
- Mix-and-Match Techniques
- & more!

You can find more details here: microsoft.github.io/archai/feature…
3/5
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(