Discover and read the best of Twitter Threads about #keras

Most recents (10)

What do the Vision Transformers learn? How do they encode anything useful for image recognition? In our latest work, we reimplement a number of works done in this area & investigate various ViT model families (DeiT, DINO, original, etc.).

Done w/ @ariG23498

We also reimplemented different models in #Keras. These were first populated w/ pre-trained parameters & were then evaluated to ensure correctness.

Code, models, a tutorial, interactive demos (w/ @huggingface Spaces), visuals:…

We’ve used the following methods for our analysis:

* Attention rollout
* Classic heatmap of the attention weights
* Mean attention distance
* Viz of the positional embeddings & linear projections

We hope our work turns out to be a useful resource for those studying ViTs.

Read 5 tweets
Wanna try out the ConvNeXt models in @TensorFlow / #keras, inspect them, fine-tune them, etc.? Well, here they are:…

Includes a total of 15 I-1k and I-21k ConvNeXt models + conversion scripts, off-the-shelf inference, and fine-tuning code.

These models ARE NOT opaque. You can load them like so and inspect whatever you need to:

2/ Image
Here's the full disclosure on the accuracy scores. Differences are mainly for library implementation differences. But happy to stand corrected if someone has other suggestions.

3/ Image
Read 5 tweets
RegNets have been successfully added to `tf.keras.applications` by @adityakane1 with tremendous help from the #Keras team.

Great architecture for studying scaling behaviors.…

Aditya had already added RegNets (-Y) to TF-Hub last year during GSoC. But he wanted to streamline them a bit and that endeavor resulted in `tf.keras.applications.regnet`. This is sheer hard work barring the cognitive load. Ain't that gritty?…

If you're looking for DL/CV interns, please consider @adityakane1. Beyond his knowledge, he is a good person and asks questions like a pro.

Read 3 tweets
This has to be the best #Keras example in 2021 so far. My criteria are: brief coverage of the underlying concepts, code understandability, and correctness, lucidity in the language. But beyond these factors, this example is just too much awesomeness.…
The author András Béres goes to great lengths to provide a comprehensive overview of the SoTA in GANs providing tips, tricks, and lessons from their experiences.

IMO, this is what makes an article truly fantastic.

Of course, the code is really high-quality.
Still `model.compile()` and ``, btw! Pretty audacious, ain't it?

Can't help mentioning this time and time again 😂
Read 3 tweets

Thread of the very best #YouTube channels and #Twitter accounts to follow for:

#AI/ #ML, #DeepLearning, #neural and all things #datascience

#AI #machinelearning @wiserin10 #datascience #bigdata #artificialintelligence


Analytics India Magazine includes discussions on news, tips for the data ecosystem and a deep dive into #AI/#ML, #deeplearning and #neural networks

#YouTube subscriber count: 38k


Krish Naik is co-founder of and specialises in #machinelearning, #deeplearning, and computer vision. Krish’s #YouTube channel is a deep dive into all things #AI/#ML, perfect for beginners

YouTube subscriber count: 421k
Read 15 tweets
The Adversarial Robustness Toolbox (ART) = framework that uses generative adversarial neural networks (GANs) to protect deep learning models from security attacks

GANs = the most popular form of generative models.

GAN-based attacks:
+White Box Attacks: The adversary has access to the training environment, knowledge of the training algorithm
+Black Box Attacks: The adversary has no additional knowledge
The goal of ART = to provide a framework to evaluate the robustness of a neural network.

The current version of ART focuses on four types of adversarial attacks:
Read 5 tweets
New #Keras example is up on *consistency regularization*or an important recipe for semi-supervised learning and tackling distribution shifts as shown in *Noisy Student Training*.…

This example provides a template for performing semi-supervised / weakly supervised learning. A few things one can plug right in:

* Incorporate more data while training the student.
* Filter the high-confidence predictions while training the student.

The example uses Stochastic Weight Averaging during training the teacher to induce geometric ensembling. With elements like Stochastic Dropout, the performance might even be better.

Here are the full experiments:
Read 5 tweets
(1/7) My experience today to migrate my code from #Keras to Keras for @TensorFlow (tf.keras) has been a bit painful unfortunately. Some issue I encountered (@fchollet maybe this is of interest to you?):
(2/7) I found out that 1.13 doesn't have support for float16 in linear algebra operators, luckily this pull request now landed in 1.14-RC0…
(3/7) It seems that Dropout on an input with a partially unknown shape, and a noise_shape of (None, 1, None), does not work in tf.keras. I could work around it but in the original Keras this works just fine.
Read 7 tweets
✨💡 This is an ace idea from @sarah_edo! 💕

👩‍💻 Be on the lookout for a @TensorFlow Advent Calendar tomorrow, as well, highlighting meaningful, high-impact projects and papers from our community. If you'd like for yours to be considered, please shoot me an @-mention!
#TFadvent begins today! 😄

For our first project, I'd like to highlight this accessibility example from @shekitup that uses @TensorFlowJS to interpret sign language—and then translates those signs into input that can be used by a home assistant! 🗣️✨…
🎁 Day #2 of #TFadvent:

🤖 Check out this project from a recent hackathon at @ColbyCollege! The team trained a @TensorFlow model to learn muscle movements, then used that model to send signals to a prosthetic arm, controlling one finger at a time. ✨

Read 32 tweets
✨🧠 The ecosystem that has grown up around @TensorFlow in the last few years blows my mind. There's just so much functionality, compared to some of the other, newer frameworks.

👉Consider this an ever-expanding thread for me to take notes + wrap my brain around products. Ready?
1) @TensorFlow Extended (TFX)

It's no secret that I 💕 #TFX and all of its tooling for deploying machine learning models into production. If you care about keeping your models up-to-date and monitoring them, you should check out the product + its paper.
2) @TensorFlow Hub

If you want to train your model on a small data set, or improve generalization, you'll need to use something called transfer learning. #TFHub modules make it easy—and are available in an #OSS marketplace:

Read 40 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!