Sometimes you need to build a Machine Learning model that cannot be expressed with the Sequential API

For these moments, when you need a more complex model, with multiple inputs and outputs or with residual connections, that's when you need the Functional API!

[2.46 min]

The Functional API is more flexible than the Sequential API.

The easiest way to understand is to visualize the same model created using the Sequential and Functional API

You can think of the Functional API as a way to create a Directed Acyclic Graph (DAG) of layers while the Sequential API can only create a stack of layers.

Functional is also known as Symbolic or Declarative API

Thinking with the Graph creation in mind, given the layers A and B are "vertices", when you call:


You're are creating an edge between them like: A ➡️ B

Benefits of the Functional API:

• Plotting the model and model.summary work as expected to give a good visualization
• Debugging will happen during model definition since layers can be "type-checked"

The Functional API has the limitation of enable only directed acyclic graphs, if you need something like a dynamic network or recursive networks, than you won't be able to build with this API

This blog post by @random_forests has an even better explanation of the Functional API…

This guide have more information and playing with it can give you more insights about the Functional API:…

One good example of Functional API usage is on this tutorial:…

YAMNet has 3 outputs. To use it to extract audio embedding, the model has to be able to deal with multiple outputs! It's perfect for a Functional API.


• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with Luiz GUStavo

Luiz GUStavo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @gusthema

5 Apr
This week, let's talk about model building a little.

In the TensorFlow world, the simplest way of building a model is with a Sequential model.

But what is it and how to do it?

[4 minutes]

First, let's go over some basics.

The goal here is to build a Neural Network, or in other words, create a set of neurons, distributed in layers and connected by weights.

Each Layer applies some computation on the values or tensors it receives.

The simplest way to create this NN is to just do a plain stack of layers where each layer has exactly one input and one output.

This is exactly what a Sequential model does

Read 9 tweets
3 Apr
Today is my birthday!

As my gift to you, I created this thread with all my NLP posts of this week to give you some technical content for your weekend!

[2 minutes]

Let's start by What is NLP?

"What is a Text Embedding?"

Read 12 tweets
2 Apr
One very interesting task on the NLP fields is text generation.

There are very advanced techniques and a lot of research on it and even business based solely on it!

But how does it work?

[I guarantee it's a much better read then doom scrolling!!!]

Let's think: what a model would have to do to generate text?

The rationale is, as humans we form sentences by trying to create a sequence of words that makes sense.

The less random this sequence looks like, the better the output text is and closer to human like.

Here is where ML can help.

A model should learn how to combine the words the best way possible.
The simplest way to teach this is: given a sentence, hide the last word and let the model try to guess it.

The loss function measures how good the model's guess is.

Read 11 tweets
2 Apr
One cool use of NLP is for NER (Named-entity recognition)

This enables you to find person names, organizations, locations, quantities, monetary values, percentages, etc. in a piece of text.

If you only want to use it on your data this API… can help

Sometimes you need to create your own model for your specific data corpus (eg: legal, science, medical texts)

To create your own model, AutoML Natural Language can help you:

If you want to build everything from scratch, then you'll need:
• a language embedding (like BERT, ELMO, USE) and #TFHub have all you need
• a dataset and this… can help you find one

Read 4 tweets
31 Mar
Encoding text in numbers is a very important part of NLP as the better this can be done, the better are the possible results!

Word embedding works but they don't have the full context of the sentence.

This is where BERT comes in

But what is BERT?

When we do word embedding, both sentences
• They are running a test
• They are running a company

Will have very similar embeddings but the meaning of both sentences are very different. Without this context, the model using this encoding will be blind to the context

This is where Bidirectional Encoder Representations from Transformers (BERT) comes in play!

It is a Transformer-based network created in 2018 and
takes into account the context of the word's occurrence. For the previous example, it gives very different embeddings.

3/9🧵 Image
Read 9 tweets
27 Mar
If you are looking for something to learn during the weekend,

How about on-device Machine Learning?

You'll only need some understanding of ML and some of Mobile development.

Let me give you all the pointers in FAQ style:

[reading: 5.84 min]

"Which tools will I need to start?"

"Ok, how can a ML model run on a phone?"

Read 13 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!