Usually we imagine Machine Learning models running on expensive servers with lots of memory and resources

A change that is enabling complete new types of apps is executing ML models on the edge, like on phones and microcontrollers

Let's find out why that matters

[2min]

1/6🧵
Why would you want to run a ML model on a phone?

Lower Latency: if you want to get inference on real time, running a model over a cloud API is not going to give a good user experience.

Running locally is much better and potentially much faster ⚡️

2/6🧵
When running ML models on-device, your app will be able to keep working even without network connectivity

For example, if your app translates text, it will still work in another country, when you need most and it won't use your hard earned money on roaming fees!

3/6🧵
ML models running on-device also preserve your privacy 🕵️ as your data won't leave the device and be used on a server somewhere of a service you might not trust yet.

4/6🧵
TensorFlow Lite is the framework that enables you to run TF models completely on-device.

TFLite is available to be used in multiple platforms like: Android, iOS and Microcontrollers

tensorflow.org/lite/guide/get…

5/6🧵
I'll keep posting about on-device ML this week.

Leave your questions in the comments so I can address them on future posts.

6/6🧵

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Luiz GUStavo

Luiz GUStavo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @gusthema

24 Mar
When we want to deploy a ML models on-device you may need to optimize them.

A model with bigger accuracy might also be bigger in size and also use more memory and slower

Do you need real time inference?

Let's take a look on how to optimize your model

[3.14 minutes]

1/6🧵
TFLite have the model optimization toolkit: tensorflow.org/model_optimiza…
To help you with this very important task.

Among the techniques are: Quantization and Pruning

2/6🧵
Quantization works by reducing the precision of the numbers used to represent a model's parameters, which by default are float32.

This results in a smaller model and faster computation.

More info here: blog.tensorflow.org/2020/04/quanti…

3/6🧵
Read 6 tweets
23 Mar
When you have your TensorFlow Model and want to use it on a mobile device, you'll need to convert it to the TFLite format.

This process can be done in two ways:
- Using the Python API
- Using a command line tool

Let's look into some more details….

1/6🧵 Image
Why do we need to convert?

The TFLite is an optimized format (Flatbuffer) for faster loading
To keep the framework lite and fast, all the Operations are optimized for mobile execution but not all TF operations are available

The available ops: tensorflow.org/lite/guide/ops…

2/6🧵
How to convert the model?

The Python API to convert a model is straightforward and it's the recommended method of conversion

You can, during the conversion, apply some optimizations, like post training quantization to reduce your model size and latency.

3/6🧵 ImageImage
Read 6 tweets
1 Mar
Let's start with some theory

I've been working with ML on the Audio domain and at first I couldn't understand much but as I kept reading I managed to figure out some things.

Let me share some of the basic theory with you:

[10 minutes read]

1/n🧵
Sound is a vibration that propagates as an acoustic wave.

It has some properties:
- Frequency
- Amplitude
- Speed
- Direction

For us, Frequency and Amplitude are the important features.

en.wikipedia.org/wiki/Sound#Sou…

2/n🧵
An important aspect is that Sounds are a mixture of their component Sinusoidal waves(follow a sine curve) of different frequencies

From the equation below:
- A is amplitude
- f is frequency
- t is time

The code replicates the formula and compose a third wave from 2 other

3/n🧵
Read 15 tweets
19 Jan
For developers, a good debugger and profiler are fundamental tools for their productivity.

On the ML world, TensorBoard can help you with that by enabling:

- Visualizing metrics, model, histograms of weights or biases
- Displaying images, text and audio data
- Profiling

1/5🧵
You can load TensorBoard directly on Colab using a magic word to load the extension and another one to load the tool.

The nice part is that this does not require installing anything on your computer.

2/5🧵
To visualize your training data, you'll need to create a callback and use it on the fit method.

The callback just needs the directory where the log will be written.

3/5🧵
Read 5 tweets
18 Jan
Machine Learning models can be classified regarding how much human supervision they need.

This affects the algorithms used and the types of tasks that it can solve.

You can categorize Machine Learning models in 4 major categories:

1/5🧵 #ML #MondayMotivation
Supervised Learning is when you train a model from the input data and ALL their corresponding labels.

Examples of
- Tasks: classification and regression
- Algorithms: kNN, Linear and Logistic regression, SVM, Decision Tree, Neural Networks(*)

2/5🧵
Unsupervised Learning is when you use unlabelled data to train your model.

Examples of
- Tasks: Clustering, Anomaly Detection, Visualization and Dimension reduction, Association rule
- Algorithms: K-means, PCA, DBSCAN

3/5🧵
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!