One term that I learned when I started studying ML is Hyperparameter.

What is it?
When should I worry about it?

Let me try to clarify it...

[3 min]

1/9🧡
First, what are the parameters of a ML model?

Those are typically the weights that you end up with after training your model. Example:

If you are creating a model to solve
AX + B = Y

A and B are the parameters you'll find. They are also know as the weights of the model

2/9🧡
Hyperparameters are the values that control the learning process.
Let's suppose we have the code in the image.

Hyperparameters could be: unit (from the Dense layer), learning_rate from the optimizer or even the batch_size

3/9🧡
The next question is: how do I choose the best Hyperparameters for my model?

This is where it gets tricky! This is a process of experimentation even if you are very experienced!

You could manually try multiple different and compare the models metrics by hand, or...

4/9🧡
This is where Keras Tuner can help!

blog.tensorflow.org/2020/01/hyperp…

5/9🧡
Keras Tuner is a library that can help you experiment with hyperparameters for your TF model!

You define a set of variables to try and their ranges and the library will execute multiple training steps and automatically search for the best values for the hyperparameters

6/9🧡
You can learn more about it on this notebook: tensorflow.org/tutorials/kera…

It will show you how to use a basic setup for an image recognition model.

I highly recommend trying it out, you'll get a much deeper understanding!

7/9🧡
The values of the parameters used during the tuning process can also be visualized on TensorBoard for an even better understanding of the process

You can try it here: tensorflow.org/tensorboard/hy…

8/9🧡
Hyperparameter tuning is an important part of creating the best ML models.

Understanding how to do it in a reproducible way is a key part of it and Keras Tuner is here to help you.

Do you use any other tools for this task?

9/9🧡

β€’ β€’ β€’

Missing some Tweet in this thread? You can try to force a refresh
γ€€

Keep Current with Luiz GUStavo πŸ’‰πŸ’‰πŸŽ‰

Luiz GUStavo πŸ’‰πŸ’‰πŸŽ‰ Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @gusthema

18 Jun
This week I've been posting about the itertools Python🐍 module.

If you want to improve your coding skills, one way is adding new tools to your toolbox

Itertools enables you to solve problems that would otherwise be absurdly hard to solve.

[2min]

1/7🧡
After you've learned the basic of Python, I'd suggest you go deeper in the collections manipulation:

β€’ Slicing
β€’ Comprehension
β€’ Generators
β€’ Iterators
β€’ Itertools
β€’ map/filter/zip

I've posted about all this content in the past, I can revisit if you'd like

2/7🧡
This week I've explained all functions on the itertools module

Starting by the basic ones:

3/7🧡
Read 7 tweets
17 Jun
Generating combinations and permutations is usually a tricky task in programming.

In Python 🐍, using the itertools library this becomes much easier, less memory intensive and faster!

Let me tell you what I learned with them

[4.45 min]

1/13🧡
Let's suppose you want to create all possible cards of a regular deck. What you have to do is to join:

β€’ All cards ranks: A, 2 to 10, J, Q and K
β€’ The 4 suits: β™₯️♣️♦️♠️

How to generate all cards?

2/13🧡
The operation that can solve this is a cartesian product:

ranks = ['A'] + list(range(2, 11)) + ['J', 'Q', 'K']
suits = ['β™₯️', '♣️', '♦️', '♠️']

all_cards = it.product(suits, ranks)
>>> ('A', 'β™₯️'),('A', '♣️'),('A', '♦️'),('A', '♠️'),...
len(all_cards)
>>> 52

3/13🧡
Read 13 tweets
15 Jun
Following up from my previous thread, let's continue taking a look at some additional itertools methods

Some of them, as you will see, have very similar built-in versions but the key here is: itertools works on iterables and generators that are lazy evaluated collections

1/14🧡
One example is the method islice. It does slicing but for iterables (potentially endless collections). The main difference is that it doesn't accept negative indexes like regular slicing.

numbers = range(10)
items = it.islice(numbers, 2, 4)
>>> [2, 3]

2/14🧡
zip_longest vs zip
both do the same thing: aggregates elements from multiple iterators.

The difference is that zip_longest aggregates until the longest iterator ends while zip stops on the shortest one.

It will fill the missing values with any value you want

3/14🧡
Read 14 tweets
9 Jun
Quick⚑️ Python🐍 trick:
How do you merge two dictionaries?

[55 sec]🀯

1/6🧡
For Python 3.5 and up you can do:

d1 = {"a":1, "b":2}
d2 = {"c":3, "d":4}
d3 = {**d1, **d2}
d3 == {"a": 1, "b": 2, "c":3, "d":4}

Why?

2/6🧡
The ** operator expands the collection, so, you can think it as this:

d1 = {"a":1, "b":2}
d2 = {"c":3, "d":4}
d3 = {**d1, **d2} -> {"a":1, "b":2, "c":3, "d":4}
d3 == {"a": 1, "b": 2, "c":3, "d":4}

3/6🧡
Read 6 tweets
8 Jun
I was telling a friend that one cool feature from Python is list slice notation

So instead of just posting the link I decided to do a brief explanation.

[5 min]

Python's regular array indexing as in multiple languages is: a[index]

a = [0, 1, 2, 3, 4]
a[0] == 0

1/12🧡
Python has negative indexing too:

a = [0, 1, 2, 3, 4]
a[-1] == 4
a[-2] == 3

2/12🧡
Python also enables creating sub-lists from a list, or a slice:

-> a[start_index:last_index]

a = [0, 1, 2, 3, 4]
a[1:3] == [1, 2]

start_index is inclusive
last_index is exclusive

3/12🧡
Read 12 tweets
30 May
I've been trying the new TensorFlow Decision Forest (TF-DF) library today and it's very good!

Not only the ease of use but also all the available metadata, documentation and integrations you get!

Let me show you some of the cool things I've learned so far...

[5 min]

1/11🧡
TensorFlow Decision Forests have implemented 3 algorithms:

β€’ CART
β€’ Random Forest
β€’ Gradient Boosted Trees

You can get this list with tfdf.keras.get_all_models()

All of them enable Classification, Regression and Ranking Tasks

2/11🧡
CART or Classification and Regression Trees is a simple decision tree. 🌳

The process divides the dataset in two parts
The first is used to grow the tree while the second is used to prune the tree

This is a good basic algorithm to learn and understand Decision Trees

3/11🧡
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(