I started as an android developer π±π€ and today I'm an ML developer π§ π€
I learned a lot of technical stuff (A LOT!), but there's much more than that.
Let me tell you some of the things I learned...
1/11π§΅
Being the big fish π³ on a small pound is cool, but you may run out of air.
Starting at this job meant becoming a tiny fish π in a big pound!
And here is where the growth opportunities are!
2/11π§΅
Working with smarter people than you is sometimes very hard for your ego!
There's a lot of impostor syndrome in the beginning.
But that's a big chance to learn what these smarter people do that you can adapt and grow as bigger fish π¦.
3/11π§΅
Diversity is very important for everyone's growth! Different perspectives from different backgrounds is key also for project success!
I've learned so many things about different cultures and backgrounds and this made me at least a more empathic person overall.
4/11π§΅
"Treat people the way they want to be treated"
This one took me some time to fully understand since I've always learned a different thing!
This is key for a better work environment!
5/11π§΅
Work is not always a sprint, it's mostly a marathon!
The daily 1% improvement is better for the long run and for your own mental health!
6/11π§΅
Documenting what you are doing and communicating it properly goes a long way! Even more on this work from home era.
I try to have the answer to "What I'm working on" and "Why" ready at all times!
With this, it's easier to get help you and also to see the big picture.
7/11π§΅
Helping new hires to adapt to the new job is very important!
They come with a fresh perspective, full of ideas and enthusiasm and energy that sometimes wear off when you are working in the same place for a long time!
it's a win win (for you and for them) situation!
8/11π§΅
The networking aspect of your job is very important!
Working with people outside of your direct team, when possible, is a great opportunity to meet new people and learn new things.
These same people, if you did good work, will remember and open new doors for you!
9/11π§΅
Self development sometimes needs extra time.
To move from android dev to ML required me to study outside of my working hours.
Was it worth it?
DEFINITELY!!
You are the best investment you can do!
10/11π§΅
Do you have any other tips to add?
If you have any questions just leave them in the comments!
And also share so that more people can learn from it!
11/11π§΅
Quick update: This is by far the highest engagement I've ever got! It made everything else look like a flat line!
I have a theory about why it happened and I'll share in the future!
Keep sharing to reach more people and help them!
Follow me for more of this content!
12/11π§΅
β’ β’ β’
Missing some Tweet in this thread? You can try to
force a refresh
Sometimes you need to build a Machine Learning model that cannot be expressed with the Sequential API
For these moments, when you need a more complex model, with multiple inputs and outputs or with residual connections, that's when you need the Functional API!
[2.46 min]
1/8π§΅
The Functional API is more flexible than the Sequential API.
The easiest way to understand is to visualize the same model created using the Sequential and Functional API
2/8π§΅
You can think of the Functional API as a way to create a Directed Acyclic Graph (DAG) of layers while the Sequential API can only create a stack of layers.
Functional is also known as Symbolic or Declarative API
One very interesting task on the NLP fields is text generation.
There are very advanced techniques and a lot of research on it and even business based solely on it!
But how does it work?
[7.47min]
[I guarantee it's a much better read then doom scrolling!!!]
1/11π§΅
Let's think: what a model would have to do to generate text?
The rationale is, as humans we form sentences by trying to create a sequence of words that makes sense.
The less random this sequence looks like, the better the output text is and closer to human like.
2/11π§΅
Here is where ML can help.
A model should learn how to combine the words the best way possible.
The simplest way to teach this is: given a sentence, hide the last word and let the model try to guess it.
The loss function measures how good the model's guess is.
Sometimes you need to create your own model for your specific data corpus (eg: legal, science, medical texts)
To create your own model, AutoML Natural Language can help you:
2/4π§΅
If you want to build everything from scratch, then you'll need:
β’ a language embedding (like BERT, ELMO, USE) and #TFHub have all you need
⒠a dataset and this github.com/juand-r/entity⦠can help you find one
Encoding text in numbers is a very important part of NLP as the better this can be done, the better are the possible results!
Word embedding works but they don't have the full context of the sentence.
This is where BERT comes in
But what is BERT?
1/9π§΅
When we do word embedding, both sentences
β’ They are running a test
β’ They are running a company
Will have very similar embeddings but the meaning of both sentences are very different. Without this context, the model using this encoding will be blind to the context
2/9π§΅
This is where Bidirectional Encoder Representations from Transformers (BERT) comes in play!
It is a Transformer-based network created in 2018 and
takes into account the context of the word's occurrence. For the previous example, it gives very different embeddings.