THREAD: How does machine learning 🤖 differ from regular programming? 🧑💻
In both, we tell the computer 𝗲𝘅𝗮𝗰𝘁𝗹𝘆 what to do.
But there is one important difference...
2/ In regular programming, we describe each step the computer will take.
In machine learning, we write a program where the computer can alter some parameters based on the training examples.
How does this work?
3/ Our model has a lot of tiny knobs, known as weights or parameters, that control the functioning of the program.
We show the computer a lot of examples with correct labels.
Here is how this can play out...
4/ "Computer, this here is a dog".
The program looks at its outputs - "oh, to me this looks only slightly like a dog, but let me tweak my parameters to improve my performance next time".
This process is called training. Many parameters get updated many, many times very quickly.
5/ Once the training is done, we get a computer program! One that has many parameters, performs many simple operations.
A program that we can save, move around and run on new data.
6/ If you find this as fascinating as I do, do check out this @fastdotai lecture where @jeremyphoward covers all of the above, and more, in greater detail!
• • •
Missing some Tweet in this thread? You can try to
force a refresh
THREAD: Can you start learning cutting-edge deep learning without specialized hardware? 🤖
In this thread, we will train an advanced Computer Vision model on a challenging dataset. 🐕🐈 Training completes in 25 minutes on my 3yrs old Ryzen 5 CPU.
Let me show you how...
2/ We will train on the challenging Oxford-IIIT Pet Dataset.
It consists of 37 classes, with very few examples (around 200) per class. These breeds are hard to tell apart for machines and humans alike!
Such problems are called fine-grained image classification tasks.
3/ These are all the lines of code that we will need!
Our model trains to a very good accuracy of 92%! This is across 37 classes!
How many people do you know who would be as good at telling dog and cat breeds apart?
2/ In 2016 Sarada founded the Perth ML Group to help others learn.
How can the community support you? 🤗
It can...
✅ help you set up your environments 🧑💻
✅ provide technically-sound answers to challenging questions 💡
✅ make learning more fun! 🥳
3/ What are some tips for community participation?
✅ explaining things to others will help you learn 🦉
✅ it's okay to be anxious about sharing your answers publicly - DMs are always an option 📨
✅ experiment with various approaches and learn in a way that suits you best 💡
This is how little code it takes to implement a siamese net using @fastdotai and @pytorch.
I share this because I continue to be amazed.
Here is a refactored version that will be easier to change
The models above were my naive adaptations of the simple siamese network concept from cs.utoronto.ca/~gkoch/files/m… (screenshot on the left below) to a CNN setting.
On the right is the network from the Learning Deep Convolutional Feature Hierarchies section but using pretrained models
Now that the Quickdraw! competition is over, I added my solution to the repository github.com/radekosmulski/…. It is a model consisting of outputs of res50, incv4 and an rnn along with country code embeddings fed to an fc classifier. The most interesting aspect of the architecture
is how little code wiring everything together required thx to the @fastdotai lib. I replaced the classifier during training (while retaining the other weights) which the @fastdotai / @pytorch combo made very easy to accomplish.
In some sense there is nothing particularly
interesting about my solution above and beyond the basic motions of building a model. For various reasons I didn't get to the point of applying competition specific insights. The model is also partially trained
The winning solution was a CNN ensemble done using LightGBM. It also