- Divide the training data into folds.
- Train a bunch of models: M1, M2.....Mn.
- Create full training predictions (using out of fold training) and test predictions using all these models. 2/4
- Till here it is Level β 1 (L1).
- Use the fold predictions from these models as features to another model.
This is now a Level β 2 (L2) model.
- Use the same folds as before to train this L2 model. 3/4
- Now create OOF (out of fold) predictions on the training set and the test set.
- Now you have L2 predictions for training data and also the final test set predictions. 4/4
β’ β’ β’
Missing some Tweet in this thread? You can try to
force a refresh
Have you had troubles or having troubles arranging your machine learning projects? This thread should give you some idea on how to arrange machine learning / deep learning projects. See the folder structure: 1/6 π½
input/: This folder consists of all the input files and data for your machine learning project. If you are working on NLP projects, you can keep your embeddings here. If you are working on image projects, all images go to a subfolder inside this folder. 2/6
src/: We will keep all the python scripts associated with the project here. If I talk about a python script, i.e. any *.py file, it is stored in the src folder. 3/6
In this thread, I will show you how to train a deep learning based sentiment classification model using BERT. First, you need the IMDb dataset, you can grab it from kaggle.com/lakshmi25npathβ¦. Save this as imdb.csv. π
1/5