This repository gathers the code for Stanford car image classification from the in-class Kaggle challenge.
Using EfficientNetB7 (noisy-student) with AutoAugmentation + MixUp and SGD + Lookahead optimizer + 1 cycle cosine annealing learning rate scheduler.
Without any ensemble models or extra training data.
Final submission score of 0.95920 which places me second on the final leaderboard.
- Ubuntu 18.04.4 LTS
- Intel(R) Xeon(R) Gold 6154 CPU @ 3.00GHz
- 1x NVIDIA Tesla V100
virtualenv .
source bin/activate
pip3 install -r requirements.txt
Join the competition and download the dataset.
cd CarClassifier
kaggle competitions download -c cs-t0828-2020-hw1
unzip cs-t0828-2020-hw1
python dataset.py
CarClassifier
+- training_data/
+- testing_data/
+- training_labels.csv
+- tmp/
+- logs/
+- cutmix_keras.py
+- dataset.py
+- EfficientNetB7.ipynb
+- Normalization_testing.ipynb
+- panda.jpg
+- ResNet50.ipynb
Model | Size | Batch | Methods | Testing Accuracy |
---|---|---|---|---|
ResNet50 | 256 | 16 | Hflip only | 0.89 |
EfficientNetB3 | 456 | 16 | + Rotate 10 | 0.915 |
EfficientNetB7 | 456 | 4 | 0.932 | |
EfficientNetB7 | 456 | 4 | AutoAugment | 0.944 |
EfficientNetB7 | 456 | 4 | wd1e-3 to 1e-4 | 0.949 |
EfficientNetB7 | 456 | 4 | wd1e-4 to 1e-5 | 0.951 |
EfficientNetB7 | 456 | 4 | + RAdam | 0.952 |
EfficientNetB7 | 456 | 4 | + SGD_Lookahead | 0.954 |
EfficientNetB7 | 600 | 4 | + Cutout | 0.956 |
EfficientNetB7 (noisy-student) | 600 | 4 | - Cutout + Mixup | 0.9592 |
EfficientNetB7 (noisy-student) | 600 | 4 | + Dropout 0.5 | 0.9594 |