Progress report of 'NLPlay with Transformers'
- introduction to google colab
- learned basic python programming
- learned libraries for ML : numpy, pandas, matplotlib
- learned to use nltk library for NLP and DL
- learned theory behind neural networking
- learned to use pyTorch library for making classifiers
- made a numerical classifier using MNIST dataset for practice
- saw numerous advancements in deep NLP in detail
- further explored the applications of deep NLP
- learned to use various types of word embeddings
- made a sentiment classifier using Feed Forwarded NN
for this IMDB movie review dataset was used
for this task simple BOW(bag-of-words) representation was used reached accuracy of 80.02%
- learned how a recurrent neural network works
- made a sentiment classifier using RNN
for this the same IMDB movie review dataset was used
reached accuracy of 87.28% - learned the LSTM and GRU models
- implemented these in the sentiment classifier
reached accuracy of 86.51% and 89.86%
- learned what a transformer is and how it works
- made sentiment classifiers using BERT
- reached accuracy of 90.36%
- made sentiment classifiers using RoBERTa
- reached accuracy of 91.02%
- saw what GPT-2 and T5 are
- learned how to generate text using transformers
- created a custom dataset
- generated text using GPT-2
- analysed output by calculating the BLEU Score