Skip to content

Progress of the SoC project 'NLPlay with Transformers'

Notifications You must be signed in to change notification settings

Mach36/NLPlay-With-Tranformers

Repository files navigation

NLPlay with Transformers

Progress report of 'NLPlay with Transformers'


Week-1

  • introduction to google colab
  • learned basic python programming
  • learned libraries for ML : numpy, pandas, matplotlib

Week-2

  • learned to use nltk library for NLP and DL
  • learned theory behind neural networking
  • learned to use pyTorch library for making classifiers

Week-3

  • made a numerical classifier using MNIST dataset for practice
  • saw numerous advancements in deep NLP in detail
  • further explored the applications of deep NLP
  • learned to use various types of word embeddings
  • made a sentiment classifier using Feed Forwarded NN
    for this IMDB movie review dataset was used
    for this task simple BOW(bag-of-words) representation was used reached accuracy of 80.02%

Post mid-sem week (Week-4)

  • learned how a recurrent neural network works
  • made a sentiment classifier using RNN
    for this the same IMDB movie review dataset was used
    reached accuracy of 87.28%
  • learned the LSTM and GRU models
  • implemented these in the sentiment classifier
    reached accuracy of 86.51% and 89.86%

Week-5

  • learned what a transformer is and how it works

Week-6

  • made sentiment classifiers using BERT
  • reached accuracy of 90.36%

Week-7

  • made sentiment classifiers using RoBERTa
  • reached accuracy of 91.02%

Week-8

  • saw what GPT-2 and T5 are
  • learned how to generate text using transformers
  • created a custom dataset
  • generated text using GPT-2
  • analysed output by calculating the BLEU Score

About

Progress of the SoC project 'NLPlay with Transformers'

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published