Code and slides to accompany the online series of webinars: https://data4sci.com/nlp-with-pytorch by Data For Science.
Natural Language lies at the heart of current developments in Artificial Intelligence, User Interaction, and Information Processing. The combination of unprecedented corpora of written text provided by social media and the massification of computational power has led to increased interest in the development of modern NLP tools based on state-of-the-art Deep Learning tools.
In this course, participants are introduced to the fundamental concepts and algorithms used for Natural Language Processing (NLP) through an in-depth exploration of different examples built using the PyTorch framework for deep learning. Applications to real datasets will be explored in detail.
- One-Hot Encoding
- TF/IDF and Stemming
- Stopwords
- N-grams
- Working with Word Embeddings
- PyTorch review
- Activation Functions
- Loss Functions
- Training procedures
- Network Architectures
- Feed Forward Networks
- Convolutional Neural Networks
- Applications
- Motivations
- Skip-gram and Continuous Bag of words
- Transfer Learning
- Recurrent Network Networks
- Gated Recurrent Unit
- Long-Short Term Memory
- Encoder-Decoder Models
- Text Generation