This section is a collection of resources about Deep Learning
Deep Learning
Online Courses**
- Geoffrey Hinton Neural Networks for Machine Learning (2012) : https://www.cs.toronto.edu/~hinton/coursera_lectures.html
- Andrew Ng Deep Learning : https://www.coursera.org/specializations/deep-learning
- FastAI Courses https://www.fast.ai/
- UCL x Deepmind Lectures : https://www.youtube.com/playlist?list=PLqYmG7hTraZCDxZ44o4p3N5Anz3lLRVZF
- NYU Deep Learning Lecture (Pytorch): https://atcold.github.io/pytorch-Deep-Learning/, https://www.youtube.com/playlist?list=PLLHTzKZzVU9eaEyErdV26ikyolxOsz6mq
Theory
-
Deep Learning (Goodfellow et al, 2017): https://www.deeplearningbook.org/
-
Grokking Deep Learning (Andrew Trask) : https://www.amazon.co.uk/Deep-Learning-Python-Francois-Chollet/dp/1617294438
Applications
- Deep Learning with Python (Francois Chollet) : https://www.amazon.co.uk/Deep-Learning-Python-Francois-Chollet/dp/1617294438
Image Classification
InceptionV3
- A Simple Guide to the Versions of the Inception Network
- Review: Inception-v3 — 1st Runner Up (Image Classification) in ILSVRC 2015
Natural Language Processing
- Natural Language Processing is Fun!
- A Practitioner's Guide to Natural Language Processing (Part I) — Processing & Understanding Text
Text Model
RNNs(Recurrent Neural Networks) RNNS & LSTMs (Long Short Term Memory)
- Understanding RNN and LSTM
- Recurrent Neural Networks and LSTM explained
- Recurrent Neural Networks
- Report on Text Classification using CNN, RNN & HAN
- Generating text using a Recurrent Neural Network
- Sentence Prediction Using a Word-level LSTM Text Generator — Language Modeling Using RNN
- Multi-Class Text Classification with LSTM
- Illustrated Guide to LSTM’s and GRU’s: A step by step explanation
- The magic of LSTM neural networks
- Video/Course: LongShortTermMemory(LSTM)
Term Frequency - Inverse Document Frequency (Tf-Idf)
- TF-IDF from scratch in python on real world dataset.
- What is TF-IDF in Feature Engineering?
- TF IDF | TFIDF Python Example
- How to process textual data using TF-IDF in Python
- TF-IDF/Term Frequency Technique: Easiest explanation for Text classification in NLP using Python (Chatbot training on words)
Word Embeddings
Sequence-to-Sequence Models
- Understanding Encoder-Decoder Sequence to Sequence Model (2019)
- Sequence To Sequence Models (2018)
- Sequence to sequence model: Introduction and concepts (2017)
- NLP | Sequence to Sequence Networks| Part 1| Processing text data
- NLP | Sequence to Sequence Networks| Part 2| Seq2seq Model (EncoderDecoder Model)
- Sequence Modeling with Deep Learning
Attention
- Brief Introduction to Attention Models
- An introduction to Attention
- Intuitive Understanding of Attention Mechanism in Deep Learning
- Attention and its Different Forms
- Attention Mechanisms in Deep Learning — Not So Special
- Coursera Video: Attention Model
Transformers
- What is a Transformer?
- How Transformers Work
- Transformer: A Novel Neural Network Architecture for Language Understanding (2017)
Bidirectional Encoder Representations from Transformers (BERT)
- BERT Explained: State of the art language model for NLP
- Understanding BERT: Is it a Game Changer in NLP?
- Google BERT — Pre Training and Fine Tuning for NLP Tasks
- Building State-of-the-Art Language Models with BERT
Reinforcement Learning
Reinforcement Learning: An Introduction (Sutton & Barto, 2018): https://www.amazon.co.uk/Reinforcement-Learning-Introduction-Richard-Sutton/dp/0262039249
Novel Topics
- CS 330: Deep Multi-Task and Meta Learning : https://cs330.stanford.edu/
Cognitive Computing
- Probabilistic Models of Cognition : https://probmods.org/
- NYU Computational cognitive modeling - Spring 2020: https://brendenlake.github.io/CCM-site/