This workshop will provide an introduction to deep learning for natural language processing (NLP). It will cover some of the common deep learning architectures, describe advantages and concerns, and provide some hands-on experience.
We would cover the following: 1) What is deep learning? 2) Motivation: Some use cases where it has produced state-of-art results 3) Basic building blocks of Neural networks (Neuron, activation function, back propagation algorithm, gradient descent algorithm) 4) Supervised learning (multi-layer perceptron, convolution neural networks, recurrent neural network) 5) Introduction to word2vec 6) Introduction to Recurrent Neural Networks 7) Text classification using RNN 8) Impact of GPUs (Some practical thoughts on hardware and software) Broadly, there will be three hands-on modules 1) A simple multi-layer perceptron - to understand basics of neural networks (everything will be coded from scratch) 2) A text classification problem and a text generation problem: This will be solved using Recurrent Neural Networks. The data and software requirements are posted on the github repository. The repository for this workshop: https://github.com/rouseguy/intro2deeplearning/ The slides for this workshop are available at: https://speakerdeck.com/bargava/introduction-to-deep-learning