Wednesday 4:15 PM–6:00 PM in Tutorial Room

Deep Learning and Modern NLP

Zachary S. Brown

Audience level:
Intermediate

Description

In this talk, we’ll cover the fundamental building blocks of neural network architectures and how they are utilized to tackle problems in modern natural language processing. Topics covered will include an overview of language vector representations, text classification, named entity recognition, and sequence to sequence modeling approaches.

Abstract

In this talk, we’ll cover the fundamental building blocks of neural network architectures and how they are utilized to tackle problems in modern natural language processing. Topics covered will include an overview of language vector representations, text classification, named entity recognition, and sequence to sequence modeling approaches. An emphasis will be placed on the shape of these types of problems from the perspective of deep learning architectures. This will help to develop an intuition for identifying which neural network techniques are the most applicable to new problems that practitioners may encounter.

This talk is targeted towards those interested in either natural language processing or deep learning. I'll assume little experience with NLP or deep learning, and will try to build up an intuition from the ground up using a highly visual approach to describe neural networks. This talk would be ideal for data scientists currently working or interested in NLP or deep learning, or analytic or business professionals interested in learning about what types of problems can be solved with modern NLP techniques.

Subscribe to Receive PyData Updates

Subscribe

Tickets

Get Now