☰ Menu & Search

Natural Language Processing with LSTM Recurrent Neural Networks

Enrollment

Please login to enroll in this event.


Description

Natural language processing (NLP) is a field focused on developing automated methods for analyzing text, and also for computer-driven text generation (synthesis, for example in translation and text summarization). Recurrent neural networks have recently become a state-of-the-art method for NLP, with the long short-time memory (LSTM) network representing the primary methodology of this type. In this session LSTM NLP models will be introduced, with as little math as possible and with an emphasis on intuition. The concept of word embeddings will be introduced within the context of implementing LSTMs, and it will be explained how such models are used in practice for analysis and generation of natural language (e.g., language translation).

Before the session, all registrants will receive an e-mail with a link and meeting information.


Resources

No resources



Details

Status Archived
Date Wednesday, October 7th, 2020
Time 4:30pm - 6:00pm
Location Virtual Classroom
Leader Lawrence Carin
Enrolled 42

loading