☰ Menu & Search

Attention Networks for Natural Language Processing

Enrollment

Please login to enroll in this event.


Description

Neural-network-based methods for natural language processing (NLP) constitute an area of significant recent technical progress, with many interesting real-world applications. The Transformer Network is one of the newest and most powerful approaches of this type. This algorithm is based on repeated application of attention networks, in an encoder-decoder framework. In this presentation the basics of all-attention models (the Transformer) for NLP will be described, with application in areas like text synthesis (e.g., suggesting email text) and language translation.

Please register if you would like to join; we will sent instructions about the virtual session to all registrants in advance.


Resources

No resources



Details

Status Archived
Date Thursday, March 26th, 2020
Time 4:30pm - 6:30pm
Location Virtual Classroom
Leader Lawrence Carin
Enrolled 114

loading