The Transformer Network for Natural Language Processing
Please login to enroll in this event.
Neural-network-based methods for natural language processing (NLP) constitute an area of significant recent technical progress, with many interesting real-world applications. The Transformer Network is one of the newest and most powerful approaches of this type. This algorithm is based on repeated application of attention networks, in an encoder-decoder framework. In this presentation the basics of all-attention models (the Transformer) for NLP will be described, with application in areas like text synthesis (e.g., suggesting email text) and language translation.
Before the session, all registrants will receive an e-mail with a link and meeting information.
|Date||Thursday, September 10th, 2020|
|Time||4:30pm - 6:00pm|