Teaching
- Spring 2020, Advanced Topics in Information Processing; Neural Machine Translation (CMSC828B).
TuTh 3:30-4:45pm, IRB 1207
Neural sequence-to-sequence models have emerged as a powerful tool to address many natural language processing tasks, such as translating text from one language to another, summarizing documents, generating answers to questions, or rewriting language in a different style. The main objective of the course is to gain an understanding of the state of the art techniques for designing, training, and using sequence-to-sequence models to generate natural language, with a focus on machine translation.
Students will be expected to (1) read, present, and discuss a range of classic and recent papers drawn from the Natural Language Processing, Machine Learning and Linguistics literature, to understand the strengths and limitations of neural sequence-to-sequence models, and (2) put these ideas in practice in a research project.
While there are no formal prerequisites, we will assume that students have background equivalent to introductory graduate courses in natural language processing and/or machine learning. A written exam at the beginning of the semester will help students assess their degree of preparation for the course.
- Fall 2019, Introduction to Natural Language Processing (CMSC470).
- Fall 2018, Introduction to Natural Language Processing (CMSC498S).
- Spring 2018, Introduction to Machine Learning (CMSC422)
- Fall 2017, Computational Linguistics I (CMSC723 / INST735 / LING723 )
- Spring 2017, Introduction to Machine Learning (CMSC422)
- Fall 2016, Computational Linguistics I (CMSC723 / INST735 / LING723 )
- Spring 2016, Introduction to Machine Learning (CMSC422)
- Fall 2015, Computational Linguistics I (CMSC723 / INST735 / LING723 )
- Spring 2015, Multilingual Natural Language Processing (CMSC828I / LING 848)