About Natural Language Processing with Attention Models course
In the fourth year of the Natural Language Processing specialization, you will: a) Translate full English sentences into German using the encoder-decoder attention model, b) Build a transformer model for text summarization, c) Use T5 and BERT models to perform question answering, and d) Build a chatbot using the reformer model.
By the end of this Specialization, you will have developed NLP applications that perform question answering and sentiment analysis, built tools for language translation and text summarization, and even built a chatbot! Students should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g. TensorFlow, Keras), and knowledge of calculus, linear algebra, and statistics. Please ensure you have completed Course 3 - Natural Language Processing with Sequence Models - before beginning this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an instructor of artificial intelligence at Stanford University who also helped create the Deep Learning Specialization. Lukasz Kaiser is a Staff Scientist at Google Brain and co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.