About Improving Deep Neural Networks: Hyperparameter Tuning, Regularization, and Optimization course
In the second year of the Deep Learning Specialization, you'll open the black box of deep learning to understand the processes that drive performance and systematically generate good results.
By the end of the course, you will understand best practices for training and test set generation, as well as bias/invariance analysis to build deep learning applications; use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient testing; implement and apply various optimization algorithms such as mini-batch gradient descent, Momentum, RMSprop, and Adam, and check their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and implications of deep learning and prepare you to participate in the development of cutting-edge AI technologies. It provides you with the knowledge and skills to apply machine learning to your work, advance your technical career, and take the plunge into the world of AI.