Week 1 - Derivatives and Optimization-Lesson 1 - Derivatives
Course Introduction
()
Machine Learning Motivation
()
Motivation to Derivatives - Part I
()
Derivatives and Tangents
()
Slopes, maxima and minima
()
Approximation of Derivatives
Derivatives and their notation
()
Some common derivatives - Lines
()
Some common Derivatives - Quadratics
()
Some common derivatives - Higher degree polynomials
()
Some common derivatives - Other power functions
()
The inverse function and its derivative
()
Derivative of trigonometric functions
()
Meaning of the Exponential (e)
()
The derivative of e^x
()
The derivative of log(x)
()
Existence of the derivative
()
Properties of the derivative: Multiplication by scalars
()
Properties of the derivative: The sum rule
()
Properties of the derivative: The product rule
()
Properties of the derivative: The chain rule
()
Week 1 - Derivatives and Optimization-Ungraded Lab
(Optional) Downloading your Notebook and Refreshing your Workspace
Week 1 - Derivatives and Optimization-Lesson 2 - Optimization
Introduction to optimization
()
Optimization of squared loss - The one powerline problem
()
Optimization of squared loss - The two powerline problem
()
Optimization of squared loss - The three powerline problem
()
Optimization of log-loss - Part 1
()
Optimization of log-loss - Part 2
()
Week 1 - Conclusion
()
Week 1 - Derivatives and Optimization-Programming Assignment: Optimizing Functions of One Variable: Cost Minimization
(Optional) Assignment Troubleshooting Tips
(Optional) Partial Grading for Assignments
Week 1 - Derivatives and Optimization-Lecture Notes
Week 1 - Slides
Week 2 - Gradients and Gradient Descent-Lesson 1 - Gradients
Introduction to Tangent planes
()
Partial derivatives - Part 1
()
Partial derivatives - Part 2
()
Gradients
()
Gradients and maxima/minima
()
Optimization with gradients: An example
()
Optimization using gradients - Analytical method
()
Week 2 - Gradients and Gradient Descent-Lesson 2 - Gradient Descent
Optimization using Gradient Descent in one variable - Part 1
()
Optimization using Gradient Descent in one variable - Part 2
()
Optimization using Gradient Descent in one variable - Part 3
()
Optimization using Gradient Descent in two variables - Part 1
()
Optimization using Gradient Descent in two variables - Part 2
()
Optimization using Gradient Descent - Least squares
()
Optimization using Gradient Descent - Least squares with multiple observations
()
Week 2 - Conclusion
()
Week 2 - Gradients and Gradient Descent-Lecture Notes
Week 2 - Slides
Week 3 - Optimization in Neural Networks and Newton's Method-Lesson 1 - Optimization in Neural Networks
Regression with a perceptron
()
Regression with a perceptron - Loss function
()
Regression with a perceptron - Gradient Descent
()
Classification with Perceptron
()
Classification with Perceptron - The sigmoid function
()
Classification with Perceptron - Gradient Descent
()
Classification with Perceptron - Calculating the derivatives
()
Classification with a Neural Network
()
Classification with a Neural Network - Minimizing log-loss
()
Gradient Descent and Backpropagation
()
Week 3 - Optimization in Neural Networks and Newton's Method-Lesson 2 - Newton's Method
Newton's Method
()
Newton's Method: An example
()
The second derivative
()
The Hessian
()
Hessians and concavity
()
Newton's Method for two variables
()
Week 3 - Conclusion
()
[IMPORTANT] Reminder about end of access to Lab Notebooks
Week 3 - Optimization in Neural Networks and Newton's Method-Acknowledgments & Course Resources
Acknowledgments
(Optional) Opportunity to Mentor Other Learners
Week 3 - Optimization in Neural Networks and Newton's Method-Lecture Notes
Week 3 - Slides