Gradient Descent for Logistic Regression
Implementation of Gradient Descent for optimizing Logistic Regression Cost Function
- For the sake of simplicity, assume that there are only two features (the algorithm will generalize over training examples). Vectorized notations will take care of multiple features and training examples.
- Also, to make the notations simple, the derivative of the cost function with respect to a variable ‘x’ will be written as
Logistic Regression: Derivative calculation with two examples
Objective: Calculate the derivative of loss function w.r.t. &
Backpropagating Step By Step:
- Calculate or
Looping over m examples: Pseudocode
for i = 1 to m
b = b- db
is the learning rate.
Written on November 29, 2017[ ]