Gradient Descent for Logistic Regression

Implementation of Gradient Descent for optimizing Logistic Regression Cost Function

Assumptions:

  1. For the sake of simplicity, assume that there are only two features (the algorithm will generalize over training examples). Vectorized notations will take care of multiple features and training examples.
  2. Also, to make the notations simple, the derivative of the cost function with respect to a variable ‘x’ will be written as

Logistic Regression: Derivative calculation with two examples

Input: Parameters:

Image1

–> –>


Objective: Calculate the derivative of loss function w.r.t. &

Backpropagating Step By Step:

  • Calculate or
  • Calculate
  • Calculate




Looping over m examples: Pseudocode

Initialize:

for i = 1 to m
= +
=

=


J/=m
/=m
/=m
db/=m



b = b- db

is the learning rate.

Written on November 29, 2017
]