Chapter 2: Logistic Regression
Building a Logistic Regression model from scratch is very easy as it shares almost all of its logic with the Linear Regression model that we implemented in the previous Learning Unit. The salient difference lies in the predict function.
import numpy as np from supervised.linear_regression import LinearRegression from utils.preprocessing import add_dummy_feature from utils.activation import Sigmoid class LogisticRegression(LinearRegression): def predict(self, X, weights=None): if self.fit_intercept is True: X = add_dummy_feature(X) if weights is None: weights = self.coef_ sigmoid = Sigmoid() return sigmoid(X.dot(weights))
Now, the model will always output a class probability between 0 and 1. This means that in order to predict the actual class, all that is needed is to round this output.
def predict_classes(self, X, weights=None): return self.predict(X, weights) > 0.5
And this is it!