Course Content
Binary Logistic Regression
Building binary classification models with neural networks
Introduction
Binary logistic regression is a fundamental algorithm used for binary classification tasks, where the output is either 0 or 1, true or false, yes or no.
It is widely used in:
β
Email spam detection (spam or not spam).
β
Medical diagnosis (disease or no disease).
β
Predicting customer churn (will leave or stay).
1οΈβ£ What is Binary Logistic Regression?
While linear regression predicts continuous values, logistic regression predicts probabilities between 0 and 1, which are then converted to class labels (0 or 1).
Equation:
[ p = \frac{1}{1 + e^{-(wx + b)}} ] where:
- ( p ) = predicted probability of the positive class (1).
- ( w ) = weight (slope).
- ( x ) = input feature.
- ( b ) = bias (intercept).
- ( e ) = Eulerβs number (~2.718).
2οΈβ£ The Sigmoid Function
The sigmoid function is used to map the output of the linear combination ( wx + b ) to a probability between 0 and 1.
[ \sigma(z) = \frac{1}{1 + e^{-z}} ]
If:
β
( p \geq 0.5 ), the output class = 1.
β
( p < 0.5 ), the output class = 0.
3οΈβ£ Loss Function for Logistic Regression
Binary Cross-Entropy Loss is used: [ L = -[y \log(p) + (1 - y) \log(1 - p)] ] where:
- ( y ) = actual label (0 or 1).
- ( p ) = predicted probability.
The optimizer updates ( w ) and ( b ) to minimize this loss using gradient descent.
4οΈβ£ Example in Python
Using scikit-learn
for a simple binary classification:
from sklearn.linear_model import LogisticRegression
import numpy as np
# Sample data
X = np.array([[1], [2], [3], [4], [5], [6]])
y = np.array([0, 0, 0, 1, 1, 1]) # Labels: 0 for low, 1 for high
# Create and train the model
model = LogisticRegression()
model.fit(X, y)
# Predict probabilities
print("Predicted probabilities:", model.predict_proba(X))
# Predict classes
print("Predicted classes:", model.predict(X))
# View learned weight and bias
print("Weight:", model.coef_)
print("Bias:", model.intercept_)
5οΈβ£ Why Binary Logistic Regression is Important
β
Helps in classification tasks with clear decision boundaries.
β
Introduces concepts of probabilities and thresholds.
β
Forms a foundation for understanding classification in neural networks.
Conclusion
Binary logistic regression is a powerful yet simple tool for binary classification and is essential for building your deep learning foundation.
Whatβs Next?
β
Experiment with logistic regression on datasets like Iris (for binary subsets) or custom binary tasks.
β
Learn how logistic regression extends to multiclass classification (softmax regression).
β
Continue with classification using neural networks.
Join the SuperML Community to practice and share your projects while learning classification.
Happy Learning! β