Comparison of Loss Functions for Linear Regression
In this section, we compare different loss functions commonly used in regression tasks: Mean Squared Error (MSE), Mean Absolute Error (MAE), and Huber Loss.
- First, it calculates the MSE and MAE using the mean_squared_error and mean_absolute_error functions from the sklearn.metrics module.
- Then, it defines a custom function huber_loss to compute the Huber Loss, which is a combination of MSE and MAE, offering a balance between robustness to outliers and smoothness.
- Next, it calculates the Huber Loss with a specified delta value (delta=1.0) using the implemented huber_loss function.
- Finally, it plots the values of these loss functions for visualization using matplotlib, with labels indicating the type of loss function.
The plot provides a visual comparison of the loss values for the different functions, allowing you to observe their behavior and relative magnitudes.
import numpy as np
import matplotlib.pyplot as plt
from sklearn.metrics import mean_squared_error, mean_absolute_error
# Sample target and predicted values
y_true = np.array([3, 7, 4, 1, 8, 5])
y_pred = np.array([4, 6, 5, 3, 7, 6])
# Calculate MSE and MAE
mse = mean_squared_error(y_true, y_pred)
mae = mean_absolute_error(y_true, y_pred)
# Huber Loss implementation
def huber_loss(y_true, y_pred, delta=1.0):
error = np.abs(y_true - y_pred)
loss = np.where(error <= delta, 0.5 * error**2, delta * error - 0.5 * delta**2)
return np.mean(loss)
huber_delta1 = huber_loss(y_true, y_pred, delta=1.0)
# Plot the loss functions
losses = [mse, mae, huber_delta1]
labels = ['MSE', 'MAE', 'Huber Loss (delta=1)']
# Providing x-values explicitly for plotting
x = np.arange(len(losses))
plt.figure(figsize=(10, 6))
plt.bar(x, losses, tick_label=labels)
plt.xlabel('Loss Function')
plt.ylabel('Loss Value')
plt.title('Comparison of Loss Functions')
plt.show()
Output:
In linear regression, the particular issue and the data’s properties determine the loss function to use. where handling regularly distributed mistakes and where outliers are not a significant problem, the MSE is often used. When robustness to outliers is crucial, the Huber Loss offers robustness without sacrificing differentiability, and the MAE is the recommended choice.
Loss function for Linear regression in Machine Learning
The loss function quantifies the disparity between the prediction value and the actual value. In the case of linear regression, the aim is to fit a linear equation to the observed data, the loss function evaluate the difference between the predicted value and true values. By minimizing this difference, the model strives to find the best-fitting line that captures the relationship between the input features and the target variable.
In this article, we will discuss Mean Squared Error (MSE) , Mean Absolute Error (MAE) and Huber Loss.
Table of Content
- Mean Squared Error (MSE)
- Computing Mean Squared Error in Python
- Computing Mean Squared Error using Sklearn Library
- Mean Absolute Error (MAE)
- Computing Mean Absolute Error in Python
- Computing Mean Absolute Error using Sklearn
- Huber Loss
- Computing Huber Loss in Python
- Comparison of Loss Functions for Linear Regression
- FAQs on Loss Functions for Linear Regression
Contact Us