I implemented, for training purpose, a linear regression in python. The problem is that the cost is increasing instead of decreasing. For the data I use the Airfoil Self-Noise Data Set. Data can be found here
I import data as follow :
import pandas as pd
def features():
features = pd.read_csv("data/airfoil_self_noise/airfoil_self_noise.dat.txt", sep="\t", header=None)
X = features.iloc[:, 0:5]
Y = features.iloc[:, 5]
return X.values, Y.values.reshape(Y.shape[0], 1)
My code for the linear regression is the following :
import numpy as np
import random
class linearRegression():
def __init__(self, learning_rate=0.01, max_iter=20):
"""
Initialize the hyperparameters of the linear regression.
:param learning_rate: the learning rate
:param max_iter: the max numer of iteration to perform
"""
self.lr = learning_rate
self.max_iter = max_iter
self.m = None
self.weights = None
self.bias = None
def fit(self, X, Y):
"""
Run gradient descent algorithm
:param X: the inputs
:param Y: the outputs
:return:
"""
self.m = X.shape[0]
self.weights = np.random.normal(0, 0.1, (X.shape[1], 1))
self.bias = random.normalvariate(0, 0.1)
for iter in range(0, self.max_iter):
A = self.__forward(X)
dw, db = self.__backward(A, X, Y)
J = (1/(2 * self.m)) * np.sum(np.power((A - Y), 2))
print("at iteration %s cost is %s" % (iter, J))
self.weights = self.weights - self.lr * dw
self.bias = self.bias - self.lr * db
def predict(self, X):
"""
Make prediction on the inputs
:param X: the inputs
:return:
"""
Y_pred = self.__forward(X)
return Y_pred
def __forward(self, X):
"""
Compute the linear function on the inputs
:param X: the inputs
:return:
A: the activation
"""
A = np.dot(X, self.weights) + self.bias
return A
def __backward(self, A, X, Y):
"""
:param A: the activation
:param X: the inputs
:param Y: the outputs
:return:
dw: the gradient for the weights
db: the gradient for the bias
"""
dw = (1 / self.m) * np.dot(X.T, (A - Y))
db = (1 / self.m) * np.sum(A - Y)
return dw, db
Then I instantiate the linearRegression class as follow :
X, Y = features()
model = linearRegression()
X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size=0.33, random_state=42)
model.fit(X_train, y_train)
I tried to find why the cost is increasing but so far I was not able to find out why. If someone could point me in the right direction it would be appreciated.
features()
toX = features.iloc[:, 1:2]
(instead of using the first four columns) your cost starts decreasing. Even when I use sklearn, I can't get a score better than .6 with the original data. Try constructing an artificial dataset that you know will place nicely with linear regression- see what kind of results you get with that – Edwardedwardian