Linear Regression expects X
as an array with two dimensions and internally requires X.shape[1]
to initialize an np.ones
array. So converting X
to an nx1 array would do the trick. So, replace:
regr.fit(x,y)
by:
regr.fit(x[:,np.newaxis],y)
This will fix the problem. Demo:
>>> from sklearn import datasets
>>> from sklearn import linear_model
>>> clf = linear_model.LinearRegression()
>>> iris=datasets.load_iris()
>>> X=iris.data[:,3]
>>> Y=iris.target
>>> clf.fit(X,Y) # This will throw an error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/sklearn/linear_model/base.py", line 363, in fit
X, y, self.fit_intercept, self.normalize, self.copy_X)
File "/usr/lib/python2.7/dist-packages/sklearn/linear_model/base.py", line 103, in center_data
X_std = np.ones(X.shape[1])
IndexError: tuple index out of range
>>> clf.fit(X[:,np.newaxis],Y) # This will work properly
LinearRegression(copy_X=True, fit_intercept=True, normalize=False)
To plot the regression line use the below code:
>>> from matplotlib import pyplot as plt
>>> plt.scatter(X, Y, color='red')
<matplotlib.collections.PathCollection object at 0x7f76640e97d0>
>>> plt.plot(X, clf.predict(X[:,np.newaxis]), color='blue')
<matplotlib.lines.Line2D object at 0x7f7663f9eb90>
>>> plt.show()
linear_model
? How did you get it? – Selffertilization