Fitting a polynomial using np.polyfit in 3 dimensions
Asked Answered
T

2

12

I have an array of data, with dimensions (N,3) for some integer N, that specifies the trajectory of a particle in 3D space, i.e. each row entry is the (x,y,z) coordinates of the particle. This trajectory is smooth and uncomplicated and I want to be able to fit a polynomial to this data.

I can do this with just the (x,y) coordinates using np.polyfit:

import numpy as np

#Load the data
some_file = 'import_file.txt'

data = np.loadtxt(some_file)
x = data[:,0]
y = data[:,1]

#Fit a 4th order polynomial
fit = np.polyfit(x,y,4)

This gives me the coefficients of the polynomial, no problems.

How would I extend this to my case where I want a polynomial which describes the x,y,z coordinates?

Teodorateodorico answered 27/7, 2017 at 12:30 Comment(0)
S
11

You have several options here. First, let's expand on your 2D case fit = np.polyfit(x,y,4). This means you describe the particle's y position as a function of x. This is fine as long it won't move back in x. (I.e. it can only have a unique y value for each x). Since movement in space is decomposed into three independent coordinates, we can fit the coordinates independently to get a 3D model:

fitxy = np.polyfit(x, y, 4)
fitxz = np.polyfit(x, z, 4)

Now both y and z are a polynomial function of x. As mentioned before, this has the drawback that the particle can only move monotonuously in x.

enter image description here

A true physical particle won't behave like that. They usually bounce around in all three dimensions, going which ever way they please. However, there is a 4th dimension in which they only move forward: time.

So let's add time:

t = np.arange(data.shape[0])  # simple assumption that data was sampled in regular steps

fitx = np.polyfit(t, x, 4)
fity = np.polyfit(t, y, 4)
fitz = np.polyfit(t, z, 4)

Now the particle is modeled to move freely in space, as a function on time.

Snipes answered 27/7, 2017 at 12:57 Comment(2)
I just want to say I really like your figures ;)Panayiotis
@Panayiotis I feel honored :)Snipes
A
10

You can do it with sklearn, using PolynomialFeatures.

For example, consider the following code:

from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
X = np.random.rand(100,2)
y=X[:,0]**2-3*X[:,1]**3+2*X[:,0]*X[:,1]-5

poly = PolynomialFeatures(degree=3)
X_t = poly.fit_transform(X)

clf = LinearRegression()
clf.fit(X_t, y)
print(clf.coef_)
print(clf.intercept_)

it prints

[  0.00000000e+00   1.08482012e-15   9.65543103e-16   1.00000000e+00
   2.00000000e+00  -1.18336405e-15   2.06115185e-15   1.82058329e-15
   2.33420247e-15  -3.00000000e+00]
-5.0

which are exactly the coefficients.

Ammon answered 27/7, 2017 at 12:57 Comment(2)
great answer Miriam. Not wanting to create too many threads on stackOverflow, would you (someone) know, how I can extract the goodness-of-fit, such as r^square, or the sum_of_square_errors? I mean, after the derivation of the polynomial, to the original data-setWreak
from sklearn.metrics import r2_score | r2_score(y, clf.predict(X_t))Tifanytiff

© 2022 - 2025 — McMap. All rights reserved.