0

I'm working with 3d coordinate data (x,y,z), however I know that the z coordinate is systematically wrong and the error of z is dependant on both x and y. I however do have some data where I know the correct z value so I want to make a model that can given the xyz coordinates give the correct z value, but I'm not sure what the best way of doing so is.

I tried using an SVM (support vector regressor) but it gives pretty bad results, I also tested k-nearest neighbor regressor and it works really well for the datapoints I have, however there are holes in the 3d space where I don't have data and I'm fairly certain the k-nearest neighbor regressor will not give correct result. I feel like there should exist a simple polynomial formula that solves the problem which is why I went for an SVM but is there any other simple method I can employ to do this correction? I feel like the problem should be solvable by a simple polynomial such as: $z_{correct} = c_0 + x*c_1 + x^2*c_2+ y*c_3 + y^2*c_4+ z*c_5 + z^2*c_6$, however I don't know what formula to use or the best way to find the constants, any suggestions would be greatly appreciated.

drulludanni
  • 103
  • 1
  • Please consider upvoting and accepting the answer or, alternatively, please describe why you consider it not correct or what is not clear in it. – noe Feb 24 '23 at 13:19

1 Answers1

1

Fitting your data to a polynomial, as you described it, is called "polynomial regression". It is equivalent to a linear regression where you introduce new features for $x^2, x^3, ...$.

There is a closed-form solution to find the optimal constants. Here you can find details on the algebra to implement it. Nevertheless, it is also possible to find the values for the constants by means of gradient descent methods. Here you can find a discussion on why using one method or the other.

The gradient descent approach can be easily implemented with any deep learning framework, like Pytorch or Tensorflow. For instance, here you have an example of a Pytorch implementation.

noe
  • 22,074
  • 1
  • 43
  • 70