顯示廣告
隱藏 ✕
看板 Programming
作者 sucker (can't fix)
標題 $60,000求解Performa a plane of best fit
時間 2015-11-29 Sun. 19:40:34


原文檔部分符號無法呈現,有意者請email:gmail.com,將另寄原稿供參考。

Performa a plane of best fit

If given a long list of voxels with position. Perform plane of best fit using SVD.

Let us first define a plane, using n points of position vector ri = (xi, yi, zi), i = 1..n, which cluster around a plane of best fit which has unit normal u = (a,b,c) and contains a point at position vector p. The plane necessarily contains the centroid of the point cloud ^r  = (^x ,^y , ^z ), where ^x  = (1/n)xi, etc.

Let us also define a matrix A to be the n 3 matrix containing the data points:

 

The distance di from the point at rito the plane is simply the magnitude of the projection of  (ri - p) onto the normal u:  

di = (ri - p).u/|u| = (ri - p).u

Let us also define the objective (“goodness-of-fit”) function O({ri},u), where {ri} is the set of position vectors to all points in the point cloud:

O({ri},u) = idi2 = i((ri - p).u)2 = i(ri.u - p.u)2

As with any least-squares problem, the objective function is a sum of residua, and minimisation of it will give the optimum value of the vector u.

It is easily shown that the centroid of the data must lie on the least-squares plane, so p = (1/n)iri. Further, if we assume that the data points have been translated so that their centroid is at the origin, we can set p = (1/n)iri = 0, and:

O({ri},u) = i(ri.u)2

However, establishing the direction of the best-fitting plane is slightly more difficult. Essentially, we are faced with a constrained minimization problem: if we define K = u2-1 = |u|2-1, then we must minimize the objective function O({ri},u) with the additional constraint that K = 0.  Using the method of Lagrange multipliers, the minimum is shown to occur at

O = K,

for some real number . We note that we are trying to minimize the objective function with respect to the plane, so the  function will be of the form

 

Since O = 2(ATA)u and K = 2u, we find that our constrained minimization problem reduces to a simple eigenvalue problem. In solving this system of equations, we choose to use the singular value decomposition method (SVD); while this methodology is computationally intensive, it does promote numerical stability, particularly useful when the normal matrix (ATA)u is very ill-conditioned.

The SVD method yields three distinct eigenvectors, and we must determine the one such that the objective function O({ri},u) is minimized. We can write (ATA)u = u as:

ixi(u.ri) = a
iyi(u.ri) = b
izi(u.ri) = c

On multiplying the three equations by a, b and c respectively, and then summing the equations, we obtain:

i(u.ri)2 = u|2 = 

We note that this is the objective function, and now it is clear that to minimise it, we must choose the smallest eigenvalue and the eigenvector corresponding to it. This eigenvector points in the direction normal to the plane of best fit.
[html]
[/html]

--
※ 作者: sucker 時間: 2015-11-29 19:40:34
※ 看板: Programming 文章推薦值: 0 目前人氣: 0 累積人氣: 1076 
1樓 時間: 2016-01-28 22:37:35 (澳大利亞)
  01-28 22:37 AU
email到底在哪??
2樓 時間: 2016-11-26 13:26:30 (台灣)
  11-26 13:26 TW
在遙遠的天際嗎?
r)回覆 e)編輯 d)刪除 M)收藏 ^x)轉錄 同主題: =)首篇 [)上篇 ])下篇