Tracking of Features and Edges

Joint Tracking of Features and Edges

1. LK光流

基本LK光流运动假设:

\[I(x+u,y+v,t+1) = I(x,y,t) \]

一阶近似得到:

\[f(u,v,I) = I_xu+I_yv+I_t =0 \]

由于Aperture problem,需要假设领域像素运动相同,并作为约束,便可以求解

\[E_{LK}(u,v) = K_{\rho}*(f(u,v;I))^2 \]

2. Horn-Schunck光流

\[E_{HS}(u,v) = \int _{\Omega} (f(u,v;I))^2+\lambda( |\nabla u|^2 +|\nabla v|^2)dxdy \]

\(\lambda\)为正则项参数,相当于加了个平滑约束.

\(\nabla ^2u, \nabla ^2v\)\(u,v\)的拉普拉斯算子,可以近似为:

\[\nabla ^2u \approx h(\overline u - u) \]

领域\(u\)的均值来表示.

3. Joint Tracking

\[E_{JLK} = \sum_{i=1}^N (E_D(i)+\lambda_i E_S(i)) \]

\[E_D(i) = K_{\rho}*(f(u_i,v_i;I))^2\\ E_S(i) = ((u_i-\hat{u}_i)^2+(v_i-\hat{v}_i)^2) \]

\((\hat{u}_i,\hat{v}_i)^T\) 为期望的偏移量,可以通过任何一种方式获取.

 Instead, we predict the motion displacement of a pixel by fitting an affine motion model 
 to the displacements of the surrounding features, which are inversely weighted according to their distance to the pixel.
We use a Gaussian weighting function on the distance, with σ = 10 pixels.

对于周围的特征拟合一个Affine变换来获取?

利用特征周围的特征点求解一个预测值:

  • 直接利用领域内\((u,v)\)的平均值

特征选择:

\[max(e_{min},\eta e_{max}), \eta <1 \]

本文取: \(\eta=0.1\)

4. Unified Point-Edgelet Feature tracking

  • 进一步优化,选取Edgelet而不是边缘的点作为track的目标
  • 预测的\((\hat{u},\hat{v})\)不是平均值,而是拟合一个Affine变换获得(u,v),并且拟合变换的权重根据距离和scale进行计算

5. \(u,v\)预测值如何计算

利用领域特征的\(u,v\)取加权来进行计算获得

6. 接下来工作

这些方法的思路都是利用点和边缘来互补操作,使得二者能够互相提升各自的缺陷,接下来基本参考joint_tracking的思路,但是不取平均值,而是进行加权操作,简单尝试.

7. 参考文献

  • Birchfield S T , Pundlik S J . Joint tracking of features and edges CVPR 2008
  • Sundararajan K . Unified point-edgelet feature tracking[J]. Dissertations & Theses - Gradworks, 2011.
posted @ 2019-10-25 16:16  椒盐蘑菇  阅读(163)  评论(0编辑  收藏  举报