Lucas-Kanade tracker

From MMVLWiki

Tracking of a texture patch in a NASA HD-video with Lucas-Kanade tracker (using 2-D affine model)
Visualisation of Lucas-Kanade template tracking (2-D affine model,2-D homography). Note that the algorithm is sensitive to illumination changes which are not modelled in this implementation (the videos can also be downloaded here: 2-D affine on a book (1.78 MByte), 2-D homography on a book (2.28 MByte), and 2-D homography on a cup (2.05 MByte))
Tracking of a nano-indenter in a TEM-video (using isometric model) with high magnification and low magnification. The indenter is lost where it moves to fast for the tracking algorithm

The Lucas Kanade tracking algorithm iteratively tries to minimise the difference between the image and a warped template. The technique can be used for image alignment, tracking, optic flow analysis, and motion estimation.

For the documentation of the mathematics have a look at the web-page of the CMU-project "Lucas-Kanade 20 years on" (https://www.ri.cmu.edu/projects/project_515.html) and at the publication by Baker and Matthews (https://www.ri.cmu.edu/pub_files/pub3/baker_simon_2004_1/baker_simon_2004_1.pdf).

Implementation

The crucial parts of the implementation (here: 2-d isometric model, three degrees of freedom) are only a few lines of code. An initial parameter vector p (obtained by performing object recognition), an image img and a template tpl are required. In this example we are using a 2-d isometric model with three degrees of freedom (i.e. p has three elements). In this case the tracking algorithm (inverse compositional Lucas-Kanade) is initialised as follows:

# three numbers indicating x-, y-position and angle
p = Vector[ xshift, yshift, rotation ]
# retrieve width and height of tracking template
w, h = *tpl.shape
# create a 2-D array with x-values and a 2-D array with y-values
x, y = xramp( w, h ), yramp( w, h )
sigma = 5.0
gx = tpl.gauss_gradient_x( sigma )
gy = tpl.gauss_gradient_y( sigma )
# compute Jacobian matrix (note that x, y, gx, and gy are 2-D arrays)
c = Matrix[ [ 1, 0 ], [ 0, 1 ], [ -y, x ] ] * Vector[ gx, gy ]
# compute Hessian matrix
hs = ( c * c.covector ).collect { |e| e.sum }

A tracking step then is done by applying the following piece of code to each image img. Usually the tracking step is performed multiple times on each image to improve the tracking estimate.

# allocate 3-D array with warp vectors
field = MultiArray.new( MultiArray::SFLOAT, w, h, 2 )
# compute first component of warp vectors
field[ 0...w, 0...h, 0 ] = x * cos( p[2] ) - y * sin( p[2] ) + p[0]
# compute second component of warp vectors
field[ 0...w, 0...h, 1 ] = x * sin( p[2] ) + y * cos( p[2] ) + p[1]
# take difference of warped image and template
diff = img.warp_clipped_interpolate( field ) - tpl
# multiply with Jacobian (note that some elements of c are 2-D arrays)
s = c.collect { |e| ( e * diff ).sum }
# get estimate for change of pose
d = hs.inverse * s
# update pose vector
p += Matrix[ [  cos(p[2]), -sin(p[2]), 0 ],
             [  sin(p[2]),  cos(p[2]), 0 ],
             [          0,          0, 1 ] ] * d

A full implementation is available as an example application with HornetsEye. The implementation does interpolation which is very important for the stability of the Lucas-Kanade tracker. Furthermore the gradient is computed using the surroundings of the initial template to avoid boundary effects. Note that the implementation does not model illumination changes so that the homography and the affine model required controlled lighting conditions. You can find a listing of the source code here (https://www.wedesoft.demon.co.uk/hornetseye-api/files/lktracker-txt.html).

See Also

External Links

Bookmark and Share

Personal tools