Texture-Independent Long-Term Tracking Using Virtual Corners (bibtex)
by Lebeda, Karel, Hadfield, Simon, Matas, Jirí and Bowden, Richard
Abstract:
Long term tracking of an object, given only a single instance in an initial frame, remains an open problem. We propose a visual tracking algorithm, robust to many of the difficulties which often occur in real-world scenes. Correspondences of edge-based features are used, to overcome the reliance on the texture of the tracked object and improve invariance to lighting. Furthermore we address long-term stability, enabling the tracker to recover from drift and to provide redetection following object disappearance or occlusion. The two-module principle is similar to the successful state-of-the-art long-term TLD tracker, however our approach offers better performance in benchmarks and extends to cases of low-textured objects. This becomes obvious in cases of plain objects with no texture at all, where the edge-based approach proves the most beneficial. We perform several different experiments to validate the proposed method. Firstly, results on short-term sequences show the performance of tracking challenging (low-textured and/or transparent) objects which represent failure cases for competing state-of-the-art approaches. Secondly, long sequences are tracked, including one of almost 30 000 frames which to our knowledge is the longest tracking sequence reported to date. This tests the redetection and drift resistance properties of the tracker. Finally, we report results of the proposed tracker on the VOT Challenge 2013 and 2014 datasets as well as on the VTB1.0 benchmark and we show relative performance of the tracker compared to its competitors. All the results are comparable to the state-ofthe-art on sequences with textured objects and superior on nontextured objects. The new annotated sequences are made publicly available.
Reference:
Texture-Independent Long-Term Tracking Using Virtual Corners (Lebeda, Karel, Hadfield, Simon, Matas, Jirí and Bowden, Richard), In IEEE Transactions on Image Processing, volume 25, 2016.
Bibtex Entry:
@Article{Lebeda15c,
  Title                    = {Texture-Independent Long-Term Tracking Using Virtual Corners},
  author    = {Lebeda, Karel and Hadfield, Simon and Matas, Jir{\' \i} and Bowden, Richard},
  Journal                  = {IEEE Transactions on Image Processing},
  Year                     = {2016},
  pages	    = {359--371},
  volume    = {25},
  number    = {1},
 gsid = {16422168168389859169},
  Abstract                 = {Long term tracking of an object, given only a single instance in an initial frame, remains an open problem. We propose a visual tracking algorithm, robust to many of the difficulties which often occur in real-world scenes. Correspondences of edge-based features are used, to overcome the reliance on the texture of the tracked object and improve invariance to lighting. Furthermore we address long-term stability, enabling the tracker to recover from drift and to provide redetection following object disappearance or occlusion. The two-module principle is similar to the successful state-of-the-art long-term TLD tracker, however our approach offers better performance in benchmarks and extends to cases of low-textured objects. This becomes obvious in cases of plain objects with no texture at all, where the edge-based approach proves the most beneficial.
We perform several different experiments to validate the proposed method. Firstly, results on short-term sequences show the performance of tracking challenging (low-textured and/or transparent) objects which represent failure cases for competing state-of-the-art approaches. Secondly, long sequences are tracked, including one of almost 30 000 frames which to our knowledge is the longest tracking sequence reported to date. This tests the redetection and drift resistance properties of the tracker. Finally, we report results of the proposed tracker on the VOT Challenge 2013 and 2014 datasets as well as on the VTB1.0 benchmark and we show relative performance of the tracker compared to its competitors. All the results are comparable to the state-ofthe-art on sequences with textured objects and superior on nontextured objects. The new annotated sequences are made publicly available.},
  Annote                   = {Long term tracking of an object, given only a single instance inan initial frame, remains an open problem. We propose a visualtracking algorithm, robust to many of the difficulties whichoften occur in real-world scenes. Correspondences of edge-basedfeatures are used, to overcome the reliance on the texture ofthe tracked object and improve invariance to lighting.Furthermore we address long-term stability, enabling the trackerto recover from drift and to provide redetection followingobject disappearance or occlusion. The two-module principle issimilar to the successful state-of-the-art long-term TLDtracker, however our approach offers better performance inbenchmarks and extends to cases of low-textured objects. Thisbecomes obvious in cases of plain objects with no texture atall, where the edge-based approach proves the most beneficial.We perform several different experiments to validate theproposed method. Firstly, results on short-term sequences showthe performance of tracking challenging (low-textured and/ortransparent) objects which represent failure cases for competingstate-of-the-art approaches. Secondly, long sequences aretracked, including one of almost 30\,000 frames which to ourknowledge is the longest tracking sequence reported to date.This tests the re-detection and drift resistance properties ofthe tracker. Finally, we report results of the proposed trackeron the VOT Challenge 2013 and 2014 datasets as well as on theVTB1.0 benchmark and we show relative performance of the trackercompared to its competitors. All the results are comparable tothe state-of-the-art on sequences with textured objects andsuperior on non-textured objects. The new annotated sequencesare made publicly available.},
  Doi                      = {10.1109/TIP.2015.2497141},
  Keywords                 = {Machine vision, image motion analysis, visual tracking, long-term tracking, low texture, edge, line correspondence.},
  Project                  = {EPSRC EP/I011811/1, GACR P103/12/G084},
  Status                   = {accepted},
  Url                      = {http://cvssp.org/Personal/KarelLebeda/papers/TIP2016.pdf}
}
Powered by bibtexbrowser