International Journal of Image, Graphics and Signal Processing(IJIGSP)

ISSN: 2074-9074 (Print), ISSN: 2074-9082 (Online)

Published By: MECS Press

IJIGSP Vol.4, No.12, Nov. 2012

Occluded Human Tracking and Identification Using Image Annotation

Full Text (PDF, 337KB), PP.43-49

Views:84   Downloads:2


Devinder Kumar, Amarjot Singh

Index Terms

Image Annotation, Human Tracking, Optical flow, Motion Tracking


The important task of human tracking can be difficult to implement in real world environment as the videos can involve complex scenes, severe occlusion and even moving background. Tracking individual objects in a cluttered scene is an important aspect of surveillance. In addition, the systems should also avoid misclassification which can lead to inaccurate tracking. This paper makes use of an efficient image annotation for human tracking. According to the literature survey, this is the first paper which proposes the application of the image annotation algorithm towards human tracking. The method divides the video scene into multiple layers assigning each layer to the individual object of interest. Since each layer has been assigned to a specific object in the video sequence: (i) we can track and analyse the movement of each object individually (ii) The method is able to reframe from misclassification as each object has been assigned a respective layer. The error incurred by the system with movement from one frame to another is presented with detailed simulations and is compared with the conventional Horn–Schunck alone.

Cite This Paper

Devinder Kumar,Amarjot Singh,"Occluded Human Tracking and Identification Using Image Annotation", IJIGSP, vol.4, no.12, pp.43-49, 2012.


[1]Book Title: Pervasive Computing” In proc.Lecture note on comp Vol 3468,pp 329: 334, 2005

[2]Underwater Human-Robot Interaction via Biological Motion Identification. Junaed Sattar and Gregory Dudek. Proceedings of the 2009 Conference on Robotics: Science and Systems V (RSS), MIT Press, pages 185-192. June-July 2009, Seattle, WA, USA.

[3]Wilson, D. Atkeson, C. “Simultaneous Tracking and Activity Recognition (STAR) Using Many Anonymous, Binary Sensors

[4]Luis M. Fuentes,Sergio A. Velastin “Human tracking in surveillance applications” In Proc. of the 2nd IEEE International workshop on PETS 2001

[5]Treptow, Andre and Cielniak, Grzegorz and Duckett, Tom (2006) Real-time human tracking for mobile robots using thermal vision. Robotics and Autonomous Systems, 54 (9). p. 729. ISSN 0921-8890

[6]David Moore. “A real-world system for human motion detection and tracking.” California Institute of Technology,2003

[7]Ronan Fablet. Michael J. Black “Automatic Detection and Tracking of Human Motion with a View-Based Representation” In Proc of ECCV 2002

[8]I. Haritaoglu, D. Harwood and L.S. Davis, “W4: Real-time surveillance of human and their activities”, PAMI, Vol. 22. No. 8, pp. 809-830, 2000

[9]Haritaoglu I, Flickner M, “Detection and tracking of shopping groups in stores”, Proceeding of the 2001 IEEE Computer Vision and Pattern Recognition, vol. 1, 8-14, pp. 431-438, December. 2001

[10]Ce Liu ,William T. Freeman, Edward H. Adelson, Yair Weiss “Human-Assisted Motion Annotation” Computer Vision and Pattern Recognition”, CVPR, 2008

[11]M. J. Black and P. Anandan, “The robust estimation of multiple motions: parametric and piecewise-smooth flow fields” Computer Vision and Image Understanding, 63(1):75-104, January 1996. 

[12]T. Brox, A. Bruhn, N. Papenberg and J. Weickert. High accuracy optical flow estimation based on a theory for wrapping In Proc. ECCV, pages 25–36, 2004. 

[13]A.Bruhn, J Weickert, and C. Schnorr. Lucas/Kanade meets Horn/schunk: combining local and global optical flow methods IJCV, 61(3):211–231, 2005. 

[14]B. Lucas and T. Kanade. An iterative image registration technique with an application to stereo vision. In Proc. Of the Intl Joint Conf. on Artificial Intelligence, pages 674–679, 1981 

[15]A Agarwala, A. Hertzmann, D.H. Salesin and S. M. Seitz Keyframe-based tracking for rotoscoping and animation.. AACM SIGGRAPH, 23(3):584–591, 2004.