Understanding Human Activity (P C Yuen et al.)

Understanding what people are doing from surveillance video cameras by machine is currently a hot research topic in computer vision and pattern recognition community because of many practical applications, ranging from national security video surveillance, shopping mall monitoring to domestic elderly smart home. To achieve this goal, we need to perform researches on (i) human detection and tracking within a camera view as well as across cameras, (ii) human action recognition and (iii) human-object/human interaction. The goal of this project is to understand what is happening in the surveillance scene.


Research Challenges:


Recent Findings:

 

Content on this page requires a newer version of Adobe Flash Player.

Get Adobe Flash player

Content on this page requires a newer version of Adobe Flash Player.

Get Adobe Flash player

Green box: initial location; Blue box: proposed method;  Red box: IVT; Yellow box: L1 tracker

Illumination insensitive face tracking results [4]

 

Person re-identification (tracking) under disjoint camera views [1]

 

Content on this page requires a newer version of Adobe Flash Player.

Get Adobe Flash player

Spatio-temporal saliency detection results [7]

 

 

Content on this page requires a newer version of Adobe Flash Player.

Get Adobe Flash player

Generating Salient Action Unit by information saliency curve [6]

 

Salient Action Units in Weizmann database [6]

 

Top five EigenActions in Weizmann database [6]

 

Block diagram of the co-training algorithm [5]

 

Manifold learning based action recognition framework [3]

 

 

Block diagram of score level dependency modeling [3]

 

Block diagram of feature level dependency modeling [3]

 


Publications:


Grant Support:

This project is supported by the Research Grant Council (RGC), Hong Kong SAR, the National Natural Science Foundation of China (NSFC) and the Faculty Research Grant (FRG) of Hong Kong Baptist University.


For further information on this research topic, please contact Prof. P C Yuen.