Research Projects

Related links


View Frédéric Jean's profile on LinkedIn

1 - PhD Project

1.1 - Title

Modelling and Comparing Human Gait from Monocular Video Sequences

1.2 - Collaborators

  • Dr Robert Bergevin (Ph.D. Advisor), Dept. of Electrical and Computer Engineering, Laval University, Quebec City, Quebec, Canada.
  • Dr Alexandra Branzan Albu (Ph.D. Co-advisor), Dept. of Electrical and Computer Engineering, University of Victoria, Victoria, British Columbia, Canada.

1.3 - Description

The tracking and detection of people by means of a computerized system of cameras has been the subject of many research projects in the recent time. Several approaches have been proposed to solve this problem. However, these approaches are sometimes not realistic and often require a constrained environment as well as the cooperation of the people being observed. It would be interesting to be able to track and recognize people using more natural criteria and a less invasive approach, such as observing the gait of a person.

A person’s gait is mainly characterized by the position of each of his limbs and the movement he carries out over time when he is walking. Consequently, the problem first consists in automatically finding and tracking principal body parts from a monocular video sequence. Most of the motions performed by a walking person involve the extremities, namely the head, the hands and the feet. Since hand motion is not constrained during the walk, only the head and the feet will be found and tracked. The tracking algorithm that was developed during this project permits the tracking of the head and the feet positions in real-time. The following video sequence shows the resulting tracking for a person walking in front of a monocular camera. It is important to note that the tracking algorithm maintains the feet correspondence during feet occlusion, that is, the yellow and red dots stay on the same foot throughout the entire the video sequence.

A "normalization" process has been developed in order to rectify the head and the feet trajectories. This normalization process is required in order to obtain body part trajectories that are invariant to the view and the changes in the walking direction. The body part trajectories are transformed such as they appear to have been extracted from a fronto-parallel view (side-view), which is the best view for gait characteristics extraction. Once the body part trajectories are normalized, it is possible to extract gait characteristics that are view-invariant.

The following video sequence show the result of the normalization process for two people following a different path. First, the gait half-cycle are detected by analyzing feet distance. A gait half-cycle finishes and another one begins when the feet distance is maximal. For each of the detected gait half-cycle, an apparent "walking plane" is defined using the head and the feet position at the begining and the end of the half-cycle. Those planes, that are distorted by perspective projection, can be rectified by computing a homography transformation that makes them appear as if they were observed from a fronto-parallel view (rectangle). Once a homography has been defined for each walking plane, it can be applied on the body part trajectories to rectify them too.

Gait models are to be constructed using characteristics extracted from the normalized body part trajectories. It will then be possible to recognize people by their gait, and/or to detect abnormal gait.

1.4 - Related Publications

  • F. Jean, A. Branzan Albu, and R. Bergevin: Towards View-Invariant Gait Modeling: Computing View-Normalized Body Part Trajectories, PR 2009
  • F. Jean, R. Bergevin, and A. Branzan Albu: Computing and Evaluating View-normalized Body Part Trajectories, IVC 2009
  • F. Jean, R. Bergevin, and A. Branzan Albu: Trajectories Normalization for Viewpoint Invariant Gait Recognition, ICPR 2008
  • F. Jean, R. Bergevin, and A. Branzan Albu: Computing View-normalized Body Parts Trajectories, CRV 2007
  • F. Jean, R. Bergevin, and A. Branzan Albu: Body Tracking in Human Walk from Monocular Video Sequences, CRV 2005

2 - Virtual Keyboard Project

2.1 - Collaborators

  • Dr Alexandra Branzan Albu (Ph.D. Co-advisor), Dept. of Electrical and Computer Engineering, University of Victoria, Victoria, British Columbia, Canada.

2.2 - Description

The "virtual keyboard" tries to mimic the bank of foot pedals one may find on an organ. It is intended to be used with new musical instruments, as the Radio Drums. Some of these kinds of instruments require a foot control interface since both hands are already occupied when playing the instrument.

The virtual keyboard in itself is only a piece of wood on which color keys have been painted. It is observed by a cheap webcam and images are analyzed with computer vision algorithms in order to track the performer's feet and detect keys hits. A white marker is put on the tip of the performer's feet in order to get a reliable position for the feet tips. Since the positions and the layout of the keyboard are known to the algorithms, it is possible to detect which key has been hit and for how long a foot remained on that key. The following video sequence shows the use of the virtual keyboard. When a key is coloured green, it means that there is a foot over it, whereas the red color means the key has been hit.

2.3 - Related Publications

  • F. Jean, and A. Branzan Albu: The Visual Keyboard: Real-time Feet Tracking for the Control of Musical Meta-instruments, SPIC 2008
  • F. Jean, A. Branzan Albu, W. Schloss and P. Driessen: Computer Vision-based Interface for the Control of Musical Meta-instruments, HCI 2007

Design provided by Free Web Templates - your source for free website templates