Project-ASL Recognition

The first video demonstrats detection of a wh-question non-manual marker. Tracked face and head is shown on left, while the right image shows the extracted spatial pyramid features. Red bars indicate detection of the wh non-manual marker, while blue bars indicate that the system detects no wh non-manual marker. In the second video we demonstrate that we are able to track eyebrow height (top right) and head pitch angle (bottom right) in an isolated utterance of wh-question. The red graph line...

Project-Children-Autism

In this project we try to assess natural movements of children with Autism Spectrum Disorders in the classroom settings. We use touch screens, a motion capture system (Polhemus Liberty sampling movement at 240Hz), multiple cameras and all sorts of fun toys to engage the children. Currently, we have developed visual stimulus to encourage the autistic children to perform designed tasks. Currently we are analyzing the data captured by Polhemus system.

Project-Expression Recognition

Facial Expression Recognition and Intensity Estimation. Age Estimation and Human Action Recognition.

Project-Expression-Flow

We address the problem of correcting an undesirable expression on a face photo by transferring local facial components, such as a smiling mouth, from another face photo of the same person which has the desired expression. Direct copying and blending using existing compositing tools results in semantically unnatural composites, since expression is a global effect and the local component in one expression is often incompatible with the shape and other components of the face in another...

Project-Eye-Blink

Fatigue from chronic partial sleep deprivation, circadian misalignment (e.g., slam-shifts), and work overload (e.g.,EVAs) is a risk factor for people driving a car, or performing critical tasks. There is a need for techniques that objectively and unobtrusively identify the presence of fatigue on-line. Tracking slow eyelid closures is one of the most reliable ways to detect fagitues. We develop a system to tracking slow eyelid closure using a single camera. The eyes are modeled by a singe...

Project-FaceTrack

Accurate face tracking and 3D head pose prediction (shown in top left as a 3D vector of pitch, yaw and tilt) while the face is making various facial expressions as well as out of plane rotations. The 79 tracked landmarks corresponding to the eyes, eyebrows, nose, mouth and face contour are shown as red dots.

Project-Group-Activity

This approach effectively models group activities based on social behavioranalysis. Different from previous work that uses independent local features,this project explores the relationships between the current behavior stateof a subject and its actions. Our method does not depend on human detectionor segmentation, so it is robust to detection errors. Instead, trackedspatio-temporal interest points are able to provide a good estimation ofmodeling group interaction. SVM is usedto find abnormal...

Project-Synchrony

We investigate how degree of interactional synchrony can signal whether trust is present, absent, increasing or declining. We propose an automated, data-driven and unobtrusive framework for deception detection and analysis in interrogation interviews from visual cues only. Our framework consists of the face tracking, the gesture detection, the expression recognition, and the synchrony estimation. This framework is able to automatically track gestures and expressions of both the subject and the...