Affective Computing for Clinical Science and Practice, (3 ECTS including an additional exam)
Please use this link to register to the course: Registration form
Jeffrey Cohn (www.jeffcohn.net) is Professor of Psychology and Psychiatry at the University of Pittsburgh and Adjunct Professor of Computer Science at the Robotics Institute at Carnegie Mellon University. He leads interdisciplinary and inter-institutional efforts to develop advanced methods of automatic analysis and synthesis of facial expression and prosody and applies those tools to research in human emotion, social development, nonverbal communication, psychopathology, and biomedicine.
Lecture 1: (TS101, 9-11 am Aug. 14)
What happens during psychotherapy that enables clients to improve? What affective mechanisms underlie depression? How does patient affect respond to novel biologically-based treatments? These are fundamental questions for clinical practice and affective computing.
Recent breakthroughs in affective computing make possible objective, valid, and efficient clinically useful behavioral indicators of psychotherapy process and response to treatment. I will present: 1) human-observer and computational approaches to measurement that guide affective computing for clinical practice and research; 2) recent applications in depression and obsessive-compulsive disorder; 3) “expression transfer” that could enable sharing of clinical video without compromising patient anonymity; and 4) current challenges for affective computing in clinical practice and research.
Lecture 2: Applications and Challenges of Affective Computing Techonology (TS101, 9-11 am, Aug. 15)
In this lecture I will go into more detail on two topics from the first talk: One is an application of automated analysis in biomedicine and infant development. The other is challenges for our technology; specifically, the generalizability of action unit detection to non-frontal head pose that is common in real-world applications and the generalizability of classifiers to new domains.
The Dynamics of Facial Expressiveness in Craniofacial Microsomia
Crniofacial microsomia (CFM) is a congenital condition associated with malformations of the bone and soft tissue of the face and facial nerves, all of which have the potential to impair facial expressiveness. Using automated face analysis, we investigated whether CFM-related variation in expressiveness is evident as early as infancy. 13-month-old infants with and without CFM were observed in emotion induction tasks designed to elicit positive and negative affect. Expressiveness was quantified in two ways. One was based on facial action units (FACS); the other was a holistic approach involving the dynamics of face and head motion. Automatic AU detection showed high concordance with manual FACS coding and varied predictably with emotion induction. Few differences, however, emerged in AU-based expressiveness between CFM and control infants. By contrast, consistent differences emerged between cases and controls for two of three phenotypes and between infants of different ethnicities. Much is to be gained by considering the temporal envelope of facial expression and body motion.
Robustness to Pose and Data Source
Robustness to pose and to new data sources (i.e., domains) are critical to facial action unit detection. Most approaches to automatic AU detection treat the face as a 2D object, flat like a sheet of paper and assume that algorithms trained in one set of data will generalize to new ones. Both assumptions are problematic. In real-world conditions, moderate to large head rotation is common and system performance degrades as head pose departs from fontal. Regarding robustness to new data sources, little is known despite much research. To increase robustness to pose, we propose FACSCaps architecture to handle multi-view and multi-label facial action unit (AU) detection within a single model that generalizes to novel views and enables insights into action unit detection. To increase generalizability to new domains, we investigate domain adaptation.
Last updated: 7.8.2018