Major advances in computer vision, signal processing, and machine learning make possible and are motivated by new questions about human behavior. This talk emphasizes questions about psychopathology and emotion communication. In particular, what can objective measurement of multimodal behavior reveal about major depressive disorder and emotion communication more broadly? To answer these and related questions, my interdisciplinary team of computer and behavioral scientists has developed advanced methods of automatically measuring facial expression, body motion, and gaze. I will review these advances and applications in depression, infants’ emotion expression, and a prototype for assisting people with severe visual impairment.
Bio: Jeffrey Cohn is a professor of psychology and psychiatry at the University of Pittsburgh and an adjunct professor at the Robotics Institute, Carnegie Mellon University. He has led interdisciplinary and inter-institutional efforts to develop advanced methods of automatic analysis and synthesis of facial expression and applied them to research in human emotion, interpersonal processes, social development, and psychopathology. He is an associate editor for IEEE Transactions on Affective Computing and has chaired major conferences in facial expression recognition, multimodal interfaces, and affective computing.