Big Data in Behavioral Medicine

Tuesday, November 22, 2016 -
9:30am to 10:30am
Discovery Building Room 3280

Speaker Name: 

Jim Rehg

Speaker Institution: 

Professor, College of Computing, Georgia Institute of Technology




The explosion of health-related data in the form of electronic health records and genomics has captured the attention of both the machine learning and medical informatics communities. In this talk, I will describe an emerging opportunity to bring data analytics to bear on the behavioral dimension of health in addition. As we move towards a preventive and anticipatory approach to health and medicine, understanding the genesis of adverse health-related behaviors (such as smoking and unhealthy eating), and developing more effective interventions for behavior change, becomes a critical challenge. Advances in mobile sensing, from classical activity monitoring to the recent advent of wearable cameras, provide new opportunities to continuously-measure behaviors under naturalistic conditions and construct novel predictive models for adverse behavioral outcomes. Behavioral data is inherently multimodal and time-varying, with complex dynamics over multiple temporal scales, and poses several interesting machine learning challenges.

This talk will provide an overview of this emerging research area and highlight several of our on-going projects. Through the NIH-funded MD2K center (, we are collaborating with a multi-university team to construct an open source platform for sensing health-related behaviors in the field and triggering mobile interventions. I will provide an overview of that effort and the health challenges in smoking cessation and heart failure that we are addressing. I will describe some recent work on efficient parameter learning for continuous-time hidden Markov models (CT-HMM), that can support trajectory-modeling and prediction for sequences of event data from clinical populations. I will also present work on developing novel sensors for social behaviors in the context of treating children with autism. In particular, I will describe a recent method for automatically measuring eye contact during face-to-face social interactions using wearable cameras, and describe its clinical applications. This is joint work with Drs. Agata Rozga, Yu-Ying Liu, and Fuxin Li, and PhD students Alexander Moreno, Yin Li, and Eunji Chong.