Abstract — Robots are increasingly an important part of our world, from working in factories and hospitals to driving on city streets. It is particularly important that robots can be taught new skills on the fly by people who are not robotics experts. This requires new user interfaces, learning from demonstration, and new algorithms for execution. Our Behavior Tree-based CoSTAR system allows end users to quickly teach new skills to collaborative robots through a combination of explicit demonstration and abstract queries. This design was informed by a 35-person user study that explored which aspects of the system were most useful when constructing new task plans. However, even these plans are limited in how they can adapt to new environments on their own, and often result in surprising behavior. New algorithms that combine perceptual abstractions with learning and motion planning allow for better adaptation to new environments and to new tasks. In particular, we see that the combination of learned skill models with tree search allows for better adaptation and new ways of incorporating and visualizing prior knowledge in challenging planning problems. Together, these components allow end users to specify complex behaviors that can generalize to new environments.
About the Speaker — Chris Paxton is a final-year graduate student at Johns Hopkins University, studying robotics and human-robot interaction. He is interested in ways we can allow non-expert users to give robots the knowledge they need to be able to plan and adapt to new environments and in making robots into true co-workers that can do the parts of a task that are difficult for humans. He did his undergraduate work at the University of Maryland, College Park, where he received a BS in Computer Science with a minor in Neuroscience. He was a member of the Johns Hopkins team that won the 2015 Kuka Award with the CoStar system.
Host — Bilge Mutlu (firstname.lastname@example.org)