Special Talk: Scaling Robot Learning with Language, Logic, and Youtube (Talk)
My long-term research goal is to enable real robots to perform many different tasks in a wide variety of application scenarios, such as in our homes, in hospitals, warehouses, or factories. Many of these tasks require long-horizon reasoning and sequencing of skills to achieve a goal state, which is typically tackled using a hierarchical approach that involves task and motion planners. I will present our work on scaling up long-horizon reasoning for a diverse set of objects, manipulation skills, and tasks. The talk will take a system-level perspective and demonstrate how learning-based approaches can support generalization. Specifically, I will discuss our work on using large-scale, language-annotated video datasets as a low-cost data source to learn a library of composable manipulation skills. I will also demonstrate how the same dataset can be used to learn grounded predicates to enable closed-loop, symbolic task planning. Lastly, I will present our work on how large language models can be used for task planning and how the feasibility of plans can be verified using learned skills. I will conclude this talk by highlighting the remaining challenges in scaling robot learning.
Biography: Jeannette Bohg is an Assistant Professor of Computer Science at Stanford University. She received her PhD from KTH in Stockholm. In her thesis, she proposed novel methods towards multi-modal scene understanding for robotic grasping. After her PhD, she became a PostDoc and group leader at the Autonomous Motion Department (AMD) of the MPI for Intelligent Systems. Jeannette’s research focuses on perception and learning for autonomous robotic manipulation and grasping. She is specifically interested in developing methods that are goal-directed, real-time and multi-modal such that they can provide meaningful feedback for execution and learning. Jeannette Bohg has received several Early Career and Best Paper awards, most notably the 2019 IEEE Robotics and Automation Society Early Career Award and the 2020 Robotics: Science and Systems Early Career Award.