Perceiving Systems Talk Biography
12 April 2022 at 12:00 - 13:00 | https://us02web.zoom.us/j/81112835346

Mixing Synthetic and Real-World Captures for RGB Hand Pose Estimation

ORGANIZERS
Thumb ticker sm 04 1.58.1   crop3
Perceiving Systems
  • Guest Scientist
Thumb ticker xxl angela yao

How can we learn models for hand pose estimation without any (real-world) labels? This talk presents our recent efforts in tackling the challenging scenario of learning from labelled synthetic data and unlabelled real-world data. I will focus on two strategies that we find to be effective: (1) cross-modal consistency and alignment for representation learning and (2) pseudo-label corrections and refinement. The second part of the talk will introduce Assembly101, our newly recorded dataset that tackles 3D hand pose and action understanding over time. Assembly101 is a new procedural activity dataset featuring multiview + egocentric videos of people assembling and disassembling “take-apart” toy vehicles.

Speaker Biography

Angela Yao (National University of Singapore)

Assistant Professor

Angela Yao is a Dean's Chair Assistant Professor at the National University of Singapore's School of Computing. She leads the Computer Vision and Machine Learning group, with a special focus on vision-based human motion analysis. Before joining NUS, she was a W1-Professor in Visual Computing at the University of Bonn. She is the recipient of Singapore’s National Research Foundation's Fellowship in Artificial Intelligence (2019) and the German Pattern Recognition (DAGM) award (2018).