Perceiving Systems Ph.D. Thesis 2025

Aerial Robot Formations for Dynamic Environment Perception

Thumb ticker sm portrait small
Perceiving Systems
  • Guest Scientist

Perceiving moving subjects, like humans and animals, outside an enclosed and controlled environment in a lab is inherently challenging, since subjects could move outside the view and range of cameras and sensors that are static and extrinsically calibrated. Previous state-of-the-art methods for such perception in outdoor scenarios use markers or sensors on the subject, which are both intrusive and unscalable for animal subjects. To address this problem, we introduce robotic flying cameras that autonomously follow the subjects. To enable functions such as monitoring, behaviour analysis or motion capture, a single point of view is often insufficient due to self-occlusion, lack of depth perception and coverage from all sides. Therefore, we propose a team of such robotic cameras that fly in formation to provide continuous coverage from multiple view-points. The position of the subject must be determined using markerless, remote sensing methods in real time. To solve this, we combine a convolutional neural network-based detector to detect the subject with a novel cooperative Bayesian fusion method to track the detected subject from multiple robots. The robots need to then plan and control their own flight path and orientation relative to the subject to achieve and maintain continuous coverage from multiple view-points. This, we address with a model-predictive-control-based method to predict and plan the motion of every robot in the formation around the subject. A preliminary demonstrator is implemented with multi-rotor drones. However, drones are noisy and potentially unsafe for the observed subjects. To address this, we introduce non-holonomic lighter-than-air autonomous airships (blimps) as the robotic camera platform. This type of robot requires dynamically constrained orbiting formations to achieve omnidirectional visual coverage of a moving subject in the presence of wind. Therefore, we introduce a novel model-predictive formation controller for a team of airships. We demonstrate and evaluate our complete system in field experiments involving both human and wild animals as subjects. The collected data enables both human outdoor motion capture and animal behaviour analysis. Additionally, we propose our method for autonomous long-term wildlife monitoring. This dissertation covers the design and evaluation of aerial robots suitable to this task, including computer vision/sensing, data annotation and network training, sensor fusion, planning, control, simulation, and modelling.

Author(s): Price, Eric
Year: 2025
Month: December
Day: 11
BibTeX Type: Ph.D. Thesis (phdthesis)
Address: Tübingen, Germany
Degree Type: PhD
DOI: http://dx.doi.org/10.15496/publikation-114304
School: University of Tübingen
State: Published
Attachments:

BibTeX

@phdthesis{AerialRobotFormations,
  title = {Aerial Robot Formations for Dynamic Environment Perception},
  abstract = {Perceiving moving subjects, like humans and animals, outside an enclosed and controlled
  environment in a lab is inherently challenging, since subjects could move outside the
  view and range of cameras and sensors that are static and extrinsically calibrated. Previous state-of-the-art methods for such perception in outdoor scenarios use markers or
  sensors on the subject, which are both intrusive and unscalable for animal subjects. To
  address this problem, we introduce robotic flying cameras that autonomously follow the
  subjects. To enable functions such as monitoring, behaviour analysis or motion capture,
  a single point of view is often insufficient due to self-occlusion, lack of depth perception
  and coverage from all sides. Therefore, we propose a team of such robotic cameras that
  fly in formation to provide continuous coverage from multiple view-points. The position
  of the subject must be determined using markerless, remote sensing methods in real time.
  To solve this, we combine a convolutional neural network-based detector to detect the
  subject with a novel cooperative Bayesian fusion method to track the detected subject
  from multiple robots. The robots need to then plan and control their own flight path
  and orientation relative to the subject to achieve and maintain continuous coverage from
  multiple view-points. This, we address with a model-predictive-control-based method
  to predict and plan the motion of every robot in the formation around the subject. A
  preliminary demonstrator is implemented with multi-rotor drones. However, drones are
  noisy and potentially unsafe for the observed subjects. To address this, we introduce
  non-holonomic lighter-than-air autonomous airships (blimps) as the robotic camera platform. This type of robot requires dynamically constrained orbiting formations to achieve
  omnidirectional visual coverage of a moving subject in the presence of wind. Therefore, we introduce a novel model-predictive formation controller for a team of airships.
  We demonstrate and evaluate our complete system in field experiments involving both
  human and wild animals as subjects. The collected data enables both human outdoor
  motion capture and animal behaviour analysis. Additionally, we propose our method
  for autonomous long-term wildlife monitoring. This dissertation covers the design and
  evaluation of aerial robots suitable to this task, including computer vision/sensing, data
  annotation and network training, sensor fusion, planning, control, simulation, and modelling.},
  degree_type = {PhD},
  school = {University of Tübingen},
  address = {Tübingen, Germany},
  month = dec,
  year = {2025},
  author = {Price, Eric},
  doi = {http://dx.doi.org/10.15496/publikation-114304},
  month_numeric = {12}
}