Autonomous Drones (Perception + Sensor Fusion)

Published:

This project area focuses on autonomous navigation for drones, where reliable perception and decision-making require robust multi-sensor fusion and strong localization under real-world constraints.

Focus

  • Multi-sensor fusion for autonomous drones (RGB, LiDAR, multispectral, IMU, GNSS)
  • Localization and state estimation as part of a deployable autonomy stack
  • Multi-agent motion forecasting using flow-based generative models (for interaction-aware perception)

What I’m working on

  • Designing modular fusion pipelines so perception, tracking, and forecasting can evolve independently.
  • Fusing complementary signals across sensors to improve robustness to missing/noisy inputs.
  • Improving forecasting quality in interactive scenes by modeling multi-agent dynamics.
  • Building deployment-minded systems: latency-aware, stable, and testable on real platforms.

Project date

2026