Darius Burschka presents work on collaborative visual-SLAM approaches using mini-UAVs. The key aspects discussed are:
1) Using omnidirectional cameras instead of fish-eye lenses to allow easy recovery of viewing angles for localization and reconstruction.
2) Proposing measures like increasing distance between cameras, focal length, or camera resolution to boost perception resolution in camera swarms.
3) Describing a collaborative reconstruction approach using two independently moving cameras to estimate extrinsic parameters and reconstruct 3D points from motion stereo without extrinsic calibration.