DABA: Decentralized and Accelerated Large-Scale Bundle Adjustment

Taosha Fan1, Joseph Ortiz2, Ming Hsiao3, Maurizio Monge3, Jing Dong3, Todd Murphy4 Mustafa Mukadam1
1Meta AI, 2Imperial College London, 3Reality Labs Research, 4Northwestern University
RSS 2023

Abstract

Scaling to arbitrarily large bundle adjustment problems requires data and compute to be distributed across multiple devices. Centralized methods in prior works are only able to solve small or medium size problems due to overhead in computation and communication. In this paper, we present a fully decentralized method that alleviates computation and communication bottlenecks to solve arbitrarily large bundle adjustment problems. We achieve this by reformulating the reprojection error and deriving a novel surrogate function that decouples optimization variables from different devices. This function makes it possible to use majorization minimization techniques and reduces bundle adjustment to independent optimization subproblems that can be solved in parallel. We further apply Nesterov's acceleration and adaptive restart to improve convergence while maintaining its theoretical guarantees. Despite limited peer-to-peer communication, our method has provable convergence to first-order critical points under mild conditions. On extensive benchmarks with public datasets, our method converges much faster than decentralized baselines with similar memory usage and communication load. Compared to centralized baselines using a single device, our method, while being decentralized, yields more accurate solutions with significant speedups of up to 953.7x over Ceres and 174.6x over DeepLM.

Video

BibTeX


      @inproceedings{Ortiz:etal:iSDF2022,
        title={Decentralization and Acceleration Enables Large-Scale Bundle Adjustment},
        author={Fan, Taosha and Ortiz, Joseph and Hsiao, Ming and Monge, Maurizio and Dong, Jing and Murphey, Todd and Mukadam, Mustafa},
        booktitle={Robotics: Science and Systems},
        year={2023}
      }

Contact

If you have any questions, please feel free to reach out to Taosha Fan or Joe Ortiz.