Recommendations for multi-camera fusion for 360 deg image?
Looking for recommendations for ROS nodes or C++ libraries that can take multiple independent camera image streams and combine them into a single 360 deg image stream. I’m looking for something that handles stitching the images together rather than just overlaying them based on the URDF file. Any recommendations?
Is the calibration between the cameras known and static? Or do you need to also do some feature matching and pose optimization before the stitching?
You can also give a try to panotools, but I've never used it for anything else than stitching panoramas from my outdoor trips... However, it's opensource, high quality, and it should have all the features you might need.
camera positions are static relative to each other but the system is on a moving base.