Stream virtual camera image following a tf frame?
Hey, I'm working on a SLAM implementation and I want to add functionality to visualize it in real-time in the browser from devices other than the host machine through a web server. Currently I am using RViz on the host machine and I am seeing the map being built in real-time just fine. Would it be possible to attach a sort of virtual camera to the scene that would hover a certain distance above the robot's frame, basically like a XYOrbit or a ThidPersonController camera in RViz, but also send whatever it captures over to a topic that I could then send over the web server and display on the client side? My reasoning for considering this is that the point cloud generated is pretty large, and is being published 10 times a second so I have a pretty good feeling it would clog up rosbridge and the websocket and then ruin performance. Also I'd need the decay functionality that RViz provides when displaying point clouds. Any ideas or suggestions on how I could achieve this?
Yes it’s possible. Please check this link: https://roboticsknowledgebase.com/wik....
@osilva: you've been around for a bit now here. Please stop posting link-only answers/comments.
With the dynamic nature of the internet, links go stale almost the minute you post them. Please quote the relevant parts in your answer/comment.
Apologies I thought it was ok for comments, I added a more complete answer.
Similar question here https://github.com/lucasw/rviz_camera...
Also #q142351 but it doesn't have the XYOrbit requirement, rviz_camera_stream doesn't provide that kind of UI but maybe some other tf generating solution does and could be used in combination with it.