Using real-world Kinect with Gazebo
I'm new to the ROS community. I have a general question: Is it possible to use a data from a real-world kinect to guide a simulated robot (e.g. PR2) in Gazebo? If so, how hard will it be to do this?
My hope is that the interface will be a drop in change for a real robot (someday) and that I can test a gesture-based control algorithm against a simulated robot.
Thanks in advance for any feedback.