ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

0-. Clarification for where ROS nodes and non-ROS lower layer processes run.

on the QNX, I run roslaunch...

I assume you meant your computer is accessing QNX, instead of running roslaunch on QNX. AFAIK there's no roslaunch or any ROS implementation for QNX (I've managed to run Catkin on it years ago but no other ROS tools are ported yet).

the ros master core + ros bridge which run on the QNX as well as hostname and port for the nextage robot.

Having said my comment above, this isn't correct.

1-. "ipython access" to the robot fails from another computer

All you need for the Python client to access the robot is that CORBA communication in between. Most easily, usually you can simply ping your robot if there's IP connection. If it goes through then you're probably good (CORBA uses 15005 for NEXTAGE, which usually isn't a problem). You should be able to skip --port, --modelfile args by default.

(BTW, typically when we talk about NEXTAGE computer resource-wise, we call QNX and "robot" as the same thing because that's where all the low level hardware controll processes run on.)

2-. "wait for ModelLoader" issue

This kind of error message gives me an impression that something is not working right on the robot/QNX side. If possible, reboot QNX and try again. Also provide us rtls result.


Lastly, I assume you've been working on the tutorials http://wiki.ros.org/rtmros_nextage/Tutorials. I'm more than happy to hear if you could point to particular pages that are unclear to you. You can just keep using answers.ros.org or GitHub actually works better for that kind of feedback.

0-. Clarification for where ROS nodes and non-ROS lower layer processes run.

on the QNX, I run roslaunch...

I assume you meant your computer is accessing QNX, instead of running roslaunch on QNX. AFAIK there's no roslaunch or any ROS implementation for QNX (I've managed to run Catkin on it years ago but no other ROS tools are ported yet).

the ros master core + ros bridge which run on the QNX as well as hostname and port for the nextage robot.

Having said my comment above, this isn't correct.

1-. "ipython access" to the robot fails from another computer

All you need for the Python client to access the robot is that CORBA communication in between. Most easily, usually you can simply ping your robot if there's IP connection. If it goes through then you're probably good (CORBA uses 15005 for NEXTAGE, which usually isn't a problem). You should be able to skip --port, --modelfile args by default.

(BTW, typically when we talk about NEXTAGE computer resource-wise, we call QNX and "robot" as the same thing because that's where all the low level hardware controll processes run on.)

2-. "wait for ModelLoader" issue

This kind of error message gives me an impression that something is not working right on the robot/QNX side. If possible, reboot QNX and try again. Also provide us rtls result.


Lastly, I assume you've been working on the tutorials http://wiki.ros.org/rtmros_nextage/Tutorials. I'm more than happy to hear if you could point to particular pages that are unclear to you. You can just keep using answers.ros.org or GitHub actually works better for that kind of feedback.


UPDATE1 (Disclaimer: The following is mostly AFAIK since I'm not affiliated with the manufacturer.) NEXTAGE Open should come with 2 computers by default. So seeing 2 machines by the robot isn't surprising.

The other IP address (called nextage I think in /etc/hosts) does not answer pings nor has open ports. Maybe this 'nextage' is the real QNX?

Normally yes unless customized.

What can I do if pinging this does not work?

In factory setting QNX should be using a specific IP address that I have in mind for the robot's LAN (i.e. not your office/lab's LAN). But the manufacturer didn't want us to disclose IP address publicly. I'll email you their support contact info in case you don't have it.