Documentation on unit and integration tests for ROS2 core packages?

asked 2021-04-01 12:08:49 -0500

shonigmann gravatar image

I was debating between ROS Discourse and ROS Answers... So hopefully I chose correctly.

Bit of a multi-parter here, but the tldr version is asking whether there is documentation on the testing of ROS2 core packages beyond the ROS 2 developer guide, which lists some best practices and quality targets, and the test source code that most core packages have?

Is there a better way to understand what tests are covering besides going through and reading the test source and comments? I.e. do any packages have more formal documentation on what they're testing and what requirements they are trying to meet?

REP 2004 mentions that a "set of measurable standards and explanations of the goals they equate to is likely the most convincing method of proving to stakeholders that the package is properly tested" but doesn't seem to set any requirements beyond test coverage (>90-95% depending on where you look). Have any packages used more of a standards/goals based approach and if so are those explanations recorded somewhere?

Are there any hard requirements for performance tests for any of the packages, or are these primarily used for benchmarking and to ensure that updates aren't degrading performance?

Is there a Dashboard or similar resource (e.g. similar to the discussion here) that summarizes package quality for core packages? E.g. that lists the currently declared quality level, maybe desired quality level, things that are missing from the package to get the desired quality level, and maybe some data or statistics on test coverage, etc. from the nightly builds? If it doesn't exist, I certainly think it could be an interesting thing to develop, e.g. on ROS Metrics, to help inform new users on the "maturity" of packages and to help direct new contributors who want to help mature ROS2.

Thanks in advance for any insight that can be shared!

edit retag flag offensive close merge delete