Skip to content

B. ROS2 Framework

A. Yilmaz edited this page Nov 29, 2024 · 11 revisions

ROS2 graph including nodes and topics

The ROS2 framework of the fruit detector package

The ROS2 Framework of the system is shown above. RGBD images can be captured from a ROS2 camera node through /camera/image_raw and /camera/depth messages. The intrinsic information of the camera on /camera/camera_info is also required if the 3D pose of the fruits also needs to be computed.

ROS2 3D pose detection

3D fruit pose estimation

When the RGB and depth images are captured, they are converted into CV images and merged to get RGBD stack arrays. This information and unique image ID acquired from the ROS2 message are supplied to the pre-trained predictor to get annotations and depth masks for the detected fruits in COCO json format.

The achieved annotations message includes fruit annotations in terms of instance IDs, category IDs, segmentation masks, bounding box masks, fruit size as pixel-wise area, confidence rates for detections, centroid and orientation estimations for the detected fruits in image space, and fruit semantic categories such as ripe or unripe.

When the annotations are achieved, that information is stored in a defined message FruitInfoArray format.

In the default settings, the FruitInfoArray stores annotations for fruits in FruitInfoMessage format, the employed RGB and depth images in sensor_msgs/Image format, and the annotated image for visualisation purposes in sensor_msgs/Image format. In ROS2, the visualisation function can draw the detected fruits' centroid, mask, and orientation information on the original RGB image.

The detection results are published on the /fruit_info message and marked annotations are published on the /image_composed message. 3D poses of the detected fruits are also published as RViz marker arrays on the /fruit_markers message.

Publishing the image, the depth image, the annotated image and the marker arrays can be switched off with the ROS2 parameter server.

Clone this wiki locally