Registering two point clouds involves finding the optimal rigid transformation that aligns those two point clouds. For a connected autonomous vehicle (CAV), an accurate localization for an `ego’ vehicle can be achieved by registering its point cloud to LiDAR data from other connected `cooperative’ vehicles.
This paper utilizes an advanced object detection algorithm to select observation points that are on detected vehicles. As a prerequisite, a general probability distribution (cf. left figure) based on the observation points from all detected vehicles is established.
For the registration, in the first step, observation points from a cooperative vehicle are assigned to detected bounding boxes. Then, each set of points belonging to one bounding box is registered to the general probability distribution resulting in a `probability map’. In the second step, the probability map is used as shared information and the point cloud of the ego vehicle is registered to it.
Different from the Euclidean distance metric of the Iterative Closest Point (ICP) algorithm and the consensus count metric of the maximum consensus method, a new probability-related metric is proposed for a coarse registration. It is used to provide an initial transformation, which is used afterwards in a registration refinement by ICP.
The registration is completely based on the vehicle information in the scene. The algorithm is evaluated on the collective perception data set COMAP. Especially for some scenes that are challenging to existing registration algorithms such as scenes in a traffic jam or in an open space where no efficient overlaps of observed static objects exists. For those scenarios, from the perspective of accuracy and robustness, the algorithm has shown good performance.
The left figure shows the general distribution of observation points, while the figure on the right shows the registration result between 'probability map' of cooperative vehicle and Lidar points of ego vehicle.