Automatic Construction of Lane-level HD Maps for Urban Scenes

Yiyang Zhou, Yuichi Takeda, Masayoshi Tomizuka, Wei Zhan

High definition (HD) maps have demonstrated their essential roles in enabling full autonomy, especially in complex urban scenarios. As a crucial layer of the HD map, lane-level maps are particularly useful: they contain geometrical and topological information for both lanes and intersections. However, large scale construction of HD maps is limited by tedious human labeling and high maintenance costs, especially for urban scenarios with complicated road structures and irregular markings. This paper proposes an approach based on semantic-particle filter to tackle the automatic lane-level mapping problem in urban scenes. The map skeleton is firstly structured as a directed cyclic graph from online mapping database OpenStreetMap. Our proposed method then performs semantic segmentation on 2D front-view images from ego vehicles and explores the lane semantics on a birds-eye-view domain with true topographical projection. Exploiting OpenStreetMap, we further infer lane topology and reference trajectory at intersections with the aforementioned lane semantics. The proposed algorithm has been tested in densely urbanized areas, and the results demonstrate accurate and robust reconstruction of the lane-level HD map.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment