BIT team publishes multi-model dataset for autonomous driving in complex environments

b78961c7f4584bef8e85694671a70a66.png

A team of professors including Xu Bin, Wang Weida, and Wang Li from the State Key Laboratory of Intelligent Unmanned System Technology, School of Mechanical Engineering and Vehicle Engineering at Beijing Institute of Technology, along with units from Tsinghua University and National University of Singapore, have recently published the world's first international complex environment autonomous driving dual 4D imaging millimeter-wave dataset, Dual Radar, in a Nature sub-journal.

The dataset collected perception data under non-ideal conditions with a total mileage of over 400 kilometers, and over 10,000 frames of synchronized frames annotated for 3D object detection and tracking, used to evaluate the performance of existing perception algorithms on real non-ideal condition data.

The related research results were published under the title Dual Radar: A Multi-modal Dataset with Dual 4D Radar for Autonomous Driving in Scientific Data, a journal dedicated to publishing datasets with scientific value.

The environmental perception of autonomous driving systems is crucial for ensuring safety and reliability, but existing sensor technologies still face many challenges in complex and dynamic driving environments. 4D millimeter-wave radar has a higher point cloud density and precise vertical resolution compared to traditional 3D radar, making it promising for environmental perception in autonomous driving.

Currently, there is a lack of comparative analysis on different point cloud densities and noise levels of 4D millimeter-wave radar in the field of autonomous driving. Existing datasets are based on a single type of 4D millimeter-wave radar, making it difficult to balance long-range point cloud data and wide-field point cloud data in the same scene, thus preventing a direct comparison of point cloud densities and noise levels.

To address this issue, the research team proposed a new dataset, Dual Radar, which integrates analysis of radar data with significantly different performance in long-range and wide-field capabilities by introducing two types of 4D radars with distinct characteristics. This advancement aims to propel the development of environmental perception technologies for autonomous driving in adverse weather conditions and complex lighting scenarios.

In this work, the research team's data covers challenging environments such as adverse weather (rain and fog), low-light day-night conditions, and backlighting that traditional datasets have not fully addressed. This fills the gap in existing datasets for extreme conditions and provides new data support for the development of perception technologies in the field of autonomous driving. Particularly in enhancing the all-weather reliability and safety of autonomous vehicles in complex environments, this work shows promising application potential. It is expected to drive the advancement of autonomous driving perception systems towards low-cost, high robustness, and all-scenario capabilities.

ADDRESS
  • Zhongguancun Campus:

    No 5 Zhongguancun South Street, Haidian District, Beijing
  • Liangxiang Campus:

    No 8 and 9 Yards, Liangxiang East Road, Fangshan District, Beijing
  • Xishan Campus:

    No 16 Lengquan East Road, Haidian District, Beijing

Copyright © Beijing Institute of Technology. All rights reserved. Presented by China Daily.