Skip to main content
High-Resolution mm-Wave Imaging and Detection for Self-Driving Cars
Recent advances in computer vision have given autonomous systems such as self-driving cars and robots the ability to see the environment as humans and interact with it. mm-Wave (mm-wave) radars, as a new sensory modality for autonomous systems, can further enable the perception of things that are otherwise invisible to human eyes, such as beyond occlusions, around corners, and in conditions where cameras fail like fog and smog. However, despite these capabilities, the performance of mm-wave radars in terms of imaging and sensing objects remains significantly limited compared to other sensors such as cameras. In this talk, I will describe how we can use deep learning to enhance mm-wave radar imaging and sensing to capture rich perceptual and contextual information about the environment. Specifically, I will discuss how we can enable self-driving cars to see through the fog by achieving high-resolution imaging and accurate detection using mm-wave radars. I will also demonstrate how to take advantage of large-scale unlabeled data to enhance perception performance through self-supervised learning. Finally, I will describe how mm-wave radar combined with AI can help us sense materials.