How Arbe is Leveraging NVIDIA's DRIVE Sim to Boost the Development of Robust Perception Imaging Radars

Perception is the ability to be aware, to regard, to understand. To achieve truly safe autonomous driving, sensors need to go beyond the act of sensing to support true understanding. Perception is critical for safe autonomous driving, and imaging radar is critical for perception.

Arbe's 360° Radar-based Perception Environment provides unparalleled safety to autonomous vehicles through unified AI-based analysis of the vehicles' surroundings. A combination of perception radars and surround imaging radars provide full sensing coverage to meet the specific needs of every position around the vehicle, while integrated perception software facilitates communication between each position, achieving a comprehensive and coherent understanding of the driving environment. Radar is a critical sensor in the perception suite, since only radar is functional in any weather and lighting conditions, penetrating dust, rain, fog, and snow. By infusing it with ultra-high resolution in all dimensions, Arbe repositions radar from a supporting player to the backbone of the autonomous perception suite.

Arbe leverages Artificial Intelligence algorithms to train its Imaging Radar's perception and cover all corner cases. Recently, we have begun to utilize NVIDIA DRIVE Sim™ to both simplify and improve this process. NVIDIA DRIVE Sim is an end-to-end multi-sensor simulation platform. It supports AV development, improving productivity and accelerating time to market. By taking advantage of the DRIVE Sim ecosystem, Arbe's delivers a better, more thoroughly tested, more easily integrated product in less time.

Deep Learning AI Training, Solved

To train the deep learning AI, Arbe needs mind-boggling amounts of data. We use Machine Learning - which requires massive quantities of data, labeled to the pixel level - for SLAM, perception, super resolution, and a host of additional next-generation radar developments. With DRIVE Sim, AV developers can improve their productivity, efficiency, and test coverage, advancing time to market while minimizing real world driving. In order to efficiently train AI models, the industry rule of thumb is that data must be generated at a rate that is 180 times faster than the sensor's real acquisition rate - something that is only possible with a simulation with the depth and breadth of quality of DRIVE Sim.

Access to the Unusual

DRIVE Sim also makes it easier for Arbe to test uncommon or dangerous scenarios simply and safely. DRIVE Sim taps into NVIDIA's core technologies, including NVIDIA RTX, Omniverse, and AI, to deliver a powerful, cloud-based simulation platform, capable of generating a wide range of real-world scenarios for autonomous vehicle development and validation. Without it, intentionally testing Perception Imaging Radar's capabilities during situations like accidents, running red lights, driving at very high speeds, or a child bolting into the street would not only be difficult, but unethical. When tested in a virtual environment, we ensure that we have tested these problematic events from every "angle" - choosing which road participants are present, defining unexpected weather and lighting conditions, and even entering randomization into the scene to test options we hadn't anticipated and find corner cases. These are scenarios that are difficult to reliably access in real life, and DRIVE Sim helps us make sure that our Perception Imaging Radar solution has been evaluated to the highest standard. And because the simulator includes Lidar and camera in every scenario, Arbe has the information we need to train and format our radar data in a way that makes sensor fusion as seamless as possible.

Risk-free Evaluation and Development

Finally, DRIVE Sim supports our customers directly as well: Arbe's sensor models are built into DRIVE Sim can be made available to OEMs and Tier 1s for training and validating their perception algorithms, with a synthetic radar sensor that is as realistic and close to Arbe's Phoenix and Lynx systems as possible. OEMs and Tier-1s can test their new systems online for first evaluation. Further, just as DRIVE Sim saves Arbe in training time and effort, it does the same for our clients, whose perception developers can develop their algorithms and enhance their perception capabilities without many of the test drives that used to be necessary, even before hardware is made available.

Teaming Up To Achieve Autonomy

The automotive autonomy industry understands that radar is a primary sensor, indispensable for driving the revolution that will enable the future of autonomous transportation.

''The ability to test and validate radar sensors in simulation is critical to bringing robust autonomous driving to the market. NVIDIA DRIVE Sim enables Arbe to make massive strides every day with its advanced Imaging Radar solutions," explained Zvi Greenstein, General Manager, Automotive at NVIDIA.

Arbe is leading the industry in Imaging radar solutions, and is an enabler of full autonomy. By supercharging the development and testing process, DRIVE Sim supports the advancement of autonomous technology achieved by Arbe's Perception Imaging Radar, helping to ensure a safer autonomous transportation future.

Connect to learn more

Attachments

  • Original Link
  • Original Document
  • Permalink

Disclaimer

Arbe Robotics Ltd. published this content on 14 September 2022 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 14 September 2022 16:49:05 UTC.