Nvidia has developed a new cloud-based virtual testing system for autonomous. This computing platform is based on the use of photo-realistic simulations that is ran by two separate servers. The first server runs the Drive Sim software that simulates a self-driving vehicle’s sensors and radar while the second server contains a powerful Nvidia Drive Pegasus Artificial Intelligence vehicle computer that runs the complete autonomous vehicle’s software stack and processes all the simulated data as if it were coming from the sensors of a car driving on a real road.
The simulation server is powered by NVIDIA GPUs, with each GPU generating a stream of simulated sensor data, which is then transferred to the DRIVE Pegasus for processing. DRIVE Sim software then generates photoreal data streams to create a vast range of different testing environments. It can simulate different weather conditions such as rainstorms and snowstorms; blinding glare at different times of the day, or limited vision at night; and all different types of road surfaces and terrain. Dangerous situations can be scripted in simulation to test the autonomous car’s ability to react, and all these are done without ever putting anyone in harm’s way.
While speaking on the new technology, Rob Csongor, Vice President and general manager of Automotive at Nvidia, said, “Deploying production self-driving cars requires a solution for testing and validating on billions of driving miles to achieve the safety and reliability needed for customers. With DRIVE Constellation, we’ve accomplished that by combining our expertise in visual computing and data centers. With virtual simulation, we can increase the robustness of our algorithms by testing on billions of miles of custom scenarios and rare corner cases, all in a fraction of the time and cost it would take to do so on physical roads.”
We understand that the new technology will be available to Nvidia’s partners like Volkswagen, Tesla, and Uber by the third quarter of 2018.