Pages

Tuesday, 31 December 2019

NVIDIA introduces software-defined platform for autonomous machines

NVIDIA has introduced NVIDIA DRIVE AGX Orin, a software-defined platform for autonomous vehicles (AVs) and robots. Orin is designed to handle the large number of applications and deep neural networks that run simultaneously in autonomous vehicles and robots, while achieving systematic safety standards such as ISO 26262 ASIL-D.

The platform is powered by a new system-on-a-chip (SoC) called Orin which consists of 17 billion transistors and is the result of four years of R&D investment. The Orin SoC integrates NVIDIA’s GPU architecture and Arm Hercules CPU cores, as well as new deep learning and computer vision accelerators that, in aggregate, deliver 200 trillion operations per second — nearly 7x the performance of NVIDIA’s previous generation Xavier SoC.

DRIVE AGX Orin is developed to enable architecturally-compatible platforms that scale from a Level 2 to full self-driving Level 5 vehicle, enabling OEMs to develop large-scale and complex families of software products. Since both Orin and Xavier are programmable through open CUDA and TensorRT APIs and libraries, developers can leverage their investments across multiple product generations.

 “Creating a safe autonomous vehicle is perhaps society’s greatest computing challenge,” said Jensen Huang, founder and CEO of NVIDIA.

“The amount of investment required to deliver autonomous vehicles has grown exponentially, and the complexity of the task requires a scalable, programmable, software-defined AI platform like Orin.” 

“NVIDIA’s long-term commitment to the transportation industry, along with its innovative end to-end platform and tools, has resulted in a vast ecosystem — virtually every company working on AVs is utilising NVIDIA in its compute stack,” said Sam Abuelsamid, Principal Research Analyst at Navigant Research.

“Orin looks to be a significant step forward that should help enable the next great chapter in this ever improving technology story.”

DRIVE has become a de facto standard for AV development, used broadly by automakers, truck manufacturers, robotaxi companies, software companies and universities. The NVIDIA DRIVE AGX Orin family will include a range of configurations based on a single architecture, targeting automakers’ 2022 production timelines.

NVIDIA separately announced that it will provide the transportation industry with access to its NVIDIA DRIVE deep neural networks (DNNs) for autonomous vehicle development. Safe, self-driving cars require dozens of DNNs to tackle redundant and diverse tasks so that accurate perception, localisation and path planning are ensured.

NVIDIA has spent years developing and training the DNNs that run on the NVIDIA DRIVE AGX platform, turning raw sensor data into a deep understanding of the real world. These DNNs cover such tasks as traffic-light and sign detection, object detection (for vehicles, pedestrians, bicycles) and path perception, as well as gaze detection and gesture recognition inside the vehicle.

AV developers will have access of NVIDIA's pretrained artificial intelligence (AI) models and training code. Using a suite of NVIDIA AI tools, the ecosystem can extend and customise the models to increase the robustness and capabilities of their own self-driving systems.

“The AI autonomous vehicle is a software-defined vehicle required to operate around the world on a wide variety of datasets,” said Huang. “By providing AV developers access to our DNNs and the advanced learning tools to optimise them for multiple datasets, we’re enabling shared learning across companies and countries, while maintaining data ownership and privacy.

"Ultimately, we are accelerating the reality of global autonomous vehicles.”

“NVIDIA leads the world in developing the deepest and broadest suite of DNNs and AI tools for the transportation industry,” said Luca De Ambroggi, Senior Research Director of Artificial Intelligence at IHS Markit.

“Making these algorithms available to others, along with the tools and workflow infrastructure to customise them, will help enable the deployment of safe autonomous transportation.”

In addition to providing access to the DNNs, NVIDIA announced the availability of a suite of advanced tools so developers can customise and enhance NVIDIA’s DNNs using their own datasets and target feature set. These tools enable the training of DNNs using different types of learning:

• Active learning improves model accuracy and reduces data collection costs by automating data selection using AI, rather than manual curation.

• Federated learning enables companies to utilise datasets across countries and with other companies while maintaining data privacy and protecting their intellectual property.

• Transfer learning gives DRIVE customers the ability to speed development of their perception software by leveraging NVIDIA’s significant investment in AV development, then further developing these networks for their own applications and target capability.

Access to the AI models will be available on NVIDIA GPU Cloud (NGC). NGC is the hub for GPU-optimised software for deep learning, machine learning, and high-performance computing.

NVIDIA also shared that Didi Chuxing (DiDi), the mobile transportation platform will leverage NVIDIA GPUs and AI technology to develop autonomous driving and cloud computing solutions. Delivering 10 billion passenger trips per year, DiDi is working toward the safe, large-scale application of autonomous driving technology, leveraging its own technology capacities, data resources and open collaboration with tech leaders and OEM partners.

DiDi will use NVIDIA GPUs in data centre servers for training machine learning algorithms and NVIDIA DRIVE for inference on its Level 4 autonomous driving vehicles. A Level 4 vehicle is fully autonomous under specific conditions. In August 2019, DiDi upgraded its autonomous driving unit into an independent company and began a wide range of collaborations with industry partners.

As part of the centralised AI processing of DiDi’s autonomous vehicles, NVIDIA DRIVE enables data to be fused from all types of sensors (cameras, lidar, radar, etc.) using DNNs to understand the 360-degree environment surrounding the car and plan a safe path forward.

“Developing safe autonomous vehicles requires end-to-end AI, in the cloud and in the car,” said Rishi Dhall, VP, Autonomous Vehicles at NVIDIA. “NVIDIA AI will enable DiDi to develop safer, more efficient transportation systems and deliver a broad range of cloud services.”

For cloud computing, DiDi will also build an AI infrastructure and launch virtual GPU (vGPU) cloud servers for computing, rendering and gaming. DiDi Cloud will adopt a new vGPU license mode to provide users with better experiences, richer application scenarios and more efficient, flexible GPU cloud computing services. Currently, DiDi Cloud is collaborating with industry partners including NVIDIA to provide services in transportation, AI, graphics rendering, video games and education.

No comments:

Post a Comment