Accelerating In-Vehicle GPU Computing, from Sensing to Inference
, Software Engineer, Pony.ai
, Senior Systems Software Engineer, NVIDIA
高度评价
Pony develops on the latest NVIDIA Ampere-based GPU technology to push AV compute performance in all aspects, achieving safer autonomous driving in complex environments. This session shares our experience in optimizing the GPU computing pipeline all the way from sensor data input to neural network output, including: • Hardware-software co-design to ensure high-bandwidth, low-latency communication between GPU-GPU and GPU-FPGA via RDMA. • Zero-copy pub-sub communication to keep sensor data on GPU until the end of model inference. • Various GPU-based camera and lidar data processing. • Mixed-precision model inference fully utilizing NVIDIA Ampere GPU. • Built-in performance benchmarking and tracing that could be used in every trip. • An overview of fundamental NVIDIA technologies in Pony’s next-generation L4 system.