Autonomous vehicles will represent the key innovation in the automotive industry for years to come. Built with driver assistance (DA) and autonomous driving (AD) technologies, they will impart a superior ability to perceive what’s happening in the vicinity of the vehicle in diverse driving conditions. The driving experience will be more comfortable and enriched as drivers will have access to quality information about the driving route. Driver-assisted systems use multiple types of LiDAR, Camera, and RADAR systems, offering sensor redundancy to negate false positives to enhance safety.
Introduction to LiDAR
Light Detection and Ranging (LiDAR) refers to an optical remote sensing method that measures the distance to the surface of a particular distant object and also its exact disposition and size. It uses the RADAR principle with one crucial difference: in-lieu of radio waves, laser light pulses scan the environment. LiDAR radiation is of shorter wavelength compared to radar and can detect tiny objects. This high resolution results in a detailed map of the vehicle vicinity and reproduces an object’s exact 3D monochromatic image.
As AI processors enter the autonomous vehicle market, Field Programmable Gate Arrays (FPGAs) claim a substantial chunk of the ADAS solutions value chain, through deft processing of highly advanced imaging radar. FPGAs play an essential role in sensor data processing at the edge where sensors process data. Since LiDAR and RADAR generate immense amounts of data, it makes sense to process the data in the sensor module at the edge. Moreover, FPGAs allow you to apply proprietary instruction sets on a compute-efficient platform.
LiDAR Use in Autonomous Vehicles
LiDAR is a guidance system customized for autonomous vehicles. The scanner’s speed and accuracy enable data to be funneled into a system and then processed in near real-time. LiDAR effectively functions as an eye of self-driving vehicles, offering 360-degree vicinity views, with the ability to detect obstacles, and rapidly update its route.
Time-sensitive airborne missions require brisk onboard LiDAR data processing. Algorithms employed for autonomous driving are a real-time based requirement and needs minimal processing time. This is difficult to achieve with standard embedded CPU solutions. GPU devices are often employed in parallel processing, but such devices consume excess power - a critical factor in this context. Both real-time processing speed and low power consumption are desirable for autonomous vehicles.
FPGAs are low-power devices and developed as customized integrated circuits. These can execute on-chip massive multi-level parallel processing along with data communications. FPGA platforms satisfy both computational capability needs and power consumption constraints; in addition, they offer scalability, portability, adaptability and modularity in designs.
Figure 1: Block diagram of a LiDAR system
The LiDAR system sends thousands of pulses of laser beams every second. The pulses are projected downward from an airborne platform (Figure 1). These beams collide with nearby objects and are subsequently reflected. The beam scans oscillate over the object area, measuring anywhere between 20,000 to 150,000 points per second. This data is subsequently processed, and LiDAR time-interval measurements (calculated from a pulse sent to return pulse received) are then converted to distance.
The LiDAR sensor collates considerable amounts of data. A single survey may quickly generate billions of points, amounting to several terabytes. The onboard computer records the reflection point of every laser and translates this quickly updating point cloud to an animated 3D representation.
The 3D image is created by measuring light speed and distance. This calculation reveals the vehicle’s position, the observable vehicle vicinity, and distance. It identifies all obstructions, illuminates found objects through laser use, and consequently generates a high-resolution digital image. LiDAR enables collision avoidance by measuring the gap between the vehicle, which is passing by, and any vehicle in front of it. A LiDAR is affixed on the vehicle roof or its bumper. The technology monitors and instructs the brakes to either decelerate or stop the vehicle. Conversely, the car speeds up when the road ahead is clear.
LiDAR scanners generally store acquired data in vendor-specific or binary format. These proprietary data formats vary between vendors and thus are unsuitable for data exchange and processing. The American Society for Photogrammetry and Remote Sensing (ASPRS) solved the interoperability problem between multiple LiDAR vendors and users by establishing a standard - LAS format, a simple binary exchange file format.
Figure 2: Driver Assistance System Functional Diagram
FPGAs are ideal for managing automated driving and complex ADASs. Figure 2 shows the driver assistance, which comprises the sensing domain, decision making, and environmental characterization sections. The standard sensors used include cameras and RADAR/LiDAR. The camera interface processing is completed at the pixel level to ensure superior image quality.
LiDAR FPGA Solutions
Xilinx’s FPGAs and heterogeneous SoCs in ADAS systems are instrumental in complex sensory data processing where the data is extracted from multiple sensors, like automotive imaging, computer vision, LiDAR, in-car networking, radar, and image processing.
LiDAR is essential to developing a Level 4 / Level 5 autonomous driving category car. Since LiDAR’s scanner is of higher resolution, it outputs enormous quantities of data, amounting to billions of points. Challenges for data exchange, storage, and distribution are reduced by compression of data using software or hardware techniques. The hardware compression, in fully parallel versions, is approximately 250 times faster compared to software processing. It follows that most of the LiDAR use intelligent FPGAs, characterized by a few unique processing needs.
Figure 3 illustrates the new fresh world of mobility. Several vehicle safety sensors output large amounts of data, and all of them must be fused into one data “pipe” for subsequent processing. For example, the Xilinx Automotive Grade ZynqUltraScale+ MPSoC’s Data aggregation, Pre-processing capability, and Distribution (DAPD) help to improve AI processing through the fusion of sensor data, thus effectively prepping it to be processed by performance modules – depending on the system architecture’s needs.
Figure 3: Building blocks for Automated Driving
Compute acceleration refers to the computation of pre-processed sensor data. It determines vehicle behavior and constitutes a primary AD operation. The data is subsequently sent to the safety processing elements to achieve vehicle control. Xilinx’s Programmable logic fabric enables compute architecture flexibility to execute such critical tasks. A high-speed LiDAR needs multiple Giga samples to be processed per second of laser pulse return data. Highly integrated LiDAR systems can be implemented with an RFSoC; this would include on-chip ADCs for the capturing of the analog LiDAR pulse return signals, programmable logic for the acceleration of pulse processing, and a complete SoC processor to tie it all together.
For developers, the ADAS future lies in embedded vision, automotive systems infrastructure, automotive human-machine interface (HMI), sensors, and connectivity design. Xilinx solutions bring integrated All Programmable system-on-a-chip (SoC) architecture and support fully programmable hardware, I/O, and software for a highly flexible platform. Examples include products such as theZynqUltraScale+ MPSoC, XA ZynqUltraScale+ RFSoC, and Zynq-7000:
- Zynq UltraScale+ MPSoC family include the fulfillment of AEC-Q100 test specifications with complete ISO26262 ASIL-C level certification. This product integrates a richly featured 64-bit quad-core Arm Cortex-A53, Xilinx programmable logic (PL) UltraScale architecture, and dual-core Arm Cortex-R5 within a single device.
- XA Zynq-7000 All Programmable SoCs combine single-chip design flexibility with dual-core Arm Cortex-A9 processor, flexible programmable logic, and high-speed programmable I/O. The XA devices are automotive-qualified, having extended Q-grade temperature ranges. The XA Zynq-7000 devices make possible end product differentiation with full IP control and assist system designers in staying abreast of ever-changing feature needs.
- The Zynq UltraScale+ RFSoC family integrates key subsystems for multiband, cable infrastructure (DOCSIS), multi-mode cellular radios within an SoC platform encapsulating a feature-rich dual-core Arm Cortex-R5 and 64-bit quad-core Arm Cortex-A53 based processing system, combining the processing system with UltraScale architecture programmable logic, soft-decision FECs, and RF-ADCs. This product family has integrated data converters, and when coupled with the adaptable hardware’s parallelism and flexibility, makes a unique solution in nascent LiDAR technologies from ADAS to innovative 3D imaging applications.
Top Comments