Arm Holdings on Tuesday pushed deeper into the automotive world with the Cortex-A65AE chip aimed at handling the streams of sensor data expected to help self-driving cars eventually navigate the roads.
The Arm Cortex-A65AE (Automotive Enhanced), expected to hit markets in 2020, is the latest addition to Arm’s Automotive Enhanced portfolio of IP, designed for more efficient processing of the multiple streams of sensor data being generated in next generation vehicles. It does this by delivering multithreading capability combined with integrated safety through Arm's "Split-Lock" technology.
In September, Arm introduced its first automotive-oriented chip, the Cortex-A76AE. That chip was the first from Arm with a new safety feature called “Split Lock.”
The idea is that when car designers want the chip to work its fastest, they can split up the processing “cores” on the chip and process data in multiple cores at once. But when designers want to maximize safety, they can “lock” cores together to perform the same operations simultaneously and double check the chip’s work, minimizing the chance of computing errors.
While the Cortex A76AE was focused on applications where high performance is needed, the Cortex A65AE is focused on high-throughput applications. The difference here would be in a sense the difference between demanding single-threaded workloads and demanding highly parallel and numerous multi-threaded workloads.
The Cortex A65AE is Arm’s first multi-threaded CPU core, allowing two threads to be executed per core.
Arm did not disclose any performance figure during its presentation, except from the fact that the new CPU core is advertised as having a 3.5x higher throughput than the prior generation core in the same market segment – in this case a Cortex-A53.
In order to achieve higher levels of autonomy, there will be a large increase in the number of sensors monitoring the surroundings of the vehicle, including cameras, LiDAR and radar, resulting in a significant increase in throughput and compute requirements to safely process this data. Multiple sensor inputs allow cars to view their environment, perceive what is happening, plan possible paths ahead, and deliver commands to actuators on the determined path.
With so much data being collected at different points of the vehicle, high data throughput capability is a key part of the heterogeneous processing mix required to enable ADAS and autonomous applications. It’s also critical that safety is at the heart of these systems. Arm claims that the Cortex-A65AE is ideal for managing the high throughput requirement for gathering sensor data and can be used in lock-step mode connected to accelerators, such as ML or computer vision, to help process the data efficiently. But what’s most critical, is that it does this with a high level of safety capability.
Alongside the increase in sensor inputs, more autonomy and advancing driver aids will change the human automotive experience. As part of this transition, there will be many more screens in our cars - augmented reality head-up-displays, alerts and improved maps. Sensors will not only be sensing out, but will be sensing in, monitoring drivers. They will be able to monitor eyelid movement to detect tiredness, body temperature, vital signs and behavioural patterns to personalise the in-car experience. These capabilities require high throughput, ML processing and a lot of heterogeneous compute.
Per year, 5.3 trillion miles are driven in cars and light vehicles depending on Arm processors. Looking ahead, the Arm automotive roadmap includes Hercules-AE optimized for 7nm in 2019 as well as future Cortex-R solutions.
Besides tha various proccesing and safety features, the new chip also has pathway for fast connection to graphics processors such as those supplied by Nvidia, which are being adopted by car makers and which Arm’s chips would serve as a complement to.
Arm’s chips would compete directly against those supplied by Intel’s Mobileye self-driving car unit.