Researchers were able to trick a Tesla Inc. vehicle into speeding by putting a strip of electrical tape over a speed limit sign, spotlighting the kinds of potential vulnerabilities facing automated driving systems.
Security researchers from the McAfee advanced threat research team demonstrated a replicable methodology whereby a Tesla Model X and a Tesla Model S, both from 2016 and both with the Tesla hardware pack 1, were made to accelerate by 50 miles per hour autonomously.
The researchers placed the piece of tape horizontally across the middle of the “3” on a 35 mile-per-hour speed limit sign. The change caused the Tesla vehicle to read the limit as 85 miles per hour, and its cruise control system automatically accelerated, according to research released by McAfee on Wednesday.
The tests involved a 2016 Model S and Model X that used camera systems supplied by Mobileye Inc., now a unit of Intel Corp. Mobileye systems are used by several automakers though Tesla stopped using them in 2016.
Tests on Mobileye’s latest camera system didn’t reveal the same vulnerability, and Tesla’s latest vehicles apparently don’t depend on traffic sign recognition, according to McAfee.
McAfee says the issue isn’t a serious risk to motorists. No one was hurt and the researcher behind the wheel was able to safely slow the car.
It should be pointed out that the Tesla vehicles did not rely entirely upon the data received from these cameras, such as the reading of road speed signs, when it came to autonomous driving functionality such as automatic cruise control.
However, by extending the central bar of the number three on a 35 miles per hour speed limit sign, just with the 2-inch piece of tape, it was determined that a repeatable misclassification from the Tesla Model X and Model S test vehicles. Both of these 2016 models were enabled with Speed Assist (SA) and Tesla Automatic Cruise Control (TACC), and both were susceptible to the model hacking exploit.
McAfee researchers say that the findings were disclosed to Tesla on September 27, 2019, and had an acknowledgment of the research in return. Tesla has not, according to the McAfee report, "expressed any current plans to address the issue on the existing platform."
A Mobileye spokesperson says that "the modifications to the traffic signs introduced in this research can confuse a human eye and therefore we do not consider this an adversarial attack. Traffic sign fonts are determined by regulators, and so advanced driver assistance systems (ADAS) are primarily focused on other more challenging use cases, and this system in particular was designed to support human drivers – not autonomous driving. Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowdsourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety."
The findings from the 18-month research that ended last year, illustrate a weakness of machine learning systems used in automated driving, according to Steve Povolny, head of advanced threat research at McAfee.