Tesla News: Autopilot Crash Result Of Sabotage?
As it turns out, the Autopilot system by Tesla is not hack-proof. It means that it is possible to manipulate the system so it would fail to protect its driver and passengers in the event of possible collision.Advertisement
A group of determined researchers from the University of South Carolina, China’s Zhejiang University and Chinese security firm Qihoo 360 managed to prove that sabotage is indeed possible when it comes to Tesla’s unique Autopilot program. In fact, all it took were a number of off-the-shelf tools to make the said system think there is no object in the car’s path, even if there is, and vice versa.
The trick lies in being able to deceive Tesla’s own sensors. According to a report from Wired, the researchers used sound, radio and light-emitting tools to make a Tesla car think there is an object ahead even if there was none. Conversely, it managed to get the system to believe that there is no object blocking its path when there is. Radio interference made this possible.
Also read: Tesla Announces Autopilot Crash Fatality
Following the death of ex-U.S. Navy SEAL Joshua Brown while he was behind the wheel of his Tesla Model S on Autopilot, Tesla indicated that it is possible that sensors failed to detect the tractor trailer that Brown’s car ended up colliding with. As Morning News USA previously reported, Tesla said the sensors failed in this instance as it could not detect the trailer against a “brightly lit sky.”
Nonetheless, researchers want to assure car owners that hacking a Tesla is not easy and definitely not cheap. For one thing, their experiment required the use of a signal generator from Keysight Technologies, which cost $90,000 alone. Aside from this, the researchers also ended up spending hundreds of dollars on a VDI frequency multiplier.
Tesla is yet to comment on these researchers’ hack into their Autopilot.