Self-Driving Car’s Visual Technology Can Be Hacked, Making AutoPilot Systems Unsafe
While self-driving automobiles is an idea that excites many, smart technology or artificial intelligence has a long way to go before it can be considered foolproof.
CBS News reported last week that researchers for McAfee Systems working in conjunction with Tesla Motor Company were able to trick a test vehicle into speeding by placing electrical tape over a posted speed limit sign.
Technicians were able to change a 35 mile per hour speed limit into 85 miles per hour by placing black electrical tape over the number three. This was apparently enough to fool the car’s Mobileye system and cause the 2016 Model X to increase its speed. More current Tesla Models don’t support the Mobileye technology.
The Director for the Center for Auto Safety, Jason Levine, believes that the study is useful in that the so-called “AutoPilot” system is misleading and can pose a danger to drivers, passengers, and others around the vehicles. ” What this study demonstrates is how dangerous the feature can be for everyone on the road,” Levine said.
As recently as last year, both researchers and hackers have been able to fool the cars by using stickers. One researcher got the car to view a stop sign as being a speed limit sign while hackers were able to cause the high-end automobile to drive into the wrong lane.
According to a report issued by Science Magazine, machine learning systems such as the one on board some Tesla vehicles can give the wrong diagnoses and lead to an accident.