For every leap forward in self-driving car technology, there seems to be one step back. The latest hack may be the most simple yet, but it could have dire consequences.
Researchers at the University of Washington have identified stickers are enough to confuse a self-driving car. In this case, the researchers placed an attack algorithm on a standard-issue stop sign. Autonomous cars feature cameras that house an object detector and a classifier. Assuming hackers can control the classifier, they can generate a new image from the stop sign. Here, the self-driving car interpreted a stop sign as a 45 mph speed limit sign.
Furthermore, even the smallest changes to the road signs can confuse the classifier and create dangerous consequences as the software works to identify a road marking. The stickers—printed on a standard-issue printer—feature the correct attack algorithms to muck up a self-driving car’s classifier, and the car was unable to correctly read the signage from various angles and distances. It’s the first time a hack has been carried out successfully from up to 40 feet away.
Not only stickers, but researchers also created an overlay for a stop sign. In this case, the sign looks worn or splotchy to the human eye, but the self-driving car once again interpreted it as a 45 mph speed limit sign. Scary stuff.
The consequences are very real. In a self-driving, connected-car world, a vehicle could blow through a stop sign by interpreting it as a speed limit sign. Or, on the contrary, the car could read a speed limit sign as a stop sign and apply the brakes at speed.
It’s discoveries like this that should keep automakers and developers wary and vigilant to ensure the technology is absolutely ready should individuals resort to malicious attacks.