An independent non-profit organisation (NPO) proved that a Tesla’s Autopilot system can be tricked into activating even if nobody is in the driver’s seat.
Consumer Reports (CR), the NPO in question, discovered how easy it is to trick the car’s self-driving system to drive without someone behind the wheel through a test conducted in its own closed track. The test also uncovered the many shortcomings in Tesla’s autopilot or self-driving system.
The test follows the unfortunate accident that recently took the lives of two people while riding in a 2019 Tesla Model S. Authorities discovered that the car did not have someone in the driver’s seat when the bodies were recovered from the car wreck.
To determine whether a Tesla could be made to drive autonomously without anyone in the driver’s seat, the NPO drove a Tesla Model Y in its closed test track. While the Model Y is different from the Model S involved in the accident, the two share the same self-driving system from Tesla. Jake Fisher, Consumer Reports’ senior director of auto testing, ran the test.
During the first portion of the test, Fisher engaged autopilot while the car was moving at 15 mph and then set the speed dial to 0. This brought the car to a complete stop, but it did not disengage the car’s autopilot. Fisher noted in the video the NPO took that the car disengages the autopilot if the driver takes their hands off the wheel while the car is driving itself. To simulate the weight of a person’s hand, he used a weighted chain. He then slid over to the passenger seat without opening any of the car’s doors and unbuckling the seat belt. In doing so, the car’s autopilot remained engaged.
Now in the front passenger’s seat, and with a weighted chain attached to the steering wheel, Fisher was able to set the car’s speed using the same speed dial used earlier. He then noted that at that point, he was in a car being driven by its autopilot without anyone in the driver’s seat. Although the car’s autopilot followed the lane lines, it did not warn passengers that “no one was in the driver’s seat, holding the steering wheel and was looking at the road.”
Fisher stated that it was frightening how easy it was to work around a Tesla’s autopilot safeguards, which were proven to be insufficient. He also mentioned that the test made it clear that the vehicle’s systems failed to ensure that the driver was paying attention to the road and to make sure someone was in the driver’s seat.
“Tesla is falling behind other automakers like General Motors and Ford that, on models with advanced driver-assist systems, use technology to make sure the driver is looking at the road,” Fisher said.
CR’s test made it evident that Tesla’s autopilot sorely needs improvements soon. The NPO recommends that all vehicles with autopilot and adaptive cruise control include systems to make sure drivers are present and looking at the road. CR cited General Motors’ Super Cruise system and infrared camera as an example of such a system.
The NPO isn’t alone in this recommendation either; other safety advocates, like the US Insurance Institute for Highway Safety, joined in to advocate for the feature’s addition.
Fisher also mentioned that a car’s autopilot is prone to making mistakes and will shut itself off when it encounters a situation it can’t solve. In this instance, if the driver can’t react quickly enough, a crash will be the inevitable result.
CR listed possible improvements that Tesla can use for its autopilot. Aside from adding a driver monitoring system, these improvements include using a weight sensor in the vehicle’s driver seat to determine if a human is seated, as well as a camera-based system that can warn drivers to look at the road in real-time and slow down to a stop if repeated warnings are ignored.
Written by John Paul Joaquin