Tesla says it has technology to detect if a driver is behind the wheel, but testing has shown it’s not failsafe – or as advanced as driver-monitoring cameras on certain Subaru, Ford and General Motors cars.
Tesla boss Elon Musk has broken his silence over a deadly Tesla crash in Texas – in which two occupants were killed after it appeared one of them was attempting to operate the vehicle on autopilot and had vacated the driver seat.
And respected consumer advocacy group Consumer Reports has disputed claims by Tesla that its Autopilot function does not work without someone in the driver’s seat – discovering that the system can be “tricked”.
Tesla says its vehicles have sensors to ensure Autopilot can only operate when there is a person behind the wheel.
However, Consumer Reports discovered Tesla’s systems were not failsafe – and not as advanced as other manufacturers who use seat sensors and tiny cameras that monitor a driver’s eye movement, such as those recently introduced on certain Subaru, Ford and General Motors cars.
Instead, Tesla uses sensors in the steering wheel and seatbelt buckle to detect if there is a person behind the wheel.
In a test on a private track, Consumer Reports assessed the effectiveness of Tesla’s driver attention system and was able to trick the technology into thinking there was a driver sitting in the seat.
Although Consumer Reports tested a newer Tesla Model Y rather than the older Tesla Model S involved in the crash, both cars were equipped with Autopilot.
“Consumer Reports engineers easily tricked our Tesla Model Y this week so that it could drive on Autopilot …without anyone in the driver’s seat – a scenario that would present extreme danger if it were repeated on public roads,” the US consumer advocacy group reported overnight.
“Over several trips across our half-mile closed test track, our Model Y automatically steered along painted lane lines, but the system did not send out a warning or indicate in any way that the driver’s seat was empty.”
The test used a weighted chain on the steering wheel to simulate hand movement and a clipped the seatbelt underneath the driver, enabling him to move to the front passenger seat. You can see the video embedded in the Consumer Reports article here.
“The system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” says Jake Fisher, the senior director of vehicle testing for Consumer Reports, who conducted the experiment.
“Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.”
Last weekend, a 2019 Tesla Model S crashed into a tree in on the outskirts of Houston, Texas, killing both occupants. Police said they believe one occupant was in the front passenger seat and the other was sitting in the back seat at the time of the crash.
The outspoken boss of the electric-car start-up, Elon Musk, said on Twitter “data logs recovered so far show Autopilot was not enabled and this car did not purchase FSD (full self driving). Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.”
The deadly crash has reignited concerns by safety authorities and car makers that some customers are placing too much faith in the technology, when in fact they should always be in control of their vehicle, even when it is equipped with driver assistance systems.
Consumer Reports said: “Our evaluation does not provide specific insight into the Texas crash, but safety advocates and researchers at CR say that it does show that driver monitoring systems need to work harder to keep drivers from using the systems in foreseeably dangerous ways.”
Safety advocates – including the USA’s Insurance Institute for Highway Safety – recommend all vehicles that incorporate steering automation and radar cruise control “also include systems to make sure drivers are present and looking at the road”, such as dashboard cameras that monitor the driver’s eye movement, Consumer Reports said.
As for the test it conducted, Consumer Reports said “under no circumstance should anyone try this”.
“Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger,” said Consumer Reports car expert Jake Fisher.
“We were able to perform this experiment because we have a private test track. We also had safety crews standing by, and at no time did the vehicle exceed 30mph (50kmh).
Elon Musk responds to deadly Texas Tesla crash, as Consumer Reports reveals how Autopilot can be tricked