New MIT study confirms Tesla’s autopilot as unsafe after collecting 500,000 miles’ worth of data; the full self-driving feature is not as safe as Tesla claims

Tech Startups

For those who have been following Elon Musk and Tesla Group on Twitter, almost no day goes by without a post or tweet on Tesla Full Self-Driving (FSD). FSD is a feature Tesla claimed to have “designed and engineered from the ground up to rapidly process neural networks — the foundation for how we train and develop Autopilot.”

Tesla further claims that “While [its] cars require active driver supervision and are not fully autonomous today, the FSD computer is capable of delivering intelligent performance and control to enable a new level of safety and autonomy, without impacting cost or range.”

But Tesla’s claims are now in doubt following a new study from MIT which found that Tesla FSD is not as safe as the company claims. And maybe it’s time for Tesla to rephrase its FSD safety claims before the floodgate of class action lawsuit starts rolling in.

A new study out of MIT “confirmed how unsafe” Tesla’s Autopilot feature really is. As part of the study, MIT collected nearly 500,000 miles’ worth of data. The study, which is published in Elsevier, reveals that Tesla’s full self-driving (FSD) is not as safe as the company claims.

As part of the study, researchers followed Tesla Model S and X owners ” driving their vehicles in their daily routine for periods of a year or more.” and found that drivers become inattentive when using partially automated driving systems.

According to the study report, “most of the trips were recorded in the greater Boston area, but many others are from locations in New England and outside.” Data were later drawn from the ongoing MIT Advanced Vehicle Technology (MIT-AVT) naturalistic driving study.

The study concluded with the following:

“Visual behavior patterns change before and after [Autopilot] disengagement. Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead.”

Meanwhile this week, Tesla ignored any ongoing investigation and has decided to roll out its Full Self Driving 10.0.1 beta. In a tweet, Tesla CEO Elon Musk said: “Good feedback from FSD Beta 10 users! 10.0.1 point release rolling out now. 10.1 rolls out a week from Friday with beta request button.” However, there is no assurance that FSD Beta 10.0.1 point release will address the safety issues raised in the MIT study.

 

Below is the Abstract of the study.

Objective
We present a model for visual behavior that can simulate the glance pattern observed around driver-initiated, non-critical disengagements of Tesla’s Autopilot (AP) in naturalistic highway driving.

Background
Drivers may become inattentive when using partially-automated driving systems. The safety effects associated with inattention are unknown until we have a quantitative reference on how visual behavior changes with automation.

Methods
The model is based on glance data from 290 human initiated AP disengagement epochs. Glance duration and transition were modelled with Bayesian Generalized Linear Mixed models.

Results
The model replicates the observed glance pattern across drivers. The model’s components show that off-road glances were longer with AP active than without and that their frequency characteristics changed. Driving-related off-road glances were less frequent with AP active than in manual driving, while non-driving related glances to the down/center-stack areas were the most frequent and the longest (22% of the glances exceeded 2 s). Little difference was found in on-road glance duration.

Conclusion
Visual behavior patterns change before and after AP disengagement. Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead.

Application
The model can be used as a reference for safety assessment or to formulate design targets for driver management systems.

Report

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments