Dawn Project shows latest version of Tesla Full Self-Driving still plagued by safety defects


SANTA BARBARA, Calif., June 17, 2024 (GLOBE NEWSWIRE) -- Recent safety tests conducted by public safety advocacy group The Dawn Project have demonstrated that Tesla Full Self-Driving is far less safe than an average human driver, and still fails to recognize a child crossing in a school crosswalk, despite Elon Musk and Tesla fans claiming for years that Tesla’s self-driving software is much safer than a human driver.

The Dawn Project’s tests have also proven that FSD cannot complete a U-turn, leading to unpredictable and dangerous situations for those in the vehicle. These tests have since been validated by some of Tesla’s most devoted fans, despite Musk proclaiming that data proves that Supervised Full Self-Driving is safer than a human driver.

The Dawn Project has disproved a number of claims made by Elon Musk and his fans, including Omar Qazi, who is one of the most prolific and enthusiastic posters about Elon Musk, Tesla, and Musk’s other companies. Qazi often reports on upcoming products, features, release dates, and announcements before the press. He regularly hosts Spaces discussions with other Musk fans where he extolls the wonders of Tesla and Full Self-Driving, and boasts about using FSD nearly 100% of the time. He was one of the first FSD Beta testers and has published thousands of hours of footage of FSD driving his Tesla around San Francisco and Los Angeles. He recently said that FSD is “doing all of my driving.”

The Dawn Project’s tests have further been validated by Elon Musk announcing on August 11th 2023 that Tesla was rewriting the software which he had previously lauded, and Qazi more recently acknowledging that the previous version of the software had been unsafe, and should be pulled off the roads, despite also previously praising the very same software.

As The Dawn Project has demonstrated, Tesla’s defective FSD software is still not safe for public roads, further highlighting the dangers of Tesla’s decision to release previous versions, which are now acknowledged as being even more dangerous than this version, to untrained consumers on public roads. The Dawn Project’s tests further highlight the contradiction between the promotion of FSD by Tesla enthusiasts, with the unsafe and reckless driving by the system.

Despite protests that FSD should be taken off the market, and The Dawn Project’s attempts to inform Tesla and the public of these dangers, Tesla has continued to sell defective self-driving software to safety conscious consumers, telling them it would make their families four times safer.

Even Tesla executives, such as Tesla’s former VP of Policy and Business Development, Rohan Patel, who has been a vocal supporter of FSD and has promoted FSD as a safety feature to potential customers and regulators claiming that it would make their families safer, admitted that even he wasn’t comfortable with his own family using v11.

Research from The Dawn Project has further shown that hundreds of thousands of people who paid up to $15,000 for FSD only use it 15% of the time, indicating that these users do not agree that FSD is safer than driving manually. The danger of Tesla’s self-driving software has also been shown by the experiences of customers such as Steve Wozniak, who reported that Tesla’s self-driving software phantom braked “a hundred times” and “lurched towards” a semi. Wozniak asserted that his self-driving Tesla tried to kill him at every chance it could.

New data from NHTSA estimates that Tesla didn’t count 82% of police reported crashes involving its self-driving software, by only counting crashes in which airbags were deployed. The real number of self-driving Tesla crashes is therefore more than five and a half times higher than Tesla reported in its Vehicle Safety Report.

Despite NHTSA pointing out that Tesla fails to account for the accidents involving its self-driving software, Tesla continues to use this discredited data to deceive the public into believing that their self-driving technology is safer than driving manually. Tesla is also under investigation by NHTSA over its self-driving software's involvement in numerous crashes with emergency vehicles.

When all of the Tesla crashes are counted, the data is unequivocal that supervised Full Self-Driving is far less safe than a human driving by themselves.

NHTSA’s explanation of Tesla’s data corroborates The Dawn Project’s research and analysis which has found that engaging Tesla’s Full Self-Driving software is much less safe than a human driving by themselves.

The Dawn Project has proven that Tesla Full Self-Driving can’t pass a DMV driver’s test, meaning that it is a worse driver than a 16 year old with a license. The Dawn Project has further demonstrated that FSD will still blow past a stopped school bus with its red lights flashing and stop sign extended and run down a child in a school crosswalk, and that FSD doesn’t understand Do Not Enter and Road Closed signs. Sometimes it inexplicably slams on the brakes for no reason. Tesla warns that FSD “can suddenly swerve even when driving conditions appear normal and straight-forward.”

Tesla claims that its self-driving software does not make the vehicle autonomous and that the driver must be ready to take over to correct FSD’s mistakes immediately. However, research has shown that the only way to ensure a driver is paying attention is to implement an effective driver monitoring system. The Dawn Project’s tests have shown that Tesla’s driver monitoring system is entirely unfit for purpose, endangering road users by failing to ensure that the driver is paying attention when operating Tesla’s Full Self-Driving software. NHTSA has recently confirmed The Dawn Project’s findings, describing Tesla’s driver monitoring system as weak in its most recent recall.

NHTSA and the National Transportation Safety Board have both rebuked Tesla for failing to disable Autopilot on roads where Tesla knows it isn’t safe. To this day, Tesla continues to ignore these safety recommendations.

One of the most commonly used arguments by FSD fans to justify allowing defective self-driving software on public roads is that it will save millions of lives in the future, so it doesn’t matter if it kills a few people today. Despite this, The Dawn Project has demonstrated that FSD continues to endanger lives on the road, rather than showing any rate of improvement to eventually be at a level where it can save lives.

Consumers, regulators, and law enforcement have been misled about the safety of FSD, while FSD supporters continue to promote the latest version of the software with the same enthusiasm and praise that they gave the previous versions that they now admit were unsafe. It will only be a matter of time before the next version of FSD is released and the supporters admit that v12 is also unsafe and should be prohibited on public roads.

As The Dawn Project’s recent testing has demonstrated, FSD must be taken off public roads until it is proven to be safe by independent third parties, rather than biased Tesla enthusiasts. The Dawn Project has demonstrated that Tesla cannot be trusted to self-certify the software and the safety claims they make about it.

Source: Dan O’Dowd, Founder of The Dawn Project

 

Kontaktdaten