A full-page ad in Sunday’s New York Times took aim at Tesla’s “Full Self-Driving” software, calling it “the worst software ever sold by a Fortune 500 company” and offering $10,000, the same price as the very first-person software that could cite “another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes”.
The ad was taken down by The Dawn Project, a recently founded organization aiming to ban insecure software from critical security systems that can be targeted by military-style hackers, as part of a campaign to remove Tesla Full Self-Driving (FSD) ) of the public. roads until it has “1,000 times fewer critical breakdowns”.
Defense group founder Dan O’Dowd is also the CEO of Green Hill Software, a company that builds operating systems and programming tools for embedded security and protection systems. At CES, the company said BMW’s iX vehicle is using its real-time operating system and other security software, and also announced the availability of its new over-the-air software product and data services for automotive electronic systems.
Despite the potential competitive bias of the founder of The Dawn Project, Tesla’s FSD beta software, an advanced driver assistance system that Tesla owners can access to handle some driving functions on city streets, has come under scrutiny in recent years. months after a series of YouTube videos that showed glitches in the system went viral.
The NYT’s announcement comes just days after the California Department of Motor Vehicles told Tesla it would be “revisiting” your opinion that the company’s testing program, which uses consumers rather than professional safety operators, does not fall under the department’s autonomous vehicle regulations. California’s DMV regulates self-driving testing in the state and requires that other companies such as Waymo and Cruise be developing, testing and planning to deploy robotaxis to report system failures and failures called “disengagements.” Tesla never issued these reports.
Tesla CEO Elon Musk, replied vaguely on Twitter, claiming that Tesla’s FSD has not resulted in an accident or injury since its launch. The US National Highway Traffic Safety Administration (NHTSA) is investigating a report from the owner of a Tesla Model Y, who reported that his vehicle entered the wrong lane while making a left turn in FSD mode, resulting in the vehicle being hit by another driver.
Even though this was the first FSD accident, Tesla’s Autopilot, the automaker’s ADAS that comes standard on vehicles, has been involved in about a dozen accidents.
Alongside the NYT announcement, The Dawn Project published a fact check of their claims, referring to their own FSD security analysis who studied data from 21 YouTube videos, totaling seven hours of travel.
The videos analyzed included beta versions 8 (released in December 2020) and 10 (released in September 2021), and the study avoided videos with significantly positive or negative titles to reduce bias. Each video was rated according to the California DMV Driver Performance Assessment, which is what human drivers must pass to obtain a driver’s license. To pass the driver test, drivers in California must have 15 or fewer maneuvering errors score, such as not using turn signals when changing lanes or keeping a safe distance from other moving vehicles, and zero critical steering errors, how to hit or pass a red light.
The study found that the FSD v10 made 16 scoring maneuver errors on average in less than an hour and a critical steering error every 8 minutes. There is an improvement in errors over the nine months between v8 and v10, the analysis found, but at the current rate of improvement, “it will take another 7.8 years (by AAA data) to 8.8 years (by Bureau data). of Transportation) to achieve the accident rate of a human driver.”
The Dawn Project announcement makes some bold claims that should be taken with a grain of salt, especially since the sample size is too small to be taken seriously from a statistical standpoint. If, however, the seven hours of footage is indeed representative of an average FSD drive, the findings could be indicative of a larger problem with Tesla’s FSD software and speak to the broader question of whether Tesla should have permission. to test this software on public roads. without regulation.
“We didn’t sign up for our families to be crash test dummies for thousands of Tesla cars being driven on public roads…” reads the ad.
Federal regulators have started taking some action against Tesla and its Autopilot and FSD beta software systems.
In October, NHTSA sent two letters to the automaker targeting the use of nondisclosure agreements for owners who gain early access to the FSD beta, as well as the company’s decision to use over-the-air software updates to fix an issue with the FSD beta. default autopilot that should have been a recall. Additionally, Consumer Reports issued a statement over the summer saying that the FSD version 9 software update did not appear to be safe enough for public roads and that it would test the software independently.
Since then, Tesla has released many different versions of its v10 software – 10.9 should be here any day now, and version 11 with “single city/highway software stack” and “many other architectural updates” coming out in February, according to CEO Elon Musk.
Reviews of the latest version 10.8 are skewed, with some online reviewers saying it is much smoother and many others stating that they don’t feel confident using the technology. A thread reviewing the latest version of the FSD on Tesla Motors Subreddit Page shows owners sharing complaints about the software, with one even writing, “Definitely not ready for the general public yet…”
Another commenter said it took a long time for the car to turn right onto “a straight and totally empty road… the next street, followed by a sudden deceleration because he changed his mind about speed and now thought a 45 mph road was 25 mph. ”
The driver said he eventually had to turn it off completely because the system completely ignored an upcoming left turn, which was supposed to take place at a standard intersection “with lights and clear visibility in all directions and no other traffic.”
The Alvorada Project campaign highlights a warning from Tesla that its FSD “may do the wrong thing at the worst time”.
“How can anyone tolerate a security-critical product on the market that can do the wrong thing at the worst time,” the advocacy group said. “Isn’t that the definition of defective? Full Self-Driving must be removed from our roads immediately.”
Neither Tesla nor The Dawn Project could be reached for comment.