Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Tech

New York Times ad warns of Tesla’s ‘fully self-driving’ – TechCrunch | Local News

New York Times ad warns of Tesla’s ‘fully self-driving’ – TechCrunch

| Breaking News Updates | Yahoo news


A full-page ad in Sunday’s New York Times took aim at Tesla’s ‘Full Self-Driving’ software, calling it ‘the worst software ever sold by a Fortune 500 company’ and offering $10,000, the same price than the software itself in the first person who could name “another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes”.

The ad was taken down by The Dawn Project, a recently founded organization aimed at banning dangerous software from security-critical systems that can be targeted by military-style hackers, as part of a campaign to remove Tesla Full Public Self-Driving (FSD). roads until there are “1,000 times fewer critical malfunctions”.

The defense group’s founder, Dan O’Dowd, is also the CEO of Green Hill Software, a company that builds operating systems and programming tools for integrated safety and security systems. At CES, the company said that BMW’s iX vehicle uses its real-time operating system and other security software, and also announced the availability of its new live software product and data services for automotive electronic systems.

Despite The Dawn Project founder’s potential competitive bias, Tesla’s beta software FSD, an advanced driver assistance system that Tesla owners can access to manage certain driving functions on city streets, did the subject of intense scrutiny in recent months after a series of YouTube videos showing flaws in the system went viral.

The NYT announcement comes just days after the California Department of Motor Vehicles told Tesla it would “re-examine” its view that the company’s testing program, which uses consumers and not operators of safety professionals, does not fall under the department’s autonomous vehicle regulations. The California DMV regulates self-driving testing in the state and compels other companies like Waymo and Cruise that are developing, testing, and planning to deploy robotaxis to report crashes and system failures known as “disengages.” Tesla never published these reports.

Tesla CEO Elon Musk has since vaguely replied on Twitter, claiming that Tesla’s FSD has not resulted in any accidents or injuries since its launch. The U.S. National Highway Traffic Safety Administration (NHTSA) is investigating a report from the owner of a Tesla Model Y, who said his vehicle pulled into the wrong lane while making a left turn while driving. FSD mode, resulting in the vehicle colliding with another driver. .

Although this was the first FSD crash, Tesla’s Autopilot, the automaker’s standard ADAS, has been involved in a dozen crashes.

Alongside the NYT ad, The Dawn Project released a fact check of its claims, referring to its own FSD safety analysis which studied data from 21 YouTube videos totaling seven hours of driving time.

Videos analyzed included beta versions 8 (released in December 2020) and 10 (released in September 2021), and the study avoided videos with significantly positive or negative titles to reduce bias. Each video has been scored according to the California DMV’s Driver Performance Rating, which is what human drivers must pass to earn a driver’s license. To pass a driving test, California drivers must have 15 or fewer maneuvering errors, such as not using turn signals when changing lanes or maintaining a safe distance from other moving vehicles, and zero critical driving errors. , like crashing or running a red light. .

The study found that FSD v10 made 16 rating maneuver errors on average in less than an hour and one critical driving error approximately every 8 minutes. There was an improvement in errors in the nine months between v8 and v10, according to the analysis, but at the current rate of improvement, “it will take another 7.8 years (according to AAA data) to 8, 8 years (according to Transportation Bureau data) to reach the accident rate of a human driver.

The Project Dawn announcement makes some bold claims that should be taken with a grain of salt, particularly because the sample size is far too small to be taken statistically seriously. If, however, the seven hours of footage is indeed representative of an average FSD reader, the results could point to a larger problem with Tesla’s FSD software and address the larger question of whether Tesla should be allowed to test this software. on public roads. without any regulation.

“We did not sign up for our families to be crash test dummies for thousands of Tesla cars driven on public roads…” the ad reads.

Federal regulators have begun taking action against Tesla and its Autopilot and FSD beta software systems.

In October, NHTSA sent two letters to the automaker targeting its use of nondisclosure agreements for owners who get early access to the FSD beta, as well as the company’s decision to use upgrades. over-the-air software updates to fix a problem in the standard autopilot system that should have been recalled. In addition, Consumer Reports issued a statement over the summer indicating that the FSD version 9 software upgrade did not appear to be safe enough for public roads and that it would test the software independently. Last week, the organization published the results of its tests, which revealed that “Tesla’s camera-based driver monitoring system fails to keep the driver’s attention on the road”. CR found that Ford’s BlueCruise, on the other hand, issues alerts when the driver’s eyes are averted.

Since then, Tesla has rolled out many different versions of its v10 software – 10.9 should be here any day, and v11 with a “single city/highway software stack” and “many more architectural upgrades” will be released in February, according to CEO Elon Musk.

Reviews of the latest 10.8 are skewed, with some online reviewers saying it’s much smoother, and many others saying they don’t feel confident using the technology at all. A thread reviewing the latest version of FSD on the Tesla Motors subreddit page shows that owners share complaints about the software, with one even writing, “Definitely not ready for the mainstream yet… “

Another reviewer said that it took the car too long to turn right onto “a completely empty straight road… Then it had to turn left and continued to hesitate for no reason, blocking the oncoming lane reverse, only to then suddenly speed up once she had made it onto the next street, followed by an equally sudden deceleration because she changed her mind about speed and now thought a 45 mph road was 25 mph.

The driver said he eventually had to disengage completely because the system completely ignored an upcoming left turn, one that was to occur at a standard intersection “with lights and clear visibility in all directions and no other traffic” .

The Project Dawn campaign highlights a warning from Tesla that its FSD “can do the wrong thing at the worst time.”

“How can anyone tolerate a security-critical product on the market that can do the wrong thing at the worst time,” the advocacy group said. “Isn’t that the definition of defective?” Fully autonomous driving must be immediately removed from our roads.

Neither Tesla nor The Dawn Project could be reached for comment.



New York Times ad warns of Tesla’s ‘fully self-driving’ – TechCrunch

| Business Top Stories Local news
techcrunch

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.

Back to top button