ReportWire

Tag: Autopilot

  • Tesla could have avoided that $242.5M Autopilot verdict, filings show | TechCrunch

    [ad_1]

    Months before a jury awarded a $242.5 million verdict against Tesla over its culpability in a 2019 fatal crash, the automaker had a chance to settle for $60 million. Instead, Tesla rejected that offer, according to new legal filings that were first reported by Reuters.

    The settlement proposal, which was made in May, was disclosed in a filing that requested Tesla cover legal fees for the plaintiffs in the case.

    Earlier this month, a jury in federal court in Miami found Tesla partly to blame for a fatal 2019 crash that involved the use of the company’s Autopilot driver assistance system. One person was killed when a Tesla Model S with Autopilot engaged plowed through an intersection and hit a Chevrolet Tahoe. The crash victims, Neima Benavides Leon and her boyfriend Dillon Angulo, were standing outside the vehicle on the shoulder at the time. Leon was killed while Angulo was severely injured.

    The driver, who was not a defendant in this case, was sued separately for his responsibility. The lawsuit filed in 2021 against Tesla centered on Autopilot, which was engaged but did not brake in time to avoid going through the intersection. The jury assigned the driver two-thirds of the blame and attributed one-third to Tesla. As part of the verdict, the jury awarded the $242.5 million verdict as part of its decision.

    Tesla, in a statement provided to TechCrunch earlier this month, said it plans to appeal the verdict “given the substantial errors of law and irregularities at trial.”

    TechCrunch has reached out to the plaintiffs’ attorneys as well as Tesla. An outside PR firm that previously provided statements on Tesla’s behalf declined to comment and directed TechCrunch to the company’s press address. Tesla disbanded its communications team several years ago.

    The lawsuit, case 1:21-cv-21940-BB, was filed in 2021 in the U.S. District Court for the Southern District of Florida.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    [ad_2]

    Kirsten Korosec

    Source link

  • Nvidia

    Nvidia

    [ad_1]

    Workers install cooling fans on a supercomputer that will train Tesla’s new Autopilot. The supercomputer will consist of 50 thousand Nvidia H100 accelerators. Such a data center requires approximately 75 megawatts of electricity. Located in a gigafactory in Texas.

    [ad_2]

    Source link

  • Every Tesla capable of ‘full self driving’ can access the Autopilot feature for free. But is it safe?

    Every Tesla capable of ‘full self driving’ can access the Autopilot feature for free. But is it safe?

    [ad_1]

    HOUSTON – Every single Tesla in the country can now drive on its own.

    The electric vehicle manufacturer is offering a free one-month trial of the Full Self Driving technology to all capable U.S. vehicles.

    Tesla CEO Elon Musk announced the free trial on his social media platform X (formerly Twitter).

    The news comes with a new software update and just a few months after the company recalled more than 2 million cars. Most Tesla vehicles come with the technology built in for Full Self Driving. However, owners have to pay an upcharge, of up to $12,000, to unlock the technology for use.

    Tesla recalled 2,031,220 vehicles, including their Model 3 (2017-2023), Model X (2016-2023), Model S (2012-2023) and Model Y (2020-2023).

    The reason?

    “In certain circumstances when Autosteer is engaged, and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged, there may be an increased risk of a crash,” the recall read in part.

    In response to the recall, Tesla issued a remedy to the problem via an over-the-air update.

    “At no cost to customers, affected vehicles will receive an over-the-air software remedy, which is expected to begin deploying to certain affected vehicles on or shortly after December 12, 2023,” Tesla said.

    But here’s the thing: The update didn’t actually disable the “Autosteer” function that is part of the Autopilot feature in Tesla vehicles. Instead, it beefed up the warnings drivers get if they’re found to not be paying attention to the vehicle.

    Keeping your eyes on the road and hands on the steering wheel, while being ready to take over at any moment, is part of the agreement while using Autopilot functions.

    “Neither the recall nor its remedy disables Autosteer or features that rely on Autosteer. As mentioned, the remedy will incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged, which includes keeping their hands on the steering wheel and paying attention to the roadway,” Tesla said.

    Fast forward three months, now Musk is giving everyone that owns a Tesla capable of self-driving the opportunity to unlock the tech for free.

    “We didn’t know that until you said something about it,” said Darius Haywood while charging outside of a Houston Tesla dealership.

    The giveaway came in the wake of the massive recall, but also with open arms.

    Haywood is one of many interested in trying out the feature that costs a third of the sticker price on an entry-level Model 3.

    “The possibilities is endless with these cars,” he said. “I feel like Elon Musk, I feel like he trying to push his brand out to stand out in a certain way than what other people think about it.”

    Regardless, the idea of more self driving cars on Houston highways than ever before has some drivers, including fellow Tesla owners, a little concerned.

    “Some people are responsible, some people can be irresponsible,” Erni Salguero said while charging his Tesla.

    “I don’t know how to feel about a car driving by itself,” added Ashunti Sanders.

    Tesla’s own safety data says their vehicles driving on Autopilot average one crash for every five million miles driven.

    That’s ten times fewer than the U.S. national average.

    While the tech is there, it might just take time for the trust to follow.

    Copyright 2024 by KPRC Click2Houston – All rights reserved.

    [ad_2]

    Gage Goulding

    Source link

  • New Study Points Out Scary Driver Behavior In Supposedly Safer Cars

    New Study Points Out Scary Driver Behavior In Supposedly Safer Cars

    [ad_1]

    Despite literally just passing a variable message sign intermittently reading “Texting while driving is … (pause) … 23x more dangerous” placed just before one of the most hazardous, local maneuvers known as the Rochester Curve, society has gotten smarter about the importance of the age-old cliché of “eyes on the road, hands on the wheel, mind on the drive.”

    Or so we thought …

    A new study by the Insurance Institute for Highway Safety (IIHS) of people who owned vehicles with advanced driver assist features found that “…large percentages of users (53% of [GM’s] Super Cruise, 42% of [Tesla’s] Autopilot and 12% of [Nissan’s] ProPILOT Assist) indicated they were comfortable treating their systems as self-driving.” Self-driving cars are presently not available to consumers, despite misleading marketing from some manufacturers. The three aforementioned systems have what’s called “partial automation.” The in-the-flesh driver must still accomplish many routine driving tasks since those systems are not ready to launch ubiquitously.

    ADVERTISEMENT

    Along those lines, the study reports that Super Cruise and Autopilot users were more likely to engage in activities where they took their hands off the wheel and their eyes off the road. In fact, approximately 50% of Super Cruise and 42% of Autopilot “… users reported triggering a ‘lockout’ of the technology at some point, which occurs when a driver fails to respond to attention warnings.” So far, all mainstream systems require the driver’s active supervision.

    A possible reason: some manufacturers were very liberal with their marketing and executives’ public statements, which essentially encouraged drivers to treat the system as autonomous. And that lead to some car owners, like Param Sharma of San Francisco, being recorded on the highway riding as a backseat passenger without any humans in the front seat. Raj Mathai, a KNTV (NBC) news anchor in the San Francisco, rightly described such behavior as “… very illegal.”

    ADVERTISEMENT

    The study’s discoveries call into question whether basic engineering rigor (e.g., examining the Safety of the Intended Feature, a.k.a., SOTIF) was appropriately completed for these designs, and whether the public understands the difference between Advanced Driver Assistance Systems (ADAS) and autonomous. “Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” says IIHS President David Harkey. “In fact, the opposite may be the case if systems lack adequate safeguards.”

    As reported by the New York Times in June, the National Highway Traffic Safety Administration (NHTSA) upgraded … “its preliminary evaluation of Autopilot to an engineering analysis [that] … will look at whether Autopilot fails to prevent drivers from diverting their attention from the road and engaging in other predictable and risky behavior while using the system.”

    Meanwhile, eight short weeks later Tesla released another Beta version of its software that it tested with only a 1000 (lucky?) users due to “many major code changes.”

    ADVERTISEMENT

    Maybe the variable message sign should be warning drivers about more than just texting.

    [ad_2]

    Steve Tengler, Senior Contributor

    Source link