Tesla must pay portion of $329M damages after fatal Autopilot crash, jury says

172 koolba 154 8/1/2025, 6:28:53 PM cnbc.com ↗

Comments (154)

guywithahat · 6h ago
> The company must pay $329 million in damages to victims and survivor, including compensatory and punitive damages.

> A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.

> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly. Had the system been being used correctly and Tesla was assigned more of the blame, would this be a 1 billion dollar case? This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road

furyofantares · 5h ago
> This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road

I hope we haven't internalized the idea that corporations should be treated the same as people.

There's essentially no difference $3M and $300M fine against most individuals, but $3M means very little to Tesla. If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.

That's another difference - fining an indivisible is not going to change risks much, the individual's behavior changing is not that meaningful compared to Tesla's behavior changing. And it's not like a huge fine is gonna make a difference in other drivers deciding to be better, whereas other automakers will notice a huge fine.

ratelimitsteve · 5h ago
>I hope we haven't internalized the idea that corporations should be treated the same as people.

Only when it comes to rights. When it comes to responsibilities the corporations stop being people and go back to being amorphous, abstract things that are impossible to punish.

dontlaugh · 5h ago
Would be nice to see executions of corporations as punishment.
tialaramex · 4h ago
Perhaps better to achieve symmetry by ceasing to execute humans.

You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human, so you can't make the scores even but you can stop putting more on the total for human misery.

Historically it was impractical to permanently warehouse large number of humans, death was more practical = but the US has been doing it for all sorts of crap for decades so that's not a problem here.

The US would still have much harsher punishments than Western Europe even without the death penalty, because it routinely uses life-means-life sentences where no matter what you're never seeing the outside again.

thfuran · 4h ago
>You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human

What if we garnished 100% of the future wages of all the employees in perpetuity as well as dissolving the corporate entity? You know, to to make sure the company stays all the way dead.

tialaramex · 2h ago
I guess my bad for not specifying that it'd need be wicked for the corporation not the humans.
JumpCrisscross · 4h ago
> Would be nice to see executions of corporations as punishment

Fines. Massive fines.

"Corporate death penalty" is a genius invention of corporate lawyers to distract from the punitive effect of massive fines.

Fines and license revocable are precedented. They take real money from the corporation and its owners. Corporate death penalties are a legal morass that doesn’t actually punish shareholders, it just cancels a legal entity. If I own an LLC and have a choice between a fine and the LLC being dissolved, I’d almost always opt for the latter.

But fines are boring. Corporate death penalty sounds exciting. The anti-corporate folks tend to run with it like catnip, thus dissolving the coalition for holding large companies accountable. (Which, again, a corporate "execution" doesn't do. Nullifying my LLC doesn't actually hurt me, it just creates a little bit of work for my lawyer, and frankly, getting out of a fuckup by poofing the LLC without touching the udnerlying assets is sort of the selling point of incorporation.)

thfuran · 4h ago
Corporate fines are a genius invention of corporate execs' personal lawyers to distract from the fact that all corporate malfeasance is conducted by actual people who could be held accountable.
JumpCrisscross · 3h ago
> Corporate fines are a genius invention of corporate execs' personal lawyers

Ahistoric and orthogonal. Corporate fines and personal sanctions have coëxisted since corporations were a thing. Charter revocations, on the other hand, have almost always followed individual liability, because again, just poofing a corporation doesn't actually do anything to its assets, the part with actual value. (In the English world, corporations frequently came pinned with trade charters. The actual punishment was losing a trade monopoly. Not a legal fiction being dissolved.)

Nothing about corporate death penalties or corporate fines prevents personal liability. And neither particularly promotes it, either, particularly if guilt is never acknowledged as part of the proceedings.

HWR_14 · 3h ago
I'm guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders.
JumpCrisscross · 3h ago
> guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders

This is just expropriation. Which is legally complicated. And raises questions around what the government should do with a bunch of seized assets and liabilities that may or may not be in a going condition, and whether some stakeholders should be reimbursed for their losses, for example employees owed pay, also do pensions count, and if so executive pensions as well, and look at that the discussion got technical and boring and nobody is listening anymore.

On the other hand, a massive fine punts that problem to the company. If it can pay it, great. It pays. If it can’t, we have bankruptcy processes already in place. And the government winds up with cash, not a Tesla plant in China.

Corporate death penalties are stupid. They’re good marketing. But they’re stupid. If you want to hold large companies unaccountable, bring it up any time someone serious threatens a fine.

usefulcat · 3h ago
The “LL” in LLC stands for Limited Liability. The whole point is to financially insulate the owner(s).
JumpCrisscross · 37m ago
To be fair, I think they’re talking about seizing the LLC’s assets. Not the members’.
parineum · 1h ago
> Only when it comes to rights.

"Corporations are people" means a corporation is people, not a corporation is a person.

People have rights, whether they are acting through a corporation or not. That's what Citizens United determined.

I hope you think about who misled you to thinking that "corporations are people" meant a corporation is a person and trust them a little less.

stouset · 5h ago
I agree that Tesla should receive punitive damages. And the size of the punitive damages must be enough to discourage bad behavior.

I'm not necessarily sure the victim(s) should get all of the punitive damages. $329 million is a gargantuan sum of money; it "feels" wrong to give a corporation-sized punishment to a small group of individuals. I could certainly see some proportion going toward funding regulatory agencies, but I fear the government getting the bulk of punitive damages would set up some perverse incentives.

I think in the absence of another alternative, giving it to the victim(s) is probably the best option. But is there an even better possible place to disburse the funds from these types of fines?

bruce511 · 5h ago
>> it "feels" wrong to give a corporation-sized punishment to a small group of individuals

This feeling has a name; loss aversion.

It's a really interesting human traits. About 66% of people feel bad when someone else does well. The impact of this feeling on behavior (even behavior that is self-harming) is instructive.

The concept of "Fairness" comes into play as well. Many people have an expectation that the "world is fair" despite every evidence that it isn't. That results in "everything I don't get is unfair" whereas "everything I get I earned on my own." Someone rlse getting a windfall is thus "unfair".

thfuran · 3h ago
That really doesn't sound like loss aversion.
singleshot_ · 3h ago
> I'm not necessarily sure the victim(s) should get all of the punitive damages.

I have some great news for you, then: the attorney probably took a third (more if they win an appeal).

> But is there an even better possible place to disburse the funds from these types of fines?

Oh, my mistake: I thought you meant way worse.

thelastgallon · 5h ago
$300M means very little to Tesla. The stock didn't even drop a bit (other than the usual market fluctuations today). Perhaps $4.20B or $6.90B would've been meaningful. Elon notices these numbers.
cookszn · 4h ago
Not doing what it asks - “keep your hands on the wheel” and “eyes on the road” - and crashing the car is somehow Elon Musks’ fault LOL hn logic. Can’t wait to sue lane assist when I drive drunk and crash!
root_axis · 3h ago
It's supposed to stop if objects appear in its path. For sure you're an idiot if you trust Tesla's autopilot, but I think it's reasonable to partially fault Tesla for setting the consumer's expectation that the car stops when obstacles get in the way even if the vehicle isn't being operated exactly as suggested by the manufacturer.
bko · 5h ago
> If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.

What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.

This is obviously targeted and the court system should not be playing favorites or going after political opponents

xnx · 4h ago
> What behavior do you want them to change?

Don't advertise their driver assist system as "full self driving".

tzs · 4h ago
The system involved in this crash was never advertised as "full self driving".
thejazzman · 5h ago
- FSD came out in October 2020; I suppose rounding up to 10 puts it nearly 10 years since. It also, literally, doubles the number from its actual value.

- There have been a lot more than one incident. This is one court case about one incident.

- There are an insane number of accidents reported; does it only matter to you if someone dies? A lot more than one person has died in an accident that involved a vehicle that was equipped with FSD.

- Your comment is obviously targeted and disingenuous.

There was even a recall over it: https://www.eastbaytimes.com/2023/02/16/tesla-full-self-driv...

So to answer your question of what one might want to come out of it, perhaps another recall where they fix the system or stop making false claims about what it can do.

helsinkiandrew · 5h ago
> This is obviously targeted and the court system should not be playing favorites or going after political opponents

This was a jury trial of a civil case - the family of the deceased took Tesla to court, not an anti-Tesla/Musk court system conspiracy.

simion314 · 5h ago
> It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.

And how many times did humans had to take over and save themselves and others from Tesla killing or injuring them? Tesla won't tell us this numbers, guess why ? The tech might be safe as a backup driver , but so far you need a human to pay attention to save himself from the car bugs/errors/glitches etc.

I really hate this bullshit safety claims pulled from someones ass, it is like me trying to convince you to get operated but an AI doctor by claiming "it is better then the a old and drunk doctor , he only killed a few people when the people supervising it did not payed attention but in the rest was very safe, we can't tell you how many times real doctors had to perform the hard work and our AI doctor only did stitching , those numbers need to be secret, but trust us the human doctors that had to intervene are just there because of the evil laws it could do the full job itself, we would not call it Full competent doctor if it can\t perform fully all expected tasks.

colingauvin · 5h ago
Tesla was found partially liable for this crash. The reason they were liable was they sold something claiming (practically speaking) that it could do something. The customer believed that claim. It failed to do that thing and killed people.

So the question then is - how much did Tesla benefit from claiming they could do this thing? That seems like a reasonable starting point for damages.

onlyrealcuzzo · 5h ago
And the fine needs to be high enough to prevent them from just saying - oh, well, we can make money if we keep doing it.

If you could only fine a person for committing murder, you wouldn't fine a billionaire $5m, and then hope he wouldn't go on killing everyone he thinks he'd rather have dead than $5m.

marcosdumay · 5h ago
The US justice system uses punitive damages very heavily. And Tesla should absolutely get some punishment here.

On most other places you'd see it paying hundreds of millions in fines and a few millions in damages.

IdSayThatllDoIt · 6h ago
I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.

"[Plaintiffs] claimed Tesla’s Autopilot technology was flawed and deceptively marketed."

ratelimitsteve · 5h ago
do you think they heard "autopilot" or "full self driving"?
MBCook · 30m ago
You really think the defense wouldn’t have objected if the wrong term was used, or that the judge would allow its continued use?
close04 · 5h ago
> I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.

It's only fair. If the name was fine when it was attracting the buyers who were mislead about the real capabilities, it must be fine when it causing the same to jurors.

There's another similar argument to be made about the massive amount awarded as damages, which maybe will be lowered on appeal. If people (Tesla included) can make the argument that when a car learns something or gets an "IQ" improvement they all do, then it stands to reason that when one car is dangerous they all are (or were, even for a time). There are millions of Teslas on the road today so proportionally it's a low amount per unsafe car.

Hamuko · 5h ago
"Autopilot" isn't even the most egregious Tesla marketing term since that honour goes to "Full Self-Driving", which according to the fine text "[does] not make the vehicle autonomous".

Tesla's self-driving advertising is all fucking garbage and then some George McGee browses Facebook while believing that his car is driving itself.

ajross · 5h ago
As gets pointed out ad nauseum, the very first "cruise control" product in cars was in fact called "Auto-Pilot". Also real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free.

This is a fake argument (post hoc rationalization): It invents a meaning to a phrase that seems reasonable but that has never been rigorously applied ever, and demands that one speaker, and only that one speaker, adhere to the ad hoc standard.

JumpCrisscross · 4h ago
> real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free

Pilot here. If my G1000’s autopilot were flying and I dropped my phone, I’d pick it up. If my Subaru’s lane-keeping were engaged and I dropped me phone, I might try to feel around for it, but I would not go spelunking for several seconds.

tzs · 2h ago
The market Tesla is advertising to is not airplane pilots. It is the general car buying public.

If they are using any terms in their ads in ways other than the way the people the ads are aimed at (the general car buying public) can reasonably be expected to understand them, then I'd expect that could be considered to be negligent.

Much of the general public is going to get their entire idea of what an autopilot can do from what autopilots do in fiction.

metabagel · 5h ago
The dictionary definition for Americans is:

> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.

https://ahdictionary.com/word/search.html?q=automatic+pilot

Note that “autopilot” and “automatic pilot” are synonyms.

https://ahdictionary.com/word/search.html?q=Autopilot

An autopilot is supposed to be an automatic system, which doesn’t imply supervision.

https://ahdictionary.com/word/search.html?q=automatic

> Self-regulating: an automatic washing machine.

SoftTalker · 5h ago
Notably, an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way. It's just that the sky is very big and other aircraft are very small, so random collisions are extremely unlikely.
JumpCrisscross · 4h ago
> an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way

TAWS (terrain) and ACAS (traffic) are built into modern autopilots.

And Tesla lied about its autopilot’s capabilities in proximity to this crash: “In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own. ‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.”

https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...

metabagel · 4h ago
Airplanes and automobiles differ in a number of ways.
aaomidi · 5h ago
That’s why we have a jury.

Autopilot quite literally means automatic pilot. Not “okay well maybe sometimes it’s automatic”.

This is why a jury is made up of the average person. The technical details of the language simply does not matter.

ratelimitsteve · 5h ago
Couldn't agree more. This thing where words have a common definition and then a secret custom definition that only applies in courts is garbage. Everyone knows what "full self driving" means, either deliver that, come up with a new phrase or get your pants sued off for deceptive marketing.
lotsofpulp · 4h ago
Autopilot is used when referring to a plane (until Tesla started using it as a name for their cruise control that can steer and keep distance).

In the context of a plane, autopilot always meant automatic piloting at altitude, and everyone knew it was the human pilots that were taking off and landing the plane.

MBCook · 27m ago
Did they?

I think you may be overestimating how much average people know about autopilot systems.

guywithahat · 5h ago
It's also worth mentioning he would have been required to keep his hands on the wheel while using autopilot, or else it starts beeping at you and eventually disables the feature entirely. The system makes it very clear you're still in control, and it will permanently disable itself if it thinks you're not paying attention too many times (you get 5 strikes for your ownership duration).
indemnity · 4h ago
The strikes reset after a week, they do not persist for the duration of your ownership of the vehicle.

https://www.tesla.com/support/autopilot - section “How long does Autopilot suspension last?”

cosmicgadget · 5h ago
Is there any contextual difference between the first instance of cruise control (which has since been relabeled cruise control, perhaps with reason), automatic flight control, and a company whose CEO and fanboys incessantly talk about vehicle autonomy?
EA-3167 · 5h ago
As is also pointed out ad nauseam, the claims made about autopilot (Tesla) go far beyond the name, partly because they sold a lot of cars on lies about imminent "FSD" and partly because as always Elon Musk can't keep his mouth shut. The issue isn't just the name, it's that the name was part of a full-court-press to mislead customers and probably regulators.
bko · 6h ago
> On one hand I don't think you can apply a price to a human life

Yes, although courts do this all the time. Even if you believe this as solely manufacturer error, there are precedents. Consider General Motors ignition switch recalls. This affected 800k vehicles and resulted in 124 deaths.

> As part of the Deferred Prosecution Agreement, GM agreed to forfeit $900 million to the United States.[4][51] GM gave $600 million in compensation to surviving victims of accidents caused by faulty ignition switches

So about $5m per death, and 300m to the government. This seems excessive for one death, even if you believe Tesla was completely at fault. And the fact that this is the only such case (?) since 2019, seems like the fault isn't really on the manufacturer side.

https://en.wikipedia.org/wiki/General_Motors_ignition_switch...

panarky · 5h ago
If you you make a manufacturing error without intentionally deceiving your customers through deceptive naming of features, you have to pay millions per death.

If you intentionally give the feature a deceptive name like "autopilot", and then customers rely on that deceptive name to take their eyes off the road, then you have to pay hundreds of millions per death.

Makes sense to me.

bangaladore · 5h ago
Wouldn't that logic mean any automaker advertising a "collision avoidance system" should be held liable whenever a car crashes into something?

In practice, they are not, because the fine print always clarifies that the feature works only under specific conditions and that the driver remains responsible. Tesla's Autopilot and FSD come with the same kind of disclaimers. The underlying principle is the same.

panarky · 4h ago
There are plenty of accurate names Tesla could have selected.

They could have named it "adaptive cruise control with assisted lane-keeping".

Instead their customers are intentionally led to believe it's as safe and autonomous as an airliner's autopilot.

Disclaimers don't compensate for a deceptive name, endless false promises and nonstop marketing hype.

cosmicgadget · 5h ago
If it was called "comprehensive collision avoidance system" then yes.
bko · 5h ago
Right, this is the frustrating thing about court room activism and general anger towards Tesla. By any stretch of the imagination, this technology is reasonably safe. It has over 3.6 billion miles, currently about 8m miles per day. By all reasonable measures, this technology is safe and useful. I could see why plaintiffs go after Tesla. They have a big target on their back for whatever reason, and activist judges go along. But I don't get how someone on the outside can look at this and think that this technology or marketing over the last 10 years is somehow deceptive or dangerous.

https://teslanorth.com/2025/03/28/teslas-full-self-driving-s...

dmoy · 5h ago
> activist judges

Wait what? What activism is the judge doing here? The jury is the one that comes up with the verdict and damage award, no?

thorum · 5h ago
The product simply should not be called Autopilot. Anyone with any common sense could predict that many people will (quite reasonably) assume that a feature called Autopilot functions as a true autopilot, and that misunderstanding will lead to fatalities.
bangaladore · 5h ago
> feature called Autopilot functions as a true autopilot

What's a "true autopilot"? In airplanes, autopilot systems traditionally keep heading, altitude, and speed, but pilots are still required to monitor and take over when necessary. It's not hands-off or fully autonomous.

I would argue you are creating a definition of "autopilot" that most people do not agree with.

lovehashbrowns · 4h ago
It can be called anything in an airplane because there the pilot has some level of training with the system and understands its limits. You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.

A car is another thing entirely because the general population's definition of "autopilot" does come into play and sometimes without proper training or education. I can just go rent a tesla right now.

cosmicgadget · 5h ago
IANAP but I think they can take their hands off the controls and pick up a dropped phone.
dzhiurgis · 4h ago
So you can on tesla when used correctly. Can you enable plane autopilot and still crash into a mountain?
HWR_14 · 3h ago
Modern autopilots? No, they will not crash into a mountain or another plane.
dzhiurgis · 37m ago
So you’re a pilot?
metabagel · 5h ago
What matters is the definition which most people use.

https://news.ycombinator.com/item?id=44761341

bangaladore · 5h ago
I guess I don't understand how.

> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.

Applies here. As far as I can tell the system did do exactly that. But the details of the actual event are unclear (I've heard parked car but also just hitting the back of another car?)

metabagel · 5h ago
It’s an emergent technology. The onus is on Tesla to be crystal clear about capabilities, and consistently so. People might quite reasonably infer that something which is labeled as “auto-“ or “automatic” doesn’t require supervision.
scbrg · 4h ago
Such as automobile?
metabagel · 4h ago
Automobiles are self-propelled, not self-navigating. They don't rely on horses to pull them.

automobile: self-propelled (self moving)

autopilot: self-piloting (self navigating)

https://www.ahdictionary.com/word/search.html?q=automobile

philistine · 2h ago
Indeed, since you do not need to externally supervise the drivetrain in a car.
nullc · 5h ago
A true autopilot is a system on a boat or aircraft that keeps it on a set heading. ISTM in this case that's what the autopilot did.
metabagel · 4h ago
Boats and aircraft are both different from automobiles. They have full freedom of movement. You can't set a course in the same way with an automobile, because the automobile will need to follow a roadway and possibly make turns at intersections. Boats and aircraft can have waypoints, but those waypoints can be mostly arbitrary, whereas a car needs to conform its path to the roadway, traffic, signage, etc.

It's an entirely different domain.

nullc · 2h ago
Yes, an autopilot is not what you need on a car.
dzhiurgis · 4h ago
Anyone who used it knows its limitations. IDK maybe in 2019 it was different tho, now it's full of warnings that make it barely useable when distracted. Ironically you are better off disabling it and staring into your phone, which seems what regulators actually want.

And by the way what is true autopilot? Is the average joe a 787 pilot who's also autopilot master?

Funny that pretty much every car ships with autosteer now. Ones I've used didn't seem to have much warnings, explanations, disclaimers or agreements that pundits here assume it should.

realusername · 5h ago
There's two conflicting goals here, Tesla's marketing department would really like to make you think the car is fully autonomous for financial reasons (hence autopilot and full self driving) and then there's Tesla's legal department which would prefer to blame somebody else for their poor software.
blargey · 5h ago
The fault with an individual can be reasonably constrained to the one prosecuted death they caused. The fault with "autopilot by Tesla", a system that was marketed and deployed at scale, cannot.

And if you want to draw parallels with individuals, an individual driver's license would be automatically suspended and revoked when found at fault for manslaughter. Would you propose a minimum 1~3 year ban on autopilot-by-Tesla within the US, instead?

HWR_14 · 3h ago
329 million is not just compensatory damages (the value of the human life) but also punitive damages. That number floats up to whatever it takes to disincentivize Tesla in the future.
polotics · 4h ago
329 million too high? if you had the money and handing it over would save your life, would you rather keep the money as a corpse?
thfuran · 3h ago
So wrongful death liability should be infinite, or maybe just equal to the money supply (pick one, I guess)?
somerandomqaguy · 4h ago
I just did some googling around:

> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.

> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.

-https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...

Nothing enrages a judge faster then an attempt to conceal evidence that a court has ordered be turned over during discovery. If this is then I suspect the punitive damages have to do as much about disregard to the legal process as it is the case itself.

fuoqi · 4h ago
>I don't think you can apply a price to a human life

Not only we can, but it also done routinely. For example, see this Practical Engineering video: https://www.youtube.com/watch?v=xQbaVdge7kU

abeppu · 5h ago
I think the conceptually messed up part is, when such an award includes separate components for compensatory damages and punitive damages, the plaintiff receives the punitive damages even if they're part of a much broader class that was impacted by the conduct in question. E.g. how many people's choice to purchase a Tesla was influenced by the deceptive marketing? How many other people had accidents or some damages? I think there ought to be a mechanism where the punitive portion rolls into the beginning of a fund for a whole class, and could be used to defray some costs of bringing a larger class action, or dispersed directly to other parties.
MBCook · 23m ago
So if I make a lawsuit and prove there is a small possibility my toaster can cause my arm to be cut off, because that’s what it did to me, and win $400,000,000 I should only get $400 if it turns out they sold 1 million units?

It’s not a class action lawsuit. If they want their cash they should sue too. That’s how our system works.

beambot · 5h ago
On the flip side: Penalties should be scaled relative to one's means so that the wealthy (whether people or corporations) actually feel the pain & learn from their mistakes. Otherwise penalties for the wealthy are like a cup of coffee for the average Joe -- just a "cost of business."

I'm also a big proponent of exponential backoff for repeat offenders.

Simulacra · 1h ago
It's sort of like with the Prius acceleration debacle we had, people always want to blame the car and not their own actions.
dzhiurgis · 4h ago
Let's fine Apple too, since they allow smartphones to be using while driving.
pengaru · 6h ago
If anyone's confused about what to expect of autopilot and/or FSD, it's Tesla's doing and they should be getting fined into oblivion for the confusion and risks they're creating.
SilverElfin · 6h ago
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

Hard for me to see this as anything but the driver’s fault. If you drop your phone, pull over and pick it up or just leave it on the floor. Everyone knows, and the car tells you, to pay attention and remain ready to take over.

naet · 4h ago
The argument is that if the driver was in a different vehicle he would have done just that, pulled over and picked it up, but because he believed the Tesla was capable of driving safely on it's own he didn't do so.

Normally I turn the steering wheel when I want to turn my car. If you sold me a car and told me it had technology to make turns automatically without my input then I might let go of the wheel instead of turning it, something I would never have done otherwise. If I then don't turn and slam straight into a wall, am I at fault for trusting what I was sold to be true?

If the driver has heard that their Tesla is capable of autonomous driving, and therefore trusts it to drive itself, there may be a fair argument that Tesla shares in that blame. If it's a completely unreasonable belief (like me believing my 1998 Toyota is capable of self driving) then that argument falls apart. But if Tesla has promoted their self driving feature as being fully functional, used confusing descriptions like "Full Self-Driving", etc, it might become a pretty reasonable argument.

dzhiurgis · 4h ago
Except driver already accepted liability.

Also this doesn't stand water today as all new cars have some basic autosteer.

breadwinner · 3h ago
No one else falsely advertises it as "Full Self Driving".
dzhiurgis · 37m ago
FSD wasn’t even released at that time.
MBCook · 20m ago
Substitute Autopilot then.

No other auto maker uses similar language. Ford and GM use BlueCruise and SuperCruise, clearly implying an improved kind of cruise control.

m463 · 6h ago
The model s has terrible phone docks. Don't get me started on cupholders, I'll bet people have drink mishaps all the time that affect driving.

I'm actually kind of serious about this - keeping people's stuff secure and organized is important in a moving car.

I'm surprised the touchscreen controls and retiring of stalks aren't coming under more safetly scrutiny.

With the new cars without a PRND stalk, how can you quickly reverse the car if you nose out too far and someone is coming from the side? will the car reverse or go forward into danger?

izacus · 5h ago
The driver at that time was Tesla Autopilot. So yeah, drivers fault as the jury said.
tantalor · 4h ago
1/3 Tesla's fault, 2/3 the operator's
tr81 · 5h ago
Why would you pull over when you paid top dollars for Autopilot?
SoftTalker · 5h ago
And why was his mobile phone in his hand to drop, if he was driving? Most states have laws against mobile device usage while driving, and it was never a responsible thing to do even before the laws were enacted.
MBCook · 19m ago
Perhaps he thought it was safe. After all, he had autopilot.
nielsbot · 4h ago
Sure--dangerous and wrong. Despite that, Autopilot was driving at the time.
ineedasername · 5h ago
Maybe he would have pulled over if the car’s capabilities hadn’t been oversold. Two entities did stupid things: 1) The person by not waiting to pull over because Elon Musk’s false claims, and 2) Tesla via Elon Musk making those false claims.

It passes a classic “but for…” test in causality.

v5v3 · 4h ago
The articles has been updated (correction) to say Tesla is liable for a portion of the damages. Not all of it.

33% liable

https://www.cnbc.com/2025/08/01/tesla-must-pay-329-million-i...

hshdhdhj4444 · 3h ago
33% for the compensatory damages and 100% of the punitive damages according to the article. So total of about $242 mm.
treetalker · 3h ago
The case number appears to be 1:21-cv-21940-BB (S.D. Fla.).

I practice in that court regularly. Beth Bloom has a reputation as a good trial judge, so I'm somewhat skeptical of Tesla's claims that the trial was rife with errors. That said, both the Southern District and the Eleventh Circuit are known to frequently and readily lop off sizable chunks of punitive damages awards.

The real battle is just beginning: post-trial motions (including for remittitur) will be due in about a month. Then an appeal will likely follow.

bsimpson · 6h ago
Wellbeing doesn't have a price tag. There's no amount of money someone could pay you to make up for your daughter dying early or your son becoming disabled.

However, $329M sounds like an imaginary amount of money in a liability claim. If this guy crashed into a parked car without Tesla being involved, the family would be unfathomably lucky to even get 1% of that amount.

narrator · 10m ago
The supreme court has ruled on punitive damage awards going back to the 1990s and limited the excesses. You can't for example get $4 million in punitive damages on $4000 in actual damages. The general rule from the looks of the article is 5x actual damages.

https://corporate.findlaw.com/litigation-disputes/punitive-d...

wedn3sday · 6h ago
Perhaps the driver would have acted differently if Tesla hadnt sold him a product called "autopilot."
cosmicgadget · 5h ago
But if it was called 'lane assist' or 'adaptive cruise' he might not have bought the car in the first place.
delecti · 4h ago
I agree. If Tesla was not allowed to give their product a misleading name, it probably would get fewer sales. That's the rationale behind false advertising laws.
terminalshort · 4h ago
Typically you need better evidence than "perhaps" to walk out of court with millions.
metabagel · 4h ago
I'm not an attorney, but I think typically juries operate on the "reasonable person" standard.
Hamuko · 5h ago
Most of that $329 million is punitive, not compensatory.
wagwang · 5h ago
I dont understand this at all. Mercedes recently announced that theyd be liable for accidentals caused by their level 4 system meaning that by default, all other self driving features are driver liable?
MBCook · 14m ago
Level 3 and level 2 are VERY different.

That’s apples and oranges. And Mercedes saying something certainly doesn’t change the law.

Mercedes built a sophisticated system designed to be able to handle what it might run into within the very limited domain it was designed for. Then they got it certified and approved by the government. Among other things, it only works at low speed. Not 60 MPH. And I think only on highways, not where there are intersections.

Tesla‘s system is not child’s play, sure. But they unleashed it everywhere. Without any certification. And relatively few limits, which I think they later had to tighten.

Mercedes was minimizing risk. Tesla was not.

metabagel · 4h ago
Mercedes Benz system in California and Nevada is level 3 autonomous.

https://www.kbb.com/car-advice/level-3-autonomy-what-car-buy...

https://cars.usnews.com/cars-trucks/features/mercedes-level-...

> Mercedes Chief Technology Officer Markus Schäfer recently told Automotive News that Level 4 autonomy is "doable" by 2030.

dmix · 5h ago
I haven't looked at it but that may be regulatory or insurance liability vs civil suits which in America like to invent fantastical numbers.
bookmtn · 6h ago
If Tesla is partially responsible for this fatal accident, who or what else was ruled as contributing to the crash?
MBCook · 13m ago
The article is clear. The driver bore the rest of the responsibility.
wedn3sday · 6h ago
Well, I dont think anyone else contributed a system called "autopilot."
bdcravens · 5h ago
And yet Tesla was only found 33% accountable, so the presence of "autopilot" was only 1/3 of the cause.
Hamuko · 5h ago
I'd guess the driver since he was using his phone, then dropped it and was scrambling to pick it up when the accident happened.
tantalor · 4h ago
Important: headline was just updated,

> CORRECTION: Tesla must pay portion of $329 million in damages after fatal Autopilot crash, jury says

This was added,

> The jury determined Tesla should be held 33% responsible for the fatal crash. That means the automaker would be responsible for about $42.5 million in compensatory damages.

1vuio0pswjnm7 · 51m ago
HN commenter: "On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly."

What this sentence is describing are compensatory damages, e.g., compensation for loss of life

That number was 129 not 329, and Tesla was only found liable for 33% of it

CNBC: "Tesla's payout is based on $129 million in compensatory damages, and $200 million in punitive damages against the company."

CNBC: "The jury determined Tesla should be held 33% responsible for the fatal crash. That means the automaker would be responsible for about $42.5 million in compensatory damages."

42.5 is not even close to the 329 number that the HN commenter's claims "feels too high"

HN commenter: "This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road"

This sentence is describing something more akin to punitive damages, e.g., a fine

That number was 200 not 329

It seems the HN commenter makes no distinction between compensatory and punitive damages

TheAlchemist · 5h ago
Whatever one believes about the state of the FSD / Autopilot today (it still doesn't work well enough to be safe and probably won't for the next 5 years), I just don't see how one can argue that almost 10 years ago, when this thing would go into a fatal accident every 10 miles while Tesla was arguing it's safer than a human and will go from LA to NY autonomously within the year, was not a total bullshit a deceptive marketing.

"The person in the driver's seat is only there for legal reasons" - that was 2016... Funnily enough, in 2025 they are rolling exactly the same idea, as a Robotaxi, in California. Amazing.

debug-desperado · 3h ago
"Vehicle will not break when accelerator pedal is applied" is displayed on the screen in AutoPilot mode. The system warns you this every time you have foot on the pedal in AutoPilot mode. I wonder if it did that back in 2019 as well or if accidents like this spurred that as a UI change.

Also, how the heck is Mr. McGee supposed to come up with the other 67% of this judgment?

tzs · 1h ago
> Also, how the heck is Mr. McGee supposed to come up with the other 67% of this judgment?

He probably can't. Assuming the amount stands on appeal, and assuming that he doesn't have an insurance policy with policy limits high enough to cover it, he'll pay as much as he can out of personal assets and probably have his wages garnished.

He might also be able to declare bankruptcy to get out of some or all of it.

An interesting question is what the plaintiffs can do if they just cannot get anywhere near the amount owed from him.

A handful of states have "joint and several liability" for most torts. If this had been in one of those states the way damages work when there are multiple defendants is:

• Each defendant is fully responsible for all damages

• The plaintiff can not collect more than the total damage award

In such a state the plaintiffs could simply ask Tesla for the entire $329 million and Tesla would have to pay. Tesla would then have a claim for $220 million against McGee.

Of course McGee probably can't come up with that, so McGee still ends up in bankruptcy, but instead of plaintiffs being shortchanged by $220 million it would be Tesla getting shortchanged.

The idea behind joint and several liability is that in situations like there where someone has to be screwed it shouldn't be the innocent plaintiff.

Many other states have "modified joint and several liability". In those states rather than each defendant being fully responsible, only defendants whose share of the fault exceeds some threshold (often 50%) are fully responsible.

For example suppose there are three plaintiffs whose fault shares are 60%, 30%, and 10%. In a pure joint and several liability state plaintiff could collect from whichever are most able to come up with the money. If that's the 10% one they could demand 100% from them and leave it to that defendant to try to get reimbursed from the others.

In a modified joint and several liability state with a 50% threshold the most the plaintiff can ask from the 30% and 10% defendants is 30% and 10% respectively. They could ask the 60% defendant for 100% instead. If the 60% defendant is the deep pockets defendant that works out fine for the plaintiff, but if it is the 10% one and the other two are poor then the plaintiff is the one that gets screwed.

Finally there are some states that have gotten rid of joint and several liability, including Florida in 2006. In those states plaintiff can only collect from each defendant bases on their share of fault. Tesla pays their 33%, McGee pays whatever tiny amount he can, and the plaintiff is screwed.

al_borland · 4h ago
If I were running a car company, I may decide to stop development on assistive features and remove any that exist from future models. The way people are using them, and the courts are punishing the companies for adding these safety features, is it worth it to them? Not having them would also bring down the base price on vehicles, which seems to be the primary complaint from people right now.
LorenPechtel · 4h ago
It's not assistive features in general that are the problem. It's that level 3 self-drive (which is what Tesla purports to have) should not be permitted to exist.

Things which simply help the driver, fine. Lane assist, adaptive cruise control and other such things that remove some of the workload but still leave the driver with the full responsibility of control. That's level 2.

Level 3 can generally run the car but is not capable of dealing with everything it might encounter. That's where you get danger--this guy thought he could pick up his dropped phone, his car didn't even manage to stay in it's lane. And the jury quite correctly said that's totally unacceptable.

The next step above level 2 should be skipping to level 4--a car that is capable of making a safe choice in all circumstances. That's Waymo, not Tesla. A Waymo will refuse to leave the area with the sufficiently detailed mapping it needs. A Waymo might encounter something on the road that it can't figure out--if that happens it will stop and wait for human help. Waymos can ask their control center for guidance (the remote operators do not actually drive, they tell the car what to do about what it doesn't understand) and can turn control over to a local first responder.

metabagel · 4h ago
Tesla is level 2 autonomous.

https://www.kbb.com/car-advice/level-3-autonomy-what-car-buy...

The only level 3 autonomous vehicles available for purchase in the U.S. are certain Mercedes-Benz models, and it only works on select roads in California and Nevada.

> Mercedes-Benz Drive Pilot: California and Nevada are currently the only states with roads approved for the Mercedes-Benz Drive Pilot. For 2025, only the EQS and S-Class offer Level 3 and then it’s by subscription.

MBCook · 10m ago
And that’s the problem.

Autopilot is/was fancy cruise-control. They don’t treat it like that or talk about it like that.

terminalshort · 4h ago
You probably should have read the article because this lawsuit is specifically about the product that you just said is "fine."
thfuran · 3h ago
Well, it becomes a lot less fine if you call it "full self driving", because it very much is not that.
fvgvkujdfbllo · 2h ago
Or just not do false advertising.
BonoboIO · 4h ago
Tesla hid evidence and was probably punished by the jury and I hardly feel sorry for Tesla. Maybe next time they will not do that, but I doubt it. Transparency was never a thing baked into the Tesla Musk DNA.

https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...

mcrk · 5h ago
I bet most of the prosecution time was spent "explaining" the meaning of autopilot to the jury. Easy money.
dmix · 5h ago
Airplanes have autopilots too but pilots are still sitting there on the controls and have to be paying attention. A brand or technical name doesn't remove responsibility.
andrewmcwatters · 4h ago
Interesting reading https://www.tesladeaths.com/, which is linked in the article at the end.
devwastaken · 5h ago
Corps operate in jurisdictions where this is limited to hundreds of thousands regardless of what the jury says. tesla will never pay this.
comte7092 · 5h ago
A lot of discussion in this thread about the technical meaning of “autopilot” and its capabilities vs FSD.

This is really missing the point. Tesla could have called it “unicorn mode” and the result would still be the same.

The true issue at hand is that Elon Musk has been banding about telling people that their cars are going to drive themselves completely for over a decade now and overstating teslas capabilities in this area. Based on the sum totality of the messaging, many lay consumers believe teslas have been able to safely drive themselves unsupervised for a long time.

From a culpability standpoint, you can’t put all this hype out and then claim it doesn’t matter because technically the fine print says otherwise.

andsoitis · 4h ago
> it doesn’t matter because technically the fine print says otherwise.

Every time you engage the system it tells you to pay attention. It also has sensors to detect when you don’t and forces you. If you have more than N violations in a trip, the system is unavailable for the remainder of your trip.

I don’t know how much clearer it could be.

I would argue that the system is actually so good (but imperfect) that people overestimate how good it is, and let their guard down.

If a system were more error prone, people would not trust it so much.

BonoboIO · 4h ago
Tesla tried to hide evidence … the jury probably did not like to be lied to by Tesla.

F** around and find out

> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.

> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.

https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...

jayess · 4h ago
"Schreiber acknowledged that the driver, George McGee, was negligent when he blew through flashing lights, a stop sign and a T-intersection at 62 miles an hour before slamming into a Chevrolet Tahoe that the couple had parked to get a look at the stars."
BonoboIO · 22m ago
Does that change the shitty behavior of Tesla?
ra7 · 4h ago
> “Tesla designed Autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” Brett Schreiber, counsel for the plaintiffs, said in an e-mailed statement on Friday.

Tesla had every opportunity to do the safe thing and limit Autopilot to only highways, but they want the hype of a product that "works anywhere" and don't want to be seen as offering a "limited" product like their competitors. Then they are surprised their users misuse it and blame it all on the driver. Tesla wants to have their cake and eat it too.

You know what else prevents misuse? Implementing safeguards for your safety critical product. It took multiple NHTSA probes for Tesla to implement obvious driver monitoring improvements to Autopilot/FSD. There's a reason Tesla is always in hot water: they simply lack the safety culture to do the right thing proactively.

jayess · 5h ago
Good lord, $129 million in compensatory damages is absurd.

One thing you notice when visiting Florida is that every other billboard says "In a wreck? Get a check! Call Lawyer Ambulance Chaser Jones!!"

metabagel · 5h ago
How much is your life worth?
knodi123 · 4h ago
to me? an infinite amount! So trying to measure it in dollar terms is already a pointless exercise.

If you want to see the hole in this argument, try looking at it from the other side- How can you arrive at the conclusion that a life is worth only $329M? Why so low? Why not a billion, or a trillion?

If you have to come up with an objective and dispassionate standard, then I doubt you'd find one where the number it generates is $329 million.

metabagel · 4h ago
> So trying to measure it in dollar terms is already a pointless exercise.

That strikes me as an argument for a judgement of zero dollars, which is the exact opposite of the reality that life is worth infinitely more than money.

SoftTalker · 5h ago
What is a reasonable forecast of your personal balance sheet at the time of your natural death?

Of course human life itself doesn't have a dollar value, but that also means that money cannot replace a lost life.

metabagel · 4h ago
> money cannot replace a lost life

So, money is basically worthless compared to life. That's why there are huge judgements when someone is found at fault for the death of another.

I think Tesla's apparent carelessness is a factor in this judgement as well.

jayess · 4h ago
Probably a few million, tops.
Ekaros · 5h ago
Only 0.03% of market cap... Seems pretty cheap actually.