Tesla said it didn't have key data in a fatal crash. Then a hacker found it

270 clcaev 143 8/29/2025, 11:15:39 AM washingtonpost.com ↗

Comments (143)

metaphor · 1h ago
> Immediately after the wreck at 9:14 p.m. on April 25, 2019, the crucial data detailing how it unfolded was automatically uploaded to the company’s servers and stored in a vast central database, according to court documents. Tesla’s headquarters soon sent an automated message back to the car confirming that it had received the collision snapshot.

> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.

Wow...just wow.

ryandvm · 17m ago
It is wild to me that people put so much trust in this company.

Even if Tesla hadn't squandered it's EV lead and was instead positioned to be a robotics and AI superpower, is this really the corporate behavior you would want? This is some fucking Aperture Science level corporate malfeasance.

flatline · 44s ago
It’s pretty typical of corporations, the cult surrounding its leader notwithstanding. Not even just US corporations - the VW emissions scandal was huge, and today they are doing as well as ever. That was a big shakeup; the kind of stuff we are seeing from Tesla feels like business as usual.
A4ET8a8uTh0_v2 · 1h ago
I am trying to imagine a scenario under which that is defensible and does not raise various questions including compliance, legal, retention. Not to mention, who were the people who put that code into production knowing it would do that.

edit: My point is that it was not one lone actor, who would have made that change.

colejohnson66 · 1h ago
Assuming no malice, I'd guess it's for space saving on the car's internal memory. If the data was uploaded off of the car, there’s no point keeping it in the car.
const_cast · 48s ago
Dude we're at the point where cars are practically gathering data on the size of your big toe.

The performance ship sailed, like, 15 years ago. We're already storing about 10000000 more data than we need. And that's not even an exaggeration.

giancarlostoro · 12m ago
I think your answer is the most logical to me as a developer, we often miss simple things, the PM overlooks it, and so it goes into production this way. I don't think its malicious. Sometimes bugs just don't become obvious until things break. We have all found an unintended consequence of our code that had nothing wrong with it technically sooner or later.
wat10000 · 57m ago
Sounds like a pretty standard telemetry upload. You transmit it, keep your copy until you get acknowledgement that it was received so you can retry if it went wrong, then delete it when it succeeds.

It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.

buran77 · 48m ago
The process of collecting and uploading the data probably confuses a lot of non-technical readers even if it worked as per standard industry practices.

The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.

giancarlostoro · 9m ago
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.

wat10000 · 22m ago
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted.

Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.

tobias3 · 51m ago
Sketchy is that then someone takes “affirmative action to delete” the data on the server as well.

Also this is not like some process crash dump where the computer keeps running after one process crashed.

This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.

wat10000 · 48m ago
I’ll bet another ten bucks that this is a generic implementation for all of their telemetry, not something special cased for crashes.

Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.

dylan604 · 11m ago
How handling an automobile crash not as a special case is the weird part. Even in the <$50 dashcams from Amazon there is a feature to mark a recording as locked so the auto delete logic does not touch the locked file. Some of them even have automatic collision detection which locks the file for you.

How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.

joshcryer · 39m ago
The problem with this is that it destroys any chain of evidence. Tesla "lost" this data, in fact. You would never want your "black box" in your car delete itself after uploading to some service because the service could go down, be hacked, or the provider could decide to withhold it, forcing you into a lengthy discovery / custody battle.

This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.

This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.

alistairSH · 40m ago
That might be the case but the article seems to indicate the system knew the data was generated from an accident. So, removing to save space on the car should now be a secondary concern.
aredox · 39m ago
It is a car. A vehicule which can be involved in a fatal accident. It is not a website. There is no "oversight", nor is it "pretty standard" to do it like that: when you don't think about what your system is actually doing (and that is the most charitable explanation), YOU ARE STILL RESPONSIBLE AS IF YOU HAD DONE IT ON PURPOSE.
wat10000 · 33m ago
One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.

buran77 · 18m ago
> their software is built by software people rather than by car people

The rogue engineer defense worked so well for VW and Dieselgate.

The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.

OutOfHere · 47m ago
That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.
phkahler · 14m ago
>> That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.

I worked a year in airbag control, and they recorded a bit of information if the bags were deployed - seatbelt buckle status was one thing. But I was under the impression there was no legal requirement for that. I'm sure it helps in court when someone tries to sue because they were injured by an airbag. The argument becomes not just "the bags comply with the law" but also "you weren't wearing your seatbelt". Regardless, I'm sure Tesla has a lot more data than that and there is likely no legal requirement to keep it - especially if it's been transferred reliably to the server.

giancarlostoro · 12m ago
I don't think its wrong, have you ever pushed code that was technically correct, only to find months later that you, your PM, their manager, their boss' boss, etc all missed one edge case? You're telling me no software developer has ever done this?
jeffbee · 1h ago
The artifact in question was a temporary archive created for upload. I can't think of a scenario in which you would not unlink it.
giancarlostoro · 15m ago
You were right in your first statement, but your follow up is a bad assumption, I think everyone here will agree that in the case of a crash this data should be more easily available and not deleted.

Assuming its not intentionally malicious this is a really dumb bug that I could have also written. You zip up a bunch of data, and then you realize that if you don't delete things you've uploaded you will fill up all available storage, so what do you do? You auto delete anything that successfully makes it to the back-end server, you mark the bug fixed, not realizing that you overlooked crash data as something you might want to keep.

I could 100% see this being what is happening.

JumpCrisscross · 54m ago
And then you delete the server copy?
semiquaver · 13m ago
They didn’t delete the server copy though. That’s what this article is about.

  > Tesla later said in court that it had the data on its own servers all along
jeffbee · 48m ago
Obviously no. The behavior of Tesla in discovery of this case is ridiculous. But treating this technical detail as an element of conspiracy is also ridiculous.
actionfromafar · 17m ago
If that was the only thing going wrong, yes. But when you have a pattern of conspiracy, deleting immediately on the client instead of having a ring buffer which ages out the oldest event, may be a malicious choice.
jeffbee · 11m ago
I haven't seen anything in the (characteristically terrible and vague) coverage of this case that suggests the Tesla deleted the EDR.
constantly · 52m ago
> I can't think of a scenario in which you would not unlink it.

Perhaps if there is some sort of crash.

artursapek · 27m ago
Exactly. That's the last data I would ever delete from the car, if I was trying to preserve valuable data.
alias_neo · 13m ago
All of their actions point at intentionally wanting that data to disappear, they even suggested turning it on and updating it, which everyone who's ever tried to protect important information on a computer knows is that exact opposite to what you should do.

Any competent engineer who puts more than 3 seconds of thought into the design of that system would conclude that crash data is critical evidence and as many steps as possible should be taken to ensure it's retained with additional fail safes.

I refuse to believe Tesla's engineers aren't at least competent, so this must have been done intentionally.

jeffbee · 15m ago
What if you were the guy who got a ticket that just said "implement telemetry upload via HTTP"?

Which of these is evidence of a conspiracy:

  tar cf - | curl
  TMPFILE=$(mktemp) ; tar cf $TMPFILE ; curl -d $TMPFILE ; rm $TMPFILE
alias_neo · 4m ago
That's reductive.

The requirements should have been clear that crash data isn't just "implement telemetry upload", a "collision snapshot" is quite clearly something that could be used as evidence in a potentially serious incident.

Unless your entire engineering process was geared towards collecting as much data that can help you, and as little data as can be used against you, you'd handle this like the crown jewels.

Also, to nit-pick, the article says the automated response "marked" for deletion, which means it's not automatically deleted as your reductive example which doesn't verify it was successfully uploaded (at least && the last rm).

fny · 41m ago
You left out the worse part:

> someone at Tesla probably took “affirmative action to delete” the copy of the data on the company’s central database, too

1vuio0pswjnm7 · 6m ago
The top HN comment on the front page story about this crash on HN a few weeks ago claimed the damages award was too high

Maybe this thread will be different

raincole · 46m ago
The 'wow' part is that they deleted data from server. The part you quoted sounds like nothing unusual to me.
lexicality · 38m ago
You don't think it's unusual that the software is designed to delete crash data from the crashed car?
Thorrez · 22m ago
The question is whether this is code that's special for crashes, or code that runs the exact same way for all data uploads, regardless of whether there's a crash.

You're implying it's special for crashes, but we don't know that.

dylan604 · 8m ago
You have it backwards. The fact that after the special condition of a crash it still allows the data to be deleted is an issue. Sure, deleting of normal data is fine, but it clearly detected a crash and did not mark the file in the special crash mode as do not delete is mind boggling. Everyone knows that in a crash detection mode that the data is very important. Not having code to ensure data retention is the laziest at best way of doing things or malevolently designed at worst. Tesla and its leadership do not deserve at best as our default choice.
lexicality · 13m ago
The crash system uses this code, therefore they chose to do something that would delete the crash data after a crash.

Saying "hey, the upload_and_delete function is used in loads of places!" doesn't free you of the responsibility that you used that function in the crash handler.

Thorrez · 11m ago
Is this a crash handler, or is it their normal telemetry upload loop?
lexicality · 8m ago
Yes, it's a crash handler that uploads a blackbox "collision snapshot" of the entire car's state leading up to a crash. It's very well documented that Tesla does this, including in the article.
phkahler · 9m ago
>> You don't think it's unusual that the software is designed to delete crash data from the crashed car?

After it confirmed upload to the server? What if it was a minor collision? The car may be back on the road the same day, or get repaired and on the road next week. How long should it retain data (that is not legally required to be logged) that has already been archived, and how big does the buffer need to be?

lexicality · 2m ago
A very simple answer is "until the next time the car crashes", you just replace the previous crash data with the new data.

If the car requires that a certain amount of storage is always available to write crash data to, then it doesn't matter what's in that particular area of storage. That reserved storage is always going to be unavailable for other general use.

foobarian · 30m ago
Think of it as the scripts that run on CI/CD actions running unit tests. If a unit test fails, the test artifacts are uploaded to an artifact repository, and then, get this - the test runner instance is destroyed! But we don't think of that as unusual or nefarious.
smallpipe · 26m ago
No one dies when your unit test fails. Different stakes, different practices, what are all the Tesla apologists smoking here?
Ambroisie · 25m ago
I don't think you can equate CI/CD unit tests and killing humans with 2 tons of metal.
lexicality · 27m ago
That's because typically the test runner hasn't just crashed into another test runner at full highway speed
fabian2k · 1h ago
Do I understand it correctly? Crash data gets automatically transmitted to Tesla, and after it was transmitted is immediately marked for deletion?

If that is actually designed like this, the only reason I could see for it would be so that Tesla has sole access to the data and can decide whether to use it or not. Which really should not work in court, but it seems it has so far.

And of course I'd expect an audit trail for the deletion of crash data on Tesla servers. But who knows whether there actually isn't one, or nobody looked into it at all.

phkahler · 8m ago
>> Tesla has sole access to the data

All vehicle manufacturers have sole access to data. There isn't a standard for logging data, nor a standard for retrieving it. Some components log data and it only the supplier has the means to read and interpret it.

Someone · 54m ago
Another reason is if there’s other kinds of data that gets uploaded to Tesla, and the code for uploading crash data reuses that code.

For the first kind of data, deleting the data from the car the moment there’s confirmation that it now is stored at Tesla can make perfect sense as a mechanism to prevent the car to run out of storage space.

Of course, if the car crashed, deleting the data isn’t the optimal, but that it gets deleted may not be malice.

cube00 · 36m ago
Data retention is legal's bread and butter. There's no chance such a decision is accidently made by reusing code.

Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.

Every byte that car records and how it is managed will be documented in excruciating detail by legal.

fabian2k · 46m ago
Deleting after a certain time makes sense, certainly. Deleting immediately seems dubious to me. Though the descriptions in the article are vague enough that we might be missing some big aspects.

But in the end we wouldn't be discussing this at all if Tesla had simply handed over the data from their servers. If they can't find it, it isn't actually there or they deliberately removed it this affects how I view this process.

Two copies are better than one. If you immediately erase the data, you better be sure the transmitted data is safe and secure. And obviously it wasn't.

lgeorget · 48m ago
I guess one charitable way to look at it is that after a crash, external people could get access to the car and its memory, which could potentially expose private data about the owner/driver. And besides private data, if data about the car condition was leaked to the public, it could be made to say anything depending on who presents it and how, so it's safer for the investigation if only appointed experts in the field have access to it.

This is not unlike what happens for flight data recorders after a crash. The raw data is not made public right away, if ever.

fabian2k · 44m ago
If Tesla securely stored this data and reliably turned it over to the authorities, I wouldn't argue much with this.

But the data was mostly unprotected on the devices, or it couldn't have been restored. And Tesla isn't exactly known for respecting the privacy of their customers, they have announced details about accidents publicly before.

And there is the potential conflict of interest, Tesla does have strong incentives to "lose" data that implicates Autopilot or FSD.

sanex · 35m ago
I would rather my cars not automatically rat me out to the authorities, personally.
souterrain · 2m ago
Your property isn't ratting you out. The software you license from Tesla is ratting you out.
gmd63 · 5m ago
I wouldn't want them to have selective memory in favor of juicing Elon's marketing scams either.
interactivecode · 17m ago
that's like worrying about external people having access to the drivers wallet in the case of a fatal crash. Like yeah sure but it's more likely that Tesla is sketchy considering their vested interest is controlling crash data reports
clipclopflop · 1h ago
Years back I bought a model3 infotainment unit on eBay to hack on - it’s absolutely insane at the amount of data contained on them. After gaining access to the system I was able to get the VIN of the car and find the salvage auction from the car it came out of - it had been wrecked. I then was able to get all the location data that gets logged, showing a glimpse of the previous owners life (house, work, stores they went to, etc) as well as the final resting place of the car. The last gps locations logged were at the end of a “T” intersection in North Carolina - google street view gave a nice look at the trees the car most likely hit :>
foobarian · 28m ago
Neat! What's the hardware like, a Linux-ish computer with SD cards? Or SSD? Which filesystem?
gwbas1c · 21m ago
I suspect it's Windows, actually, and I'm pretty sure the UI is some form of C#.

They tried to recruit me for the UI. If I lived closer, I would have jumped on it. Not only was I bit of a Tesla fanboy at the time, I used to work across the street from their office and really liked that area. (Deer Creek Road in Palo Alto.)

flutas · 14m ago
dagmx · 5m ago
It’s Linux and the UI is Qt
normie3000 · 20m ago
> a glimpse of the previous owners life...

...and potentially death?

throw7 · 8m ago
After reading the article, I am never buying a Tesla.

Props to greenthehacker. may you sip Starbuck's venti-size hot chocolates for many years to come.

voidUpdate · 1h ago
I'm still convinced that it being called "full self driving" is misleading marketing and really needs to stop, since it isn't according to Tesla
orlp · 1h ago
The marketing doesn't even matter. It either needs to be full self driving, or nothing at all. The "semi self-driving but you're still responsible when shit hits the fan" just doesn't work.

Humans are simply incapable of paying attention to a task for long periods if it doesn't involve some kind of interactive feedback. You can't ask someone to watch paint dry while simultaneously expect them to have < 0.5sec reaction time to a sudden impulse three hours into the drying process.

AuthorizedCust · 1h ago
I have a SAE level 2 car. Those features DO help!
tialaramex · 49m ago
Framing is crucial. Example, why was the Autonomous Emergency Braking configured to brake violently to a full stop? Lets consider two scenarios, in both cases we're not paying enough attention to the outside world and are about to strike a child on a bicycle but the AEB policy varies.

1. AEB brakes violently to a full stop. We experience shock and dismay. What happened? Oh, a kid on a bike I didn't see. I nearly fucked up bad, good job AEB

2. AEB smoothly slows the vehicle to prevent striking the bicycle, we gradually become aware of the bike and believe we had always known it was there and our decision eliminated risk, why even bother with stupid computer systems?

Humans are really bad at accepting that they fucked up, if you give them an opportunity to re-frame their experience as "I'm great, nothing could have gone wrong" that's what they prefer, so, to deliver the effective safety improvements you need to be firm about what happened and why it worked out OK.

Jeremy1026 · 54m ago
Same. Not having to worry about keeping the car between the lines allows me to keep my focus on the other cars around me more. Offloading the cognitive load of fine tuning allows more dedication to the bigger picture.
AlexandrB · 25m ago
This makes no sense to me. Driving involves all senses, not just vision - if you're not feeling what the car is doing because you're not engaged with the steering wheel what good is it to see what's around you? I also don't understand how one has trouble staying between the lines with minimal cognitive input after more than a few months of driving.

Oh! And also, moving within the lane is sometimes important for getting a better look at what's up ahead or behind you or expressing car "body language" that allows others to know you're probably going to change lanes soon.

ghaff · 13m ago
I don't have personal experience but friends with personal experience have sort of shifted my thinking on the topic. They'll note they do need to stay engaged but that it is genuinely useful on long drives in particular. The control handover is definitely an issue so is manual driving in general. Their consensus is that the current state of the art is by no means perfect but it is improved and it's not like there aren't problems with existing manual driving even with some assistive systems.
JumpCrisscross · 52m ago
If you live in a city, please send this article to your municipal and state electeds. Tesla is lobbying for the right to train and activate its Level 4 product, marketed as Level 5, in cities where Musk is deeply unpopular. There is massive political capital to be had in banning Tesla’s self-driving features on even the flimsiest grounds.
terminalshort · 49m ago
I would rather take a bullet than be a luddite who gets in the way of technological advancement on "the flimsiest of grounds."
JumpCrisscross · 44m ago
> be a luddite who gets in the way of technological advancement on "the flimsiest of grounds”

Blocking a technology is Luddism. Blocking a company is politics.

moduspol · 51m ago
I’m less convinced we need to keep bringing this up in every single thread involving Tesla.
acdha · 1h ago
Why do you think Musk put so much money into helping Trump win? Tesla was under multiple investigations for safety and unkept promises, and he knew that he would not have leverage to halt those under a Harris administration.
thowaway52729 · 1h ago
If that was his goal he would have minded his own business after the election, instead of spouting invective posts against Trump on X.
Maken · 1h ago
That was after Musk realized he had alienated his entire consumer base.
thowaway52729 · 1h ago
And he wants to bring them back by alienating Trump while doubling down on his rhetoric?
delfinom · 44m ago
He has an ego and narcissism but he isn't dumb. He sees the problems but also cant admit hes wrong or anything.
antonvs · 42m ago
That happened much earlier. The split with Trump happened after it finally sunk in that that Republicans weren't actually interested in smaller government or cost savings, that that was just a rhetorical weapon that they deploy selectively to get elected.
rsynnott · 20m ago
I mean, if we was rational, sure, that's probably what he should have done. But, y'know, he clearly _isn't_.
lenkite · 49m ago
He was under some imaginary assumption that Trump cared about the national deficit because of his campaign speeches. Once he realized that Trump really didn't care two hoots about it and only planned to increase it even more he had a late buyer's realization.
myrmidon · 1h ago
I'm absolutely not a fan of Trump, but this is a highly questionable assumption.

The much more likely hypothesis in my view is that he was helping Trump because of personal conviction (only in small parts motivated by naked self-interest).

You should expect rational billionaires to tend politically right out of pure self-interest and distorted perspective alone; because the universal thing that such parties reliably do when in power is cutting tax burden on the top end.

blizdiddy · 1h ago
That’s insane. Do you remember DOGE or Elon taking his cronies into the same departments investigating him? Do you even remember?
myrmidon · 21m ago
What would Elon even be in court for? Being a politically incorrect dumbass on ex-twitter is not punishable by law.

Sending a bunch of scriptkiddies around and having them cut government funding and gut agencies is not really how you make evidence "vanish", how would that even work?

And, lastly, jumping in front of an audience at every opportunity and running your mouth is the absolute last thing anyone would ever do if the goal was to avoid prosection. But it is perfectly in line with a person that has a very big ego and wants to achieve political goals.

blizdiddy · 4m ago
Labor violations, taxes, National Highway traffic safety administration investigation Tesla.. are you willfully ignorant or a troll?
antonvs · 1h ago
Musk is on record saying to Tucker Carlson that “If [Trump] loses, I’m fucked.”

So this isn't so much of an assumption, as taking him at his word.

myrmidon · 15m ago
All the context I have for this is that he was grandstanding in front of a rightwing audience (after Trump was shot at, notably) and playing the "surely I would get unjustly prosecuted for my political incorrectness under the democrats".

What is your actual point? What would he stand in front of a judge for, right now, if Harris had won?

unnamed76ri · 57m ago
The Left was coming after Musk pretty hard before the election. I don’t know the context of the quote you pulled but it’s not hard to see how if Trump lost, there was going to be consequences for Musk.
Swenrekcah · 48m ago
He has committed a lot of fraud and was facing consequences for that. That has nothing to do with left or right.
unnamed76ri · 36m ago
Fraud has nothing to do with vandalizing Tesla dealerships last I checked.
Swenrekcah · 29m ago
You are right it doesn’t. That is (wrongly) done by people who are (rightly) mad at him for making american life harder and global life more dangerous, in a self serving attempt to evade the justice system.
terminalshort · 26m ago
Can you give an example of these many instances of fraud?
Swenrekcah · 4m ago
For instance he has made fraudulent statements regarding the current and near future capabilities of Tesla in an effort to inflate stock prices numerous times. He was in fact ordered by a judge to stop making such statements but he didn’t obey that.

He used to be quite charismatic, I believed him up until about 2017 or so. Then I figured he was just a bit greedy and maybe money got to his head but still a respectable innovator. However during 2020 or 2021 (I don’t exactly remember) he started to get quite unpleasant and making obviously short-term decisions, such as relying only on cameras for self driving because of chip shortages but dressing it up as an engineering decision.

scott_w · 21m ago
We can start from the linked article?
terminalshort · 16m ago
Yeah, that's where I started, and I would recommend you do the same:

> U.S. District Judge Beth Bloom, who presided over the case, said in an order that she did not find “sufficient evidence” that Tesla’s failure to initially produce the data was intentional.

walls · 42m ago
It does actually, because only one side is interested in finding or fighting fraud.
Swenrekcah · 31m ago
Currently yes, but it is not inherently so. The problem with the US regime is that it is compromised, corrupt and heading towards fascism.

The problem is not that the republican party used to be a conservative right party.

What I’m saying is this is not a sports competition where Musk is automatically an opponent of the Democratic party because he supported Trump. He supported Trump in order to improve his chances with the legal system because he knew Trump would be willing to be so corrupt.

Another world might be imagined in which the Democratic party was taken over in 2016 but that is not the world we live in.

terminalshort · 23m ago
Both are interested in finding and fighting fraud, but only from the other side. Leticia James charged Trump with a rack of felonies for putting false info on a loan application. The Trump DOJ charged Leticia James for doing exactly the same. Both sides claim the charges against them are politically motivated and the charges against the other side are completely legitimate.
walls · 12m ago
> Both sides claim the charges against them are politically motivated and the charges against the other side are completely legitimate.

This is how conservatives keep people going 'both sides!' even though they manufacture whatever is required to be that way.

razemio · 52m ago
Everytime this comes up, I am on the opposite site of this. It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. There are various videos online where FSD managed to drive a route start to finish without a single human override. That's full self driving. It can also crash like humans "can" and that why it needs supervision. In this sense, we as humans are also "full self driving" with a much (?) lower risk of crashing.

Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.

JumpCrisscross · 49m ago
> It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. That's full self driving

All this demonstrates is the term “full self driving” is meaningless.

Tesla has a SAE Level 3 [1] product they’re falsely marketing as Level 5; when this case occurred, they were misrepresenting a Level 2 system as Level 4 or 5.

If you want to see true self driving, take a Waymo. Tesla can’t do that. They’ve been lying that they can. That’s gotten people hurt and killed; Tesla should be liable for tens if not hundreds of billions for that liability.

[1] https://www.sae.org/blog/sae-j3016-update

terminalshort · 46m ago
If it's a meaningless term then it can't be misrepresenting to use it.
JumpCrisscross · 4m ago
> If it's a meaningless term then it can't be misrepresenting to use it

It’s meaningless because Tesla redefines it at will. The misrepresentation causes the meaninglessness.

claw-el · 34m ago
The other confusion with self driving for me is, is the “self” the human or the car?

Self driving can totally means the human own-self driving.

Having SAE level is clearer.

razemio · 39m ago
That's something different. The problem with the level is, that it only focuses on the attention the human driver needs to give to the automation. In this sense my Kia EV6 is also Level 2/3, same as FSD. However FSD can do so much more than my Kia EV6. That's a fact. Still the same level. Where did Tesla say FSD is SAE Level 5 approved? They would be responsible everytime FSD is active during a crash. Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading.

Also "All this demonstrates is the term “full self driving” is meaningless." prooves my point that it is not missleading.

JumpCrisscross · 30m ago
> FSD can do so much more than my Kia EV6. That's a fact. Still the same level

The levels are set at the lowest common denominator. A 1960s hot rod can navigate a straight road with no user input. That doesn’t mean you can trust it to do so.

> Where did Tesla say FSD is SAE Level 5 approved?

They didn’t say that. They said it could do what a Level 5 self-driving car can do.

“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.

‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”

> Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading

This is tautology. You’re defining FSD to mean whatever Tesla FSD can do.

[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...

razemio · 17m ago
How would you name a system which can do everything a Level 5 system can, but with Level 2/3 supervision? A name, which a PR team would choose without the missleading stuff as you are saying.
JumpCrisscross · 10m ago
> How would you name a system which can do everything a Level 5 system can, but with Level 2/3 supervision?

FSD cannot “do everything a Level 5 system can.” It can’t even match Waymo’s Level 4 capabilities, because it periodically requires human intervention.

But granting your premise, you’d say it’s a Level 2 or 3 system with some advanced capabilities. (Mercedes has a lane-keeping and -switching product. They’re not constantly losing court cases.)

terminalshort · 1h ago
I see this brought up a lot, but I don't think it's really an issue. It's misleading in a very technical sense, but it's so misleading that nobody is mislead. Just like nobody thinks the "Magic Eraser" is actually magic. I fundamentally just don't think anybody is out there actually believing this thing is L5 full self driving, especially after all the warnings it shows you and the disclaimers when you buy it.

The problem here isn't that people think they don't need to pay attention because their car can drive itself and then crash. The problem is that people who know full well that they need to focus on driving just don't because fundamentally the human brain isn't any good at paying attention 100% of the time in a situation where you can get away with not paying attention 99.9% of the time, and naming just can't solve this.

pu_pe · 59m ago
Volkswagen was caught cheating on its emission data and the CEO got fired, then prosecuted. Why shouldn't that be the case here?
rsynnott · 21m ago
The really weird thing about the diesel emissions scandal was that someone actually got in trouble for it. It is _rare_ for companies to be punished, particularly criminally, for that sort of thing.
bobmcnamara · 16m ago
Usually they'd get a DPA
JumpCrisscross · 55m ago
You’d need a coalition of Democratic attorneys general to bring a case in the mould of Big Tobacco.
MangoToupe · 35m ago
We'd need a third party if you'd actually want to fight american corporations. Unless you intended "small d" democratic
dagmx · 4m ago
Good news, the CEO of this American corporation is making a third party… (the monkey paw curls)
metaphor · 1h ago
h1fra · 25m ago
The video is staggering, going super fast before an intersection, with no visibility, a blinking signal, and clear stop sign in sight. I hope FSD got better
whimsicalism · 15m ago
not sure if you are saying otherwise but for those who might get confused this crash was with “Autopilot” not FSD, although both are definitely problematic
phkahler · 1m ago
>> not sure if you are saying otherwise but for those who might get confused this crash was with “Autopilot” not FSD

And the distinction is what?

I'm not serious of course. There are huge swaths of the public whose eyes would glaze over if you tried to explain it, and that's my point.

gmd63 · 37m ago
None of this should be surprising to anyone who has given an ounce of effort to examining Elon's character.

Lies about capabilities, timelines, even things as frivolous as being rank one in a video game. He bought Twitter to scale his deception.

Noumenon72 · 16m ago
They should be saving every crash as a unit test to ensure it never happens again.
estearum · 1h ago
Surely this is the behavior of a company that's confident in the safety of its products!
05 · 46m ago
Don't worry, once Tesla figures out secure boot nobody will be able to call their bluff and they'll be free to 'lose' crash data with the same impunity the police loses their bodycam footage.
firesteelrain · 1h ago
If Tesla can’t ensure safeguarding of this information, it’s a feature that will get them in big trouble.
IlikeKitties · 5m ago
So, will tesla get nuked from orbit for what is obviously a serious, intentional and systemic discovery violation or is this just ok because it's a big corp?
aurizon · 24m ago
Can you imagine air craft makers avoid this sort of black box autodelete! A red handed catch!
OutOfHere · 43m ago
With everything that is wrong with Tesla, I'll be the first to say that all Tesla cars need to be taken off of the roads, at least until all of their auto-driving features have been fully removed.
chiffre01 · 1h ago
The description of the guy finding the data while at a Starbucks doesn't do justice to his setup shown in the photo. My dude has a seriously chaotic and awesome setup there.
jalk · 58m ago
I imagine, he dumped the car data onto his laptop, so that he could work on the problem in a more cozy place, than his messy bitcave
ath3nd · 1h ago
Of course they will say they don't have the key data.

Do we expect them to admit they were outright lying and wrong considering their leader is a pill popping Nazi salute making workaholic known to abuse his workers?

rwmj · 1h ago
Lying to a court is usually pretty serious. Any sensible legal department will tell you never to do that, whatever your CEO says.
emsign · 1h ago
Unless you paid off the president they assume.
gruez · 20m ago
Did they have a falling out a few months ago?
psychoslave · 1h ago
You mean, the other allegations on this same person would not be judged something serious and could even be recommended?
lawn · 1h ago
Usually.

But today you just have a private dinner with the president and he'll wave it away.