Tesla said it didn't have key data in a fatal crash, then a hacker found it

453 clcaev 248 8/29/2025, 11:15:39 AM washingtonpost.com ↗

Comments (248)

metaphor · 3h ago
fabian2k · 3h ago
Do I understand it correctly? Crash data gets automatically transmitted to Tesla, and after it was transmitted is immediately marked for deletion?

If that is actually designed like this, the only reason I could see for it would be so that Tesla has sole access to the data and can decide whether to use it or not. Which really should not work in court, but it seems it has so far.

And of course I'd expect an audit trail for the deletion of crash data on Tesla servers. But who knows whether there actually isn't one, or nobody looked into it at all.

phkahler · 2h ago
>> Tesla has sole access to the data

All vehicle manufacturers have sole access to data. There isn't a standard for logging data, nor a standard for retrieving it. Some components log data and it only the supplier has the means to read and interpret it.

dghlsakjg · 1h ago
Mostly incorrect. At least for the US.

If your car has an EDR, what data it collects is legislated. There is not a standard interface for retrieving it, but the manufacturer is required to ensure that there is a commercially available tool for data retrieval that any third party can use.

https://www.ecfr.gov/current/title-49/subtitle-B/chapter-V/p...

onlyrealcuzzo · 41m ago
Does it legislate that you can't "accidentally" delete all incriminating data?
sidewndr46 · 40m ago
It looks like this covers "and an unloaded vehicle weight of 2,495 kg (5,500 pounds) or less". From what I understand even my F-150 wouldn't fall under this legislation
Aurornis · 1m ago
Unloaded vehicle weight, not gross vehicle weight.

From a quick search, it's technically possible to configure some model year F-150s to have a curb weight over 5,500 pounds with all the right options, but most are lower.

jayd16 · 31m ago
Might not cover large trucks but most sedans are under that.

Is this one of those "that's why big cars are cheaper to make" situations?

__alexs · 1h ago
There is a world of difference between "you need our special hardware and software to read the data" and "we deleted it lol".
lgeorget · 2h ago
I guess one charitable way to look at it is that after a crash, external people could get access to the car and its memory, which could potentially expose private data about the owner/driver. And besides private data, if data about the car condition was leaked to the public, it could be made to say anything depending on who presents it and how, so it's safer for the investigation if only appointed experts in the field have access to it.

This is not unlike what happens for flight data recorders after a crash. The raw data is not made public right away, if ever.

fabian2k · 2h ago
If Tesla securely stored this data and reliably turned it over to the authorities, I wouldn't argue much with this.

But the data was mostly unprotected on the devices, or it couldn't have been restored. And Tesla isn't exactly known for respecting the privacy of their customers, they have announced details about accidents publicly before.

And there is the potential conflict of interest, Tesla does have strong incentives to "lose" data that implicates Autopilot or FSD.

sanex · 2h ago
I would rather my cars not automatically rat me out to the authorities, personally.
gmd63 · 2h ago
I wouldn't want them to have selective memory in favor of juicing Elon's marketing scams either.
souterrain · 2h ago
Your property isn't ratting you out. The software you license from Tesla is ratting you out.
salawat · 1h ago
Such a pity there is no way to get an electronics minimal car control unit. Funny how conspicuously unimplemented functionality works.
interactivecode · 2h ago
that's like worrying about external people having access to the drivers wallet in the case of a fatal crash. Like yeah sure but it's more likely that Tesla is sketchy considering their vested interest is controlling crash data reports
ChrisMarshallNY · 1h ago
It's probably a bit like "This call may be recorded for quality purposes." That's a disclaimer that's usually required by the authorities, to let you know that you're being recorded, but it lets them off the hook, if the recording would be inconvenient to them. If it supports their side, they 100% always have it, but if it supports the caller's side, then it seems they didn't actually record that call ...so sorry...

Tesla's fairly notorious for casual treatment of customer car data (which they have a lot of). There was an article, recently, about how in-car video recordings were being passed around the office.

I know that at least one porn actress recorded a scene in a self-driving Tesla. I'll bet that recording made the rounds "for quality purposes."

HillRat · 24m ago
As an FYI that might be helpful to some, in the case of sales, there's a positive legal obligation to maintain call recordings, so in the event of a courtroom dispute the failure to produce can lead to an adverse inference instruction.
criddell · 1h ago
> "This call may be recorded for quality purposes."

It's a disclaimer, but it also grants permission for you to record.

ChrisMarshallNY · 1h ago
This is true.

I knew a guy who used to record all his calls with companies, and would let them know they were being recorded, if they didn't have that disclaimer.

He would say "This call is being recorded." He told me that most of the companies hung up immediately, when he said that.

I never heard him say that his recording ever did him any good, though.

Someone · 2h ago
Another reason is if there’s other kinds of data that gets uploaded to Tesla, and the code for uploading crash data reuses that code.

For the first kind of data, deleting the data from the car the moment there’s confirmation that it now is stored at Tesla can make perfect sense as a mechanism to prevent the car to run out of storage space.

Of course, if the car crashed, deleting the data isn’t the optimal, but that it gets deleted may not be malice.

cube00 · 2h ago
Data retention is legal's bread and butter. There's no chance such a decision is accidently made by reusing code.

Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.

Every byte that car records and how it is managed will be documented in excruciating detail by legal.

Someone · 1h ago
> Data retention is legal's bread and butter.

As is deleting data. Also, for, say, training data for Tesla’s software, I don’t see legal requirements for keeping it around,

> There's no chance such a decision is accidently made by reusing code.

At Tesla? I know about nothing about their software development practices, but from them, it wouldn’t surprise me at all if this were accidental.

Edit: one scenario to easily introduce this bug is if the “delete data after upload” feature were added after the “on a crash, upload all data you have, in case the car burns down” feature.

Retric · 55m ago
> I don’t see legal requirements for keeping it around,

If you selectively delete data, courts can assume that data is the worst possible thing for a court case against you.

SoftTalker · 1h ago
Agreed. Tesla axed their marketing department, why assume they have much of a legal department overseeing how the data uploads are managed?
sidewndr46 · 27m ago
Not sure where you've worked by the "data retention policy" at places I worked made it abundantly clear that we were not to be retaining any data unless personally ordered to by a court. If a line manager, C-Level executive or board member requested me to retain data, I could refuse it under the policy.

Like many things, the retention policy was actually a destruction policy

mattmcknight · 1h ago
> Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.

In my experience, they are setting automated 90 deletion policies on email so they don't end up with surprises in discovery.

Someone · 1h ago
Many large companies nowadays have 90 day deletion policies.
fabian2k · 2h ago
Deleting after a certain time makes sense, certainly. Deleting immediately seems dubious to me. Though the descriptions in the article are vague enough that we might be missing some big aspects.

But in the end we wouldn't be discussing this at all if Tesla had simply handed over the data from their servers. If they can't find it, it isn't actually there or they deliberately removed it this affects how I view this process.

Two copies are better than one. If you immediately erase the data, you better be sure the transmitted data is safe and secure. And obviously it wasn't.

FireBeyond · 13m ago
Absolutely so.

I don't know how accurate it is right now, but previously, people have had to sue Tesla to get telemetry data from their own vehicle, not to use against Tesla, but to use in accident lawsuits against other parties.

Meanwhile, without your consent, Tesla will hold press conferences using your telemetry data to throw you under the bus (even deceptively) to defend themselves. "The vehicle had told the driver to pay attention!" NHTSA, four months later: "The vehicle had issued one inattention alert, eighteen minutes prior to the collision." (emphasis mine)

clipclopflop · 3h ago
Years back I bought a model3 infotainment unit on eBay to hack on - it’s absolutely insane at the amount of data contained on them. After gaining access to the system I was able to get the VIN of the car and find the salvage auction from the car it came out of - it had been wrecked. I then was able to get all the location data that gets logged, showing a glimpse of the previous owners life (house, work, stores they went to, etc) as well as the final resting place of the car. The last gps locations logged were at the end of a “T” intersection in North Carolina - google street view gave a nice look at the trees the car most likely hit :>
foobarian · 2h ago
Neat! What's the hardware like, a Linux-ish computer with SD cards? Or SSD? Which filesystem?
clipclopflop · 1h ago
HW wise, the older units were intel atom based cpu (latest gen is amd I believe?) - the hardware is typical embedded stuff - cpu + eMMC + bt/wifi mcu + cellular daughter card. OS is linux + QT UI stuff. I would expect things have changed for newer HW revisions, but the previous gen did not utilize encryption (dmcrypt) so all data was unprotected at rest.
clipclopflop · 1h ago
Following up on this - the actual 'self driving' part of the HW stack is an entirely separate board with 2x custom ARM chips on it. The HW/SW is much more locked down and the OS/Data is not accessible. I believe a lot of the self-driving info gleaned by types like green were built up from the first generation of Model S cars where the 'self driving' HW was much less defensible and it was much easier to gain access to it.
gwbas1c · 2h ago
I suspect it's Windows, actually, and I'm pretty sure the UI is some form of C#.

They tried to recruit me for the UI. If I lived closer, I would have jumped on it. Not only was I bit of a Tesla fanboy at the time, I used to work across the street from their office and really liked that area. (Deer Creek Road in Palo Alto.)

flutas · 2h ago
iamjake648 · 1h ago
If this were true, Tesla wouldn't have the only use able car infotainment system in the industry.
dagmx · 2h ago
It’s Linux and the UI is Qt
normie3000 · 2h ago
> a glimpse of the previous owners life...

...and potentially death?

clipclopflop · 1h ago
I tried to dig up news articles in the area and could not find any reported fatalities - but yea, maybe?
breadwinner · 2h ago
> In the annotated video played for the jury, the vehicle detects a vehicle about 170 feet away. A subsequent frame shows it detecting a pedestrian about 116 feet away. As McGee hurtles closer and closer, the video shows the Tesla planning a path through Angulo’s truck, right where he and his girlfriend were standing behind signs and reflectors highlighting the end of the road.

So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?

skeezyboy · 1h ago
AI is so unlike anything weve ever seen and its going to revolutionise the world and its literally gna be skynet except it pathfinds like a counterstrike bot just ignore that bit
floren · 1m ago
we're still so early!
xgulfie · 43m ago
Just 2 more years bro just 2 more years and we'll have self driving cars working trust me bro
AndrewKemendo · 22m ago
https://waymo.com/rides/san-francisco/

You can take a Waymo any time of day in SF and they provide 1000s of successful rides daily

skeezyboy · 9m ago
And theyve had to spend how many manhours engineering around shit like the above?
vilhelm_s · 16m ago
I'm guessing they mean it detected a different vehicle and pedestrian but not the ones it hit. (If it was the victim I don't think they would have said "a".)
metaphor · 3h ago
> Immediately after the wreck at 9:14 p.m. on April 25, 2019, the crucial data detailing how it unfolded was automatically uploaded to the company’s servers and stored in a vast central database, according to court documents. Tesla’s headquarters soon sent an automated message back to the car confirming that it had received the collision snapshot.

> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.

Wow...just wow.

ryandvm · 2h ago
It is wild to me that people put so much trust in this company.

Even if Tesla hadn't squandered it's EV lead and was instead positioned to be a robotics and AI superpower, is this really the corporate behavior you would want? This is some fucking Aperture Science level corporate malfeasance.

flatline · 2h ago
It’s pretty typical of corporations, the cult surrounding its leader notwithstanding. Not even just US corporations - the VW emissions scandal was huge, and today they are doing as well as ever. That was a big shakeup; the kind of stuff we are seeing from Tesla feels like business as usual.
watwut · 1h ago
VW emission scandal ended with actual judgement and two prison sentences.

Miles and miles different - they were not completely untouchable the way tesla and similar hot companies are.

estearum · 1h ago
No, it's not typical, because you don't see huge numbers of people defending VW's emissions fraud.
SoftTalker · 1h ago
I don't defend it but the specifics never bothered me. They cheated because their cars didn't meet new emissions standards. They were fine by the standards of the year before. So a bureaucracy just declared that a legal level of emissions was now illegal.

In my mind it's like suddenly declaring that blue cars are illegal, and they made a color-shifting car that is blue except when the authorities are looking at it.

It is wrong in the sense that it is normalizion of deviance, however. We live in a society and if we don't like a law or regulation the correct response is to get it legally changed, not to ignore it and cheat.

estearum · 1h ago
I didn't say you are defending it. I'm saying that "companies do bad things sometimes" is not a full description of the Tesla phenomenon that people take issue with.
LightBug1 · 1h ago
Nope - the VW episode was terrible, but they faced large fines and corrected course and it's history. I'm still slightly squeamish about accepting them but they've turned it around and I think I read have just overtaken Tesla in EV sales in Europe (a self-inflicted Musk wound, of course).

I see no course correction from Tesla. Just continuing and utter tripe from it's CEO, team, and Musk-d-riders.

This is an on-going issue for them and, at this point, with no further change? I hope it drives them into the ground (Autopilot, natch).

fred_is_fred · 1h ago
You can actively criticize VW on the internet without an army of sycophants coming for you. The standard behavior of Tesla stans is that any problem with the vehicle is in fact your fault and only your fault because it would not be possible for Tesla to do something wrong. It is cult-like.
lotsofpulp · 1h ago
I just hate the corrupt laws mandating car dealerships.
A4ET8a8uTh0_v2 · 3h ago
I am trying to imagine a scenario under which that is defensible and does not raise various questions including compliance, legal, retention. Not to mention, who were the people who put that code into production knowing it would do that.

edit: My point is that it was not one lone actor, who would have made that change.

colejohnson66 · 3h ago
Assuming no malice, I'd guess it's for space saving on the car's internal memory. If the data was uploaded off of the car, there’s no point keeping it in the car.
giancarlostoro · 2h ago
I think your answer is the most logical to me as a developer, we often miss simple things, the PM overlooks it, and so it goes into production this way. I don't think its malicious. Sometimes bugs just don't become obvious until things break. We have all found an unintended consequence of our code that had nothing wrong with it technically sooner or later.
ajross · 32m ago
In point of fact eMMC wear failure was an actual bug in early Tesla MCUs. They were logging too much, so when the car reached (via routine use) a certain fill level the logging started running over the same storage again and again and the chips started failing.

It's very easy to imagine a response to this being (beyond "don't log so much") an audit layer to start automatically removing redundant data.

The externalities of the company are such that people want to ascribe malice, but this is a very routine kind of thing.

const_cast · 2h ago
Dude we're at the point where cars are practically gathering data on the size of your big toe.

The performance ship sailed, like, 15 years ago. We're already storing about 10000000 more data than we need. And that's not even an exaggeration.

OutOfHere · 2h ago
That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.
phkahler · 2h ago
>> That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.

I worked a year in airbag control, and they recorded a bit of information if the bags were deployed - seatbelt buckle status was one thing. But I was under the impression there was no legal requirement for that. I'm sure it helps in court when someone tries to sue because they were injured by an airbag. The argument becomes not just "the bags comply with the law" but also "you weren't wearing your seatbelt". Regardless, I'm sure Tesla has a lot more data than that and there is likely no legal requirement to keep it - especially if it's been transferred reliably to the server.

giancarlostoro · 2h ago
I don't think its wrong, have you ever pushed code that was technically correct, only to find months later that you, your PM, their manager, their boss' boss, etc all missed one edge case? You're telling me no software developer has ever done this?
buran77 · 1h ago
You discover it the day you a person dies and your relevant data is not there. Next time it's no longer a "missed edge case".
giancarlostoro · 1h ago
In a perfect world where developers are omnipresent and all knowing sure? This isn't a perfect world. Heck, how do you account for the developer who coded it leaving the company, and now that code has been untouched for half a decade if not more, because nothing is seemingly wrong with the code, what then? Who realizes it needs to be changed? Nobody. The number of obscure bugs I find in legacy code that stump even the most experienced maintainers never ends.
matthewdgreen · 25m ago
There have been dozens of government investigations and lawsuits around Tesla crashes over the past decade (more likely hundreds or thousands, I'm just thinking of the ones that received significant national press and that I happened to notice.) In each of these cases, Tesla's data retention was questioned, sometimes by regulators and sometimes as a major legal question in the case. There is no way in 2025 that the retention process around crash data is some niche area of Tesla's code that the business leaders haven't thought about extremely carefully.

This is like saying "maybe nobody has recently looked at the ad-selection mechanism at Google." That's just not plausible.

OutOfHere · 1h ago
It's not an edge case; it's wanton criminal sabotage, destruction of evidence, and it deserves a prison sentence for anyone facilitating it at any level.
giancarlostoro · 1h ago
This is assuming malice out of the gate without any evidence, which is not what we do here on HN. If this is in fact maliciously done, please provide evidence.
wat10000 · 3h ago
Sounds like a pretty standard telemetry upload. You transmit it, keep your copy until you get acknowledgement that it was received so you can retry if it went wrong, then delete it when it succeeds.

It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.

buran77 · 2h ago
The process of collecting and uploading the data probably confuses a lot of non-technical readers even if it worked as per standard industry practices.

The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.

wat10000 · 2h ago
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted.

Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.

giancarlostoro · 2h ago
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.

No comments yet

tobias3 · 2h ago
Sketchy is that then someone takes “affirmative action to delete” the data on the server as well.

Also this is not like some process crash dump where the computer keeps running after one process crashed.

This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.

wat10000 · 2h ago
I’ll bet another ten bucks that this is a generic implementation for all of their telemetry, not something special cased for crashes.

Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.

dylan604 · 2h ago
How handling an automobile crash not as a special case is the weird part. Even in the <$50 dashcams from Amazon there is a feature to mark a recording as locked so the auto delete logic does not touch the locked file. Some of them even have automatic collision detection which locks the file for you.

How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.

immibis · 1h ago
That one's easy: nobody at Tesla cares about having this feature
pjob · 55m ago
That might not be a good bet. https://news.ycombinator.com/item?id=45063380
alistairSH · 2h ago
That might be the case but the article seems to indicate the system knew the data was generated from an accident. So, removing to save space on the car should now be a secondary concern.
joshcryer · 2h ago
The problem with this is that it destroys any chain of evidence. Tesla "lost" this data, in fact. You would never want your "black box" in your car delete itself after uploading to some service because the service could go down, be hacked, or the provider could decide to withhold it, forcing you into a lengthy discovery / custody battle.

This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.

This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.

aredox · 2h ago
It is a car. A vehicule which can be involved in a fatal accident. It is not a website. There is no "oversight", nor is it "pretty standard" to do it like that: when you don't think about what your system is actually doing (and that is the most charitable explanation), YOU ARE STILL RESPONSIBLE AS IF YOU HAD DONE IT ON PURPOSE.
wat10000 · 2h ago
One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.

throwway120385 · 1h ago
I'm a software person but I still take the car person approach when I know i'm building a car. You have a responsibility to understand the gravity of the enterprise you undertake and to take appropriate steps given that gravity. Ignorance shouldn't be a defense, and if you don't know what you don't know then god help you.
buran77 · 2h ago
> their software is built by software people rather than by car people

The rogue engineer defense worked so well for VW and Dieselgate.

The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.

const_cast · 2h ago
There are software people who know what they're doing - some write flight software or medical equipment software. They know how to critically think about the processes of their systems in detail.

So either the problem is Tesla engineers are fucking stupid (doubtful) or this is a poor business/product design.

My money is on the latter.

kergonath · 1h ago
> One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

So we just shrug because software boys gotta be software boys? That’s completely insane and a big reason why a lot of engineers roll their eyes about developers who want to be considered engineers.

Software engineers who work on projects that can kill people must act like the lives of other people depend on them doing their job seriously, because that is the case. Look at the aviation industry. Is it acceptable to have a bug in the avionics suite down planes at random and then delete the black boxes? It absolutely is not, and when anything like that happens shit gets serious (think 737 MAX).

The developers who designed the systems are responsible, and so are their managers who approved the changes, all the way to the top. This would not happen in a company with appropriate processes in place.

jeffbee · 3h ago
The artifact in question was a temporary archive created for upload. I can't think of a scenario in which you would not unlink it.
giancarlostoro · 2h ago
You were right in your first statement, but your follow up is a bad assumption, I think everyone here will agree that in the case of a crash this data should be more easily available and not deleted.

Assuming its not intentionally malicious this is a really dumb bug that I could have also written. You zip up a bunch of data, and then you realize that if you don't delete things you've uploaded you will fill up all available storage, so what do you do? You auto delete anything that successfully makes it to the back-end server, you mark the bug fixed, not realizing that you overlooked crash data as something you might want to keep.

I could 100% see this being what is happening.

JumpCrisscross · 2h ago
And then you delete the server copy?
semiquaver · 2h ago
They didn’t delete the server copy though. That’s what this article is about.

  > Tesla later said in court that it had the data on its own servers all along
JumpCrisscross · 1h ago
Wasn’t that after they’d been caught?
jeffbee · 2h ago
Obviously no. The behavior of Tesla in discovery of this case is ridiculous. But treating this technical detail as an element of conspiracy is also ridiculous.
actionfromafar · 2h ago
If that was the only thing going wrong, yes. But when you have a pattern of conspiracy, deleting immediately on the client instead of having a ring buffer which ages out the oldest event, may be a malicious choice.
jeffbee · 2h ago
I haven't seen anything in the (characteristically terrible and vague) coverage of this case that suggests the Tesla deleted the EDR.
constantly · 2h ago
> I can't think of a scenario in which you would not unlink it.

Perhaps if there is some sort of crash.

artursapek · 2h ago
Exactly. That's the last data I would ever delete from the car, if I was trying to preserve valuable data.
alias_neo · 2h ago
All of their actions point at intentionally wanting that data to disappear, they even suggested turning it on and updating it, which everyone who's ever tried to protect important information on a computer knows is that exact opposite to what you should do.

Any competent engineer who puts more than 3 seconds of thought into the design of that system would conclude that crash data is critical evidence and as many steps as possible should be taken to ensure it's retained with additional fail safes.

I refuse to believe Tesla's engineers aren't at least competent, so this must have been done intentionally.

jeffbee · 2h ago
What if you were the guy who got a ticket that just said "implement telemetry upload via HTTP"?

Which of these is evidence of a conspiracy:

  tar cf - | curl
  TMPFILE=$(mktemp) ; tar cf $TMPFILE ; curl -d $TMPFILE ; rm $TMPFILE
alias_neo · 2h ago
That's reductive.

The requirements should have been clear that crash data isn't just "implement telemetry upload", a "collision snapshot" is quite clearly something that could be used as evidence in a potentially serious incident.

Unless your entire engineering process was geared towards collecting as much data that can help you, and as little data as can be used against you, you'd handle this like the crown jewels.

Also, to nit-pick, the article says the automated response "marked" for deletion, which means it's not automatically deleted as your reductive example which doesn't verify it was successfully uploaded (at least && the last rm).

ozim · 1h ago
Well if it would be EU for GDPR you can assume contract was terminated because of force majeure and you are not allowed to keep customer data past contract. /s
fny · 2h ago
You left out the worse part:

> someone at Tesla probably took “affirmative action to delete” the copy of the data on the company’s central database, too

raincole · 2h ago
The 'wow' part is that they deleted data from server. The part you quoted sounds like nothing unusual to me.
lexicality · 2h ago
You don't think it's unusual that the software is designed to delete crash data from the crashed car?
Thorrez · 2h ago
The question is whether this is code that's special for crashes, or code that runs the exact same way for all data uploads, regardless of whether there's a crash.

You're implying it's special for crashes, but we don't know that.

dylan604 · 2h ago
You have it backwards. The fact that after the special condition of a crash it still allows the data to be deleted is an issue. Sure, deleting of normal data is fine, but it clearly detected a crash and did not mark the file in the special crash mode as do not delete is mind boggling. Everyone knows that in a crash detection mode that the data is very important. Not having code to ensure data retention is the laziest at best way of doing things or malevolently designed at worst. Tesla and its leadership do not deserve at best as our default choice.
lexicality · 2h ago
The crash system uses this code, therefore they chose to do something that would delete the crash data after a crash.

Saying "hey, the upload_and_delete function is used in loads of places!" doesn't free you of the responsibility that you used that function in the crash handler.

Thorrez · 2h ago
Is this a crash handler, or is it their normal telemetry upload loop?
lexicality · 2h ago
Yes, it's a crash handler that uploads a blackbox "collision snapshot" of the entire car's state leading up to a crash. It's very well documented that Tesla does this, including in the article.
sim7c00 · 1h ago
if its not special for crashes thats criminally bad design in a safety critical system.

u know if for instance u weld a gas pipeline and an xray machine reveal a crack in your work, you can go to jail.... but if you treat car software as an appstore item, totally fine??

stop defending ridiculously bad design and corporate practices.

foobarian · 2h ago
Think of it as the scripts that run on CI/CD actions running unit tests. If a unit test fails, the test artifacts are uploaded to an artifact repository, and then, get this - the test runner instance is destroyed! But we don't think of that as unusual or nefarious.
smallpipe · 2h ago
No one dies when your unit test fails. Different stakes, different practices, what are all the Tesla apologists smoking here?
Ambroisie · 2h ago
I don't think you can equate CI/CD unit tests and killing humans with 2 tons of metal.
foobarian · 1h ago
And yet, that's what you get when your software org comes from that kind of devops culture. And here we are
lexicality · 2h ago
That's because typically the test runner hasn't just crashed into another test runner at full highway speed
phkahler · 2h ago
>> You don't think it's unusual that the software is designed to delete crash data from the crashed car?

After it confirmed upload to the server? What if it was a minor collision? The car may be back on the road the same day, or get repaired and on the road next week. How long should it retain data (that is not legally required to be logged) that has already been archived, and how big does the buffer need to be?

lexicality · 2h ago
A very simple answer is "until the next time the car crashes", you just replace the previous crash data with the new data.

If the car requires that a certain amount of storage is always available to write crash data to, then it doesn't matter what's in that particular area of storage. That reserved storage is always going to be unavailable for other general use.

kergonath · 1h ago
> What if it was a minor collision?

Then, I don’t know… Check if it was the case? Seriously, it’s unbelievable. It’s a company with a protocol to delete possibly incriminating evidence in a situation where it can be responsible for multiple deaths.

1vuio0pswjnm7 · 2h ago
The top HN comment on the front page story about this crash on HN several weeks ago claimed the damages award was too high

Maybe this thread will be different

throw7 · 2h ago
After reading the article, I am never buying a Tesla.

Props to greenthehacker. may you sip Starbuck's venti-size hot chocolates for many years to come.

lexicality · 1h ago
Were you considering buying one before today? I'm curious as to what's different about this autopilot death compared to all the other autopilot deaths that have happened previously. Personally for me it was when the guy in Florida got decapitated when his car drove under a semitruck that made me never want to get in one again.
throw7 · 1h ago
I wasn't opposed to buying a tesla. In my situation, I don't have the ability to charge ev's conveniently, so I'm not in the market so to speak.

Plus, I'm not interested at this time in the "autopilot" "AI" stuff; I believe drivers should be responsible all the time, until such time that full legal liability is put on the manufacturer.

Don't get me wrong... I would love to call my car to come pick me up at the airport!

terminalshort · 1h ago
I'm curious as to what's different about any of the autopilot deaths and the 40,000 non-autopilot car wreck deaths that happen every year in the US other than the fact that one is considered news and the other isn't. I'm also curious as to how this would ever affect anyone's decision to buy a Tesla given that use of autopilot / FSD is entirely optional.
sneak · 1h ago
Autopilot is opt-in. You can drive it like any other car and never use autopilot.
lexicality · 1h ago
This is very true, but if you had to choose between two microwaves, one of which had a button that occasionally killed people and one which did not, which would you choose? Personally I would feel better buying a microwave that doesn't have the option to decapitate me, even if I would never press it.
whimsicalism · 1h ago
all cars have a button that occasionally kills people, it’s called the accelerator pedal
lexicality · 1h ago
I think you know that's a false equivalence, both because every control in a car has the possibility of killing you and also because every car has an accelerator pedal and I'm talking about an extra button.
terminalshort · 1h ago
Well then to go back to your microwave analogy, it's really more like choosing between a microwave with 9 buttons that can occasionally kill you or one with 10 buttons that could occasionally kill you, and that sounds about the same to me.
SirHackalot · 1h ago
So, Musk summoning the Luftwaffe like that didn’t dissuade you from buying one?
pu_pe · 3h ago
Volkswagen was caught cheating on its emission data and the CEO got fired, then prosecuted. Why shouldn't that be the case here?
rsynnott · 2h ago
The really weird thing about the diesel emissions scandal was that someone actually got in trouble for it. It is _rare_ for companies to be punished, particularly criminally, for that sort of thing.
bobmcnamara · 2h ago
Usually they'd get a DPA
csours · 1h ago
Firing the CEO is nominally up to the board of directors.

In Tesla's case, the board knows that the valuation of the company is wildly irrational, and they feel that the valuation is tied to the CEO.

JumpCrisscross · 2h ago
You’d need a coalition of Democratic attorneys general to bring a case in the mould of Big Tobacco.
MangoToupe · 2h ago
We'd need a third party if you'd actually want to fight american corporations. Unless you intended "small d" democratic
dagmx · 2h ago
Good news, the CEO of this American corporation is making a third party… (the monkey paw curls)
MangoToupe · 28m ago
A third party... that sounds exactly like the other two.

Where is the anti-capitalism party? The anti-war party? The anti-corruption party? Aren't political parties supposed to represent DIFFERENT interests? Instead we're forced to choose between a party hates immigrants and a party that hates immigrants slightly more

And like you can criticize republicans, but they actually invested in intel. Wrong company, but a step in the right direction.

conradev · 1h ago

  Tesla recanted its employee’s testimony “after discovering evidence inconsistent with his stated recollection of events,” it said.
That’s a fancy way to say that he lied
3eb7988a1663 · 6m ago
That is my big question in this. What happens to the specific employees who provably lied? That sounds like a big no-no, but I wonder in our twisted system if they get some kind of protection as acting on behalf of the company.
gmd63 · 2h ago
None of this should be surprising to anyone who has given an ounce of effort to examining Elon's character.

Lies about capabilities, timelines, even things as frivolous as being rank one in a video game. He bought Twitter to scale his deception.

05 · 2h ago
Don't worry, once Tesla figures out secure boot nobody will be able to call their bluff and they'll be free to 'lose' crash data with the same impunity the police loses their bodycam footage.
ubunthree · 1h ago
This should be modded up higher. Exactly. The only way hackers found this is because they weren’t using secure boot or encrypted images. Every embedded developer knows about MCUboot. Except managers don’t want the overhead because it is complicated. Once embedded devs get the ok all embedded firmware will basically be like a Signal chat with only the manufacturer having the keys. Heck even PSA compliant hardware MUST be resistant to multibit glitch attacks. Bye bye hackers.
h1fra · 2h ago
The video is staggering, going super fast before an intersection, with no visibility, a blinking signal, and clear stop sign in sight. I hope FSD got better
declan_roberts · 43m ago
It's not FSD, it's Tesla's cruise control.

My minivan would happily do the same thing (but without the telemetry).

whimsicalism · 2h ago
not sure if you are saying otherwise but for those who might get confused this crash was with “Autopilot” not FSD, although both are definitely problematic
phkahler · 2h ago
>> not sure if you are saying otherwise but for those who might get confused this crash was with “Autopilot” not FSD

And the distinction is what?

I'm not serious of course. There are huge swaths of the public whose eyes would glaze over if you tried to explain it, and that's my point.

whimsicalism · 1h ago
i think the public can generally grasp the difference between lane assist and a waymo/AV but the naming is bad agreed
AlotOfReading · 1h ago
Tesla's official, wildly misleading position is that FSD is a driver assist system that should be treated no different than autopilot, not an autonomous system like Waymo. They've stated it in court, in regulatory filings, and if you open the owner's manual you'll find a bolded statement that FSD doesn't make the vehicle autonomous.

Everything else that you might be reasonably misled by? Puffery and the official position is that you really should have known better.

Sophira · 42m ago
I've seen videos of people literally using the Tesla mobile app to 'call' their FSD-enabled car to them. Given that that they coded this functionality and expose it in their app, I really don't see how Tesla can be let off by making the statement that you must officially be in front of the wheel all of the time.
estearum · 3h ago
Surely this is the behavior of a company that's confident in the safety of its products!
rcpt · 28m ago
Cruise was shut down for less than this. TSLA won't even have a down day.

Corruption pays

childintime · 1h ago
I'd like to hear the law say that self-driving cars should collect data (video, sensor inputs, actuator outputs), and that it is the property of the law when an accident happens. No exceptions. The real question is how the law is written, for it should leave no doubt about what Tesla, or any other, is required to do.

Probably all cars should have a black box, as both modern electronics and humans can do weird stuff.

whimsicalism · 1h ago
good luck passing such a law in the US
firesteelrain · 3h ago
If Tesla can’t ensure safeguarding of this information, it’s a feature that will get them in big trouble.
voidUpdate · 3h ago
I'm still convinced that it being called "full self driving" is misleading marketing and really needs to stop, since it isn't according to Tesla
orlp · 3h ago
The marketing doesn't even matter. It either needs to be full self driving, or nothing at all. The "semi self-driving but you're still responsible when shit hits the fan" just doesn't work.

Humans are simply incapable of paying attention to a task for long periods if it doesn't involve some kind of interactive feedback. You can't ask someone to watch paint dry while simultaneously expect them to have < 0.5sec reaction time to a sudden impulse three hours into the drying process.

AuthorizedCust · 3h ago
I have a SAE level 2 car. Those features DO help!
tialaramex · 2h ago
Framing is crucial. Example, why was the Autonomous Emergency Braking configured to brake violently to a full stop? Lets consider two scenarios, in both cases we're not paying enough attention to the outside world and are about to strike a child on a bicycle but the AEB policy varies.

1. AEB brakes violently to a full stop. We experience shock and dismay. What happened? Oh, a kid on a bike I didn't see. I nearly fucked up bad, good job AEB

2. AEB smoothly slows the vehicle to prevent striking the bicycle, we gradually become aware of the bike and believe we had always known it was there and our decision eliminated risk, why even bother with stupid computer systems?

Humans are really bad at accepting that they fucked up, if you give them an opportunity to re-frame their experience as "I'm great, nothing could have gone wrong" that's what they prefer, so, to deliver the effective safety improvements you need to be firm about what happened and why it worked out OK.

Jeremy1026 · 2h ago
Same. Not having to worry about keeping the car between the lines allows me to keep my focus on the other cars around me more. Offloading the cognitive load of fine tuning allows more dedication to the bigger picture.
AlexandrB · 2h ago
This makes no sense to me. Driving involves all senses, not just vision - if you're not feeling what the car is doing because you're not engaged with the steering wheel what good is it to see what's around you? I also don't understand how one has trouble staying between the lines with minimal cognitive input after more than a few months of driving.

Oh! And also, moving within the lane is sometimes important for getting a better look at what's up ahead or behind you or expressing car "body language" that allows others to know you're probably going to change lanes soon.

jamincan · 1h ago
I drive a VW with lane-keep assist and adaptive cruise control and automatic emergency braking. It won't change lanes for me, but aside from the requirements that I have my hands on the wheel, could otherwise drive itself on the highway.

I commute mainly on the highway about 45-1hr each way every day and it makes a big difference for driver fatigue. I was honestly a bit surprised. Even though, I'm steering, it requires less effort. I don't have my foot on the gas and I'm not having to adjust my speed constantly.

Critically, though, I do have to pay attention to my surroundings. It's not taking so much out of my driving that I can't stay engaged to what's happening around me.

ghaff · 2h ago
I don't have personal experience but friends with personal experience have sort of shifted my thinking on the topic. They'll note they do need to stay engaged but that it is genuinely useful on long drives in particular. The control handover is definitely an issue but so is manual driving in general. Their consensus is that the current state of the art is by no means perfect but it is improved and it's not like there aren't problems with existing manual driving even with some assistive systems.
Jeremy1026 · 1h ago
My car requires hands on the wheel to continue to operate. So I do feel it moving.

> I also don't understand how one has trouble staying between the lines with minimal cognitive input after more than a few months of driving.

Once you have something assist you with that, you'll notice how much "effort" you are actually putting towards it.

sneak · 1h ago
I used to think this, but then I got a Model 3. I believe that FSD is presently better than most humans driving today even when they are theoretically “fully engaged in manual driving”.

FSD doesn’t lull humans into a false sense of security, humans do. FSD doesn’t let you use your phone while it’s on. This alone is an upgrade over most human beings, who think occasional quick phone usage while driving is fine (at least for themselves).

I believe that if you replaced all human drivers in the US with FSD as it exists today, fatalities would go down immediately.

Humans are not a gold standard, and the current median human driver is easy to outperform on safety.

JumpCrisscross · 2h ago
If you live in a city, please send this article to your municipal and state electeds. Tesla is lobbying for the right to train and activate its Level 4 product, marketed as Level 5, in cities where Musk is deeply unpopular. There is massive political capital to be had in banning Tesla’s self-driving features on even the flimsiest grounds.
terminalshort · 2h ago
I would rather take a bullet than be a luddite who gets in the way of technological advancement on "the flimsiest of grounds."
JumpCrisscross · 2h ago
> be a luddite who gets in the way of technological advancement on "the flimsiest of grounds”

Blocking a technology is Luddism. Blocking a company is politics.

moduspol · 2h ago
I’m less convinced we need to keep bringing this up in every single thread involving Tesla.
acdha · 3h ago
Why do you think Musk put so much money into helping Trump win? Tesla was under multiple investigations for safety and unkept promises, and he knew that he would not have leverage to halt those under a Harris administration.
thowaway52729 · 3h ago
If that was his goal he would have minded his own business after the election, instead of spouting invective posts against Trump on X.
Maken · 3h ago
That was after Musk realized he had alienated his entire consumer base.
antonvs · 2h ago
That happened much earlier. The split with Trump happened after it finally sunk in that that Republicans weren't actually interested in smaller government or cost savings, that that was just a rhetorical weapon that they deploy selectively to get elected.
thowaway52729 · 3h ago
And he wants to bring them back by alienating Trump while doubling down on his rhetoric?
delfinom · 2h ago
He has an ego and narcissism but he isn't dumb. He sees the problems but also cant admit hes wrong or anything.
sjsdaiuasgdia · 1h ago
> but he isn't dumb.

    Musk’s assistant peeked back the muttered and said he had another meeting. “Do you have any final thoughts?” she asked.

    “Yes, I want to say one thing.” the data scientist said. He took a deep breath and turned to Musk.

    “I’m resigning today. I was feeling excited about the takeover, but I was really disappointed by your Paul Pelosi tweet. It’s really such obvious partisan misinformation and it makes me worry about you and what kind of friends you’re getting information from. It’s only really like the tenth percentile of the adult population who’d be gullible enough to fall for this.”

    The color drained from Musk’s already pale face. He leaned forward in his chair. No one spoke to him like this. And no one, least of all someone who worked for him, would dare to question his intellect or his tweets. His darting eyes focused for a second directly on the data scientist.

    “Fuck you!” Musk growled.

https://www.techdirt.com/2024/10/25/lies-damned-lies-and-elo...
efficax · 1h ago
he's not a very smart man
rsynnott · 2h ago
I mean, if we was rational, sure, that's probably what he should have done. But, y'know, he clearly _isn't_.
lenkite · 2h ago
He was under some imaginary assumption that Trump cared about the national deficit because of his campaign speeches. Once he realized that Trump really didn't care two hoots about it and only planned to increase it even more he had a late buyer's realization.
baggachipz · 39m ago
If he thought trump would actually adhere to anything he said... or, for that matter, was the least bit consistent in what he did on a day-to-day basis, then Elon is not fit to pull his own pants up in the morning.
myrmidon · 3h ago
I'm absolutely not a fan of Trump, but this is a highly questionable assumption.

The much more likely hypothesis in my view is that he was helping Trump because of personal conviction (only in small parts motivated by naked self-interest).

You should expect rational billionaires to tend politically right out of pure self-interest and distorted perspective alone; because the universal thing that such parties reliably do when in power is cutting tax burden on the top end.

blizdiddy · 3h ago
That’s insane. Do you remember DOGE or Elon taking his cronies into the same departments investigating him? Do you even remember?
myrmidon · 2h ago
What would Elon even be in court for? Being a politically incorrect dumbass on ex-twitter is not punishable by law.

Sending a bunch of scriptkiddies around and having them cut government funding and gut agencies is not really how you make evidence "vanish", how would that even work?

And, lastly, jumping in front of an audience at every opportunity and running your mouth is the absolute last thing anyone would ever do if the goal was to avoid prosection. But it is perfectly in line with a person that has a very big ego and wants to achieve political goals.

blizdiddy · 2h ago
Labor violations, taxes, National Highway traffic safety administration investigation Tesla.. are you willfully ignorant or a troll?
myrmidon · 1h ago
I'm not a troll.

I scrutinise beliefs and assumptions even if they are convenient, and you should, too.

I don't believe that Musks main motivation to participate in the 2024 election was to avoid prosecution, because his actions are not really compatible with this, and there is a much more plausible alternative hypothesis that he preferred (possibly no longer) the republican platform for non-prosecution reasons/personal conviction instead, which his actions are very compatible with.

> Labor violations, taxes, National Highway traffic safety administration investigation Tesla

Let me say it like this: Billionaires generally don't have to care about minor infractions like this at all. The whole system is set up to shield them from liability, and wealth is an excellent buffer against effective prosection regardless of who is president. There have been a plethora of infinitely more serious infractions with zero real consequences for the CEOs involved, and this is not because they participated in past presidential election campaigns. See: the VW diesel emission fraud or much worse, leaded gas in the last century (and what associated industry did to keep that going).

DonHopkins · 37m ago
Oh, so you're willfully ignorant.
antonvs · 3h ago
Musk is on record saying to Tucker Carlson that “If [Trump] loses, I’m fucked.”

So this isn't so much of an assumption, as taking him at his word.

myrmidon · 2h ago
All the context I have for this is that he was grandstanding in front of a rightwing audience (after Trump was shot at, notably) and playing the "surely I would get unjustly prosecuted for my political incorrectness under the democrats".

What is your actual point? What would he stand in front of a judge for, right now, if Harris had won?

antonvs · 1h ago
My actual point is that when someone tells you who they are, you should consider believing them.

You'd have to ask Musk what he feels so guilty about that he had to buy an election.

unnamed76ri · 2h ago
The Left was coming after Musk pretty hard before the election. I don’t know the context of the quote you pulled but it’s not hard to see how if Trump lost, there was going to be consequences for Musk.
Swenrekcah · 2h ago
He has committed a lot of fraud and was facing consequences for that. That has nothing to do with left or right.
unnamed76ri · 2h ago
Fraud has nothing to do with vandalizing Tesla dealerships last I checked.
Swenrekcah · 2h ago
You are right it doesn’t. That is (wrongly) done by people who are (rightly) mad at him for making american life harder and global life more dangerous, in a self serving attempt to evade the justice system.
throwway120385 · 1h ago
We were talking about Tesla's fraud cases, not some vandalism cases last time I checked.
unnamed76ri · 1h ago
Actually we were talking about personal consequences to Musk.
terminalshort · 2h ago
Can you give an example of these many instances of fraud?
Swenrekcah · 2h ago
For instance he has made fraudulent statements regarding the current and near future capabilities of Tesla in an effort to inflate stock prices numerous times. He was in fact ordered by a judge to stop making such statements but he didn’t obey that.

He used to be quite charismatic, I believed him up until about 2017 or so. Then I figured he was just a bit greedy and maybe money got to his head but still a respectable innovator. However during 2020 or 2021 (I don’t exactly remember) he started to get quite unpleasant and making obviously short-term decisions, such as relying only on cameras for self driving because of chip shortages but dressing it up as an engineering decision.

terminalshort · 45m ago
I just can't take the accusation of lying to increase stock price seriously because Elon has on occasion come right out and said the stock is overvalued https://x.com/elonmusk/status/1256239815256797184

You will basically never hear another CEO of another publicly traded company say this. I just don't believe that the same person who cares so little about his stock price that he sends a tweet like that (and the stock dropped 10% on it) also is making fraudulent statements to inflate the price. A better explanation is that he just says what he thinks without regard for the stock price, which is also something you won't see any other CEO of a publicly traded company do.

scott_w · 2h ago
We can start from the linked article?
terminalshort · 2h ago
Yeah, that's where I started, and I would recommend you do the same:

> U.S. District Judge Beth Bloom, who presided over the case, said in an order that she did not find “sufficient evidence” that Tesla’s failure to initially produce the data was intentional.

immibis · 1h ago
Why do you believe it has nothing to do with left or right?

(Democrats aren't left btw)

walls · 2h ago
It does actually, because only one side is interested in finding or fighting fraud.
Swenrekcah · 2h ago
Currently yes, but it is not inherently so. The problem with the US regime is that it is compromised, corrupt and heading towards fascism.

The problem is not that the republican party used to be a conservative right party.

What I’m saying is this is not a sports competition where Musk is automatically an opponent of the Democratic party because he supported Trump. He supported Trump in order to improve his chances with the legal system because he knew Trump would be willing to be so corrupt.

Another world might be imagined in which the Democratic party was taken over in 2016 but that is not the world we live in.

terminalshort · 2h ago
Both are interested in finding and fighting fraud, but only from the other side. Leticia James charged Trump with a rack of felonies for putting false info on a loan application. The Trump DOJ charged Leticia James for doing exactly the same. Both sides claim the charges against them are politically motivated and the charges against the other side are completely legitimate.
walls · 2h ago
> Both sides claim the charges against them are politically motivated and the charges against the other side are completely legitimate.

This is how conservatives keep people going 'both sides!' even though they manufacture whatever is required to be that way.

terminalshort · 1h ago
Please explain how one person lying on a loan application is manufactured and another person lying on a loan application is a serious felony.
razemio · 2h ago
Everytime this comes up, I am on the opposite site of this. It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. There are various videos online where FSD managed to drive a route start to finish without a single human override. That's full self driving. It can also crash like humans "can" and that why it needs supervision. In this sense, we as humans are also "full self driving" with a much (?) lower risk of crashing.

Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.

JumpCrisscross · 2h ago
> It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. That's full self driving

All this demonstrates is the term “full self driving” is meaningless.

Tesla has a SAE Level 3 [1] product they’re falsely marketing as Level 5; when this case occurred, they were misrepresenting a Level 2 system as Level 4 or 5.

If you want to see true self driving, take a Waymo. Tesla can’t do that. They’ve been lying that they can. That’s gotten people hurt and killed; Tesla should be liable for tens if not hundreds of billions for that liability.

[1] https://www.sae.org/blog/sae-j3016-update

terminalshort · 2h ago
If it's a meaningless term then it can't be misrepresenting to use it.
JumpCrisscross · 2h ago
> If it's a meaningless term then it can't be misrepresenting to use it

It’s meaningless because Tesla redefines it at will. The misrepresentation causes the meaninglessness.

claw-el · 2h ago
The other confusion with self driving for me is, is the “self” the human or the car?

Self driving can totally means the human own-self driving.

Having SAE level is clearer.

sjsdaiuasgdia · 55m ago
Do you think anyone makes the same error when they see a "self cleaning" oven?

There's plenty wrong about the FSD terminology and SAE levels would absolutely be clearer, but I doubt more than a tiny fraction of people are confused as to the target of 'self' in the phrase 'full self driving'.

razemio · 2h ago
That's something different. The problem with the level is, that it only focuses on the attention the human driver needs to give to the automation. In this sense my Kia EV6 is also Level 2/3, same as FSD. However FSD can do so much more than my Kia EV6. That's a fact. Still the same level. Where did Tesla say FSD is SAE Level 5 approved? They would be responsible everytime FSD is active during a crash. Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading.

Also "All this demonstrates is the term “full self driving” is meaningless." prooves my point that it is not missleading.

JumpCrisscross · 2h ago
> FSD can do so much more than my Kia EV6. That's a fact. Still the same level

The levels are set at the lowest common denominator. A 1960s hot rod can navigate a straight road with no user input. That doesn’t mean you can trust it to do so.

> Where did Tesla say FSD is SAE Level 5 approved?

They didn’t say that. They said it could do what a Level 5 self-driving car can do.

“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.

‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”

> Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading

This is tautology. You’re defining FSD to mean whatever Tesla FSD can do.

[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...

razemio · 2h ago
How would you name a system which can do everything a Level 5 system can, but with Level 2/3 supervision? A name, which a PR team would choose without the missleading stuff as you are saying.
JumpCrisscross · 2h ago
> How would you name a system which can do everything a Level 5 system can, but with Level 2/3 supervision?

FSD cannot “do everything a Level 5 system can.” It can’t even match Waymo’s Level 4 capabilities, because it periodically requires human intervention.

But granting your premise, you’d say it’s a Level 2 or 3 system with some advanced capabilities. (Mercedes has a lane-keeping and -switching product. They’re not constantly losing court cases.)

terminalshort · 55m ago
But this speaks to the fundamental point the other commenter is making. A Waymo requires human intervention periodically too. It's just less than a Tesla with FSD, which is in turn less than a Tesla with Autopilot, which is dramatically less than my 20 year old truck. It's just that at some point we assume the probability of a crash is low enough that the human driver can zone out and hope for the best and nobody has the balls to come out and actually define an acceptable probability of serious injury or death to set an actually useful performance standard based on this.
Workaccount2 · 1h ago
>if you could tell me where I am wrong

It needs to have a crash rate equal to or ideally lower than a human driver.

Tesla does not release crash data (wonder why...), has a safety driver with a finger on the kill switch, and only lets select people take rides. Of course according to Elon always-honest-about-timelines Musk, this will all go away Soon(TM) and we will have 1M Robotaxis on the road by December 31st.

Completing a route without intervention doesn't mean much. It needs to complete thousands of routes without intervention.

Keep in mind that Waymos have selective intervention for when they get stuck. Teslas have active intervention to prevent them from mowing down pedestrians.

terminalshort · 3h ago
I see this brought up a lot, but I don't think it's really an issue. It's misleading in a very technical sense, but it's so misleading that nobody is mislead. Just like nobody thinks the "Magic Eraser" is actually magic. I fundamentally just don't think anybody is out there actually believing this thing is L5 full self driving, especially after all the warnings it shows you and the disclaimers when you buy it.

The problem here isn't that people think they don't need to pay attention because their car can drive itself and then crash. The problem is that people who know full well that they need to focus on driving just don't because fundamentally the human brain isn't any good at paying attention 100% of the time in a situation where you can get away with not paying attention 99.9% of the time, and naming just can't solve this.

de6u99er · 1h ago
So first the data wasn't there, and suddenly it is there. I think the only way to prevent tis in the future is to litigate against those individuals who knowingly lie for a company.
alphawhisky · 1h ago
Litigate the company, not the individual. The hiding of the data was almost certainly a result of company ethos and most likely involved multiple levels of people. The maintenance tech was probably the lowest paid of everyone involved.
Noumenon72 · 2h ago
They should be saving every crash as a unit test to ensure it never happens again.
inetknght · 19m ago
More than a unit test -- a whole system test. But, as a software engineer with experience in robotics and drones with a focus on software safety, yes I 100% agree.

The unfortunate thing is that the state of the industry (or, my experience in it) currently is not set up to be able to do that cheaply nor at scale. Imagine you have tens of thousands of various unique problem scenarios to run through, and some might take several minutes of simulation to run the test. Even if your release cadence is slow, but especially if you have continuous deployment with dozens of micro-releases every day: how exactly do you cheaply scale such that simulation testing doesn't become a massive bottleneck?

chiffre01 · 3h ago
The description of the guy finding the data while at a Starbucks doesn't do justice to his setup shown in the photo. My dude has a seriously chaotic and awesome setup there.
jalk · 3h ago
I imagine, he dumped the car data onto his laptop, so that he could work on the problem in a more cozy place, than his messy bitcave
outside1234 · 1h ago
This will continue until people go to prison
aurizon · 2h ago
Can you imagine air craft makers avoid this sort of black box autodelete! A red handed catch!
LightBug1 · 1h ago
Huge props to the hacker (@greentheonly) ... considering the cutbacks in journalism, perhaps we're entering a world where some of the most important investigative journalism will be done by hackers.

Unpaid, unrewarded excellence.

skeezyboy · 1h ago
the dirtiest of doggery
fblp · 1h ago
No paywall link https://archive.is/s1psp
IlikeKitties · 2h ago
So, will tesla get nuked from orbit for what is obviously a serious, intentional and systemic discovery violation or is this just ok because it's a big corp?
OutOfHere · 2h ago
With everything that is wrong with Tesla, I'll be the first to say that all Tesla cars need to be taken off of the roads, at least until all of their auto-driving features have been fully removed.
ath3nd · 3h ago
Of course they will say they don't have the key data.

Do we expect them to admit they were outright lying and wrong considering their leader is a pill popping Nazi salute making workaholic known to abuse his workers?

rwmj · 3h ago
Lying to a court is usually pretty serious. Any sensible legal department will tell you never to do that, whatever your CEO says.
emsign · 3h ago
Unless you paid off the president they assume.
gruez · 2h ago
Did they have a falling out a few months ago?
psychoslave · 3h ago
You mean, the other allegations on this same person would not be judged something serious and could even be recommended?
lawn · 3h ago
Usually.

But today you just have a private dinner with the president and he'll wave it away.

declan_roberts · 1h ago
Everyone is mad at Tesla but they're literally the only company collecting this kind of crash metadata.

Other car manufacturers would never get in trouble for this because it's not even possible for them to do it in the first place!

jdiff · 1h ago
People aren't mad that they collect the data, everyone does that, but that they immediately deleted it, then lied about it ever happening, in a matter of life and death.

I would deeply encourage you to re-assess whatever led you to make this comment, because you have fallen wildly off the mark here. Corporations are not your friend.

jeremyjh · 1h ago
Everyone is mad because they killed people and lied about it.
fred_is_fred · 1h ago
Wrong. Almost all modern cars track location and tons of other data. Ford even has a screen that pops-up saying basically "hey you're opting into this FYI".
declan_roberts · 36m ago
You think other cars are recording whether they detected a person and the approximate location of the person to the car?
inetknght · 15m ago
I certainly think that. Because, as a software engineer in robotics and drones, that's exactly what I would do. Using logs to recreate the scenario, especially for regression testing, is standard process for competent software engineers.