U.S. government takes 10% stake in Intel (cnbc.com)
604 points by givemeethekeys 6d ago 718 comments
Ask HN: Why hasn't x86 caught up with Apple M series?
434 points by stephenheron 3d ago 616 comments
Tesla said it didn't have key data in a fatal crash, then a hacker found it
449 clcaev 246 8/29/2025, 11:15:39 AM washingtonpost.com ↗
If that is actually designed like this, the only reason I could see for it would be so that Tesla has sole access to the data and can decide whether to use it or not. Which really should not work in court, but it seems it has so far.
And of course I'd expect an audit trail for the deletion of crash data on Tesla servers. But who knows whether there actually isn't one, or nobody looked into it at all.
All vehicle manufacturers have sole access to data. There isn't a standard for logging data, nor a standard for retrieving it. Some components log data and it only the supplier has the means to read and interpret it.
If your car has an EDR, what data it collects is legislated. There is not a standard interface for retrieving it, but the manufacturer is required to ensure that there is a commercially available tool for data retrieval that any third party can use.
https://www.ecfr.gov/current/title-49/subtitle-B/chapter-V/p...
Is this one of those "that's why big cars are cheaper to make" situations?
For the first kind of data, deleting the data from the car the moment there’s confirmation that it now is stored at Tesla can make perfect sense as a mechanism to prevent the car to run out of storage space.
Of course, if the car crashed, deleting the data isn’t the optimal, but that it gets deleted may not be malice.
Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.
Every byte that car records and how it is managed will be documented in excruciating detail by legal.
As is deleting data. Also, for, say, training data for Tesla’s software, I don’t see legal requirements for keeping it around,
> There's no chance such a decision is accidently made by reusing code.
At Tesla? I know about nothing about their software development practices, but from them, it wouldn’t surprise me at all if this were accidental.
Edit: one scenario to easily introduce this bug is if the “delete data after upload” feature were added after the “on a crash, upload all data you have, in case the car burns down” feature.
If you selectively delete data, courts can assume that data is the worst possible thing for a court case against you.
Like many things, the retention policy was actually a destruction policy
In my experience, they are setting automated 90 deletion policies on email so they don't end up with surprises in discovery.
But in the end we wouldn't be discussing this at all if Tesla had simply handed over the data from their servers. If they can't find it, it isn't actually there or they deliberately removed it this affects how I view this process.
Two copies are better than one. If you immediately erase the data, you better be sure the transmitted data is safe and secure. And obviously it wasn't.
Tesla's fairly notorious for casual treatment of customer car data (which they have a lot of). There was an article, recently, about how in-car video recordings were being passed around the office.
I know that at least one porn actress recorded a scene in a self-driving Tesla. I'll bet that recording made the rounds "for quality purposes."
It's a disclaimer, but it also grants permission for you to record.
I knew a guy who used to record all his calls with companies, and would let them know they were being recorded, if they didn't have that disclaimer.
He would say "This call is being recorded." He told me that most of the companies hung up immediately, when he said that.
I never heard him say that his recording ever did him any good, though.
This is not unlike what happens for flight data recorders after a crash. The raw data is not made public right away, if ever.
But the data was mostly unprotected on the devices, or it couldn't have been restored. And Tesla isn't exactly known for respecting the privacy of their customers, they have announced details about accidents publicly before.
And there is the potential conflict of interest, Tesla does have strong incentives to "lose" data that implicates Autopilot or FSD.
I don't know how accurate it is right now, but previously, people have had to sue Tesla to get telemetry data from their own vehicle, not to use against Tesla, but to use in accident lawsuits against other parties.
Meanwhile, without your consent, Tesla will hold press conferences using your telemetry data to throw you under the bus (even deceptively) to defend themselves. "The vehicle had told the driver to pay attention!" NHTSA, four months later: "The vehicle had issued one inattention alert, eighteen minutes prior to the collision." (emphasis mine)
They tried to recruit me for the UI. If I lived closer, I would have jumped on it. Not only was I bit of a Tesla fanboy at the time, I used to work across the street from their office and really liked that area. (Deer Creek Road in Palo Alto.)
...and potentially death?
HN submission: https://news.ycombinator.com/item?id=41012443
So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?
You can take a Waymo any time of day in SF and they provide 1000s of successful rides daily
> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.
Wow...just wow.
Even if Tesla hadn't squandered it's EV lead and was instead positioned to be a robotics and AI superpower, is this really the corporate behavior you would want? This is some fucking Aperture Science level corporate malfeasance.
Miles and miles different - they were not completely untouchable the way tesla and similar hot companies are.
In my mind it's like suddenly declaring that blue cars are illegal, and they made a color-shifting car that is blue except when the authorities are looking at it.
It is wrong in the sense that it is normalizion of deviance, however. We live in a society and if we don't like a law or regulation the correct response is to get it legally changed, not to ignore it and cheat.
I see no course correction from Tesla. Just continuing and utter tripe from it's CEO, team, and Musk-d-riders.
This is an on-going issue for them and, at this point, with no further change? I hope it drives them into the ground (Autopilot, natch).
edit: My point is that it was not one lone actor, who would have made that change.
It's very easy to imagine a response to this being (beyond "don't log so much") an audit layer to start automatically removing redundant data.
The externalities of the company are such that people want to ascribe malice, but this is a very routine kind of thing.
The performance ship sailed, like, 15 years ago. We're already storing about 10000000 more data than we need. And that's not even an exaggeration.
I worked a year in airbag control, and they recorded a bit of information if the bags were deployed - seatbelt buckle status was one thing. But I was under the impression there was no legal requirement for that. I'm sure it helps in court when someone tries to sue because they were injured by an airbag. The argument becomes not just "the bags comply with the law" but also "you weren't wearing your seatbelt". Regardless, I'm sure Tesla has a lot more data than that and there is likely no legal requirement to keep it - especially if it's been transferred reliably to the server.
This is like saying "maybe nobody has recently looked at the ad-selection mechanism at Google." That's just not plausible.
It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.
The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.
Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.
Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.
My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.
No comments yet
Also this is not like some process crash dump where the computer keeps running after one process crashed.
This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.
Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.
How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.
This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.
This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.
Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.
The rogue engineer defense worked so well for VW and Dieselgate.
The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.
So either the problem is Tesla engineers are fucking stupid (doubtful) or this is a poor business/product design.
My money is on the latter.
So we just shrug because software boys gotta be software boys? That’s completely insane and a big reason why a lot of engineers roll their eyes about developers who want to be considered engineers.
Software engineers who work on projects that can kill people must act like the lives of other people depend on them doing their job seriously, because that is the case. Look at the aviation industry. Is it acceptable to have a bug in the avionics suite down planes at random and then delete the black boxes? It absolutely is not, and when anything like that happens shit gets serious (think 737 MAX).
The developers who designed the systems are responsible, and so are their managers who approved the changes, all the way to the top. This would not happen in a company with appropriate processes in place.
Assuming its not intentionally malicious this is a really dumb bug that I could have also written. You zip up a bunch of data, and then you realize that if you don't delete things you've uploaded you will fill up all available storage, so what do you do? You auto delete anything that successfully makes it to the back-end server, you mark the bug fixed, not realizing that you overlooked crash data as something you might want to keep.
I could 100% see this being what is happening.
Perhaps if there is some sort of crash.
Any competent engineer who puts more than 3 seconds of thought into the design of that system would conclude that crash data is critical evidence and as many steps as possible should be taken to ensure it's retained with additional fail safes.
I refuse to believe Tesla's engineers aren't at least competent, so this must have been done intentionally.
Which of these is evidence of a conspiracy:
The requirements should have been clear that crash data isn't just "implement telemetry upload", a "collision snapshot" is quite clearly something that could be used as evidence in a potentially serious incident.
Unless your entire engineering process was geared towards collecting as much data that can help you, and as little data as can be used against you, you'd handle this like the crown jewels.
Also, to nit-pick, the article says the automated response "marked" for deletion, which means it's not automatically deleted as your reductive example which doesn't verify it was successfully uploaded (at least && the last rm).
> someone at Tesla probably took “affirmative action to delete” the copy of the data on the company’s central database, too
You're implying it's special for crashes, but we don't know that.
Saying "hey, the upload_and_delete function is used in loads of places!" doesn't free you of the responsibility that you used that function in the crash handler.
u know if for instance u weld a gas pipeline and an xray machine reveal a crack in your work, you can go to jail.... but if you treat car software as an appstore item, totally fine??
stop defending ridiculously bad design and corporate practices.
After it confirmed upload to the server? What if it was a minor collision? The car may be back on the road the same day, or get repaired and on the road next week. How long should it retain data (that is not legally required to be logged) that has already been archived, and how big does the buffer need to be?
If the car requires that a certain amount of storage is always available to write crash data to, then it doesn't matter what's in that particular area of storage. That reserved storage is always going to be unavailable for other general use.
Then, I don’t know… Check if it was the case? Seriously, it’s unbelievable. It’s a company with a protocol to delete possibly incriminating evidence in a situation where it can be responsible for multiple deaths.
Maybe this thread will be different
Props to greenthehacker. may you sip Starbuck's venti-size hot chocolates for many years to come.
Plus, I'm not interested at this time in the "autopilot" "AI" stuff; I believe drivers should be responsible all the time, until such time that full legal liability is put on the manufacturer.
Don't get me wrong... I would love to call my car to come pick me up at the airport!
In Tesla's case, the board knows that the valuation of the company is wildly irrational, and they feel that the valuation is tied to the CEO.
Where is the anti-capitalism party? The anti-war party? The anti-corruption party? Aren't political parties supposed to represent DIFFERENT interests? Instead we're forced to choose between a party hates immigrants and a party that hates immigrants slightly more
And like you can criticize republicans, but they actually invested in intel. Wrong company, but a step in the right direction.
Lies about capabilities, timelines, even things as frivolous as being rank one in a video game. He bought Twitter to scale his deception.
My minivan would happily do the same thing (but without the telemetry).
And the distinction is what?
I'm not serious of course. There are huge swaths of the public whose eyes would glaze over if you tried to explain it, and that's my point.
Everything else that you might be reasonably misled by? Puffery and the official position is that you really should have known better.
Corruption pays
Probably all cars should have a black box, as both modern electronics and humans can do weird stuff.
Humans are simply incapable of paying attention to a task for long periods if it doesn't involve some kind of interactive feedback. You can't ask someone to watch paint dry while simultaneously expect them to have < 0.5sec reaction time to a sudden impulse three hours into the drying process.
1. AEB brakes violently to a full stop. We experience shock and dismay. What happened? Oh, a kid on a bike I didn't see. I nearly fucked up bad, good job AEB
2. AEB smoothly slows the vehicle to prevent striking the bicycle, we gradually become aware of the bike and believe we had always known it was there and our decision eliminated risk, why even bother with stupid computer systems?
Humans are really bad at accepting that they fucked up, if you give them an opportunity to re-frame their experience as "I'm great, nothing could have gone wrong" that's what they prefer, so, to deliver the effective safety improvements you need to be firm about what happened and why it worked out OK.
Oh! And also, moving within the lane is sometimes important for getting a better look at what's up ahead or behind you or expressing car "body language" that allows others to know you're probably going to change lanes soon.
I commute mainly on the highway about 45-1hr each way every day and it makes a big difference for driver fatigue. I was honestly a bit surprised. Even though, I'm steering, it requires less effort. I don't have my foot on the gas and I'm not having to adjust my speed constantly.
Critically, though, I do have to pay attention to my surroundings. It's not taking so much out of my driving that I can't stay engaged to what's happening around me.
> I also don't understand how one has trouble staying between the lines with minimal cognitive input after more than a few months of driving.
Once you have something assist you with that, you'll notice how much "effort" you are actually putting towards it.
FSD doesn’t lull humans into a false sense of security, humans do. FSD doesn’t let you use your phone while it’s on. This alone is an upgrade over most human beings, who think occasional quick phone usage while driving is fine (at least for themselves).
I believe that if you replaced all human drivers in the US with FSD as it exists today, fatalities would go down immediately.
Humans are not a gold standard, and the current median human driver is easy to outperform on safety.
Blocking a technology is Luddism. Blocking a company is politics.
The much more likely hypothesis in my view is that he was helping Trump because of personal conviction (only in small parts motivated by naked self-interest).
You should expect rational billionaires to tend politically right out of pure self-interest and distorted perspective alone; because the universal thing that such parties reliably do when in power is cutting tax burden on the top end.
Sending a bunch of scriptkiddies around and having them cut government funding and gut agencies is not really how you make evidence "vanish", how would that even work?
And, lastly, jumping in front of an audience at every opportunity and running your mouth is the absolute last thing anyone would ever do if the goal was to avoid prosection. But it is perfectly in line with a person that has a very big ego and wants to achieve political goals.
I scrutinise beliefs and assumptions even if they are convenient, and you should, too.
I don't believe that Musks main motivation to participate in the 2024 election was to avoid prosecution, because his actions are not really compatible with this, and there is a much more plausible alternative hypothesis that he preferred (possibly no longer) the republican platform for non-prosecution reasons/personal conviction instead, which his actions are very compatible with.
> Labor violations, taxes, National Highway traffic safety administration investigation Tesla
Let me say it like this: Billionaires generally don't have to care about minor infractions like this at all. The whole system is set up to shield them from liability, and wealth is an excellent buffer against effective prosection regardless of who is president. There have been a plethora of infinitely more serious infractions with zero real consequences for the CEOs involved, and this is not because they participated in past presidential election campaigns. See: the VW diesel emission fraud or much worse, leaded gas in the last century (and what associated industry did to keep that going).
So this isn't so much of an assumption, as taking him at his word.
What is your actual point? What would he stand in front of a judge for, right now, if Harris had won?
You'd have to ask Musk what he feels so guilty about that he had to buy an election.
He used to be quite charismatic, I believed him up until about 2017 or so. Then I figured he was just a bit greedy and maybe money got to his head but still a respectable innovator. However during 2020 or 2021 (I don’t exactly remember) he started to get quite unpleasant and making obviously short-term decisions, such as relying only on cameras for self driving because of chip shortages but dressing it up as an engineering decision.
You will basically never hear another CEO of another publicly traded company say this. I just don't believe that the same person who cares so little about his stock price that he sends a tweet like that (and the stock dropped 10% on it) also is making fraudulent statements to inflate the price. A better explanation is that he just says what he thinks without regard for the stock price, which is also something you won't see any other CEO of a publicly traded company do.
> U.S. District Judge Beth Bloom, who presided over the case, said in an order that she did not find “sufficient evidence” that Tesla’s failure to initially produce the data was intentional.
(Democrats aren't left btw)
The problem is not that the republican party used to be a conservative right party.
What I’m saying is this is not a sports competition where Musk is automatically an opponent of the Democratic party because he supported Trump. He supported Trump in order to improve his chances with the legal system because he knew Trump would be willing to be so corrupt.
Another world might be imagined in which the Democratic party was taken over in 2016 but that is not the world we live in.
This is how conservatives keep people going 'both sides!' even though they manufacture whatever is required to be that way.
Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.
All this demonstrates is the term “full self driving” is meaningless.
Tesla has a SAE Level 3 [1] product they’re falsely marketing as Level 5; when this case occurred, they were misrepresenting a Level 2 system as Level 4 or 5.
If you want to see true self driving, take a Waymo. Tesla can’t do that. They’ve been lying that they can. That’s gotten people hurt and killed; Tesla should be liable for tens if not hundreds of billions for that liability.
[1] https://www.sae.org/blog/sae-j3016-update
It’s meaningless because Tesla redefines it at will. The misrepresentation causes the meaninglessness.
Self driving can totally means the human own-self driving.
Having SAE level is clearer.
There's plenty wrong about the FSD terminology and SAE levels would absolutely be clearer, but I doubt more than a tiny fraction of people are confused as to the target of 'self' in the phrase 'full self driving'.
Also "All this demonstrates is the term “full self driving” is meaningless." prooves my point that it is not missleading.
The levels are set at the lowest common denominator. A 1960s hot rod can navigate a straight road with no user input. That doesn’t mean you can trust it to do so.
> Where did Tesla say FSD is SAE Level 5 approved?
They didn’t say that. They said it could do what a Level 5 self-driving car can do.
“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”
> Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading
This is tautology. You’re defining FSD to mean whatever Tesla FSD can do.
[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
FSD cannot “do everything a Level 5 system can.” It can’t even match Waymo’s Level 4 capabilities, because it periodically requires human intervention.
But granting your premise, you’d say it’s a Level 2 or 3 system with some advanced capabilities. (Mercedes has a lane-keeping and -switching product. They’re not constantly losing court cases.)
It needs to have a crash rate equal to or ideally lower than a human driver.
Tesla does not release crash data (wonder why...), has a safety driver with a finger on the kill switch, and only lets select people take rides. Of course according to Elon always-honest-about-timelines Musk, this will all go away Soon(TM) and we will have 1M Robotaxis on the road by December 31st.
Completing a route without intervention doesn't mean much. It needs to complete thousands of routes without intervention.
Keep in mind that Waymos have selective intervention for when they get stuck. Teslas have active intervention to prevent them from mowing down pedestrians.
The problem here isn't that people think they don't need to pay attention because their car can drive itself and then crash. The problem is that people who know full well that they need to focus on driving just don't because fundamentally the human brain isn't any good at paying attention 100% of the time in a situation where you can get away with not paying attention 99.9% of the time, and naming just can't solve this.
The unfortunate thing is that the state of the industry (or, my experience in it) currently is not set up to be able to do that cheaply nor at scale. Imagine you have tens of thousands of various unique problem scenarios to run through, and some might take several minutes of simulation to run the test. Even if your release cadence is slow, but especially if you have continuous deployment with dozens of micro-releases every day: how exactly do you cheaply scale such that simulation testing doesn't become a massive bottleneck?
Unpaid, unrewarded excellence.
Do we expect them to admit they were outright lying and wrong considering their leader is a pill popping Nazi salute making workaholic known to abuse his workers?
But today you just have a private dinner with the president and he'll wave it away.
Other car manufacturers would never get in trouble for this because it's not even possible for them to do it in the first place!
I would deeply encourage you to re-assess whatever led you to make this comment, because you have fallen wildly off the mark here. Corporations are not your friend.