Tesla blows past stopped school bus and hits kid-sized dummies in FSD tests

117 ndsipa_pomu 134 6/16/2025, 9:58:01 AM engadget.com ↗

Comments (134)

AlecSchueler · 2h ago
Why are Tesla related posts still being flagged? Mr Musk stepped out of his governmental role so criticism of his assets is no longer unavoidably political. My understanding was that criticism of Tesla was for the past few months seen as a political action and that many here don't want any inflammatory political discussions about the current US administration, but what's the current reason for flagging? This is surely tech/business news through and through.
enslavedrobot · 1h ago
If you follow Tesla for any length of time, you'll find so many disingenuous slanted articles about the company and the cars that you'll begin to wonder what the hell is going on.

This article is about "testing" conducted by a guy who is trying to sell competing software products and has produced videos in the past that weren't replicable and were likely staged. They never release unedited footage or car diagnostic data, just a produced video presented with the intent of damaging Tesla's reputation and boosting their own sales.

This happens all the time, from the New York Times' false claims that their review car ran out of charge, to the most recent Mark Rober video that cost the CEO of a lidar company his job.

The video in this article requires independent validation it is not from a trusted source as it is produced by an extremely biased and financially motivated Tesla competitor.

colpabar · 1h ago
The comments become unreadable because everyone just argues over musk, so people just flag the whole thing.
ndsipa_pomu · 1h ago
Ironically, most of the mentions of Musk in this thread are due to the flagging.
2rsf · 8h ago
> "requires a fully attentive driver and will display a series of escalating warnings requiring driver response."

I understand the reasoning behind it, but watching the video () of the test shows that the car did not warn the driver, and even if it did it was speeding too much leaving almost no time for a driver to respond

Disclaimer- I have never used FSD before

() https://dawnproject.com/the-dawn-project-and-tesla-takedowns...

denniebee · 8h ago
> but watching the video () of the test shows that the car did not warn the driver

The warnings occur when you look away or don't touch the steering wheel for a while. Not saying that Tesla is without error (it isn't), but just clarify what the warnings are for.

hulitu · 8h ago
> The warnings occur when you look away

So they are useless. My car warns me even if i don't look.

jlbooker · 4h ago
> So they are useless. My car warns me even if i don't look.

No, they serve a very specific purpose -- (attempting) to ensure the driver is at the controls and paying attention.

Don't confuse the FSD attention "nag" warnings with collision warnings. The collision warnings will sound all the time, with and without FSD enabled and even if you're not looking at the road. If you don't start slowing down quickly, Automatic Emergency Braking (AEB) will slam on the brakes and bring the car to a stop.

reaperducer · 4h ago
So they are useless. My car warns me even if i don't look.

Heck, my car not only warns you, it slams on the brakes for you.

Scared the heck out of me when it happened, but it saved me from hitting something I didn't realize was so close.

locococo · 8h ago
Ignoring a stop sign, not even slowing down, thats a major safety flaw.

I am wondering if there is a safety certification body for self driving technology. If not, one is needed because consumers can't be expected to be aware of all the limitations of the latest update they have installed.

There must be basic safety standards these systems need to meet, a disclaimer can't be the solution here.

antennafirepla · 8h ago
There isn’t. We won’t regulate until there is public outcry over tens or hundreds of deaths.
addandsubtract · 6h ago
Or you pay Trump more money than Elon.
potato3732842 · 8h ago
The real problem is that it didn't recognize and stop for the stop signs on the school bus. The child is basically an afterthought designed to appeal to the emotion of those whom logic fails. Even if no kids materialized the way a bus stop works (bus stops, then kids cross) means that detecting the kid really shouldn't be the primary trigger for stopping in this situation, the stop sign needs to be recognized and acted upon. Any ability to detect and stop for pedestrians is secondary to that.
b3orn · 8h ago
I don't agree with this. Not hitting pedestrians should not just be an afterthought. Of course the car should recognise the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.
alexey-salmin · 3h ago
Yes, but recognizing a pedestrian when he jumps in front of your car is useless -- you don't have time to stop anyway.

What you WANT to recognize is conditions when such an event is possible (obstructed vision) and to slow down in advance even if you don't see/detect any pedestrians at the moment.

This obviously includes the case with the school bus and the stop sign but, as you correctly point out, is not limited to that. There are more cases when a pedestrian, especially a child, can jump under your car from behind a big vehicle or an obstacle.

Recognizing these situation and slowing down in advance is a characteristic trait of a good-intentioned experienced driver. Though I think that most of the time it's not a skill you have straight out of driving courses, it takes time and a few close calls to burn it into your subconsciousness.

BobaFloutist · 3h ago
At 25 mph, which I would hope would be the speed limit on roads next to schools, slamming on the brakes even seconds before colliding with children can make an enormous difference in how fast the car is going when it hits the kid.

Speed is the factor in collisions (other than weight), and modern brakes are incredibly good.

Not to mention that the car, with it's 360 degree sensors, could safely and efficiently swerve around the children even faster than it can brake, as long as there's not a car right next to you in another lane -- and even if there is, hitting another car is far less dangerous to their life than hitting the children is to yours.

These things should be so much better than we are, since they're not limited by unidirectional binocular vision, but somehow they're largely just worse. Waymo is, at best, a bit better. On average.

franktankbank · 1h ago
Kids don't get dropped off only at school. They get dropped of at their homes and can be on 55 mph roads. You are quibbling with the main point anyway. The whole reason for the stop signs on busses is because kids will be kids and really its too late when they run out.
bryanlarsen · 3h ago
25mph is 36 feet per second, about the length of a school bus. The stopping distance at 25mph is 30 feet, assuming perfect reaction time and dry pavement. Human reaction time is about 750ms, so stopping distance is about 2 school bus lengths. You don't have seconds.

25mph is too fast for any street where kids may jump out behind parked cars. Not just school zones, but all residential streets. There's a knee at about 20mph in the traffic fatality stats where below that speed pedestrian collisions are unlikely to be fatal. Above 20mph fatalities become more common quite quickly.

scienceman · 13m ago
For the knee to be useful you have to go below it -- thus 15 mph is a much better neighborhood and school speed limit.
mckn1ght · 2h ago
This is why school buses flash yellow warning lights before deploying the stop sign and opening the doors.

It should never be the case that someone is surprised by an instantaneous bus stop. The are plenty of context clues in advance. Including the fact that the bus is around at all, which should already heighten attention.

cameldrv · 2h ago
There are a lot of videos Waymo has posted of split second swerves they’ve had to do in SF and Austin. It looked to me like a combination of hard braking and swerving could have avoided the collision. Now to be fair to Tesla, the dummies in this test didn’t look very realistic, but not even slowing down for the school bus shows that FSD is not close to being ready for unsupervised use.
potato3732842 · 6h ago
>I don't agree with this. Not hitting pedestrians should not just be an afterthought.

You're disagreeing with something I didn't say. There's a difference between afterthought and the primary initiator of the stop in a situation like this.

>Of course the car should recognize the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.

The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics. Avoiding errant pedestrians like in the video will likely only come as a byproduct of better situational behavior by self driving vehicles. The overwhelming majority of drivers know to ignore the speed limit if the situation is rife with pedestrians or otherwise sus and are generally fine with missing/obstructed stop signs. I don't know what route self driving software will take to approximate such behavior but it likely will need to.

ChoGGi · 6h ago
> The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics.

There's different degrees of failure as well, did the Tesla try to brake beforehand or apply brakes after hitting the doll?

lawn · 7h ago
Bah. The car should slow down if the view is restricted or when it's passing a bus, especially a school bus, regardless if there's a stop sign or not.
ryandrake · 4h ago
Merely slowing down is not enough. If the car doesn't come to a complete stop and remain stopped while the bus has its stop sign extended, it's driving illegally.
instaclay · 3h ago
They're saying if there's a school bus inside of a narrow corridor, that a prudent driver would slow down and use caution IN ANY CIRCUMSTANCE.

They're obviously not arguing that the car shouldn't stop with a sign deployed.

Arguing from a point of bad faith doesn't advance the discussion.

ryandrake · 2h ago
I'm saying slowing down is better, but not enough. If the manufacturer can program the system to slow down when it sees a school bus inside of a narrow corridor, then it can also program the system to (correctly) stop in that situation.
BobaFloutist · 3h ago
I want them to drive legally, but I also want them to be able to react to objects even if they don't have signs on them.

Signs are often obstructed by trees or are simply inadequate for safe driving, even when combined with the rules of the road. Any even "partially" automated driving should be able to trivially recognize when its view is compromised and proceed accordingly.

ryandrake · 3h ago
Totally agree. "Stop for a stopped school bus" and "Drive slow enough to stop in an emergency, in areas where pedestrians are obscured" are two separate, barely related problems that car companies need to reliably solve.
alexey-salmin · 3h ago
Regardless of one's stance on Tesla, it's sad to see this post flagged.
nunez · 9m ago
this is a dan o'dowd production. he's spent significant capital trying to take down autopilot/fsd for years and has played dirty in the past.
Molitor5901 · 2h ago
I think the idea of self-driving needs to be strongly re-evaluated. There are countless videos of people in their Tesla's driving down the road.. from the back seat. FSD is simply not feasible right now, but it seems that when people let Tesla Take the Wheel, they are easily duped into assuming it will always work - when it doesn't.

Until there are federal standards and rigorous testing of self-driving vehicle technologies they should not be installed, or advertised.

angusb · 8h ago
This has done the rounds on other platforms. A couple of important points:

- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision

- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD

Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.

fabian2k · 8h ago
I don't see why Tesla would deserve the benefit of the doubt here. We cannot know how well the actual Taxi software will work, I think it is fair to extrapolate from the parts we can observe.
angusb · 8h ago
re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver
fabian2k · 8h ago
I absolutely do think that self-driving cars will save many lives in the long run. But I also think it is entirely fair to focus on the big, visible mistakes right now.

This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.

angusb · 7h ago
I agree it's a major mistake + should get a lot of focus from the FSD team. I'm just unsure whether that directly translates to prohibiting a robotaxi rollout (I'm open to the possibility it should though).

I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.

I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!

ethbr1 · 7h ago
Zero systemic, reproducible mistakes is the only acceptable criteria.

Do you really want to trust a heartless, profit-motivated corporation with 'better than human is good enough'?

What happens when Tesla decides they don't want to invest in additional mistake mitigation, because it's incompatible with their next product release?

Reubachi · 6h ago
Caveat/preface to prevent trolls: FSD is a sham and money grab at best, death trap at worst, etc.

But, I've read through your chain of rplies to OP and maybe I can help with my POV.

OP is replying in good faith showing "this sampling incident is out of scope of production testing/cars for several reasons, all greatly skewing the testing from this known bad actor source."

And you reply with "Zero systemic reproducible mistakes is the only acceptable critera."

Well then, you should know, that is the current situation. In tesla testing, they achieve this. The "test" in this article, which the OP is pointing out, is not a standardized test via Tesla on current platforms. SO be careful with your ultimatums, or you might give the corporation a green light to say "look! we tested it!".

I am not a tesla fan. However, I also am aware that yesterday, thousands of people across the world where mowed down by human operators.

If I put out a test video showing that a human runs over another human with minimum circumstances met, IE; rain, distraction, tires, density, etc., would you call for a halt on all human driving? Of course not, you'd investigate the root cause, which is most of the time, distracted or impaired driving.

angusb · 6h ago
> Do you really want to trust...

No, but the regulator helps here - they do their own independent evaluation

> What happens when Tesla decides...

the regulator should pressure them for improvements and suspend licenses for self driving services that don't improve

ethbr1 · 6h ago
The regulator doesn't currently exist in a functional capacity.

https://techcrunch.com/2025/02/21/elon-musks-doge-comes-for-...

spwa4 · 7h ago
It should get a lot of focus from the regulator, not "the FSD team".
ethbr1 · 7h ago
Tesla and this administration operate under the Boeing model: surely the manufacturer knows best.
angusb · 7h ago
agreed - I said FSD team to distinguish from "the crowds" but this was the wrong wording, should be the regulator too.
locococo · 8h ago
No! Ignoring a stop sign is such a basic driving standard that it's an automatic disqualification. A driver that misses a stop sign would not have my kids in their car. They could be the safest driver on the racetrack it does not matter at that point.
dzhiurgis · 6h ago
Also they've repeatedly tested closer and closer distances until Tesla failed aka p-hacking.
sokoloff · 59m ago
In the video (starting at ~13 seconds), the Tesla is at least 16 and probably 20 car lengths from the back of the bus with the bus red flashing lights on the entire time.

If the Tesla can't stop for the bus (not the kid) in 12 car lengths, that's not p-hacking, that's Tesla FSD being both unlawful and obviously unsafe.

reaperducer · 4h ago
I agree with that, but remember there's sampling error.

Ma'am, we're sorry your little girl got splattered all over the road by a billionaire's toy. But, hey, sampling errors happen.

locococo · 8h ago
I thin their test is valid and not in bad faith because they demonstrate the Teslas self driving technology has no basic safety standards.

Your argument that a newer version is better simply because it's newer does not convince me. The new version could still have that same issue.

angusb · 8h ago
> Your argument that a newer version is better

I actually didn't say that and am not arguing it formally - I said what I said because I think that the version difference is something that should be acknowledged when doing a test like this.

I do privately assume the new version will be better in some ways, but have no idea if this problem would be solved in it - so I agree with your last sentence.

Topfi · 7h ago
> […] I think that the version difference is something that should be acknowledged when doing a test like this.

Did they anywhere refer to this as Robotaxi software over calling it FSD, the term Tesla has chosen for this?

angusb · 6h ago
I don't think there's an official divergence in terminology other than the version numbers (which have mostly stopped incrementing for FSD vehicle owners, meanwhile there is a lot of work going into new iterations for the version running on tesla's robotaxi fleet)
Topfi · 2h ago
Then I struggle to understand what they should have acknowledged here concerning the software used? That they do not have access to a version of FSD which currently isn’t accessible to the public? I’d think that’s self-evident for any Organisation not affiliated with Tesla.
interloxia · 8h ago
It is also a failure that it does a cruise and continues to drive over the thing/child that was hit.
angusb · 8h ago
that should have been in my list, you're right
mxschumacher · 8h ago
it's always the next version that will go from catastrophic failure to being perfect. This card has been played 100 times over the last few years.
jantissler · 3h ago
Exactly what I wanted to add. Every single time there is hard evidence what a failure FSD is, someone points out that they didn't use the latest Beta. And of course they provide zero evidence that this newer version actually addresses the problem. Anyone who knows anything about Software and updates understand how new versions can actually introduce new problems and new bugs …
locococo · 8h ago
excatly this
chneu · 8h ago
Everytime Tesla's FSD is shown to be lacking someone always says "well that's not the real version and these people are bias!"
ryandrake · 4h ago
Don't forget the standard "And the next version surely will be much better!"
bestouff · 8h ago
> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

That's why you slow down when you pass a bus (or a huge american SUV).

sokoloff · 8h ago
In this case, you stop for the bus that is displaying the flashing red lights.

Every driver/car that obeys the law has absolutely no problem avoiding this collision with the child, which is why the bus has flashing lights and a stop sign.

ethbr1 · 7h ago
I'm honestly shocked that all versions of Tesla assisted/self-driving don't have a hardcoded stopped schoolbus exception to force slow driving.

Worst case, you needlessly slow down around a roadside parked schoolbus.

Best case, you save a kid's life.

ryandrake · 4h ago
If it merely slowed down, it would still be driving illegally. Everywhere I've driven in the USA, if there is a stopped school bus in front of you, or on the other side of the road, all traffic on the road must stop and remain stopped until the school bus retracts the sign and resumes driving.
ethbr1 · 3h ago
Point being, if it always slowed down (sign or no sign), then at least reaction time requirements would be lessened.
ryandrake · 2h ago
I mean, it can be programmed in any way, so why not program it to follow both rules: If the sign is extended, then stop. Else if the sign is not extended, slow down.
pas · 6h ago
if I put a stop sign somewhere is that legal? or there's some statute (or at least local ordinance) that says that yellow buses on this and this route can have moving stop signs?
sokoloff · 5h ago
Obviously, there is law that forms the basis for this. In Massachusetts, it's MGL c. 90 § 14

https://www.mass.gov/info-details/mass-general-laws-c90-ss-1...

addandsubtract · 6h ago
Yes, in the US, school buses come equipped with a stop sign that they extend when stopping to let kids on and off the bus. You (as a driver) are required to stop (and not pass) on both sides of the road while the bus has their sign extended.
q3k · 6h ago
It's almost as if we need human-level intelligence to be actually able to reason about the nuances of applicability of traffic signs :).
Sharlin · 8h ago
Exactly. If your view is obstructed by something, anything, you slow down.
Zigurd · 5h ago
The simulation of a kid running out from behind the bus is both realistic, and it points out another aspect of the problem with FSD. It didn't just pass the bus illegally. It was going far too fast while passing the bus.

As for being unrepresentative of the next release of FSD, we've had what eight years ten years of it's going to work on the next release.

BobaFloutist · 3h ago
>(Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

Shouldn't that be the one case where self driving system has an enormous natural advantage? It has faster reflexes, and it doesn't require much, if any, interpretation or understanding of signs or predictive behavior of other drivers. At the very worst, the car should be able to detect a big object in the road and try to brake and avoid the object. If the car can't take minimal steps to avoid crashing into any given thing that's in front of it on the road, what are we even doing here?

enragedcacti · 3h ago
Some important counter-points:

- FSD has been failing this test publicly for almost three years, including in a Super Bowl commercial. It strains credulity to imagine that they have a robust solution that they haven't bothered to use to shut up their loudest critic.

- The Robotaxi version of FSD is reportedly optimized for a small area of austin, and is going to extensively use tele-operators as safety drivers. There is no evidence that Robotaxi FSD isn't "supposed" to be used with human supervision, its supervision will just be subject to latency and limited spatial awareness.

- The Dawn Project's position is that FSD should be completely banned because Tesla is negligent with regard to safety. Having a test coincide with the Robotaxi launch is good for publicity but the distinction isn't really relevant because the fundamental flaw is with the companies approach to safety regardless of FSD version.

- Tesla doesn't have an inalienable right to test 2-ton autonomous machines on public roads. If they wanted to demonstrate the safety of the robotaxi version they could publish the reams of tests they've surely conducted and begin reporting industry standard metrics like miles per critical disengagement.

arccy · 8h ago
sounds like a tesla problem for naming their crappy tech "full self driving"

No comments yet

WillAdams · 8h ago
Does the Tesla taxi option afford radar/lidar?

My understanding is that Tesla is the only manufacturer trying to make self-driving work with just visual-spectrum cameras --- all other vendors use radar/lidar _and_ visual-spectrum cameras.

ndsipa_pomu · 8h ago
They've painted themselves into a corner really, as if they start using anything other than just cameras, they'll be on the hook for selling previous cars as being fully FSD-capable and presumably have to retrofit radar/lidar to anyone that was mis-sold a Tesla.
rsynnott · 4h ago
... Wait, so you think that Tesla have a child-killing and a non-child-killing version, but are only providing the child-killing version to consumers?

... eh? I mean, what?

ndsipa_pomu · 8h ago
> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

Not really - it's a case of slowing down and anticipating a potential hazard. It's a fairly common situation with any kind of bus, or similarly if you're overtaking a stationary high-sided vehicle as pedestrians may be looking to cross the road (in non-jay-walking jurisdictions).

burnt-resistor · 5h ago
Yes. And given the latency of cameras (or even humans) not being able to see around objects and that dogs and kids and move fast from hidden areas into the path, driving really slow next to large obstacles until able to see behind them becomes more important.

One of the prime directives of driving for humans and FAD systems must be "never drive faster than brakes can stop in visible areas". This must account for such scenarios as obstacles stopped or possible coming the wrong way around a mountain turn.

BobaFloutist · 3h ago
And that should be a case where automated systems can easily beat people. It's "just" math, there's no interpretation, no reasoning, no reading signs, no predicting other drivers, just crunching the numbers based on stopping distance and field of view. If they can't even do this, what are they for?
ndsipa_pomu · 3h ago
> never drive faster than brakes can stop in visible areas

Here in the UK, that's phrased in the Highway Code as "Drive at a speed that will allow you to stop well within the distance you can see to be clear". It's such a basic tenet of safety as you never know if there's a fallen tree just round the next blind corner etc. However, it doesn't strictly apply to peds running out from behind an obstruction as your way ahead can be clear, until suddenly it isn't - sometimes you have to slow just for a possible hazard.

handsclean · 7h ago
I’ve seen this “next version” trick enough times and in enough contexts to know when to call BS. There’s always a next version, it’s rarely night-and-day better, and when it is better you’ll have evidence, not just a salesman’s word. People deserve credit/blame today for reality today, and if reality changes tomorrow, tomorrow is when they’ll deserve credit/blame for that change. Anybody who tries to get you to judge them today for what you can’t see today is really just trying to make you dismiss what you see in favor of what you’re told.
aeurielesn · 7h ago
Sorry, I am failing to see how any of these three points are relevant or even important.
jofzar · 8h ago
Before anyone says it's on the responsibility of the driver, that's only while there is still a driver.

https://www.reddit.com/r/teslamotors/comments/1l84dkq/first_...

ndsipa_pomu · 3h ago
I don't know why this is flagged unless it's just the Tesla/Musk association. I thought that self-driving vehicles are a popular topic on HN.
lawn · 2h ago
It's obviously because of Musk. Anything that paints him in a bad light is flagged ad nauseam.
sigmoid10 · 8h ago
Why on earth can't this be done by normal testing agencies? Why do things like "Tesla Takedown" have to participate in it? Even if the test was 100% legit, that connection to mere protest movements taints it immediately. It's like when oil companies publish critical research on climate change. Or Apple publishing research that AI is not that good and their own fumblings should not be seen as a bad omen. This kind of stuff could be factually completely correct and most rational people would still immediately dismiss it due to conflict of interest. All this will do is flame up fanboys who were already behind it and get ignored by people who weren't. If real goal is to divide society, this is how you do it.
sitkack · 8h ago
You mean the ones gutted by said owner of said company?

Comparing "Tesla Takedown" with ExxonMobile is way too quick, you should have said Greenpeace. I'd say that TT has to do this, Is part of the point.

cannonpr · 8h ago
In normal times, perhaps, today…

https://www.theverge.com/news/646797/nhtsa-staffers-office-v...

When regular in theory bipartisan mechanisms fail, protest is all you have left.

fragmede · 8h ago
> NHTSA staffers evaluating the risks of self-driving cars were reportedly fired by

Elon Musk, who also owns Tesla.

No comments yet

philistine · 5h ago
You seem to ignore the historical reasons why testing agencies exist in the first place. There wasn't a consensus back then that they even needed to exist. People like Ralph Nader needed to hammer the point again and again and again to will those standards into existence. The pressure groups fighting against Tesla are doing the same harsh difficult job that Nader did.

https://en.wikipedia.org/wiki/Unsafe_at_Any_Speed%3A_The_Des...

randomcarbloke · 8h ago
Well quite, it's an indictment of americas institutions if investigations held in the public interest must be conducted by third parties and not the establishment.

What other hidden dangers slip by without public knowledge.

ethbr1 · 7h ago
If only there some proven way to hold abusive power to account in the public consciousness. https://en.m.wikipedia.org/wiki/The_Jungle
ecocentrik · 7h ago
For the same reason that nonprofit consumer advocacy and safety organizations like Consumer Reports and the Center for Auto Safety exist. Lobbyists and wealthy private individuals exert a lot of influence over the operations of publicly funded oversight agencies. They have the power to censor these agencies or as we've seen recently, fire all their staff, defund them or close them completely.

As for the fanbois and f*cbois, they have always existed and will always exist. They are the pawns. Smart mature people learn to lead, direct, manipulate, deflect and ignore them.

ndsipa_pomu · 8h ago
In my view, the onus to conduct tests and prove that it is safe to use should be conducted by the manufacturer and should meet the requirements of the regulating authority (presumably NHTSA in the U.S.) and appropriate city/states if they agree to allow limited public road usage.
bjord · 8h ago
In theory, that sounds great, but in practice, why should we trust manufacturers to reliably test their own products?
ndsipa_pomu · 8h ago
I agree, but it's fairly common practise. I think a "trust, but verify" approach should be used and jail board members for attempts to fool the regulators (c.f. VW emissions fraud)
const_cast · 37m ago
The problem is that the manufacturer of course has a very strong incentive to stand with themselves. People don't advocate against themselves, and companies are no different.

When J&J found out about potential asbestos contamination in their baby powder in the 70s, they managed to convince the FDA that they would research and handle it. It took until 2020 for it to come to light that they did not do that, and that, in fact, their baby powder was contaminated.

They ran multiple studies, and some of them even showed that the amount of asbestos in their product was dangerous. But those studies never saw the light of day, and the company acted in a self-preserving manner. It's a similar story with 3M and byproducts of Teflon.

But, federal or state agencies have no alliance to a company's bottom line. They don't have the same incentives to lie or downplay. So, I think, it only makes sense that they should be responsible for testing directly, not just supervising.

I also think we need to adopt some legislation so that we must test products before we release them. You may be shocked to know you're allowed to release new chemical products without proving their safety, and you can even release new food products without proving their safety. Most of these products end up being okay in the long run, but some we have to retroactively ban. It would be easier for everyone if we begin in a banned state.

ethbr1 · 6h ago
The difference in staffing between trust+verification and no-trust is miniscule, because any savings are things that are trusted and unverified.

Why not take an objective, fact-based regulatory approach from the start?

orwin · 5h ago
I've read somewhere that the NHTSA people working on mandatory tests for self driving were fired by DOGE.
lawn · 8h ago
Tesla has been lying for years. Why would you trust them to conduct their own tests? That's completely backwards.

Independent tests are what's needed, and preferably done by agencies who won't get gutted because they find something that's inconvenient.

Eisenstein · 8h ago
Which testing agencies are doing these tests?
sigmoid10 · 8h ago
sjsdaiuasgdia · 8h ago
That's crash testing, which is about the safety of the people inside the vehicle.

This test is about whether Tesla's self driving technology is safe for the people outside the vehicle.

sigmoid10 · 8h ago
>That's crash testing, which is about the safety of the people inside the vehicle.

Crash testing is much more than that. Check out NCAP for example. They specifically include safety ratings for other, vulnerable road users (i.e. pedestrians). And the Model 3 currently sits at the top spot of their 2025 rankings in that category.

sjsdaiuasgdia · 8h ago
That's still after the person outside the vehicle has been struck.

This test shows the self driving software disobeys traffic laws (the school bus stop sign requires everyone else to stop), resulting in a pedestrian collision that would not have happened had traffic laws been followed.

potato3732842 · 8h ago
The same agencies that do crash testing generally do all manner of other tests. Rollover testing, brake distance testing, restraint systems, drive aids, car seat fitment, etc, etc.
fragmede · 7h ago
Or cigarette companies having doctors show that cigarettes don't cause cancer. It turns out that the bar for science is really really high, and for stuff that's less obvious and takes time and lots of money and effort, it's really hard to show. You'd think I was a loon if I told you gravity wasn't a decided matter of physics, since we're all glued to this Earth every day, but we still don't know that dark matter and the ramifications for the theory of gravity are. Science still doesn't know how to test psychedelic medicine because it's obvious to the control group that they're on the placebo. That doesn't mean we should lower the bar for science, just that we should be aware of shortcomings in our beliefs.

So if you want to get into the details and figure out why a video from "Tesla Takedown" should or should not be believed, in all ears, but I'm some random on the Internet. I don't work at NHTSA or anywhere that could affect change based on the outcome of this video. It's not going to affect my buying decisions one way or another, but it'll only divide people who have decided this matters to them and can't get along with others.

ndsipa_pomu · 8h ago
It seems crazy to me that the U.S. allows beta software to be used to control potentially lethal vehicles on public roads. (Not that I have much faith in human controlled lethal vehicles)

No comments yet

anArbitraryOne · 6h ago
Hopefully the dummies were from the board of directors
vachina · 8h ago
Isn’t FSD modeled after real drivers?

In that case it could’ve learnt almost nobody ever fully stops at a stop sign, or in this case a bus stopped with a stop sign.

sokoloff · 8h ago
Blowing the stop signals on a school bus is one of the few traffic laws that I see overwhelmingly support for strict adherence.

People are way more accepting/understanding of drunk driving than passing school bus stop signals.