Tesla blows past stopped school bus and hits kid-sized dummies in FSD tests

88 ndsipa_pomu 104 6/16/2025, 9:58:01 AM engadget.com ↗

Comments (104)

alexey-salmin · 5m ago
Regardless of one's stance on Tesla, it's sad to see this post flagged.
2rsf · 4h ago
> "requires a fully attentive driver and will display a series of escalating warnings requiring driver response."

I understand the reasoning behind it, but watching the video () of the test shows that the car did not warn the driver, and even if it did it was speeding too much leaving almost no time for a driver to respond

Disclaimer- I have never used FSD before

() https://dawnproject.com/the-dawn-project-and-tesla-takedowns...

denniebee · 4h ago
> but watching the video () of the test shows that the car did not warn the driver

The warnings occur when you look away or don't touch the steering wheel for a while. Not saying that Tesla is without error (it isn't), but just clarify what the warnings are for.

hulitu · 4h ago
> The warnings occur when you look away

So they are useless. My car warns me even if i don't look.

jlbooker · 59m ago
> So they are useless. My car warns me even if i don't look.

No, they serve a very specific purpose -- (attempting) to ensure the driver is at the controls and paying attention.

Don't confuse the FSD attention "nag" warnings with collision warnings. The collision warnings will sound all the time, with and without FSD enabled and even if you're not looking at the road. If you don't start slowing down quickly, Automatic Emergency Braking (AEB) will slam on the brakes and bring the car to a stop.

reaperducer · 1h ago
So they are useless. My car warns me even if i don't look.

Heck, my car not only warns you, it slams on the brakes for you.

Scared the heck out of me when it happened, but it saved me from hitting something I didn't realize was so close.

locococo · 4h ago
Ignoring a stop sign, not even slowing down, thats a major safety flaw.

I am wondering if there is a safety certification body for self driving technology. If not, one is needed because consumers can't be expected to be aware of all the limitations of the latest update they have installed.

There must be basic safety standards these systems need to meet, a disclaimer can't be the solution here.

antennafirepla · 4h ago
There isn’t. We won’t regulate until there is public outcry over tens or hundreds of deaths.
addandsubtract · 2h ago
Or you pay Trump more money than Elon.
potato3732842 · 4h ago
The real problem is that it didn't recognize and stop for the stop signs on the school bus. The child is basically an afterthought designed to appeal to the emotion of those whom logic fails. Even if no kids materialized the way a bus stop works (bus stops, then kids cross) means that detecting the kid really shouldn't be the primary trigger for stopping in this situation, the stop sign needs to be recognized and acted upon. Any ability to detect and stop for pedestrians is secondary to that.
b3orn · 4h ago
I don't agree with this. Not hitting pedestrians should not just be an afterthought. Of course the car should recognise the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.
potato3732842 · 2h ago
>I don't agree with this. Not hitting pedestrians should not just be an afterthought.

You're disagreeing with something I didn't say. There's a difference between afterthought and the primary initiator of the stop in a situation like this.

>Of course the car should recognize the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.

The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics. Avoiding errant pedestrians like in the video will likely only come as a byproduct of better situational behavior by self driving vehicles. The overwhelming majority of drivers know to ignore the speed limit if the situation is rife with pedestrians or otherwise sus and are generally fine with missing/obstructed stop signs. I don't know what route self driving software will take to approximate such behavior but it likely will need to.

ChoGGi · 2h ago
> The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics.

There's different degrees of failure as well, did the Tesla try to brake beforehand or apply brakes after hitting the doll?

lawn · 3h ago
Bah. The car should slow down if the view is restricted or when it's passing a bus, especially a school bus, regardless if there's a stop sign or not.
ryandrake · 10m ago
Merely slowing down is not enough. If the car doesn't come to a complete stop and remain stopped while the bus has its stop sign extended, it's driving illegally.
angusb · 4h ago
This has done the rounds on other platforms. A couple of important points:

- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision

- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD

Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.

fabian2k · 4h ago
I don't see why Tesla would deserve the benefit of the doubt here. We cannot know how well the actual Taxi software will work, I think it is fair to extrapolate from the parts we can observe.
angusb · 4h ago
re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver
fabian2k · 4h ago
I absolutely do think that self-driving cars will save many lives in the long run. But I also think it is entirely fair to focus on the big, visible mistakes right now.

This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.

angusb · 3h ago
I agree it's a major mistake + should get a lot of focus from the FSD team. I'm just unsure whether that directly translates to prohibiting a robotaxi rollout (I'm open to the possibility it should though).

I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.

I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!

ethbr1 · 3h ago
Zero systemic, reproducible mistakes is the only acceptable criteria.

Do you really want to trust a heartless, profit-motivated corporation with 'better than human is good enough'?

What happens when Tesla decides they don't want to invest in additional mistake mitigation, because it's incompatible with their next product release?

Reubachi · 2h ago
Caveat/preface to prevent trolls: FSD is a sham and money grab at best, death trap at worst, etc.

But, I've read through your chain of rplies to OP and maybe I can help with my POV.

OP is replying in good faith showing "this sampling incident is out of scope of production testing/cars for several reasons, all greatly skewing the testing from this known bad actor source."

And you reply with "Zero systemic reproducible mistakes is the only acceptable critera."

Well then, you should know, that is the current situation. In tesla testing, they achieve this. The "test" in this article, which the OP is pointing out, is not a standardized test via Tesla on current platforms. SO be careful with your ultimatums, or you might give the corporation a green light to say "look! we tested it!".

I am not a tesla fan. However, I also am aware that yesterday, thousands of people across the world where mowed down by human operators.

If I put out a test video showing that a human runs over another human with minimum circumstances met, IE; rain, distraction, tires, density, etc., would you call for a halt on all human driving? Of course not, you'd investigate the root cause, which is most of the time, distracted or impaired driving.

angusb · 3h ago
> Do you really want to trust...

No, but the regulator helps here - they do their own independent evaluation

> What happens when Tesla decides...

the regulator should pressure them for improvements and suspend licenses for self driving services that don't improve

ethbr1 · 2h ago
The regulator doesn't currently exist in a functional capacity.

https://techcrunch.com/2025/02/21/elon-musks-doge-comes-for-...

spwa4 · 3h ago
It should get a lot of focus from the regulator, not "the FSD team".
ethbr1 · 3h ago
Tesla and this administration operate under the Boeing model: surely the manufacturer knows best.
angusb · 3h ago
agreed - I said FSD team to distinguish from "the crowds" but this was the wrong wording, should be the regulator too.
locococo · 4h ago
No! Ignoring a stop sign is such a basic driving standard that it's an automatic disqualification. A driver that misses a stop sign would not have my kids in their car. They could be the safest driver on the racetrack it does not matter at that point.
reaperducer · 58m ago
I agree with that, but remember there's sampling error.

Ma'am, we're sorry your little girl got splattered all over the road by a billionaire's toy. But, hey, sampling errors happen.

dzhiurgis · 2h ago
Also they've repeatedly tested closer and closer distances until Tesla failed aka p-hacking.
enragedcacti · 3m ago
Some important counter-points:

- FSD has been failing this test publicly for almost three years, including in a Super Bowl commercial. It strains credulity to imagine that they have a robust solution that they haven't bothered to use to shut up their loudest critic.

- The Robotaxi version of FSD is reportedly optimized for a small area of austin, and is going to extensively use tele-operators as safety drivers. There is no evidence that Robotaxi FSD isn't "supposed" to be used with human supervision, its supervision will just be subject to latency and limited spatial awareness.

- The Dawn Project's position is that FSD should be completely banned because Tesla is negligent with regard to safety. Having a test coincide with the Robotaxi launch is good for publicity but the distinction isn't really relevant because the fundamental flaw is with the companies approach to safety regardless of FSD version.

- Tesla doesn't have an inalienable right to test 2-ton autonomous machines on public roads. If they wanted to demonstrate the safety of the robotaxi version they could publish the reams of tests they've surely conducted and begin reporting industry standard metrics like miles per critical disengagement.

locococo · 4h ago
I thin their test is valid and not in bad faith because they demonstrate the Teslas self driving technology has no basic safety standards.

Your argument that a newer version is better simply because it's newer does not convince me. The new version could still have that same issue.

angusb · 4h ago
> Your argument that a newer version is better

I actually didn't say that and am not arguing it formally - I said what I said because I think that the version difference is something that should be acknowledged when doing a test like this.

I do privately assume the new version will be better in some ways, but have no idea if this problem would be solved in it - so I agree with your last sentence.

Topfi · 3h ago
> […] I think that the version difference is something that should be acknowledged when doing a test like this.

Did they anywhere refer to this as Robotaxi software over calling it FSD, the term Tesla has chosen for this?

angusb · 3h ago
I don't think there's an official divergence in terminology other than the version numbers (which have mostly stopped incrementing for FSD vehicle owners, meanwhile there is a lot of work going into new iterations for the version running on tesla's robotaxi fleet)
interloxia · 4h ago
It is also a failure that it does a cruise and continues to drive over the thing/child that was hit.
angusb · 4h ago
that should have been in my list, you're right
bestouff · 4h ago
> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

That's why you slow down when you pass a bus (or a huge american SUV).

sokoloff · 4h ago
In this case, you stop for the bus that is displaying the flashing red lights.

Every driver/car that obeys the law has absolutely no problem avoiding this collision with the child, which is why the bus has flashing lights and a stop sign.

ethbr1 · 3h ago
I'm honestly shocked that all versions of Tesla assisted/self-driving don't have a hardcoded stopped schoolbus exception to force slow driving.

Worst case, you needlessly slow down around a roadside parked schoolbus.

Best case, you save a kid's life.

ryandrake · 18m ago
If it merely slowed down, it would still be driving illegally. Everywhere I've driven in the USA, if there is a stopped school bus in front of you, or on the other side of the road, all traffic on the road must stop and remain stopped until the school bus retracts the sign and resumes driving.
pas · 2h ago
if I put a stop sign somewhere is that legal? or there's some statute (or at least local ordinance) that says that yellow buses on this and this route can have moving stop signs?
sokoloff · 1h ago
Obviously, there is law that forms the basis for this. In Massachusetts, it's MGL c. 90 § 14

https://www.mass.gov/info-details/mass-general-laws-c90-ss-1...

addandsubtract · 2h ago
Yes, in the US, school buses come equipped with a stop sign that they extend when stopping to let kids on and off the bus. You (as a driver) are required to stop (and not pass) on both sides of the road while the bus has their sign extended.
q3k · 2h ago
It's almost as if we need human-level intelligence to be actually able to reason about the nuances of applicability of traffic signs :).
Sharlin · 4h ago
Exactly. If your view is obstructed by something, anything, you slow down.
mxschumacher · 4h ago
it's always the next version that will go from catastrophic failure to being perfect. This card has been played 100 times over the last few years.
locococo · 4h ago
excatly this
chneu · 4h ago
Everytime Tesla's FSD is shown to be lacking someone always says "well that's not the real version and these people are bias!"
ryandrake · 17m ago
Don't forget the standard "And the next version surely will be much better!"
Zigurd · 2h ago
The simulation of a kid running out from behind the bus is both realistic, and it points out another aspect of the problem with FSD. It didn't just pass the bus illegally. It was going far too fast while passing the bus.

As for being unrepresentative of the next release of FSD, we've had what eight years ten years of it's going to work on the next release.

arccy · 4h ago
sounds like a tesla problem for naming their crappy tech "full self driving"

No comments yet

WillAdams · 4h ago
Does the Tesla taxi option afford radar/lidar?

My understanding is that Tesla is the only manufacturer trying to make self-driving work with just visual-spectrum cameras --- all other vendors use radar/lidar _and_ visual-spectrum cameras.

ndsipa_pomu · 4h ago
They've painted themselves into a corner really, as if they start using anything other than just cameras, they'll be on the hook for selling previous cars as being fully FSD-capable and presumably have to retrofit radar/lidar to anyone that was mis-sold a Tesla.
rsynnott · 35m ago
... Wait, so you think that Tesla have a child-killing and a non-child-killing version, but are only providing the child-killing version to consumers?

... eh? I mean, what?

ndsipa_pomu · 4h ago
> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

Not really - it's a case of slowing down and anticipating a potential hazard. It's a fairly common situation with any kind of bus, or similarly if you're overtaking a stationary high-sided vehicle as pedestrians may be looking to cross the road (in non-jay-walking jurisdictions).

burnt-resistor · 2h ago
Yes. And given the latency of cameras (or even humans) not being able to see around objects and that dogs and kids and move fast from hidden areas into the path, driving really slow next to large obstacles until able to see behind them becomes more important.

One of the prime directives of driving for humans and FAD systems must be "never drive faster than brakes can stop in visible areas". This must account for such scenarios as obstacles stopped or possible coming the wrong way around a mountain turn.

ndsipa_pomu · 8m ago
> never drive faster than brakes can stop in visible areas

Here in the UK, that's phrased in the Highway Code as "Drive at a speed that will allow you to stop well within the distance you can see to be clear". It's such a basic tenet of safety as you never know if there's a fallen tree just round the next blind corner etc. However, it doesn't strictly apply to peds running out from behind an obstruction as your way ahead can be clear, until suddenly it isn't - sometimes you have to slow just for a possible hazard.

handsclean · 3h ago
I’ve seen this “next version” trick enough times and in enough contexts to know when to call BS. There’s always a next version, it’s rarely night-and-day better, and when it is better you’ll have evidence, not just a salesman’s word. People deserve credit/blame today for reality today, and if reality changes tomorrow, tomorrow is when they’ll deserve credit/blame for that change. Anybody who tries to get you to judge them today for what you can’t see today is really just trying to make you dismiss what you see in favor of what you’re told.
aeurielesn · 3h ago
Sorry, I am failing to see how any of these three points are relevant or even important.
jofzar · 5h ago
Before anyone says it's on the responsibility of the driver, that's only while there is still a driver.

https://www.reddit.com/r/teslamotors/comments/1l84dkq/first_...

ndsipa_pomu · 5m ago
I don't know why this is flagged unless it's just the Tesla/Musk association. I thought that self-driving vehicles are a popular topic on HN.
sigmoid10 · 4h ago
Why on earth can't this be done by normal testing agencies? Why do things like "Tesla Takedown" have to participate in it? Even if the test was 100% legit, that connection to mere protest movements taints it immediately. It's like when oil companies publish critical research on climate change. Or Apple publishing research that AI is not that good and their own fumblings should not be seen as a bad omen. This kind of stuff could be factually completely correct and most rational people would still immediately dismiss it due to conflict of interest. All this will do is flame up fanboys who were already behind it and get ignored by people who weren't. If real goal is to divide society, this is how you do it.
sitkack · 4h ago
You mean the ones gutted by said owner of said company?

Comparing "Tesla Takedown" with ExxonMobile is way too quick, you should have said Greenpeace. I'd say that TT has to do this, Is part of the point.

cannonpr · 4h ago
In normal times, perhaps, today…

https://www.theverge.com/news/646797/nhtsa-staffers-office-v...

When regular in theory bipartisan mechanisms fail, protest is all you have left.

fragmede · 4h ago
> NHTSA staffers evaluating the risks of self-driving cars were reportedly fired by

Elon Musk, who also owns Tesla.

No comments yet

philistine · 1h ago
You seem to ignore the historical reasons why testing agencies exist in the first place. There wasn't a consensus back then that they even needed to exist. People like Ralph Nader needed to hammer the point again and again and again to will those standards into existence. The pressure groups fighting against Tesla are doing the same harsh difficult job that Nader did.

https://en.wikipedia.org/wiki/Unsafe_at_Any_Speed%3A_The_Des...

randomcarbloke · 4h ago
Well quite, it's an indictment of americas institutions if investigations held in the public interest must be conducted by third parties and not the establishment.

What other hidden dangers slip by without public knowledge.

ethbr1 · 3h ago
If only there some proven way to hold abusive power to account in the public consciousness. https://en.m.wikipedia.org/wiki/The_Jungle
ecocentrik · 3h ago
For the same reason that nonprofit consumer advocacy and safety organizations like Consumer Reports and the Center for Auto Safety exist. Lobbyists and wealthy private individuals exert a lot of influence over the operations of publicly funded oversight agencies. They have the power to censor these agencies or as we've seen recently, fire all their staff, defund them or close them completely.

As for the fanbois and f*cbois, they have always existed and will always exist. They are the pawns. Smart mature people learn to lead, direct, manipulate, deflect and ignore them.

ndsipa_pomu · 4h ago
In my view, the onus to conduct tests and prove that it is safe to use should be conducted by the manufacturer and should meet the requirements of the regulating authority (presumably NHTSA in the U.S.) and appropriate city/states if they agree to allow limited public road usage.
bjord · 4h ago
In theory, that sounds great, but in practice, why should we trust manufacturers to reliably test their own products?
ndsipa_pomu · 4h ago
I agree, but it's fairly common practise. I think a "trust, but verify" approach should be used and jail board members for attempts to fool the regulators (c.f. VW emissions fraud)
ethbr1 · 3h ago
The difference in staffing between trust+verification and no-trust is miniscule, because any savings are things that are trusted and unverified.

Why not take an objective, fact-based regulatory approach from the start?

orwin · 1h ago
I've read somewhere that the NHTSA people working on mandatory tests for self driving were fired by DOGE.
lawn · 4h ago
Tesla has been lying for years. Why would you trust them to conduct their own tests? That's completely backwards.

Independent tests are what's needed, and preferably done by agencies who won't get gutted because they find something that's inconvenient.

Eisenstein · 4h ago
Which testing agencies are doing these tests?
sigmoid10 · 4h ago
sjsdaiuasgdia · 4h ago
That's crash testing, which is about the safety of the people inside the vehicle.

This test is about whether Tesla's self driving technology is safe for the people outside the vehicle.

sigmoid10 · 4h ago
>That's crash testing, which is about the safety of the people inside the vehicle.

Crash testing is much more than that. Check out NCAP for example. They specifically include safety ratings for other, vulnerable road users (i.e. pedestrians). And the Model 3 currently sits at the top spot of their 2025 rankings in that category.

sjsdaiuasgdia · 4h ago
That's still after the person outside the vehicle has been struck.

This test shows the self driving software disobeys traffic laws (the school bus stop sign requires everyone else to stop), resulting in a pedestrian collision that would not have happened had traffic laws been followed.

potato3732842 · 4h ago
The same agencies that do crash testing generally do all manner of other tests. Rollover testing, brake distance testing, restraint systems, drive aids, car seat fitment, etc, etc.
fragmede · 3h ago
Or cigarette companies having doctors show that cigarettes don't cause cancer. It turns out that the bar for science is really really high, and for stuff that's less obvious and takes time and lots of money and effort, it's really hard to show. You'd think I was a loon if I told you gravity wasn't a decided matter of physics, since we're all glued to this Earth every day, but we still don't know that dark matter and the ramifications for the theory of gravity are. Science still doesn't know how to test psychedelic medicine because it's obvious to the control group that they're on the placebo. That doesn't mean we should lower the bar for science, just that we should be aware of shortcomings in our beliefs.

So if you want to get into the details and figure out why a video from "Tesla Takedown" should or should not be believed, in all ears, but I'm some random on the Internet. I don't work at NHTSA or anywhere that could affect change based on the outcome of this video. It's not going to affect my buying decisions one way or another, but it'll only divide people who have decided this matters to them and can't get along with others.

ndsipa_pomu · 4h ago
It seems crazy to me that the U.S. allows beta software to be used to control potentially lethal vehicles on public roads. (Not that I have much faith in human controlled lethal vehicles)

No comments yet

anArbitraryOne · 2h ago
Hopefully the dummies were from the board of directors
vachina · 4h ago
Isn’t FSD modeled after real drivers?

In that case it could’ve learnt almost nobody ever fully stops at a stop sign, or in this case a bus stopped with a stop sign.

sokoloff · 4h ago
Blowing the stop signals on a school bus is one of the few traffic laws that I see overwhelmingly support for strict adherence.

People are way more accepting/understanding of drunk driving than passing school bus stop signals.