Experts lay into Tesla safety in federal autopilot trial

24 duxup 6 7/18/2025, 5:32:47 PM arstechnica.com ↗

Comments (6)

bryanlarsen · 1h ago
For context, in 2019 Tesla's autopilot did not stop for any stop signs or red lights, it was just a lane-following traffic aware cruise control with an arguably deceptive name. It also has a mechanism to ensure that the driver stays alert while autopilot is on. The trial is thus about that mechanism, not about self-driving.
edgineer · 58m ago
"it is my professional opinion that Tesla's Autopilot is defective because Tesla knowingly allows the car to be operated in operational domains for which it is explicitly not designed for."

Outrageous; compare to cruise control on other cars.

In line with the outrageous regulation kill-switch-on-all-new-cars-after-2026, inside the infrastructure and jobs bill passed in 2021.

I see there's a newer bill drafted called No Kill Switches in Cars Act, hope it gets moving.

2OEH8eoCRo0 · 7m ago
Cruise control on other cars isn't called autopilot and people know what to expect.
abbotcabbot · 2h ago
I don't really get the description going through a stop sign and hitting pedestrians next to a stopped car. It sounds like 2 entirely independent errors in succession as no right of way, etc, is really affected by the stop?
MBCook · 2h ago
It committed two extremely dangerous maneuvers in quick succession (one illegal), and the injuries to the people involved sound like they likely would have been much less if it had stopped as required by law.

Either way, it broke the law and killed someone. Does it matter if it was an act in 1 part or 5?

abbotcabbot · 1h ago
> Does it matter if it was an act in 1 part or 5?

If it had 5 errors 4 of which were not deadly due to random circumstances like no vehicle with right of way passing, then that is worse than if it committed one error that was deadly. I might fix the second implementation and scrap the first.