For context, in 2019 Tesla's autopilot did not stop for any stop signs or red lights, it was just a lane-following traffic aware cruise control with an arguably deceptive name. It also has a mechanism to ensure that the driver stays alert while autopilot is on. The trial is thus about that mechanism, not about self-driving.
MBCook · 2h ago
Right. This is not Full Self Driving at issue.
constantcrying · 1h ago
But then what is the trial about? A driver neglected to stop for a stop sign and somehow Tesla is at fault?
Surely the driver had to know his car did not take over these functions and if the software was not designed to do this, then it couldn't have malfunctioned.
edgineer · 3h ago
"it is my professional opinion that Tesla's Autopilot is defective because Tesla knowingly allows the car to be operated in operational domains for which it is explicitly not designed for."
Outrageous; compare to cruise control on other cars.
In line with the outrageous regulation kill-switch-on-all-new-cars-after-2026, inside the infrastructure and jobs bill passed in 2021.
I see there's a newer bill drafted called No Kill Switches in Cars Act, hope it gets moving.
MBCook · 2h ago
Autopilot isn’t cruise control. It’s ADAS. Apples and oranges.
Other ADAS systems have much better driver monitoring. My 2016 Honda did. My 2021 Ford does.
The government has forced Tesla to strengthen their monitoring at least twice.
Tesla called it Autopilot. Tesla (seemingly) didn’t care about reckless behavior. Tesla repeatedly had misleading claims about being far safer than humans driving. Tesla didn’t even keep the data that would be needed to prove that for the first 2 years.
2OEH8eoCRo0 · 2h ago
Cruise control on other cars isn't called autopilot and people know what to expect.
abbotcabbot · 5h ago
I don't really get the description going through a stop sign and hitting pedestrians next to a stopped car. It sounds like 2 entirely independent errors in succession as no right of way, etc, is really affected by the stop?
MBCook · 4h ago
It committed two extremely dangerous maneuvers in quick succession (one illegal), and the injuries to the people involved sound like they likely would have been much less if it had stopped as required by law.
Either way, it broke the law and killed someone. Does it matter if it was an act in 1 part or 5?
1970-01-01 · 2h ago
That is a yes for any legal definition of harm that results in death.
abbotcabbot · 3h ago
> Does it matter if it was an act in 1 part or 5?
If it had 5 errors 4 of which were not deadly due to random circumstances like no vehicle with right of way passing, then that is worse than if it committed one error that was deadly. I might fix the second implementation and scrap the first.
Surely the driver had to know his car did not take over these functions and if the software was not designed to do this, then it couldn't have malfunctioned.
Outrageous; compare to cruise control on other cars.
In line with the outrageous regulation kill-switch-on-all-new-cars-after-2026, inside the infrastructure and jobs bill passed in 2021.
I see there's a newer bill drafted called No Kill Switches in Cars Act, hope it gets moving.
Other ADAS systems have much better driver monitoring. My 2016 Honda did. My 2021 Ford does.
The government has forced Tesla to strengthen their monitoring at least twice.
Tesla called it Autopilot. Tesla (seemingly) didn’t care about reckless behavior. Tesla repeatedly had misleading claims about being far safer than humans driving. Tesla didn’t even keep the data that would be needed to prove that for the first 2 years.
Either way, it broke the law and killed someone. Does it matter if it was an act in 1 part or 5?
If it had 5 errors 4 of which were not deadly due to random circumstances like no vehicle with right of way passing, then that is worse than if it committed one error that was deadly. I might fix the second implementation and scrap the first.