> In this case, the driver had less than a second to react, and even if he had reacted, it might have made things worse, like correcting, but not enough to get back on the road and hit the tree head-on instead.
As the article rightly notes, Tesla FSD is still a Level 2 driver assistance system, not a fully autonomous solution. From my background in Human-Machine Interface (HMI) design, I’ve long argued that expecting a human to instantly take over in a high-speed, high-stakes situation - especially to correct an unexpected or erratic system behavior - is a fundamental design flaw.
Human attention and reaction time simply don’t align with the demands of these edge cases. If the system requires a human to be both disengaged and instantly ready, it creates a paradox that undermines safety.
jqpabc123 · 21h ago
Musk's arrogant approach to FSD is to handicap his engineers.
Look out drivers in Austin. This is headed your way.
taylodl · 21h ago
While drivers should always be attentive, I think the greater concern with Tesla’s FSD isn’t just for those behind the wheel - it’s for the people outside the vehicle.
Pedestrians, cyclists, motorcyclists, and even children getting off school buses have been involved in tragic incidents. These are the people who can’t intervene when the system fails. That’s why I believe the conversation around FSD safety needs to focus more on system accountability and protection of vulnerable road users, not just driver vigilance.
desktopninja · 20h ago
Perhaps, like the uber/lyft light-on signage, when FSD is engaged some sort of visual cue should activated on the exterior of the vehicle. It can be subtle like an extra set of LEDs eliminated either on the front or back. The end effect might be that other drivers will make it more predicable for FSD to safely navigate.
But I as one of other drivers should not have to change my driving habits just for elon to make more money.
dzhiurgis · 18h ago
That is possibly the worst idea imaginably. You’re changing behaviour around you by your presence which will fuck up all your training.
bell-cot · 21h ago
The article's last photo is from the Tesla's front camera, a fraction of a second before FSD suddenly swerved off the 2-lane asphalt road.
Nothing in that photo hints at "why?". (I have no computer vision experience.) The article speculates about shadows on the road...maybe those suddenly appeared (sun came from behind a cloud), and the FSD image analyzer freaked out?
dzhiurgis · 18h ago
I had some “close calls” with autopilot, but this clearly is drivers fault. Not only he was not paying attention his hands were clearly busy elsewhere.
But I don’t wanna blame him entirely. Firstly not everyone is capable using systems like this, nor everyone should. And obviously system shouldn’t behave like this at all (tho someone looking at data was saying driver deliberately swerved themselves).
As the article rightly notes, Tesla FSD is still a Level 2 driver assistance system, not a fully autonomous solution. From my background in Human-Machine Interface (HMI) design, I’ve long argued that expecting a human to instantly take over in a high-speed, high-stakes situation - especially to correct an unexpected or erratic system behavior - is a fundamental design flaw.
Human attention and reaction time simply don’t align with the demands of these edge cases. If the system requires a human to be both disengaged and instantly ready, it creates a paradox that undermines safety.
Look out drivers in Austin. This is headed your way.
Pedestrians, cyclists, motorcyclists, and even children getting off school buses have been involved in tragic incidents. These are the people who can’t intervene when the system fails. That’s why I believe the conversation around FSD safety needs to focus more on system accountability and protection of vulnerable road users, not just driver vigilance.
But I as one of other drivers should not have to change my driving habits just for elon to make more money.
Nothing in that photo hints at "why?". (I have no computer vision experience.) The article speculates about shadows on the road...maybe those suddenly appeared (sun came from behind a cloud), and the FSD image analyzer freaked out?
But I don’t wanna blame him entirely. Firstly not everyone is capable using systems like this, nor everyone should. And obviously system shouldn’t behave like this at all (tho someone looking at data was saying driver deliberately swerved themselves).