Tesla fucking up traditional driving controls only make sense if their self-driving system is working so the driver has no need to touch the steering wheel except in rare case. How good is Tesla’s full self driving these days?
It regularly kills people. It can’t be used on a lot of road types (but people still do because Tesla makes no effort to prevent it). It’s still marketed as Full Self Driving despite the fact that Tesla has stated on the record that it is, and I quote, “Not capable of driving itself.”
They’re trying to have their cake and eat it too. Any time it benefits them, they claim that their cars are completely autonomous vehicles powered by the most advanced AI. Any time they get their wrists slapped, they claim that it’s an assistive feature like cruise control that cannot and will not ever replace the human behind the wheel.
That’s all the people who were asleep on the highway or driving at very high speed in town
The recent versions don’t allow either of those behaviours now, so those crashes aren’t happening anymore.
Full self driving doesn’t do that
And the deaths I’m interested in are these ones being caused by FSD, not lane keeping and cruise control. Loads of brands do lane keeping and cruise control and implement it no better than Tesla
It is quite different. Navigate on autopilot is lane keeping, cruise control, and automatic highway exits. FSD tries to do all driving tasks - turns at stop signs, at lights, keeping to the correct side on roads with no centre line, negotiating with oncoming traffic on narrow roads…
Yeah it adds more capabilities for sure. But if you are on a moderate to high speed road where autopilot works fine, then is the control logic any different?
Obviously there are various tours of accidents that autopilot would never get the chance to cause, like maybe turning right at an intersection and hitting a pedestrian. But do they act differently on a main road where teslas have done things like run into tractor trailers?
The one that hit a tractor trailer was years ago. They are far better now, specifically they see low contrast stuff now and that’s on autopilot. The biggest difference to the user will be the ability to have hands off the controls.
It isn’t the same though. FSD is written completely differently to autopilot. It’s a different program.
Other accidents it won’t have on those roads include falling asleep and running off the road, or being surprised by someone braking ahead and running into them
I’m sure it will be worse than humans around animals on the road. I wonder if it will see a wombat before it hits it.
I don’t need to provide you with evidence that FSD has caused crashes. There’s plenty; if you can’t find it you’re not looking.
As to your point about accident statistics, that’s responding to a different point than the one I was making. I didn’t say that it kills people more often than they kill themselves (through dangerous, inattentive or reckless driving). I just said that it regularly kills people. There’s potentially some hyperbole there, you can quibble over definitions of “regularly” if you want to be a pendant, I really don’t care.
Could a human make these errors? Absolutely. But would you, as a human, want to trust yourself to a vehicle that is capable of making these kinds of errors? Are you happy with the idea of possibly dying because the machine you’re in made one critical error? Perhaps an error that you yourself would not have made under the same circumstances?
A lot of people will answer “yes” to that, but for me personally any autopilot that requires constant supervision to make sure it doesn’t kill me is more of a negative than a positive. Even if you try to pay attention, automation blindness will inevitably kick in. And really what is even the point of self driving if you have to be paying attention? If it’s not freeing you up to focus on other things then it might as well not be there at all.
No, I’m actually interested to know. Are most Tesla owners activate self driving during their daily commute? Tesla doesn’t sell their vehicle here so the only times I actually see a Tesla are in car shows.
We’ve had news stories - and a friend’s coworker too - of people sleeping on the highway portion of their commute. The friend’s coworker did it daily for months, setting an alarm when it was probably going to be ‘street’ driving time so he’d wake up and be ready.
Being able to sleep (or not paying any attention to the road) is the entire reason I would get a self driving car (assuming it’s safe to do so). But aren’t you required to keep your hands on staying wheel when engaging full self driving? And I think the car has camera to monitor driver attentiveness too. Can you really fall asleep during commute like that?
They say it’s beta but beta would imply that it’s at least somewhat close to ready, which it clearly isn’t even after being in “beta” for a long ass time.
About 1 in 5, though recent changes to price and the widening of the full self driving beta will have changed that since the stats were released in 2022
The average human driver has a car that’s five years older than the oldest model 3. This means five years more age on various safety equipment, five years more primitive collision avoidance systems, cars without stability control, etc.
The autopilot system only engages in ideal circumstances. Poor visibility, poorly marked road, bad weather, all scenarios that are high risk that autopilot wont touch that also cause a lot of human accidents.
I’m talking full self driving beta, not autopilot. FSD works on bad roads, car parks, any weather it can see in, including moderately heavy rain. It won’t work in heavy fog, but I won’t drive in that either. Autopilot has a long history of only working on highways which upped its safety, but also a history of working hands off and at any speed.
Also note that the initial beta was only open to the safest, most responsible, drivers according to Tesla data (Tesla have a lot of data on their drivers, many opt in to sharing everything in the hope of hurrying better automation) so the cars were very well supervised
I’m really hanging out for insurance data once this system is out of beta
Even with FSD, I don’t think we can be anywhere close to a comparable cohort.
To expand on the safety equipment, I wager the average driver with their 12.5 year old car also doesn’t have regen braking. So while 99% of Teslas likely have near pristine brake systems due to age and regen braking, the average driver is more likely to experience “surprise, your brakes are out!”
Also, particularly based on my time with rural folk with cars in the woods, I’m highly doubtful that no matter how aggressive FSD may be, it won’t be as daring as some dubious human operators in that “average” cohort.
Also, I’d wonder how Tesla would treat an FSD deactivation by driver intervention. If a crash is unavoidable and imminent, I’d imagine an aware driver might manage to yank the wheel in time to deactivate, but still get in an airbag deploying crash.
There’s also some potetntial slush around “accidents that activate airbags”. Different models have different sensitivies.
But all this falls second to a primary concern: never trust what amounts to marketing data from any company compared to something like NHTSA data.
Would be interesting if someone could do the legwork to manage “like for like” to tell safety due to:
-General age of car in general
-Regenerative braking versus standard
-Stability control, collision avoidance, automatic braking and so forth
-Like for like driving conditions
-Data for Teslas including human operation, autopilot and FSD. Particularly if human operator, but FSD was on less than 10 seconds before impact.
That really doesn’t happen from wear. Brakes only surprise fail on long descents where the driver doesn’t use engine braking. If brakes fail like that you have the hand brake/e-brake
EVs of course use regen braking almost always in that situation - though they can’t when their battery is full - my car expects to arrive at the coast at 20% battery, at the top of the coastal mountain range it’s at 15%, but at the beach it has regenerated to 20%
The rest I generally agree. We need better data, especially better data from someone other than Tesla.
Tesla fucking up traditional driving controls only make sense if their self-driving system is working so the driver has no need to touch the steering wheel except in rare case. How good is Tesla’s full self driving these days?
It regularly kills people. It can’t be used on a lot of road types (but people still do because Tesla makes no effort to prevent it). It’s still marketed as Full Self Driving despite the fact that Tesla has stated on the record that it is, and I quote, “Not capable of driving itself.”
They’re trying to have their cake and eat it too. Any time it benefits them, they claim that their cars are completely autonomous vehicles powered by the most advanced AI. Any time they get their wrists slapped, they claim that it’s an assistive feature like cruise control that cannot and will not ever replace the human behind the wheel.
Could you link an article saying so? I couldn’t find anything with a quick google search about people being killed by Tesla FSD
Edit to add: this fairly recent article https://insideevs.com/news/655983/tesla-full-self-driving-beta-crash-stats-revealed/ says they’re pretty safe.
Maybe search for killed while on autopilot?
That’s all the people who were asleep on the highway or driving at very high speed in town
The recent versions don’t allow either of those behaviours now, so those crashes aren’t happening anymore.Full self driving doesn’t do that
And the deaths I’m interested in are these ones being caused by FSD, not lane keeping and cruise control. Loads of brands do lane keeping and cruise control and implement it no better than Tesla
Just keep in mind that FSD is only as safe as they claim because it’s supervised.
I would hope that even a reasonably working system would be better with a human vigilantly watching it than a human driving regularly.
The system would have to be really bad to be worse than that.
No
But does FSD change the logic for the lane keeping and the speed & distance?
Aren’t one of the features “navigate on autopilot?”
It is quite different. Navigate on autopilot is lane keeping, cruise control, and automatic highway exits. FSD tries to do all driving tasks - turns at stop signs, at lights, keeping to the correct side on roads with no centre line, negotiating with oncoming traffic on narrow roads…
Yeah it adds more capabilities for sure. But if you are on a moderate to high speed road where autopilot works fine, then is the control logic any different?
Obviously there are various tours of accidents that autopilot would never get the chance to cause, like maybe turning right at an intersection and hitting a pedestrian. But do they act differently on a main road where teslas have done things like run into tractor trailers?
The one that hit a tractor trailer was years ago. They are far better now, specifically they see low contrast stuff now and that’s on autopilot. The biggest difference to the user will be the ability to have hands off the controls.
It isn’t the same though. FSD is written completely differently to autopilot. It’s a different program.
Other accidents it won’t have on those roads include falling asleep and running off the road, or being surprised by someone braking ahead and running into them
I’m sure it will be worse than humans around animals on the road. I wonder if it will see a wombat before it hits it.
I don’t need to provide you with evidence that FSD has caused crashes. There’s plenty; if you can’t find it you’re not looking.
As to your point about accident statistics, that’s responding to a different point than the one I was making. I didn’t say that it kills people more often than they kill themselves (through dangerous, inattentive or reckless driving). I just said that it regularly kills people. There’s potentially some hyperbole there, you can quibble over definitions of “regularly” if you want to be a pendant, I really don’t care.
The point is that when it does go wrong, it often goes spectacularly wrong, such as this case where a Tesla plowed into a truck or this thankfully low speed example of a very confused Tesla driving into oncoming traffic.
Could a human make these errors? Absolutely. But would you, as a human, want to trust yourself to a vehicle that is capable of making these kinds of errors? Are you happy with the idea of possibly dying because the machine you’re in made one critical error? Perhaps an error that you yourself would not have made under the same circumstances?
A lot of people will answer “yes” to that, but for me personally any autopilot that requires constant supervision to make sure it doesn’t kill me is more of a negative than a positive. Even if you try to pay attention, automation blindness will inevitably kick in. And really what is even the point of self driving if you have to be paying attention? If it’s not freeing you up to focus on other things then it might as well not be there at all.
Full Self Driving is still in beta stage.
AI DRIVR has good content on Tesla FSD if you’re actually interested in knowing how good it is.
No, I’m actually interested to know. Are most Tesla owners activate self driving during their daily commute? Tesla doesn’t sell their vehicle here so the only times I actually see a Tesla are in car shows.
We’ve had news stories - and a friend’s coworker too - of people sleeping on the highway portion of their commute. The friend’s coworker did it daily for months, setting an alarm when it was probably going to be ‘street’ driving time so he’d wake up and be ready.
That’s both extremely stupid and irresponsible but also quite impressive on Tesla’s part.
Being able to sleep (or not paying any attention to the road) is the entire reason I would get a self driving car (assuming it’s safe to do so). But aren’t you required to keep your hands on staying wheel when engaging full self driving? And I think the car has camera to monitor driver attentiveness too. Can you really fall asleep during commute like that?
They say it’s beta but beta would imply that it’s at least somewhat close to ready, which it clearly isn’t even after being in “beta” for a long ass time.
What do you mean it clearly isn’t atleast somewhat close to ready?
Even if it were ready, what proportion of buyers spend the extra $12k to get self-driving?
If FSD was truly autonomous, or an excellent level 2 system?
Truly autonomous, at 12k, it would have unlimited demand. Production would be the only restraint.
Edit: Tesla might even prioritize sales with FSD or only make FSD cars at that point and rake in the profits.
About 1 in 5, though recent changes to price and the widening of the full self driving beta will have changed that since the stats were released in 2022
Tesla say it crashes enough to deploy an airbag about one fifth as often as human drivers (once per 3,200,000 miles versus once per 600,000)
So safer than the average driver, presumably less safe than a safe driver
Be wary of cherry picked data.
The average human driver has a car that’s five years older than the oldest model 3. This means five years more age on various safety equipment, five years more primitive collision avoidance systems, cars without stability control, etc.
The autopilot system only engages in ideal circumstances. Poor visibility, poorly marked road, bad weather, all scenarios that are high risk that autopilot wont touch that also cause a lot of human accidents.
I’m talking full self driving beta, not autopilot. FSD works on bad roads, car parks, any weather it can see in, including moderately heavy rain. It won’t work in heavy fog, but I won’t drive in that either. Autopilot has a long history of only working on highways which upped its safety, but also a history of working hands off and at any speed.
Also note that the initial beta was only open to the safest, most responsible, drivers according to Tesla data (Tesla have a lot of data on their drivers, many opt in to sharing everything in the hope of hurrying better automation) so the cars were very well supervised
I’m really hanging out for insurance data once this system is out of beta
Even with FSD, I don’t think we can be anywhere close to a comparable cohort.
To expand on the safety equipment, I wager the average driver with their 12.5 year old car also doesn’t have regen braking. So while 99% of Teslas likely have near pristine brake systems due to age and regen braking, the average driver is more likely to experience “surprise, your brakes are out!”
Also, particularly based on my time with rural folk with cars in the woods, I’m highly doubtful that no matter how aggressive FSD may be, it won’t be as daring as some dubious human operators in that “average” cohort.
Also, I’d wonder how Tesla would treat an FSD deactivation by driver intervention. If a crash is unavoidable and imminent, I’d imagine an aware driver might manage to yank the wheel in time to deactivate, but still get in an airbag deploying crash.
There’s also some potetntial slush around “accidents that activate airbags”. Different models have different sensitivies.
But all this falls second to a primary concern: never trust what amounts to marketing data from any company compared to something like NHTSA data.
Would be interesting if someone could do the legwork to manage “like for like” to tell safety due to: -General age of car in general -Regenerative braking versus standard -Stability control, collision avoidance, automatic braking and so forth -Like for like driving conditions -Data for Teslas including human operation, autopilot and FSD. Particularly if human operator, but FSD was on less than 10 seconds before impact.
That really doesn’t happen from wear. Brakes only surprise fail on long descents where the driver doesn’t use engine braking. If brakes fail like that you have the hand brake/e-brake
EVs of course use regen braking almost always in that situation - though they can’t when their battery is full - my car expects to arrive at the coast at 20% battery, at the top of the coastal mountain range it’s at 15%, but at the beach it has regenerated to 20%
The rest I generally agree. We need better data, especially better data from someone other than Tesla.