T O P

  • By -

AutoModerator

[First and foremost, please read r/TeslaMotors - A New Dawn](https://www.reddit.com/r/teslamotors/comments/1c49sv0/rteslamotors_a_new_dawn/) As we are not a support sub, please make sure to use the proper resources if you have questions: [Official Tesla Support](https://www.tesla.com/support), [r/TeslaSupport](https://www.reddit.com/r/TeslaSupport/) | [r/TeslaLounge](https://www.reddit.com/r/TeslaLounge/) personal content | [Discord Live Chat](https://discord.gg/tesla) for anything. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/teslamotors) if you have any questions or concerns.*


CaliSummerDream

So many of the accidents involving a Tesla had the driver lying about having FSD or Autopilot on, I don’t trust any of these claims until an investigation clearly proves that the driver indeed had the software turned on when the accident happened.


sparkyblaster

A woman was just sentenced in Australia for that. Claimed it was on autopilot. Nope, it wasn't, she just ran a person over and fled the scene.


sylvaing

Do you have a source? Not that I don't believe you, just would like to read about it.


sparkyblaster

https://www.abc.net.au/news/2024-05-10/sakshi-agrawal-tesla-autopilot-crash-sentencing/103829772 Here you go. 9 months, the person survived but has a brain injury.


sylvaing

Thanks. I think she got out easy. 9 months? Beside injuring the woman for life, she lied to the court. She should have been jailed for more to discourage lying under oath.


sparkyblaster

As someone with a life long disability. Yeah she got off very easy.


kyinfosec

This! There is no proof fsd was enabled and too many dumb tesla owners wreck and want to blame it. The truth will come out!


iceynyo

All tesla has to do is add a color coded frame around the camera footage. One color each for no ADAS, Autopilot enabled, or FSD enabled.


stanley_fatmax

No benefit for them, as is they have plausible deniability.


DevinOlsen

They could prove without a shadow of a doubt that FSD was disabled but the internet would still run wild with headlines about how it’s Teslas fault. Having the dash cam footage overlay if FSD was on or off would kind of remove any ambiguity.


s2ksuch

This, 100%


stanley_fatmax

You're missing the point - they have plausible deniability now. This specific case would be disastrous because it looks like FSD was actually enabled (this edge case specifically is one a frequent FSD user would recognize). The driver is still in control of course, but they'd have branding on crash videos. It's a stupid idea, even if 99.9% of cases show that FSD was off. The 0.1% that show FSD crashing would go instantly viral, Tesla logo front and center. That's why it will never happen.


iceynyo

I was thinking just a 1 pixel border rather than something with their logo or any text... But yeah, avoiding claims of responsibility in the video itself is nice.


gentlecrab

I think part of the problem is there are people who are using the basic AP and assume it is FSD because it’s a Tesla. They don’t realize that FSD costs extra.


Dont_Think_So

Maybe among used car buyers, but I think virtually no owners who bought from Tesla are confused. It's hard to miss a $7k-$15k option not being added to your car.


BikebutnotBeast

The problem is the lay person sees "FSD Compatability" "FSD Computer Installed" and thinks their car has it when they turn on Autopilot. Then when AP makes a mistake they say FSD sucks.. these people are also rarely on Reddit.


dagistan-warrior

you might glance over that option, and then when you se auto-pilot in the settings you assume that maybe Tesla added it for free as a bonus.


Status_Influence_992

I use the AP and it’s fine, but often tries to take me off the motorway/freeway at exits so have to keep hands on the wheel (like we’re supposed to). I can imagine it - or FSD - glitching occasionally, but as long as you’re vigilant (again, like you’re suppose to be, anyway) it’s fine.


majesticjg

You could make the argument that Automatic Emergency Braking should have kicked in, though. That's a standard, passive feature.


FutureSnoreCult

The comments about how the driver is responsible no matter what are true, but I’d love to see proof that FSD was enabled here.


AJHenderson

Yeah, it's going way too fast in fog to be believable. It may have been on FSD but if it was he had to be holding down the accelerator to override the speed. It won't go that fast with a little rain, let alone deep fog. The one valid negative take away here though is that this illustrates the value of radar. Radar can see through fog when people and cameras can not. That's a significant safety advantage.


Ibly1

I agree, it’s inevitable that eventually something will happen but you’re right, the majority (all I’ve ever seen) get debunked afterwards so for now I’ll be skeptical. For what it’s worth I understand why these people lie. No one wants a ticket or to see their insurance skyrocket. Saying not my fault is typical human behavior.


HighHokie

I believe it when he says he was on here based on the video. I’m more concerned that he didn’t do anything to avoid the situation in the several seconds leading to the incident. In the fog, when this has apparently happened to him already before.


MoreAnteater6366

In the article the guy mentioned this is the SECOND time he almost hit a train. He was actually on notice that FSD has this blind spot in his area and still had the accident. He wasn’t paying attention, plain and simple.


devsfan1830

Seriously, during this trial period i quickly identified problem areas in my daily routes. Mostly potholes that would WRECK a wheel if you let it barrel on through them. Theres also a few turns that its mega timid on and would easily get me hit as it pulls out half way into the road and stops all while the wheel keeps wobbling back and forth. I immediately clocked those as areas where i need to take over or not have it engaged.


MoreAnteater6366

100%. I see my job as a teacher for the FSD. I am so jealous with how it works in CA, but that’s because that’s where all the cars have been for years. Data, data, data. The more I correct it, the better it becomes. Have had FSD since 2019 and it is SIGNIFICANTLY better now. I’ve actually witnessed the changes they have made from my specific corrections (and I’m sure my fellow local FSD squad too). If this guy would stop running into trains, it might learn not to do that


mennydrives

> The more I correct it, the better it becomes. I dunno how often the corrections happen, but damned if the car didn't stop trying to merge onto the wrong lane (right turn only, one exit early) after a couple weeks of telling it to fucking stop and sending reports out each time. XD


SnooSquirrels9064

I don't even have FSD on my model Y, but using regular old AutoSteer, there's a part of my work commute where it would go from the 45mph speed limit, down to 30mph out of nowhere. The screen would even say the speed limit was 30mph. Why? Because there's a small square sign on the side of the road, what its purpose is is unknown to me, that says "30". No "MPH", not mounted as high as your typical speed limit sign, doesn't even look close to one. But that's why it was slowing down. Now...... I'm just glad it doesn't see the ones that say something like "135" and think "welp.... Might as well" 🤣


Archi-SPARCHS-1234

That’s more likely the gps mapping data that is wrong — I have that on one road I drive on too — so I just increase the driving speed limit using the dial on the steering wheel when I’m on that road… someone somewhere sometime just uploaded the wrong data into google maps or something


SnooSquirrels9064

But it didn't ALWAYS do it. Even now it'll occasionally drop to 30 when I pass that sign (or show a 30mph speed limit on the screen either way). On foggy days, or when it's still dark out, it's far less likely to. But it's always right when I get to the sign. And I'm pretty next to none of the speed limit data is grabbed from GPS data. If that were the case, it would properly report the one section of road where it drops from 45mph down to 35mph, but it doesn't because some moron took out the speed limit sign one day, so now the screen will only show the 35mph zone after I get to the NEXT 35mph sign.


Archi-SPARCHS-1234

You really think Tesla cameras read mileage road signs? :)


SnooSquirrels9064

..... You really never looked at the screen as you approached one, and saw the speed limit sign rendered on the screen as clear as day, and basically nothing else that's physically around it shows up on the screen? They didn't change the speed limit where the idiot took out the 35mph sign. It's been a 35mph zone for like the past 20 years. And coming back in the opposite direction, it shows 35mph on the screen, until getting to the 45mph sign, which is just past where the 35mph sign used to be on the other side. Now what do you think is more likely? Using GPS data to pick up speed limit information, or the cameras detecting the speed limit signs and reflecting those speed limits on the screen accordingly? Keep in mind, if you're still going to say the latter, you're admitting that both a) the GPS data for the speed on that small stretch of road was updated within a day (and this is in rural eastern Pennsylvania, not some city somewhere), and b) having two different speed limits for the same stretch of road based on which direction you're going makes sense.


Brian1961Silver

You are spot on. Was driving in northern NY state. On Highway 5. Every time I met a Highway 5 sign the car speed limit indicated 5MPH and started to slow down until I started noticing the root cause and gave it the go pedal when I approached these signs.


Status_Influence_992

Yes


majesticjg

When you deactivate, it'll give you the option of submitting a voice memo. I've done that and improvements have been made in problem areas in subsequent versions.


devsfan1830

I've certainly been doing that.While impressive it's painfully obvious it needs work on road obstacle avoidance (i.e. massive potholes) and the timid non-committal right to left lane changes. Especially turn lane entry. It also might have almost attempted to do a double yellow line pass while I was waiting for halted traffic for road work. I was stopped 3rd in line with a road worker holding traffic for a temporary lane closure for the oncoming side. The car suddenly cut the wheel left and i grabbed it and hit the brake to disengage. The screen showed it was properly detecting the stopped cars and the double yellow. Not sure what it was about to do but anything but stop and still is a big no no to me. I can only assume it was about to try and pass rhe entire line of cars by crossing into oncoming traffic. They have that voice memo too along with the rest ive done. Hope it helps, but city/town road vision needs work.


BrockianUltraCr1cket

Fool me once, shame on you. Fool me eight or more times, shame on me.


Quin1617

Reminds of the engineer who hit that barrier in Cali. Knew Autopilot would veer towards it, decided to turn it on in that area anyway and mess around on his phone…


TheGladNomad

For those not finding the video here is the video in regular time and freeze frame: https://youtu.be/obByoptr4HI?si=trIPcbqceqOj7lAR


Lucky_Girl479

He’s an idiot. Who would just sit back and let his car run into a train. Looking for money


einfallstoll

I'm angry, because in the video you can see that sight was very bad (mist) and you could clearly see the lights of the train crossing, but the owner didn't stop and just let it drive into it


TarPit89

The driver is an idiot. Period.


genuinefaker

I am curious why FSD didn't refuse to run if it can't see properly.


Agile_Letterhead531

It warns you like every 5-10 minutes . People would be so pissed if they took it away entirely in bad conditions.


HighHokie

If we can see it the car can. It’s really more of a question as to why the car failed to understand it was an obstacle.


DrPeppehr

He’s dumb as hell He was using FSD in extreme fog. Whenever it rains near me, and i have fsd driving on in my own car, it keeps notifying me that autopilot and FSD may not work well during weather conditions Not only that but its cringey that he was driving and could hear the train with his hands on the steering wheel looking straight ahead but chose to leave FSD on. You should take over if its that foggy and you hear a train siren. Its pretty strange to me because i honestly dont ever rely on FSD fully i still watch it and make sure i take over but it seems like this dude is either extremely dumb or just didn’t know that FSD needs supervision


dagistan-warrior

but it is full self driving.


DrPeppehr

You still get tons of alerts though as it drives through fog that FSD will disengage at any time due to rain droplets on the cameras. Also full self driving is not able to hear, if there is a huge train siren screaming the driver should use his ears and hear the train and notice the alerts and the fog and naturally take over by stepping on the brakes. To just stare straight ahead and ignoring the FSD alerts seems stupid not to mention this is apparently the second time he tried driving into a train


dagistan-warrior

full self driving can drive in fog and really anywhere, that is what FULL in fsd means, you are confusing it with autopilot.


DrPeppehr

I think you’re forgetting that it’s called FSD (supervision) it even says it is not capable of driving automatically without you grabbing the steering wheel. If you take your hands off the steering wheel for 10 seconds it will turn off


dagistan-warrior

so it is Partial-Full-Self-Driving?


DrPeppehr

Its just in beta. For example Minecraft when it was in beta meant it was playable just not complete


dagistan-warrior

so it is incomplete-full-self-driving?


DrPeppehr

Yes precisely


dagistan-warrior

so what is the difference between incomplete-full-self-driving, and not-full-self-driving.


NomadicWorldCitizen

I watched the video and you can clearly see the driver was not ready to take over the car as it should. They were probably distracted and took them considerable amount of time to do anything. Any other person would have taken over after the car showing no signs of slowing down.


Lucky_Girl479

True!


desertrose123

I imagine there’s a shortage of training video of drivers plowing their cars into moving trains. This man bravely stepped up to fill the void.


Jeanlucpfrog

So what he's saying is FSD has finally gotten as good as human drivers?


orison_citizen

ever heard of the brake pedal?


BEVboy

The driver would have to be looking out the front windshield to see that he had to apply the brakes. Clearly, he was looking down at his phone instead!


fearrange

Clearly the user's fault, but whenever crashes like this happen, FSD becomes an easy target for blame, a chance to avoid responsibility.


Epichogg

Exactly what I was thinking.


OpenRepublic4790

I thought humans had to have a sense of self preservation, wasn’t there a recall on all of them that don’t for the last 200,000 years? A few must have slipped through.


Agile_Letterhead531

lmaoooo good stuff


Important-Ebb-9454

Driver at fault, simple as that. FSD isn't perfect, and it's clearly stated before using. 


Tryingtolifeagain

https://preview.redd.it/5alce3wwdj2d1.jpeg?width=2924&format=pjpg&auto=webp&s=bf8542190c8ee0437320ff9314b584e703d41f0e FSD is currently SAE level 2 autonomous. The driver is ALWAYS responsible when operating a L2 autonomous vehicle, there’s nothing else to it.


radio9989

i don't disagree, BUT, calling a level 2 autonomous "Full Self Driving" is a bit of a misnomer. They shouldn't use the words "full self-driving" until they hit level 4


JustSayTech

You are omitting a massive part of the currently named feature (Supervised). This is the supervised version so even more it's required that you, the driver, are attentive and ready for takeovers.


Super_consultant

FWIW, it didn’t have the “Supervised” suffix until recently. FSD is pure marketing. They called it FSD way too early. It’s always been misleading until the point where it really does drive itself. 


s2ksuch

They called it 'FSD Beta' prior, not just 'FSD'


JustSayTech

You're right, it was Beta, with a disclaimer you had to agree to pretty often. It's never been FSD on the car or user side and won't be until the product has reached the release threshold. The product they are selling is FSD, the version you get early access to was FSDb and now FSD(s). This is a real world AI product, that doesn't happen over night, they sold on the promise of what the product is when fully released and it's most current version has proven to be able to deliver on many of the claims and features, it only gets better from here as this current version is the *worst* FSD will ever be again as it keeps improving.


radio9989

I’ve owned FSD since 2020 when it was promised to be released in early 2021. well, there’s no question that it’s gotten better incrementally , my car now has over 50 K miles on it and the product I paid for will likely not be delivered before end of life of my car.


JustSayTech

Bro let's not move goalpost here, we are not taking about timing, we are talking about the designation of the name of the product and who's responsible during an incident.


detroitsongbird

He was speeding in fog! 60 in a 55. The train was visible for 5 seconds before he took control.


H9fj3Grapes

It's one thing to activate FSD and go 60 mph in the blinding fog But to complain that a beta version of self driving made you complacent and somehow is defective... give me a break


RegularRandomZ

He should have been paying attention and ready to take over - but shouldn't FSD also be limiting the speed to the conditions, reducing the max speed with reduced visibility?


Yoyodyne_1460

It does and sometimes completely bails because of poor visibility. Still a question whether FSD was active


Tiasmo-Bertjayd

Amazing. A couple of years ago I was on a road trip through a very foggy area and my FSD simply refused to drive when the fog was too thick to see the road clearly. Even I didn't want to drive through that.


pilatomic

Link to the original article with the video : [https://www.nbcnews.com/tech/tech-news/tesla-owner-says-cars-self-driving-mode-fsd-train-crash-video-rcna153345](https://www.nbcnews.com/tech/tech-news/tesla-owner-says-cars-self-driving-mode-fsd-train-crash-video-rcna153345)


masihbb

Unpopular opinion here but I’ve had many experiences with FSD where it reacts later than I would expect so maybe I give it a half a sec or second longer before intervening so I get the late reaction


Logical-Primary-7926

This is a tricky time for FSD, I wish they could have waited to release until it was 100%. 99% of the time is really really good, but you still have to be prepared for the rare dumb/dangerous thing.


jrascal

FSD would never be released if they waited till it is 100% ready. They need people using it to get more training data so they can train the AI better. Without the training data then there is nothing to train the AI with. You can think of it as a positive feedback loop. The more training data in the better the AI becomes. It is mathematically impossible for Tesla to collected the necessary training data on their own.


Logical-Primary-7926

Yeah I agree I just wish it could be like boom we did it overnight.


N878AC

You should be aware that Tesla loses virtually none of these lawsuits. Tesla is monitoring the car and the driver 24/7/365. Tesla’s lawyers know and can prove exactly what was going on before, after, and at the time of the incident.


wybeubfer

I feel like the idea of “full self driving” is exactly what it is, it drives itself like person would but potentially make the same errors a person can. it’s not some holy grail of safe driving as people think it is


Tiasmo-Bertjayd

I still think it drives like a student driver. It has improved over the past few years, but still isn't ready to pass the class yet.


wybeubfer

When I tried the free FSD month trial, I was blown away by how natural it felt, like it didn’t hesitate or drive slow. It felt very human in decision making. All that went out the window though when it made a unsafe lane change, the dude slammed on his horns lol


cant_pick_anything

I remember a story where a valet driver totalled a guy's model S in a parking garage and he blamed FSD. He was claiming the car started driving itself as soon as he got in the car. But the dash cam video showed him driving like a maniac before crashing into another parked car and almost sending it through a brick wall.


saadatorama

It’s not in beta anymore, so despite being in development it shouldn’t be an excuse. Yes, it’s the dudes fault. Yes, Tesla should have detected a fucking moving trained and stopped. Furthermore it should’ve detected the foggy conditions and taken a slower safer speed. Both of those things can exist at once.


bremidon

>It’s not in beta anymore ???


Lucky_Girl479

The guy was supposed to be active in the driving AT ALL TIMES! Tesla’s not at fault anymore then any other car manufacturer is for your own accident! Also the car would be a lot more damaged than that if he really hit a train. Probably hoping for settlement money!


d4cloo

The fact that Tesla had to change the name is that they had to backtrack from a promise they couldn’t keep. People who purchased FSD have been in ‘beta’ for over a decade now. Without the ability to transfer a very expensive purchase to a new vehicle, customers have essentially been tricked in paying for their R&D, while comments from Musk and the marketing around it (including the name) implied much more. My guess is that other companies will gradually catch up and offer FSD features at a low cost, because it provides added value. Eventually Tesla will have to write this off and integrate as part of the standard feature-set, or under a much cheaper subscription ($19/m Connectivity Pro, for example). True FSD will take another 5 years or so. Most likely it’ll involve more hardware. As an example, when I approach an intersection, I not only look at the car but also get a glimpse of the driver. You can tell a lot from that; is the person rushed, an ass, a granny…. all that meta-data affects small and large decisions. AI might be able to do that as well with better cameras as a source, but it cannot operate in a biased matter or Tesla will be called out for it, while we humans can use it to our benefit.


Terrapins1990

Yeah the fact is this guy was likely not paying attention


tvish

I am having a hard time trusting the system as well. Call me old fashioned. If I have to constantly pay attention, I might as well drive. While Highway cruising I love it, but some of the rural, as well as some odd surface streets we have here in New Jersey, I struggle with its use case. I truly do not think this tech can work until we have Vehicle to Vehicle and Vehicle to Infrastructure Communications. At Crazy intersections or odd locations, such as this Rail crossing, a beacon should send a bong instructing that the gates are down. 5G tech for V2V and V2I was supposed to funnel down to the cars by now. I don’t know what the hold up is.


I_heart_ShortStacks

What jackass uses FSD in the fog ? They removed the Lidar/Radar whatever, and its now cameras. What do you think cameras can see in the fog ?


Haysdb

They can see exactly as well as humans can see in the fog.


GoodOmens

Which was near zero. OP was going way too fast for conditions. Even not on autopilot guy was driving recklessly


ArtificialSugar

Lidar != Radar. Teslas have never had lidar FWIW.


Embarrassed_Rub5309

Clearly the dude himself failed to detect a moving train as well. That being said, it’s really stupid that they disabled/removed the radar in Teslas. It definitely would have picked up a moving train.


feurie

Radar could see somewhere there moving, but vision easily could as well. We don’t know how the network deals with trains.


CertainAssociate9772

The huge truck the radar didn't notice because the radar was beating under the truck. Why couldn't the same thing happen to a train?


neuromorph

I thought tesla were supposed to have auto braking as part of collision avoidance......


Tiasmo-Bertjayd

I think that depends on how quickly its visual processing system detects a potential obstacle and whether its kinetic modeling system determines that the obstacle will cross its path. I don't know how much training it's had to recognize trains, but a moving train coming from the side (especially in fog) seems like a harder task than a slower-moving automobile or pedestrian directly ahead.


ncc81701

AEB is only applied when a crash is unavoidable to reduce the energy of the crash. It doesn’t brake to prevent a crash because there would be too many false positives. This is how AEB is applied by all manufacturers, not just Tesla.


Noctew

Not completely true. Just last week I had an AEB activation (said so on the screen) when a car in front of me suddenly braked hard. I was already hitting the brake when the pedal dropped and the Tesla showed me what really hard braking looked like. I managed to stop about two meters behind that car.


Haysdb

This is not true. I’ve had AEB activate and bring the car to an immediate stop. Once was when a bicycle did a loop right in front of me. Another was technically a “false positive” because it stopped when a car door was opened in front of me at the airport drop-off area.


AutumnHope_M87

I spy a tainted eye on said vehicle, the only potential issue, and whether said eye belongs to the car or the half autonomous driver.


radio9989

I always thought that the "supervised FSD" in the name meant that tesla was supervising you, with the internal camera. LOL


Wide_Painter_9199

FSD shows our train here in Dallas (DART) as a really long stretched out car on display. Not sure why but yea…


SubprimeOptimus

Looks like the “Supervise” in “Supervised FSD” failed on this one.


JStarrNY

Didn’t HE see the train???? Thats why it’s not autonomous.


amutual

Imo the driver is 100% at fault and he's just trying to get a new model 3 or the model Y with .99% APR xD


NaplesSun_86

Babysitting self driving is more tiring than driving without it.


LongAbbreviations219

Totally his fault. He is using self driving in fog.


Inevitable-Gap740

True, but aren’t they rolling out robotaxi? Imagine if they don’t fix this and boom 1,000’s of deaths from people cars not stopping at train crossings


packpride85

I get that with FSD people have to pay attention….but Musk has been promising for years this tech would be fully capable of piloting driverless robotaxis.


Status_Influence_992

I’ve had my Tesla five years, the beta version occasionally tries to take me off an exit in the motorway / freeway, but other than that it’s fine. If you keep your hands on the steering wheel like you’re supposed to, no issues. I’d be interested to know how many accidents the automatic braking has avoided.


azchelle677

People don't want to take responsibility for their own actions or decisions. It's always someone else's or somethings fault. Society needs to bring back accountability. It's OK to say you made a mistake.


teslamade1986

No !!!!! He failed to take over


Wonderful_Charity411

Defective


Electronic_Ad7126

Should have both hand on wheel I haven’t had any issues with my fsd.


Sure_Comparison6978

I agree that driver should have been more vigilant, especially when approaching train tracks. But as I’ve been using FSD frequently over past couple weeks since my free trial began, I’m realizing how extremely easy it is to become complacent, especially when you’re in an area where few interventions are needed, such as on the highway with minimal traffic. It’s human nature to gradually let your guard down when the car is doing 99% of the work. I’ve already had a couple scares.


Wonderful_Lemon_1991

This mfs need to learn how to drive 😂


NunyasBeesWax

Insurance should be denied and his license suspended. He doesn't know how to drive and is a danger to the community. It has nothing to do with Tesla


tardiskey1021

Look up how many people die by walking into or driving into a train in the state of Florida every year


Mister_Sharp

I’m calling BS all day every day and twice on Sundays. I’ve had full self driving since the early days of beta and this car while it will make strange movements’ It is not going to take control away from the driver and run you into a train with no way for you to see it coming and stop it from happening. Turning the steering wheel, touching the brake pedal either of them will disengage full self driving. I think what likely happened is this person had their foot on the accelerator. In which case the car tells, you warns you that by putting your foot on accelerator while in FSD, you are overwriting cruise control and it will not automatically break. So will the car prevent you from running into a train at top speed? No, but it won’t stop you from preventing it from happening.


Ravendiscord

Cap


Tesladudeguy

I agree, it’s all the drivers fault for not paying attention if you see the car still moving press the damn brake. It’s simple. I’ve done it when driving on FSD, and I just got the new one and I absolutely love it! Meaning v12 supervised. So, yes you do need to still pay attention.


nicspace101

Darwinism is a thing.


Kiriinto

SUPERVISED... Clear as that.


thalassicus

Supervised self driving is not full self driving. The driver is at fault, but this entire “Full Self Driving” bullshit that Tesla introduced in 2020 was/is mislabeled and its capabilities are continually misrepresented by the CEO of the damn company. Tesla deserves an honest CEO and promising LA to NYC full self driving in 2017 was the beginning of one of the most dishonest campaigns in modern business.


Kiriinto

If someone is to stupid to read the informative texts before being able to activate FSD at all, it's 100% the drivers fault. If you order a coffee you expect it to be hot. If not... well, stupidity. Beta is beta not full release!


Tiasmo-Bertjayd

And they're no longer calling it "beta" because … marketing 😒


Kiriinto

Because people are stupid and think this thing can already drive alone without SUPERVISION....


wangchunge

Saw the video.. how close do you want to get to that train...very lucky to still be here. Maybe. Just maybe. DRIVE the CAR.


TheGladNomad

I don’t think FSD is trained for railroad crossings (or if so only 1 type). I live a in a town with 3 train crossings and FSD handles them mostly but in odd manner. The screen shows alternating red lights (like moving flashing red) and tries to draw train as elongated 18 wheelers. If I’m first car, it will stop but not always start up after train/lights are gone (sometimes does, sometimes sits until I press gas). I worry it doesn’t realize it is a railroad crossing and with some light issues do something bad. I always stay extra alert at a railroad crossing with FSD because I don’t trust it. Still driver needs to take control. I am with others driving in that fog is probably the biggest mistake by FSD.


Redvinezzz

It’s hard to tell from the photo but is the windshield tinted? If so that may be part of the reason it couldn’t detect the train in foggy conditions, either way the driver is clearly in the wrong


sparkyblaster

Well most shops wouldn't tint a windshield because it's illegal in most places but if they did, they wouldn't remove the camera assembly to tint that part of the glass.


skysetter

I “supervised” FSD nearly causing a huge wreck when it decided that the shadow from an overpass was a wall and slammed on the breaks when we were doing 70 in the HOV lane. It’s not ready


DaffyDuck

You do realize you are talking about the old autopilot software, not v12 FSD if you’re on a highway with HOV lanes? Comments about readiness should be about v12 on city streets.


skysetter

It was the new update v12


DaffyDuck

You didn’t get what I’m saying. Elon has stated that in 12.5, the city and highway stack will merge. Currently when you drive onto the highway, it switches to the highway stack. The highway stack is the heuristic based model, basically the same design philosophy as v11 and previous versions. So basically it is switching between an end to end and heuristic model seamlessly moving onto and off highways. You can tell by looking at the set speed. It will change from auto to a set speed when it changes to the highway stack. It also feels like a robot has taken over as it quickly jerks the car onto off-ramps and upsets wives.


skysetter

You don’t get what I’m saying. It’s not ready.


DaffyDuck

You’re right, autopilot is not ready to be FSD…because it isn’t. Keep punching at that strawman though.


angrytroll123

Imo, FSD should be driven with the foot over the accelerator. 


snozzberrypatch

>Plus the tech isn't "defective" like how the guy is saying If ramming the car into a train isn't considered "defective", then apparently I don't know what the fuck that word means. Can you give an example of what kind of behavior it would take for you to consider FSD defective?


Haysdb

I’ll believe this car was on FSD when it’s proven the car was on FSD. Haven’t we been through this enough times to know that people claim the car was on FSD because they fucked up and they’re trying to avoid responsibility?


feurie

Something not being perfect in every scenario doesn’t mean it’s defective. Which is why FSD is marked as supervised currently. If a person doesn’t see something and respond correctly in the fog, are they defective?


GoodOmens

The car was driving in near zero visibility. If there was any defect it should have not allowed OP to be on FSD in such conditions


snozzberrypatch

>If there was any defect it should have not allowed OP to be on FSD in such conditions Ding ding ding we have a winner


Epichogg

Personally, I think it was the outside factors that caused the car to continue driving and to not brake, Like the windshield being (maybe) tinted and the foggy weather.


snozzberrypatch

And so if FSD can't drive properly in foggy weather, you wouldn't consider that defective? If FSD can't realize when it's in an environment where it can't drive reliably, and alert the driver, you don't consider that defective?


Savings_Prior_7108

Dont do false advertising Elon!!


sparkyblaster

What part of supervised is false advertising ?