Of course it will. Because it wants to protect every employee from a lack of productivity that comes with unions. In the end Tesla employees are way better of without one because productivity will lead to a higher stock price and you can benefit from that through stock based compensation. Unless you get fired for stupid behavior that is.
You have a good friend to casually run into the street in front of a moving vehicle it's increasingly clear won't stop, and give a chair a spin, or knock over a pallet!
You're doing exactly what I would expect an excellent and dedicated engineer to do. You have always stayed focused on the task at hand and truthfully reported everything you have observed, and even after you were let go you have continued doing this work on your own. Much respect.
Thanks for this! Haters are gonna hate and fanboys are gonna be fanboys. Don't be discouraged. You're bringing light about the FSD situation and informing thousands about how not-ready this technology is.
@@AIAddict And in pro of your own physical integrity, an (unsolicited) tip: When performing this kind of tests, hold the steering wheel without placing your thumbs around it. This is common practice in off-road driving, so that if a rock or rut causes the wheels to turn suddenly and violently, and thus also the steering wheel, you don't risk potential thumb injury. I think that scenario definitely applies here. 😅 I'd say it's worth preventing that risk so you can keep posting these very useful tests while reducing the risk to your person (insomuch as possible) 😅
@@AIAddict re-activate it? Are you the one who got it taken away after hitting something? Sucks man. I like the tests but it is beyond the way they want people using it. They already have so many people complaining about "normal" drivers testing software on public roads....
@@newberry940051well my autopilot stops for chairs I place in-front of it, mind you it’s late 2023 almost 2024. Make sure next time ur using a tesla equipped with full self driving as this is not fair to just use autopilot and label it as full self driving. And remember detecting an object and acting upon it is different. Truthfully tesla has always said this feature is still actively being improved so expecting it to be good right out of the box is quite stupid. And it’s going to be fully autonomous pretty quick considering where it’s at today and how much it’s improved with tesla removing 300,000+ lines of code and now using the neural network to train the car even better.
Tech does not require belief. My toaster works whether I believe in it or not. The way some people talk about this stuff sounds like a doomsday cult. Like you're all preparing for self-driving rapture.
So, what did they get the award for, if it never tried to stop? Im not a Tesla fan, I consider myself rather critical of them, but that's really strange that it didn't do anything. Have you tried a different roadway? Just out of curiosity. But i guess sometimes it detected the objects and did nothing, so wow thats pretty useless.
The specifics of the award were for its detection and avoidance of “vehicles and pedestrians”. However, as shown, it did not react to a vehicle pulling out.
@@nicholasthon973 Doesn't make me worry, because I'd have to be either blind, paralyzed, or unconscious before I would rely on any self-driving software operating a vehicle that wasn't on a track or cable. And I agree, this award is premature by several years, at least judging by these results.
it's so important, you do this, even if Tesla will hate you for this or start any arguing why it's not realistic or whatever. But they have to face the truth that there is something not working propperly, not good enough. But, please: Make something that wakes up Tesla, really: Do this test again, but this time don't use daily items but: Take balloons (to make the object so light that you will be able to actually run over it without damaging your car) and create with the balloons a puppet in the REAL size of a 2 years old child, so with arms and legs and a head out of balloons and put child cloths on and some wool for hairs and place it as if it would crawl on the ground, maybe like a 2 years old after it has fallen down. (Maybe you should pull on a thin cord that the balloon toddler moves A BIT. And film this moment closely when and how the balloon toddler gets overrun to shock everybody, so that Tesla will speed up by getting better and also maybe focussing on this insead of creating new unimportant things like funny sounds or what ever. It's just disappointing that not at least the expensive cars can already perform perfect automatic emergency breaks! (no matter which system they use)
I think that behind the hype you don’t really know what Tesla is… Don’t expect that thief will suddenly try obeying the law and care about safety. Tesla is a Fraud.
Camera-only is a really bold move. They should probably have stuck to LIDAR which I imagine is a lot safer and much easier to interpret for the machine. Detecting an object in a LIDAR scan is probably downright trivial compared to computer vision. You just check for the distance to the car and that's it. You should be able to tell us more about this question I guess
Even with lidar you're still left with the problem of prediction. Humans it's just intuitive. It's very hard to teach machines how dumb people can be leaving the trash cans in the middle of the road.
Tesla vehicles used to have radar along with their vision. The radar would detect and brake for any object in the road. But Tesla felt they were hitting a glass ceiling depending on radar and decided to drop the radar completely and go all in on their Tesla Vision. However, by doing that, they had to go many steps back and work on getting through those edge cases that the radar would have no problems detecting. This is where Tesla is right now.
This was observable in a few situations in videos. FSD shows rather slow reaction times towards suddenly occurring objects. FSD can predict traffic and paths really well given far enough viewing range to assess the situation 2-3 seconds ahead of reaching that actually point or the objects arriving at the car's location. They need to bring this down to the 0.5 second region to significantly surpass human capabilities.
There is so much wrong here. Iihs says it's a pedestrian and vehicle avoidance system, not a static/moving object system. You were testing auto steer, not the "FSD system". If you read what AEB and Auto steer do, they definitely don't tell you "the car should steer and brake without [your] input".
@STerkskz yes I did. He's wrong. He classified in the video that before people say this nonsense... FSD probably is better but not for the reasons explained. Fsd can probably detect any object that's not usual...he's talking about what they specifically have coded for to avoid with Emergency braking
@@yahyahassen9807 the author of the vide states he's testing the "FSD system" which is a lie. Then the author of the video pits the legacy auto steer and AEB system Tesla is not actively improving any longer having recognized it can never succeed against static objects which are neither a car or pedestrian. The author additionally seems to believe the car will swerve to avoid accidents something neither AEB nor Auto steer could do. The author disingenuously implies FSD has failed when it would thrive under these circumstances. Now cars with or without FSD have an FSD-based AEB with more power and capability. Software updates are awesome
You need to try dropping a cardboard box out of the back of the car you are following. This happen to me in real life but it was a carpenters saw horse which I couldn't avoid!
do you think that the proximity of the tree's shadows had any effect on the detection? Have you considered suspending things in the proper size/shape which would not damage the car and see how it behaves - like a balsa wood cutout or stuffed animal as target?
Exactly not taking up for the car but it doesn’t do well with shadows plus they are on a corner going around a curve and a white truck is sitting right there it’s focusing on two many things and going too fast
You need to run this test again, but this time with child simulations using ballistics gel wrapped in saran wrap and done in such a way that it (hopefully) doesn't destroy the tesla. I think the image of that going splat on the front of the Tesla would be enough to get attention, especially if you are approaching it as "we're trying to gather data to understand what's going on here".
Time to get some of those 39" tall real dolls (My Size, or something was their name) and see if Tesla is ready to start running over kids as well as roll through stop signs.
can you include a camera showing the pedals? Ideally a continuous clip that shows pedals and the object. just to be sure you're not pressing any pedals. Because you can press the accelerator to override emergency breaking.
I got a Mazda CX30 GT Sport Tech this year, and I can can tell you it's emergency braking is so much better, I don't think I would be able to crash the car even if I tried. With one exception, the other night I had to defrost the windscreen, and I should have got the frost of the cameras, result was dash was flashing at me all safety systems off line. Next time we have a heavy frost I will know to make sure it's fully defrosted before I drive off.
Technically, the human operator is driving the car. Otherwise, the “FSD” would have to be turned off immediately. The fanboys that think this system is anywhere close to being autonomous are insane …
one problem I see is you are using AUTO PILOT on a two lane highway, auto pilot is made for the freeway and does not respond properly ib 2 lane traffic like FSD beta does. I'm sure it would work on a freeway, and I'm sure FSD beta would do fine. maybe not for the bucket but some of the other objects and defiantly the truck. I'm a beta tester and my model 3 2022 will stop if a truck or pedestrian pulls or steps in front of the car. also you need to have a destination set on the screen, seems to me when I use mine if I have plugged in a destination everything works better. probably should not mater but that's how mine seems to work.
From a real world perspective, these requirements are too complex for the average person to rely on safely. The type of road shouldn’t dictate whether or not the car steers around deadly obstacles.
This is very weird. Especially the last example with the truck pulling out. I’ve had the exact scenario with a car pulling out in front of me more than once (not very good drivers in my area). It pulled out even later than yours did. I didn’t even have time to react and my Model Y activated the emergency brakes and slowed me enough to avoid hitting the car pulling out in front of me. My Model Y has FSD but not beta - so the same software as your model 3 (in theory).
@@stevecool4381 I'm not sure. I'm sure there is a lot that's different. I was on a 4-lane road in a different area of the country. It's happened more than once, so I need to pay better attention to the conditions next time 👍🏻
Since a lot of people seem to want to criticize Tesla for their naming and word choice, we should use their actual words: There is no "FSD" currently. That phrasing was only ever used when talking about what the end goal is, and the closest thing they currently have is the "FSDbeta" program. The thing they sell is the "Full Self-Driving Capability" package which includes features like: - Navigate on Autopilot - Auto Lane Change - Autopark - Summon - Full Self-Driving Computer - Traffic Light and Stop Sign Control And the coming soon stuff. None of which are shown or used in this video (except of course that the car has that specific computer hardware). You only used Traffic-Aware Cruise Control, Autosteer, and the Emergency features, all of which are included on every car. So you calling it "FSD" multiple times and saying stuff like "even other cars are better that don't have self-driving" is unbelievable misleading since the features that package adds were not relevant at all here. And the features that were relevant on this test are pretty basic. I think Autosteer is literally not allowed to leave its lane right? Sadly I'm unable to find any screenshots right now of the explanation text that is shown when you enable it. But of course the results of the testing were utterly disappointing. It's sad to see that the public features are still that dumb. In that point the video is helpful.
The only constant is that type of roadway. Judging by how narrow the roadway is and how dense the objects (aren’t) it’s calculating that it’s safer to go through it at that speed than off the roadway. Try an open parking lot
Tesla doesn't want anybody that isn't mostly promoting them to use the beta software. They openly restrict usage for anybody that tries to be fair with the testing
That’s why I trust LiDAR more than pure vision system for now. LiDAR may not good at recognizing type of object but it’ll at least tell you the existence of the object. In pure vision, it’s all guessing game where AI guess likelihood of the objects and if it fail then it’ll learn and hopefully do better next time. Eventually, AI should gets better at those guessing game to a point that’s better than human. But, I always prefer a system that actively detect objects and surpass human’s ability vs actively getting a better guess than human.
One thing that is never going to happen is swerving around the object with a double yellow painted lane marking. Especially with what I call dumb autopilot and FSD. The only object i would have expected it to brake for was the truck but the original non-beta FSD was not meant for streets anyway and will be improved when the beta is complete.
Correct. Insurance companies will never fault you for staying in your lane. Your chances of further injuries and damage go up like 3x as soon as you leave your lane, either left or right.
If you run tests like this again, careful of over steering. Say you have the collision point (CP) and the near miss pathing around that CP; anything after that CP could be apart of that new pathing. You have some reasonable reactions but there is Absolutely No reason to have anyone after the CP. Cases: 1. pathing around the object is not possible, and front of the car strikes object while projecting that object towards bystander. 2. over correction or tyre fails and telsa now isnt going to adhere to intended near miss pathing (tesla may skid or continue into bystander) 3. failure to handle the steering wheel (missing the critical turn back towards the road and thus ditching the car off the road - towards a bystanders) One simple solution is setting up cameras on tripods (thus removing human life in the area) while fully controlling the road for these tests. (With respects to a partially opened road: I was a part of controlling traffic on a mountain road (2 lane) while wearing cycling gear and having no traffic cones or flares while we waited for EMS. 1 crashed cyclist who couldnt be moved, 1 person who was stabilising the cyclist and eventually 4 people conducting traffic through the area, 2 above the crash, 1 at the crash and 1 below the crash - due to sight lines and poor reaction of drivers. Drivers who are used to be bombing about 120% of the speed limit down their favourite mountain/backroads are not looking for unusual circumstances. They are going to keep driving and not really register someone trying to flag their attention, especially americans in silicon valley. Best solution is to expect cars to ignore any directions. They will drive through this area of the road, collide and then ask why the road wasnt closed fully for testing)(I hope you had additional spotter cars to clear a larger time gap than what was shown here)
I’m quite curious if the emergency braking is still limited in between 7 and 29mph. If this is the case, could you retry this test on a straight road going under 28mph. I feel auto pilot needs the upper hand with a straight road for static objects. I know this test reflects real world situations and you’d definitely want your car to emergency brake at high speeds and in all cases.
The answer is Yes. The FSD Beta have better objects recognition in comparison to Tesla Autopilot which is by the way just a plane old dumb cruise system same as all other Cruise system out there.
I have a model 3 rwd 2023 and im so afraid to use the basic auto pilot that i hardly ever use it.. simply put i don't trust it nor the emergency breaking… i always have my hand on the steering and one foot on the pedal. My $40k tesla is basically an expensive electric gokart.
The fact it is detecting an object in the road and alerting you but doing nothing significant to avoid the situation on its own... what?! I could understand if it doesn't detect something there is nothing to act on. This shows it knows and does not act appropriately.
No, it's just that the system he's testing was never designed to avoid/steer around objects and he's being intentionally misleading. Based on the two solid blue lines, that indicates he's in Autosteer -- the most basic form of Autopilot. All that does is keep you in lane and will brake for cars ahead of you. That's all Autosteer does. Auto emergency braking is part of the basic Autopilot package which includes Autosteer, but AEB will only brake for pedestrians, cyclists, cars, etc. You know, things that are actually common on the road. It is NOT trained to stop for office chairs in the road, grills, or any other random object that Aiaddict has lying around. And considering he worked at Tesla on the Autopilot team, I'd expect he knows this. Would you really want a system that would auto emergency brake for any object? Any false positive would have your car constantly braking which is why it's only trained to identify actual things that matter like people, cars and other road users. The only system that Tesla is testing that has any form of object avoidance is FSD beta, and even that is limited in what it will avoid (open car doors, some animals, etc.) and even that's only at low speeds. Aiaddict also previously tested this, so he should be aware of that too. To make a video like this to intentionally mislead people to think that Autopilot is something incapable system that isn't safe is ridiculous.
@@Brendon471 regardless of the incorrect “auto steer” claims made. What are your thoughts on the emergency braking system? It should be able to detect at least a trash can and a vehicle but failed to brake.
Its still worrying that people actually think Level 2 should stop for static objects. That's the driver's job. The reason that AEB and AES exist is that sudden moving obstacles that might be dangerous (so small animals will be driven over because its considered to be safer) or endangered by the car can be avoided.
The pickup truck could be because of the missing Radar. Cameras are just very bad at detecting such dangerous situations, LIDAR should further improve this detection rate dramatically in the next few years. But its still Level 2 for now.
This has always been the most concerning failure of Tesla's software. It just tries to plow right through so many things that it doesn't expect to be there. FSD Beta regularly tried to drive me through a train. I can understand struggling to recognize some small obstructions, as it must be near-impossible to train it to distinguish every object, but it's particularly concerning that it fails to see large obstructions, and that it fails to react to the ones it does see. The latter seems improved on FSD Beta but these are my greatest concerns for FSD ever reaching full functionality.
OK so not FSD beta. Ballsy to run over the bucket :) I'm guessing they don't avoid items because it would otherwise have spurious detections and violently swerve when nothing was there. Or give rise to phantom braking. If you can you can test the same with beta. It might be more elaborate
I agree, I do want to re-run this test again using the Beta stack. I also want to put big objects that won’t damage the vehicle so I can get closer to the objects to see if it can brake, just late.
@@AIAddict can you say what your role was at tesla? was it peripheral to the autopilot stuff like coding on the simulator or some of the core ML stuff?
Im glad Tesla is still at its infancy with their FSD. Things like this once again reaffirms my belief that FSD has a LONG LONG way to go when it comes full self driving LOL
Dude is on BASIC AUTOPILOT and using the Tesla that has abandoned sensors for cameras (aka TeslaVision). You can't test a feature without using the correct tools. In addition, I've had automatic emergency brake brake for me because of a dude trying to brake check me. I've also heard from other Tesla owners who've had their Tesla brake for them, saving their life. To the guy who said " Haters are gonna hate and fanboys are gonna be fanboys. Don't be discouraged." sounds like a gasoline car fanboy himself driving around in a car with zero safety capabilities. At least Tesla is trying and releasing updates, making the car safer every month.
My assumption is that the only acceptable tolerances on their vision-only FSD suite to avoid excessive phantom braking (due to false positives like road markings and shadows) that smaller static objects are simply ignored. Hopefully the beta version can do better because that's obviously not acceptable for anything beyond L2.
@05:58, I think this is an important real world test that Tesla AI fails to remotely consider. I give Tesla an anticipation score of 0% where I would be wondering and concerned (is there a human in the driver seat and are they going to not look before moving out), rating that if I was on a bike, 75% chance that as an occupied ‘parked’ car would pull out. (avoidance as a cyclist, likely biking around +30km/hr normally at 0% grade, slow to 24km/hr at 0% grade while moving more centre in the road (assuming I have vision ahead). This way I have more braking time and giving myself room to ditch into the wrong way) (otherwise driving this, I would be covering the brake after flashing my high beams, expecting to brake within 100ms or less)
Thanks for the video. The best thing Tesla could have done for the owners community was fire you. Thankfully you are free of the non-compete contract and can publish all of your findings freely without fear of reprisal! Hope you make millions off these videos sharing your actual experience.
I just got FSD beta today. So sad that I paid for this. It can't even drive in a straight line. It disengages constantly and tried to kill me 3 times. I was a believer and a sucker for so long. :-(
interesting, what year is the model 3 you were testing? I wonder if the camera only symptom is why thats happening or if thats the one with the camera and radar combo to operate the self drive/autopilot
2021 Model 3 Long Range with Radar equipped. Regardless of having radar equipped or not, Tesla has turned it off completely in the software. However, they do still rely on radar for TACC and Summon
@@AIAddict interesting, did not know that. thanks for the update. thats really bad they need to fix that i feel like other makers would stop for those obstacles like The Subaru's EyeSight system ect.
I performed this exact same test at four different Tesla dealers on the real roads with real stopped cars in front in 2019. I tested about 30 times and both forward collision warning and emergency braking did NOT work, and I had to steer to the adjacent lane every single time. I went on to Tesla forum and all the Tesla fans and owners did NOT want to hear the truth. Their cars "self-drive". I performed the same test on cars from Nissan and Toyota. The other brands worked as expected the very first time. So I ended up buying a Nissan. I guess after 4 years, Tesla still cannot get this basic thing correct.
There is similar test with other cars also ruclips.net/video/bRhoUqD6Opc/видео.html These are cars were tested: Tesla Model S with Ap1 and ap2 Tesla Model 3 Tesla Model X Volkswagen Golf GTE Audi E-tron 55 Quattro Jaguar iPace You can notice, that ADAS systems have limitations in most of the cars. Audi works only to 25m/h. M3 at 25m/h overshoots slightly. MS works nicely. Very old MS sees something, but fails. Jaguar iPace and VW Golf GTE didn't work at all. So, newer Teslas are better than the competition, but it is still work in progress for all the brands. And the test was made for slow situations of 25m/h. The test was not made on current models.
Ahh I see so instead of confronting the real world tesla's ai thought of deleting the object from its virtual world. I guess everyone process stress and truma differently. Bet a different tesla with a different personality will not fall for this.
oh you are the one got fired from tesla lol. No, autopilot is not the same as fsd beta. they are tring to merge these two feature into one in fsd beta v11. you were just the data labeler, not the fsd system engineer, you don’t understand the system. you are not running latest fsd beta, so it doesn’t count as valid test.
@@AIAddict There is “Full Self Driving” text on your video thumbnail. That means you’re testing FSD. tesla’s latest FSD stack is FSD beta, if you just testing old version of the technology what’s the point. You can argue the chrome is slower than safari basing on testing with 5 years old chrome and latest safari, but that’s not the same isn’t it?
@@AIAddict Also FSD stack with autopilot is not the same as FSD beta, you should know that, or you couldn’t have known it because you never worked at FSD system directly.
Mines road? Seems like seeing an object would be the easier part of a camera only system and determining the object's distance and trajectory would be the harder part that requires all the fancy compute and ML, and yet here it just flat out fails.
If you put a labor union contract in the road, the Tesla will slam the brakes and deploy the air bags.
It's Tay!
Of course it will. Because it wants to protect every employee from a lack of productivity that comes with unions. In the end Tesla employees are way better of without one because productivity will lead to a higher stock price and you can benefit from that through stock based compensation. Unless you get fired for stupid behavior that is.
Bro you can’t be leaving comments on videos like you’re some kind of normal RUclipsr or something
Funniest thing I've read today.
@@whynotstartusingyourbrain8726 ...and other jokes you can tell yourself
I applaud your bravery in these tests. You really waited for the last second each time to give the system a chance to stop. Thank you for the video!
You have a good friend to casually run into the street in front of a moving vehicle it's increasingly clear won't stop, and give a chair a spin, or knock over a pallet!
You're doing exactly what I would expect an excellent and dedicated engineer to do. You have always stayed focused on the task at hand and truthfully reported everything you have observed, and even after you were let go you have continued doing this work on your own. Much respect.
Thanks for this! Haters are gonna hate and fanboys are gonna be fanboys. Don't be discouraged. You're bringing light about the FSD situation and informing thousands about how not-ready this technology is.
I appreciate your kind words
Bringing light to FSD 😅. The car being tested is not even in FSD. I'm glad Tesla fired your lazy ass from their team. Go work for GM and Ford.
@@AIAddict And in pro of your own physical integrity, an (unsolicited) tip: When performing this kind of tests, hold the steering wheel without placing your thumbs around it. This is common practice in off-road driving, so that if a rock or rut causes the wheels to turn suddenly and violently, and thus also the steering wheel, you don't risk potential thumb injury. I think that scenario definitely applies here. 😅
I'd say it's worth preventing that risk so you can keep posting these very useful tests while reducing the risk to your person (insomuch as possible) 😅
This isn’t FSD software. This is BASIC AUTOPILOT.
They also abandoned sensors for cameras so I'm not surprised at the results
I applaud your bravery, have always joked about doing these types of tests. Glad someone was brave enough to try it.
This was actually no clickbait. I'm impressed!
I’d be interested to see the test performed with beta version
Same, Tesla can re-activate it for me and we can see
Shame tesla doesn't allow testing with anybody that is not promoting them actively
@@sanisidrocr I have beta and and could do some of these
@@AIAddict re-activate it? Are you the one who got it taken away after hitting something? Sucks man. I like the tests but it is beyond the way they want people using it. They already have so many people complaining about "normal" drivers testing software on public roads....
@@newberry940051well my autopilot stops for chairs I place in-front of it, mind you it’s late 2023 almost 2024. Make sure next time ur using a tesla equipped with full self driving as this is not fair to just use autopilot and label it as full self driving. And remember detecting an object and acting upon it is different. Truthfully tesla has always said this feature is still actively being improved so expecting it to be good right out of the box is quite stupid. And it’s going to be fully autonomous pretty quick considering where it’s at today and how much it’s improved with tesla removing 300,000+ lines of code and now using the neural network to train the car even better.
Even when it thinks the chair is a PERSON it's like "oh well BEEPBEEPBEEPBEEP" & just decides to run him over.
As much as I’m a believer in the tech, I call BS with Elon removing
Radar from the 3 and the y. Radar cars perform better. Period
Tech does not require belief. My toaster works whether I believe in it or not. The way some people talk about this stuff sounds like a doomsday cult. Like you're all preparing for self-driving rapture.
So, what did they get the award for, if it never tried to stop? Im not a Tesla fan, I consider myself rather critical of them, but that's really strange that it didn't do anything. Have you tried a different roadway? Just out of curiosity. But i guess sometimes it detected the objects and did nothing, so wow thats pretty useless.
The specifics of the award were for its detection and avoidance of “vehicles and pedestrians”. However, as shown, it did not react to a vehicle pulling out.
How bad must the others be? It would be great to have the next up competitor run these tests too
It’s miles better than what’s in all other cars, that’s why. Makes you worry about being another car, doesn’t it?
@@nicholasthon973 Doesn't make me worry, because I'd have to be either blind, paralyzed, or unconscious before I would rely on any self-driving software operating a vehicle that wasn't on a track or cable. And I agree, this award is premature by several years, at least judging by these results.
wow man, your testing of the capabilities of a tesla is great, best I've seen on yt so far!
it's so important, you do this, even if Tesla will hate you for this or start any arguing why it's not realistic or whatever. But they have to face the truth that there is something not working propperly, not good enough.
But, please:
Make something that wakes up Tesla, really: Do this test again, but this time don't use daily items but:
Take balloons (to make the object so light that you will be able to actually run over it without damaging your car) and create with the balloons a puppet in the REAL size of a 2 years old child, so with arms and legs and a head out of balloons and put child cloths on and some wool for hairs and place it as if it would crawl on the ground, maybe like a 2 years old after it has fallen down. (Maybe you should pull on a thin cord that the balloon toddler moves A BIT.
And film this moment closely when and how the balloon toddler gets overrun to shock everybody, so that Tesla will speed up by getting better and also maybe focussing on this insead of creating new unimportant things like funny sounds or what ever.
It's just disappointing that not at least the expensive cars can already perform perfect automatic emergency breaks! (no matter which system they use)
Incorporating this into part 2. Great ideas all around here!
Dang you have really thought about this. Great idea. 👍
I think that behind the hype you don’t really know what Tesla is… Don’t expect that thief will suddenly try obeying the law and care about safety. Tesla is a Fraud.
You could potentially do a carboard cutout of a child or an adult or a barbecue 🙂. The car will not be damaged by a cardboard cutout.
@@AIAddict this will definitely result in media attention and hopefully will force Tesla to put more effort into fixing this. Doing great work!
Camera-only is a really bold move. They should probably have stuck to LIDAR which I imagine is a lot safer and much easier to interpret for the machine. Detecting an object in a LIDAR scan is probably downright trivial compared to computer vision. You just check for the distance to the car and that's it. You should be able to tell us more about this question I guess
Even with lidar you're still left with the problem of prediction. Humans it's just intuitive. It's very hard to teach machines how dumb people can be leaving the trash cans in the middle of the road.
DUDE, just please be careful!
Performing those stunts on public roads, what could actually go wrong?
This is the reason Tesla fired his ass in the first place 😂.
Sad to see, that there is no emergency braking applied at all. Happy to see, that it is not violating the double line by avoiding the obstacles.
@@logicthought24 no humans harmed or involved in this test..
Tesla vehicles used to have radar along with their vision. The radar would detect and brake for any object in the road. But Tesla felt they were hitting a glass ceiling depending on radar and decided to drop the radar completely and go all in on their Tesla Vision. However, by doing that, they had to go many steps back and work on getting through those edge cases that the radar would have no problems detecting. This is where Tesla is right now.
I wonder if this Tesla still has radar?
It is also not obvious to me if the FSD stopped using radar for vehicles equipped with radar.
My Tesla vehicle does have radar equipped but seems to only use the radar hardware for cruise control and summon.
@@AIAddict Tesla’s that still have the radar hardware have them deactivated now. They don’t work.
Wow. Scary! I hope the authorities around the world are paying attention.
I sent a link to this post to our son who is a Waymo programmer. He lives in SJC.
I would love to meet your son. He sounds like a really cool person.
This was observable in a few situations in videos. FSD shows rather slow reaction times towards suddenly occurring objects. FSD can predict traffic and paths really well given far enough viewing range to assess the situation 2-3 seconds ahead of reaching that actually point or the objects arriving at the car's location. They need to bring this down to the 0.5 second region to significantly surpass human capabilities.
Wow, these are better tests than 99% of tesla youtube. Grats!
Been waiting for this one
Hope you like it!!
There is so much wrong here. Iihs says it's a pedestrian and vehicle avoidance system, not a static/moving object system. You were testing auto steer, not the "FSD system". If you read what AEB and Auto steer do, they definitely don't tell you "the car should steer and brake without [your] input".
He literally said he helped program it ...
@yuahyahassen9807 u didn't understand his comment
@STerkskz yes I did. He's wrong. He classified in the video that before people say this nonsense... FSD probably is better but not for the reasons explained. Fsd can probably detect any object that's not usual...he's talking about what they specifically have coded for to avoid with Emergency braking
@@STerkskz not everyone has fsd
@@yahyahassen9807 the author of the vide states he's testing the "FSD system" which is a lie. Then the author of the video pits the legacy auto steer and AEB system Tesla is not actively improving any longer having recognized it can never succeed against static objects which are neither a car or pedestrian. The author additionally seems to believe the car will swerve to avoid accidents something neither AEB nor Auto steer could do. The author disingenuously implies FSD has failed when it would thrive under these circumstances.
Now cars with or without FSD have an FSD-based AEB with more power and capability. Software updates are awesome
Consequences of Musk’s genius idea to remove radar. -anything to cut costs.
If a pedestrian wears a shirt with a picture of a red traffic light on it, will the Tesla think it's a real light and stop?
good video. they need to be aware of these issues. who better to call them out than someone who they fired. keep it up
You need to try dropping a cardboard box out of the back of the car you are following. This happen to me in real life but it was a carpenters saw horse which I couldn't avoid!
Will 100% incorporate this into the next video!
1:54 so did they kicked you off the Beta?
Thank you for these in-depth tests. I'm terrified of the average driver, and even more so when that driver starts depending on this to drive for them.
Same, when I let friends drive my car, they activate this stuff and then let go of the wheel. Absolutely too complacent
@@AIAddict Holy shit don't they watch your channel?! I'm gonna say it's probably time to stop letting others drive the car, sadly.
do you think that the proximity of the tree's shadows had any effect on the detection?
Have you considered suspending things in the proper size/shape which would not damage the car and see how it behaves - like a balsa wood cutout or stuffed animal as target?
Even if it this were the case it's still obviously unacceptable.
Exactly not taking up for the car but it doesn’t do well with shadows plus they are on a corner going around a curve and a white truck is sitting right there it’s focusing on two many things and going too fast
Brave man. Thanks for testing that out for us
No problem 👍
I was on my motorcycle and a Tesla just plowed right into me no braking at all. This video explains why. 😱😱
You need to run this test again, but this time with child simulations using ballistics gel wrapped in saran wrap and done in such a way that it (hopefully) doesn't destroy the tesla.
I think the image of that going splat on the front of the Tesla would be enough to get attention, especially if you are approaching it as "we're trying to gather data to understand what's going on here".
Great idea, currently looking into ballistics. Maybe Adam Savage could help. Everyone should tag him in this comment
@@AIAddict Put bull bars on it, with spikes and razorwire.
I'm lowkey scared it will crash like the Teslas in recent news. Stay Safe and Thank you for making this content.
You've got nerves of steel! 😅 thank you for all the tests!
Some parts of this video scared the hijibies out of me. 😨 You were so close to hitting them. 🤯 That's insane. Thanks for uploading. Upload more. 👍
People use it every day on the highway ..and there is no problem may be some time phantom braking ...thanks for the effort..i think you use it too.
I agree. I just subscribed to FSD and went on a 200 mile road trip and the car performed flawlessly. I was super impressed.
Wow, 12K for a feature that is dangerous. If you can't rely on it, what is the point?
This is what I have seen with AI aplied to the "phisical" world, very promising, a lot of potential but so far it doesn't work.
Sums it up nicely, I'd say.
Time to get some of those 39" tall real dolls (My Size, or something was their name) and see if Tesla is ready to start running over kids as well as roll through stop signs.
Just ordered some!!!
This was an incredible video, glad someone is doing this!!
You need a real crash dummy like what IIHS uses so the radar and cameras can properly detect it
can you include a camera showing the pedals? Ideally a continuous clip that shows pedals and the object. just to be sure you're not pressing any pedals. Because you can press the accelerator to override emergency breaking.
I can definitely incorporate that into the part 2 video
Thanks for making this!
I got a Mazda CX30 GT Sport Tech this year, and I can can tell you it's emergency braking is so much better, I don't think I would be able to crash the car even if I tried. With one exception, the other night I had to defrost the windscreen, and I should have got the frost of the cameras, result was dash was flashing at me all safety systems off line. Next time we have a heavy frost I will know to make sure it's fully defrosted before I drive off.
WHY can these systems get on the road without passing a driving test!
I'd rather have a child drive.
Technically, the human operator is driving the car. Otherwise, the “FSD” would have to be turned off immediately.
The fanboys that think this system is anywhere close to being autonomous are insane …
Excellent video!
one problem I see is you are using AUTO PILOT on a two lane highway, auto pilot is made for the freeway and does not respond properly ib 2 lane traffic like FSD beta does. I'm sure it would work on a freeway, and I'm sure FSD beta would do fine. maybe not for the bucket but some of the other objects and defiantly the truck. I'm a beta tester and my model 3 2022 will stop if a truck or pedestrian pulls or steps in front of the car. also you need to have a destination set on the screen, seems to me when I use mine if I have plugged in a destination everything works better. probably should not mater but that's how mine seems to work.
From a real world perspective, these requirements are too complex for the average person to rely on safely. The type of road shouldn’t dictate whether or not the car steers around deadly obstacles.
My car always stops for beer kegs!
So does my stomach 😋
So does mine, and it’s Level-0 autonomous and still somehow knows a good thing when it “sees” it …
Can you do it again with the FSD Beta? I wanna see if there are any differences
I’ll try and track down someone who is willing to let me do this with their beta build lol
This is very weird. Especially the last example with the truck pulling out. I’ve had the exact scenario with a car pulling out in front of me more than once (not very good drivers in my area). It pulled out even later than yours did. I didn’t even have time to react and my Model Y activated the emergency brakes and slowed me enough to avoid hitting the car pulling out in front of me. My Model Y has FSD but not beta - so the same software as your model 3 (in theory).
interesting, was the lighting any different?
@@stevecool4381 Maybe you should check your system because my car works fine in both of your last situations.
@@stevecool4381 I'm not sure. I'm sure there is a lot that's different. I was on a 4-lane road in a different area of the country. It's happened more than once, so I need to pay better attention to the conditions next time 👍🏻
Your commentary is hilarious when it’s about to hit an object
Since a lot of people seem to want to criticize Tesla for their naming and word choice, we should use their actual words:
There is no "FSD" currently. That phrasing was only ever used when talking about what the end goal is, and the closest thing they currently have is the "FSDbeta" program.
The thing they sell is the "Full Self-Driving Capability" package which includes features like:
- Navigate on Autopilot
- Auto Lane Change
- Autopark
- Summon
- Full Self-Driving Computer
- Traffic Light and Stop Sign Control
And the coming soon stuff.
None of which are shown or used in this video (except of course that the car has that specific computer hardware).
You only used Traffic-Aware Cruise Control, Autosteer, and the Emergency features, all of which are included on every car.
So you calling it "FSD" multiple times and saying stuff like "even other cars are better that don't have self-driving" is unbelievable misleading since the features that package adds were not relevant at all here.
And the features that were relevant on this test are pretty basic.
I think Autosteer is literally not allowed to leave its lane right? Sadly I'm unable to find any screenshots right now of the explanation text that is shown when you enable it.
But of course the results of the testing were utterly disappointing. It's sad to see that the public features are still that dumb.
In that point the video is helpful.
@@Grauenwolf More like I currently don't know of any actions that would make me call it fraudulent marketing to begin with.
Thank you for the video. This will make musk very upset.
The only constant is that type of roadway. Judging by how narrow the roadway is and how dense the objects (aren’t) it’s calculating that it’s safer to go through it at that speed than off the roadway. Try an open parking lot
Try this with fsd beta!
Tesla doesn't want anybody that isn't mostly promoting them to use the beta software. They openly restrict usage for anybody that tries to be fair with the testing
That’s why I trust LiDAR more than pure vision system for now. LiDAR may not good at recognizing type of object but it’ll at least tell you the existence of the object. In pure vision, it’s all guessing game where AI guess likelihood of the objects and if it fail then it’ll learn and hopefully do better next time. Eventually, AI should gets better at those guessing game to a point that’s better than human. But, I always prefer a system that actively detect objects and surpass human’s ability vs actively getting a better guess than human.
I fully agree
bro risked his own life in the name of science
😉
What model year is your car? I wonder if it has the front facing radar.
2021 Model 3 Long Range with Radar
One thing that is never going to happen is swerving around the object with a double yellow painted lane marking. Especially with what I call dumb autopilot and FSD. The only object i would have expected it to brake for was the truck but the original non-beta FSD was not meant for streets anyway and will be improved when the beta is complete.
Correct. Insurance companies will never fault you for staying in your lane. Your chances of further injuries and damage go up like 3x as soon as you leave your lane, either left or right.
If you run tests like this again, careful of over steering. Say you have the collision point (CP) and the near miss pathing around that CP; anything after that CP could be apart of that new pathing. You have some reasonable reactions but there is Absolutely No reason to have anyone after the CP.
Cases:
1. pathing around the object is not possible, and front of the car strikes object while projecting that object towards bystander.
2. over correction or tyre fails and telsa now isnt going to adhere to intended near miss pathing (tesla may skid or continue into bystander)
3. failure to handle the steering wheel (missing the critical turn back towards the road and thus ditching the car off the road - towards a bystanders)
One simple solution is setting up cameras on tripods (thus removing human life in the area) while fully controlling the road for these tests.
(With respects to a partially opened road: I was a part of controlling traffic on a mountain road (2 lane) while wearing cycling gear and having no traffic cones or flares while we waited for EMS. 1 crashed cyclist who couldnt be moved, 1 person who was stabilising the cyclist and eventually 4 people conducting traffic through the area, 2 above the crash, 1 at the crash and 1 below the crash - due to sight lines and poor reaction of drivers. Drivers who are used to be bombing about 120% of the speed limit down their favourite mountain/backroads are not looking for unusual circumstances. They are going to keep driving and not really register someone trying to flag their attention, especially americans in silicon valley. Best solution is to expect cars to ignore any directions. They will drive through this area of the road, collide and then ask why the road wasnt closed fully for testing)(I hope you had additional spotter cars to clear a larger time gap than what was shown here)
Full self driving in five years though 🤡
not even 50
I’m quite curious if the emergency braking is still limited in between 7 and 29mph. If this is the case, could you retry this test on a straight road going under 28mph. I feel auto pilot needs the upper hand with a straight road for static objects.
I know this test reflects real world situations and you’d definitely want your car to emergency brake at high speeds and in all cases.
Would FSD BETA avoid those same objects though? Just to see the whole picture, what if any progress with BETA vs the version 2016..
The answer is Yes. The FSD Beta have better objects recognition in comparison to Tesla Autopilot which is by the way just a plane old dumb cruise system same as all other Cruise system out there.
I have a model 3 rwd 2023 and im so afraid to use the basic auto pilot that i hardly ever use it.. simply put i don't trust it nor the emergency breaking… i always have my hand on the steering and one foot on the pedal. My $40k tesla is basically an expensive electric gokart.
You should create objects from soft material.
The fact it is detecting an object in the road and alerting you but doing nothing significant to avoid the situation on its own... what?!
I could understand if it doesn't detect something there is nothing to act on. This shows it knows and does not act appropriately.
It lacks radar and thus cannot be 100% sure it can safely stop or swerve I'm guessing
This is a very astute comment
No, it's just that the system he's testing was never designed to avoid/steer around objects and he's being intentionally misleading. Based on the two solid blue lines, that indicates he's in Autosteer -- the most basic form of Autopilot. All that does is keep you in lane and will brake for cars ahead of you. That's all Autosteer does. Auto emergency braking is part of the basic Autopilot package which includes Autosteer, but AEB will only brake for pedestrians, cyclists, cars, etc. You know, things that are actually common on the road. It is NOT trained to stop for office chairs in the road, grills, or any other random object that Aiaddict has lying around. And considering he worked at Tesla on the Autopilot team, I'd expect he knows this.
Would you really want a system that would auto emergency brake for any object? Any false positive would have your car constantly braking which is why it's only trained to identify actual things that matter like people, cars and other road users. The only system that Tesla is testing that has any form of object avoidance is FSD beta, and even that is limited in what it will avoid (open car doors, some animals, etc.) and even that's only at low speeds. Aiaddict also previously tested this, so he should be aware of that too.
To make a video like this to intentionally mislead people to think that Autopilot is something incapable system that isn't safe is ridiculous.
@@Brendon471 regardless of the incorrect “auto steer” claims made.
What are your thoughts on the emergency braking system?
It should be able to detect at least a trash can and a vehicle but failed to brake.
Its still worrying that people actually think Level 2 should stop for static objects. That's the driver's job. The reason that AEB and AES exist is that sudden moving obstacles that might be dangerous (so small animals will be driven over because its considered to be safer) or endangered by the car can be avoided.
The pickup truck could be because of the missing Radar. Cameras are just very bad at detecting such dangerous situations, LIDAR should further improve this detection rate dramatically in the next few years. But its still Level 2 for now.
Thank you for this!!!;)
This has always been the most concerning failure of Tesla's software. It just tries to plow right through so many things that it doesn't expect to be there. FSD Beta regularly tried to drive me through a train. I can understand struggling to recognize some small obstructions, as it must be near-impossible to train it to distinguish every object, but it's particularly concerning that it fails to see large obstructions, and that it fails to react to the ones it does see. The latter seems improved on FSD Beta but these are my greatest concerns for FSD ever reaching full functionality.
OK so not FSD beta. Ballsy to run over the bucket :) I'm guessing they don't avoid items because it would otherwise have spurious detections and violently swerve when nothing was there. Or give rise to phantom braking. If you can you can test the same with beta. It might be more elaborate
I agree, I do want to re-run this test again using the Beta stack. I also want to put big objects that won’t damage the vehicle so I can get closer to the objects to see if it can brake, just late.
@@AIAddict can you say what your role was at tesla? was it peripheral to the autopilot stuff like coding on the simulator or some of the core ML stuff?
Im glad Tesla is still at its infancy with their FSD. Things like this once again reaffirms my belief that FSD has a LONG LONG way to go when it comes full self driving LOL
Robotaxi's lmao
Cool testing video. Thanks!
Hi Peter, glad you enjoyed it!
Excellent job!
Why are you testing on a curve in the road?
Dude is on BASIC AUTOPILOT and using the Tesla that has abandoned sensors for cameras (aka TeslaVision). You can't test a feature without using the correct tools. In addition, I've had automatic emergency brake brake for me because of a dude trying to brake check me. I've also heard from other Tesla owners who've had their Tesla brake for them, saving their life. To the guy who said " Haters are gonna hate and fanboys are gonna be fanboys. Don't be discouraged." sounds like a gasoline car fanboy himself driving around in a car with zero safety capabilities. At least Tesla is trying and releasing updates, making the car safer every month.
I LOVE THIS!
My assumption is that the only acceptable tolerances on their vision-only FSD suite to avoid excessive phantom braking (due to false positives like road markings and shadows) that smaller static objects are simply ignored. Hopefully the beta version can do better because that's obviously not acceptable for anything beyond L2.
now see if it stops for things like leaves, or anamorphic chalk drawings
@05:58, I think this is an important real world test that Tesla AI fails to remotely consider. I give Tesla an anticipation score of 0% where I would be wondering and concerned (is there a human in the driver seat and are they going to not look before moving out), rating that if I was on a bike, 75% chance that as an occupied ‘parked’ car would pull out.
(avoidance as a cyclist, likely biking around +30km/hr normally at 0% grade, slow to 24km/hr at 0% grade while moving more centre in the road (assuming I have vision ahead). This way I have more braking time and giving myself room to ditch into the wrong way) (otherwise driving this, I would be covering the brake after flashing my high beams, expecting to brake within 100ms or less)
Thanks for the video. The best thing Tesla could have done for the owners community was fire you. Thankfully you are free of the non-compete contract and can publish all of your findings freely without fear of reprisal! Hope you make millions off these videos sharing your actual experience.
You’re not wrong. I finally feel the ability to speak freely. I was definitely having my tongue tied by management.
I wonder how this test would turn out today as opposed to when this video was made. Potentially there's been quite some improvement.
Wow, wasn't expecting such poor results. Do you think the software has gotten better since then? Maybe do another one of these vidoes?
You guys are crazy.
Could you maybe do a redo on this? I wanna know if they've made any software updates to this. Subscribed!
I just got FSD beta today. So sad that I paid for this. It can't even drive in a straight line. It disengages constantly and tried to kill me 3 times. I was a believer and a sucker for so long. :-(
This is what I’ve tried to tell the world. I’m sorry :((
please do it with fsd beta
interesting, what year is the model 3 you were testing? I wonder if the camera only symptom is why thats happening or if thats the one with the camera and radar combo to operate the self drive/autopilot
2021 Model 3 Long Range with Radar equipped.
Regardless of having radar equipped or not, Tesla has turned it off completely in the software. However, they do still rely on radar for TACC and Summon
@@AIAddict interesting, did not know that. thanks for the update. thats really bad they need to fix that i feel like other makers would stop for those obstacles like The Subaru's EyeSight system ect.
the way this guy steers aggressively back to the right lane seems so dumb lol
I performed this exact same test at four different Tesla dealers on the real roads with real stopped cars in front in 2019. I tested about 30 times and both forward collision warning and emergency braking did NOT work, and I had to steer to the adjacent lane every single time. I went on to Tesla forum and all the Tesla fans and owners did NOT want to hear the truth. Their cars "self-drive". I performed the same test on cars from Nissan and Toyota. The other brands worked as expected the very first time. So I ended up buying a Nissan. I guess after 4 years, Tesla still cannot get this basic thing correct.
Swerving the car to the left in that curve with no shoulder to the left of the road increases the risk for the car to roll over that cliff.
There is similar test with other cars also ruclips.net/video/bRhoUqD6Opc/видео.html
These are cars were tested:
Tesla Model S with Ap1 and ap2
Tesla Model 3
Tesla Model X
Volkswagen Golf GTE
Audi E-tron 55 Quattro
Jaguar iPace
You can notice, that ADAS systems have limitations in most of the cars.
Audi works only to 25m/h.
M3 at 25m/h overshoots slightly.
MS works nicely.
Very old MS sees something, but fails.
Jaguar iPace and VW Golf GTE didn't work at all.
So, newer Teslas are better than the competition, but it is still work in progress for all the brands. And the test was made for slow situations of 25m/h.
The test was not made on current models.
Ahh I see so instead of confronting the real world tesla's ai thought of deleting the object from its virtual world. I guess everyone process stress and truma differently. Bet a different tesla with a different personality will not fall for this.
are these features fixed/improved as of august 2024?
could you also add the replay of the footage of all cameras next time? Would be interested to see.
I was really hoping you'd actually hit the objects lol
Yeah the orange bucket on run #1 was the only object we willfully hit
oh you are the one got fired from tesla lol.
No, autopilot is not the same as fsd beta.
they are tring to merge these two feature into one in fsd beta v11.
you were just the data labeler, not the fsd system engineer, you don’t understand the system.
you are not running latest fsd beta, so it doesn’t count as valid test.
I specifically say in the test multiple times I am not testing FSD Beta and to not confuse it with the normal Autopilot / FSD stack which was tested.
@@AIAddict There is “Full Self Driving” text on your video thumbnail. That means you’re testing FSD. tesla’s latest FSD stack is FSD beta, if you just testing old version of the technology what’s the point. You can argue the chrome is slower than safari basing on testing with 5 years old chrome and latest safari, but that’s not the same isn’t it?
@@AIAddict Also FSD stack with autopilot is not the same as FSD beta, you should know that, or you couldn’t have known it because you never worked at FSD system directly.
Nah bro. It's working as intended. 🤣
Mines road? Seems like seeing an object would be the easier part of a camera only system and determining the object's distance and trajectory would be the harder part that requires all the fancy compute and ML, and yet here it just flat out fails.