It’d be interesting to see both cars going towards each other and comparing the numbers on both that way we’d be able to see the measurements they each give for the exact same distance
in germany / europe we have really tight parking spaces from time to time, without USS it will be a nightmare, since you need every cm to get the car into the parking space
Exactly this. I don’t know what parking in the US is like, but TV park assist does not function at distances small enough to be useful parking in Europe. Nobody needs sensors to park a car in a massive space, you need them in tight spaces and that’s exactly when TV gives up.
@@Andy-sj2hl We have a lot of trucks here. Our parking spots are massive. One of the first things my friend pointed out when she traveled here is how big our parking spots are.
Thanks for proving what everybody against the vision sensors ignored: The ultrasonics are not perfect and miss a lot of small objects in front of the car.
I was quite surprised by the step stool. The reason I chose the items I did was to show 1. an item they'd both miss and then 2. an item that could differentiate their abilities... I was wrong :)
I don’t agree, it only shows that Tesla hasn’t delivered what they promised. As usual, buy a product for what it delivers today not for what it maybe can do in the future.
When I got my Model S a few months ago, Vision Park Assist did not work yet. A few weeks later, it was installed through OTA update. I took my family out one evening and kept hearing the fast beeps while trying to parallel park. I was still far from the vehicles in front and behind me, so I had no idea what the beeps meant. Then I heard the awful noise of wheels scraping the sidewalk. I was immediately both impressed and heart-broken that Tesla Vision was working and that I had curb-rashed my front wheel. So I can attest that the side view of park assist works, and I have the scars to prove it.
When I park with something in front, it always tells me to stop one or two feet away. The difference with that scenario and yours is that yours can leverage the side cameras. Mine has nothing but the “memory” the car had as it was pulling up to the object, and algorithms calculating distance (poorly).
Vision being off by 2 feet could certainly be an issue in a robotaxi situation. A lot of downtown areas require tremendous precision during parking maneuvers, etc.
@@achilles-live so far it is really at level 2. What they can do does not matter if they don't want to take liability for millions of possible FSD drivers and accidents that might happen.
Actually needs more than that. You'll need movement to estimate distance. So if you if you place something while parked and the car is off, it can not know the distance.
Having switched from a 2020 M3LR to a 2023 MYP earlier this week, the change was dramatic. In my garage, the M3 would work perfectly in telling me when to stop pulling forward. It also showed the clearances on both sides, not sure why people say it didn't work to the side. The dings were generally accurate, if you hear the ding you really need to pay attention. It was pretty accurate in all areas. The MY is, in my opinion, horrible in this regard. The forward sensors say to stop WAY earlier than the M3. If I followed it (like I followed the M3) the car is around a foot further back than it was before and not far enough in to put down the garage door. I've figured out that when the red stop line on the display reaches the passenger compartment it's about right. Additionally, I've had it tell me to STOP when pulled off to the side of the road with nothing in front of me at all. Finally I nearly backed into a four-foot retaining wall (on purpose, as a test) without the car reacting at all. I hope this will improve, but at present it's so inaccurate as to be pretty much unusable. The difference in the areas that I had a lot of experience with (e.g. my driveway and garage) was night and day with the M3 being pretty much flawless and the MY being pretty much useless. This aspect was far and away my biggest disappointment with the new car, and I believe they made a huge mistake removing those sensors. I still love the cars, but in my opinion this was a definite step backward.
Cost cutting move. I am disappointed in this change as well because we have a tight garage and need to use all of it to put the garage door down. Now I will have to just guess. Things should not get worse with newer models. My 8 year old SUV can tell me when to stop much more accurate than the Model Y I will be taking delivery of later this week.
I’d vote for radar, as I can accept a parking experience that I’ve used my own gray matter for the past several years. But collision avoidance in situations where visual tools work poorly is more important to me. That is when one would want collision avoidance the most…poor visibility. Radar isn’t perfect, but it is better than cameras. Coupled with cameras, it’s even better.
Hopefully Toyota or other Japanese will catch up, so we don't have to experience this nonsense with no stalks, no USS, shitty headlights, touch buttons for horn etc.
The car can count tire rotations once an object leaves the field of view. Probably the only reason that they haven’t activated some features is that they are collecting data to convince themselves that the algorithms are working properly.
Management normally asks developers to create a proof-of-concept build to validate an idea before making a decision. I was expecting that the first software release for non-USS cars would have this algorithm, but instead, it was unusable. Did they not do a proof-of-concept then? Among all the Tesla decisions, this one seems the most half-baked. Still don't know how it's supposed to work with a smooth wall, or in the dark, and how it's supposed to bend light to see in front of the car.
It seems to me that it would be even better to have both systems working together, drawing on each others' strengths, to get the most complete and accurate accounting of the obstacles near the vehicle. I said (and still believe) the same thing about removing the radar systems. I get that the vision alone is sufficient in most situations, but added information is only a good thing when making safety decisions.
@@shocktreatmentgamingnah the main issue with FSD is how it drives, not what it sees. I think we will see a huge change with V12 because what the car does in every situation won’t have to have been manually programmed for every possible scenario
The problem with radar is that it was introduced into cars for highway cruise control where the object Infront is very likely another car traveling slowly relative to the car you're in. Relatively high speed objects (i.e. everything stationary) are filtered out as uninteresting. Which makes them difficult to rely on in urban environments unless completely reprogammed. Anyone who has tried radar only adaptive cruise control on a twisty road will have experience as to how unworkable they tend to be. Fine on the motorway though.
In our garage with a lot of stuff, our 2022 MY ultrasonics aren’t enough. I ceiling-mounted a motion-activated dual laser pointer that shines into the cabin. I aligned it with the sliding console lid trim and can park the car very accurately. I also put a wheel chock to block a front wheel if I go too far for some reason. The laser pointer I got was from Amazon for about $35 CDN. Lotsa love to all ❤❤❤ PS Thanks Chris and Steph too 🙏
Vision ultrasonic is a game changer in terms of avoiding curb rash on the wheels. Much easier to avoid damaging the rims with vision that adds the side distance monitoring and alerting.
@@dragsteraa Seems like a lot of work for a party trick. Doesnt seem very useful for normal drivers. Just like Tesla Vision, its mostly pointless unless you're like 16 and need help.. People that have Teslas are 30-60yo, I think the average age is 54. We dont need a 360 birds eye camera to pull into a parking spot...
@@Buggabones The most difficult issue is backing into a parking space next to a curb (parallel parking). The other, easier issues, it nosing or backing into a garage or other parking space.
When you're measuring around the 11:22 mark, the car seems to (correctly) want to stop where the pavement break is, not where the door itself is. Subtract the 6 to 8 inches of door inset, and measure from the rear bumper, with a 6" buffer. The car is accurate to within 6 inches of where you want it to stop, and as you said, it gave you a *good* place to stop, before you hit the door. Seems like the right thing to do.
As one of the lucky people that Tesla screwed over by taking our money for a load of features they then removed from the car before delivery I’ve had plenty of experience of Vision park assist by now. The bottom line is that it assists you in every situation where you don’t need help and as soon as the situation gets difficult (weather, darkness, confined space) it taps out and either says ‘degraded’ or just tells you to stop. I hope it’s going to improve because there is NO WAY summon (etc) can safely move a car around without a driver with the accuracy the system currently has.
I should add (to balance the negativity) that I think the concept has great potential and COULD turn out to leave USS in the dark ages. The question, as always with Tesla, is whether they will get there fast enough and be able to do it with the hardware on the car that I have. It leaves a sour taste that we’ve been ripped off with something that doesn’t really work to fund the R&D for people buying some future ‘2025’ or HW5 vehicle to benefit from.
Isnt it the same with autopilot? The faster I drive the more I "really" need it as a safety system. But then it doesn't work anymore at 180kmh. And when my sight is limited during a rainy/foggy night, tesla vision has the same limitations.
This is the exact impression I got from this video. Someone at Tesla has really overthought reversing and entering into tight spaces. Every car I’ve owned in the past 10 years has had factory ultrasonic sensors that were dialled in to cover virtually 100% of parking scenarios. That’s all you need Tesla. I need to park my car in tight spaces without hitting anything. There is no need to reinvent the wheel here.
I prefer the USS but having the backup camera sort of makes it a mute point. Yes, vision is inaccurate but you can see what you're backing into on the screen. Whenever my car starts beeping, I never look to see how many inches I have, I look directly at the screen. However, not having a camera for the front bumper is probably the weakest link in Tesla Vision.
Great comparison thanks. Our cars in Australia 🇦🇺 still have ultrasonic sensors. However as anyone should know, ultrasonic sensors on any car are FAR from 100% reliable, especially with posts etc. So often I have seen people whinge about this when they hit one, when in fact the problem is the nut behind the wheel. Somehow I survived driving with absolutely no kind of parking assistance without hitting anything for forty odd years, before I had a car with a rear camera, and then a few more years still I owned a car with USS. KIND OF giving away my age there 😅
13:50 are those infrared LEDs above the rearview mirror new? I haven't noticed them before. Thinking they are used for driver monitoring for FSD, but they are on now, when FSD is not engaged, interesting
11:15 I think you're missing that you can put a trailer hitch on the car, which will extend the car. I think 15 inches could actually be the length of the trailer hitch :)
in machine learning, this class of problem is called "out of distribution" failure. The sensor could be detecting the objects, but the model doesn't know how to classify the object. Basically, the training data didn't have data like it, so the model can't handle it. Training neural networks is mostly about the quality and quantity of the training data. If you have 1 million photos of cars and roads, but none with a specific object, it will still fail. The main way to fix the failure is to add more diverse types of objects and retrain the model. This is what computer scientists call "data distribution". The way neural networks work today (aug 2023), there isn't a way to analyze the model weights and prove definitively the model is doing what we "believe" it's doing. The fancy name for this is model interpretability. Since we can't interpret the model weights today, people just add more data. The catch is, adding another million pictures of cars doesn't actually make it better. What you need is high quality photos of things a kid might put in your drive way or things the model hasn't been trained on. Even if you were to send Tesla all of the data from FSD system in your tests, they would need to spend months to figure out exactly what the sensors detected and why the models failed.
Thanks for your video it helped me understand the difference between the New Tesla Vision and the older system. The only thing I realized before seeing your video was that with Tesla Vision I do not have Summon and Enhanced Summon and Park Assist that I/we paid $6 grand for, and have no idea when we’ll get the SW update to provide us these advertised features. Thank you.
Parking a Model X forward into a garage with white walls no "stuff" laying around and it thinks I hit the wall when still being a feet away. Another disadvantage of the missing USS on the MX is that the doors only open a little bit if you approach the car, open from the app or from inside the car and not anymore as far as possible. With previous MXs you could walk up to the car, the driver side door opens automatically as far as possible, you get in, push the break and the door closes again. With the current MX, you have to pull open the slightly opened driver side door.
Thanks for taking the time to compare these features. No doubt HW4 cameras bring better visuals, but they cannot stand alone. The ultrasonic sensors win on proximity, even if you see the object from afar you do not know how close you are to the object when you or the cameras cannot see it. A real world scenario is when you're approaching a smaller retaining wall not visible to the driver and the cameras when you reach a certain distance. I was considering buying my first Tesla, but will wait until this is sorted.
We have a 2022 Shanghai Model Y with radar parking sensors in the bumpers, but it does not function as your older car. Our system does provide side distances as well as front and rear, presumably combining information from cameras and the radar.
I have the same model, I thought (but may be incorrect) actually that with the software update for Tesla Vision, it actually stopped using the ultrasonic sensors. In any case I recently hit a small wall (about 45cm - 1.5 foot) with the left front corner of the car, and AFAIK didn't get any warning 😥
@@HermanBovens I wonder how we can test whether the ultrasonics are still being used? Yesterday I measured the actual distance to a small barrier around a lamp post, about 0.5m high, made of 5cm tubing, and the reported distance on the screen was within 2cm of the measured distance. I measured it as this video has made me paranoid about hitting stuff!
This is usually how I reason when I park: if I’m having trouble to park at a spot because of it being tight, unrelated to the cars system, the statistics of me getting my car damage, will be more likely to be caused by another car, then me trying to park. Conclusion: If the space is too narrow, find a bigger one and avoid getting damages from a third party, because that’s more likely. Better accuracy of the cars system, i.e. USS, will not avoid this fact. PARK SAFER, NOT CLOSER!
That exact thing happened to me. My wife put the bag of swim stuff in front of the car and I couldn’t see it from where I was, I tried to summon and it wouldn’t come.
Should be able to display the 360 overview as Elon promised several years ago now, would be much better in those tight parking situations. My 2011 car has this so why teslas doesn't is quite strange
Both systems should react EXACTLY the same as long as the object is detected (e.g. visualized as a trash bin) because the reaction (after detection) is software driven and completely independent of hardware. Both should react precisely the same to the "garbage bin". The only difference should be in the detection phase. As expected, the ultrasonic system is more accurate for close distances for a flat surface. No surprise there. I'm very impressed with the fact that Tesla Vision performed so well. It makes me less annoyed that I will never get V4 hardware. 🙂
13:45 Very interesting to see the IC scanning of the interior camera in the top. I heard the new system is looking for where your eyes look and how much you blink and stuff, and all other cars also have these types of sensors for this detection. Sad so see another nail in the "HW3 can get retrofit" coffin, but at least this detector suite can be used to enable hands-free driving in the right conditions if done correctly.
I mean a parking sensor is there for showing you the distance to an object not just "stop" even though there is tons of room. If you want to know if you can take the turn or if you have to reverse one more time while you rank, tesla vision will be useless for this because it just tells you to stop with half a car length of space. I find it really disrespectful of the customer to just remove uss from a car while you could easily integrate tesla vision features like the detection on the sides into uss system to make a more powerful system than any of these two could be alone. Nevertheless I appreciate your work and thanks for all your videos!
For tight garages, nothing beats a parking pad glued to the floor and rubber bumpers on the walls. Some people in the comments are too stressed about things they have absolute control over. For novel situations and areas, i prefer a little conservatism.
Everyone is overcomplicating this. They are simply using their "Drivable Space" vs "Non-Drivable Space" logic + the Occupancy network for distance estimation. That's why it shows up until the curb and not the poles. Distance estimation is worse, but reading weird shapes is better. Simple
Good video. I thnk there are a few salient important points. First, although the Tesla with ultrasonic sensors did not do as well as expected, it would be easy for one to conclude that Tesla Vision is almost as good as ultrasonic-equipped cars. This would be true for Teslas, but certainly not for cars that are not Teslas. I’ve had cars that go back over a decade that perform better than a Tesla with ultrasonic sensors. This points to Tesla’s inability to write software that works well for the scenarios noted. Second, Tesla should remove phantom features from their sales tool. Autopark, Summon and Smart Summon are not available on Tesla Vision vehicles. Implying they are, only to have some legal disclaimer that wipes out the marketing lie, is really a testament of their ethics in my mind. I’m surprised that the FTC has’t gone after them. Third, Tesla is really a reckless company when it comes to changes. Unlike most responsible companies, Tesla’s approach to changes is absurd. The “right” way to behave when they made the choice to remove ultrasonic sensors (and radar) would have been to provide feature parity before the change was made. Instead, their approach was to remove it and then make empty promises that the features would return “soon.” They haven’t. It’s clear what happened here. Tesla wanted to save money and removed parts, and they didn’t really care what impact it would have on customers. I believe their approach has impacted the safety of the vehicles as well; it would be interesting to see a reputable company re-test Tesla Vision cars to see if they arrive at the same conclusion as it relates to safety. (Collision avoidance, for example.) Someone else posted that one should buy something for features that it currently has, versus those that are promised. Bingo. Reliable software that does what it is supposed to do is not Tesla’s strong suit. The software that really matters (drivetrain control) works well I believe. But anything outside of that is a crapshoot. People cling to the Elon Musk puffery for some strange reason. Anyone who believes what he says is probably in line to buy prime ocean-front Nevada real estate knowing that the “big one” is going to hit California thus making it sink. The Tesla software playbook is vintage early software companies-promise functionality to make the sale and hope that it can someday be delivered.
I think it would be cool to back towards the garage door and then open the door. Maybe both while in the car and with the car asleep to see what it says. USS should obviously see the door disappear, but does vision? My guess is that vision would incorrectly say the door is still there if you open the door while the car is asleep and don’t move.
Maybe put some foam on the crate before you assault your clear coat so much but looking at the side of the car and the name of the channel - Great Job!
We have one Tesla with and one without. I personally find the vision-based assist to be worse than nothing, basically just giving you incorrect information 100% of the time.
Same here, I turned off the parking beeps as they just distracted me. I get being conservative but I can tell when I'm three feet from something on my own. Inaccurate parking sensors are worse than not having them. People say Tesla's have great tech, but honestly it's got some important gaps that almost any other car made in the last five years can do well.
They removed USS too soon. They could have run with both, training TV based on the USS readings on the car. Then when they got them to performing similar, they could stop putting in the USS. I think they jumped the gun here.
Ultrasonics do warn on the sides, at least on my 2021. There are ultrasonic sensors pointing straight left/right just in front of the front wheels and just behind the back wheels. So you do get ultrasonic alerts for the sides of the car but only on the front and back edges. That is enough: you don't need ultasonics, for example, halfway down the side of the car because by that point, you've already driven past the sensors on the front/rear and they've already warned you as you drove past the obstacle(s).
I'm confused by this test... It's my understanding Tesla disabled the ultrasonics' on existing vehicles that still had em, so long as they are on HW3... IE - aren't both these cars just relying on Tesla Vision at this point, regardless of whether the sensors are installed? Or they are still utilized, but only for parking purposes?
Thank you for your thoughts and the testing. My experience with a 2023 Model X is very different: When I approach obstacles the car absolutely overreacts and often tells me to stop 1 meter (3,3 feet) or more in front of it - which isn't really helpful at all. On the other hand, it often misses to detect lamp posts completely. Also: I really, really dislike the "curb" detection as it is. Tesla should display curbs differently (maybe a gray line?) or not at all. Because what's the point if I need to park so that the front of the car is above the curb (very, very typical situation here in Germany!) and the car warns me because of the curb but fails to display the wall/tree/post behind it instead? The test with the "small object" in front of the car isn't so funny when you realize that a typical parking situation in Europe includes people in front of you or behind you moving (parking in a line at the side of the street) their cars and often the distance in parking situations here is only 50 to 60cm (1,6 feet roughly) between the bumpers. Then I come back to the car and it will show the OLD distances in the front. They can be too close or to wide, depending on what happend in my time away from the car. What they should have done is add a "nose" camera - which, frankly, every other car in this price range has. Also: The Model X had previously ultrasonics on all four dours (and of course front/back and even the roof), so my 2018 Model X very well detects obstacles all around the car (and very precise!), but the new one is just a mess when it comes to parking. Please realize: European parking lots are a lot smaller, sometimes, when I drive from floor to floor the distance between the front of the car and the wall is only 30cm (~one foot) so if the sensor are off around one foot, that's all it takes.
The experience was better with the US. I wish they would bring them back. The cameras give warnings, but it's not accurate enough in tight situations. IMO, that was a bad move by Tesla.
To come to a conclusion Tesla vision is just garbage for parking. All the features about surroundings are nice to have but they just could add it on top of the USS probably. They won’t solve it especially when you have for example a white wall where the vision system will have problems to find the right reference points.
Thanks for making the video! I was finding videos about the visual parking vs the normal sensors that you would get in a 2021 and below! As a 13 year old myself I will probably get a normal gas car then I’ll move to a tesla!
So you're 13 now, but by the time you're old enough to drive, Teslas will be much more affordable than gas cars. They're already at price parity today. They're also safer. Park assist will be even better and FSD with hardware 4 will be in full effect. It's interesting you used the term 'normal' gas car. Sooner than you think EVs will be the new normal. They're already the norm for me and I am 55 having driven gas cars for decades before getting a Model Y. Curious, why might you still be considering a gas car over a Tesla as your first car?
You made the point. USS are far more reliable than Tesla Vision (as for now, we hope for the best in the future). In every tests, it is almost precise in 1-2 inch radius whereas vision is far off. That’s hopefully protective but too largely conservative when you have to park in tight spaces as in Europe. In crowded cities it gets very hard and no aid. USS used to work as a charm whatever the weather, night and day. The sides detection does not really help, there is no measurement and line is always red, even when far from the curb
Would be cool if Tesla would support a measuring label (like barcode but with larger bars) for more accurate distance measurement. Land Rover uses it for towing a trailer. Tesla can use it for more accurate distance measurement in tight garages and tight parking spots.
As Sandy Munro says, FLIR sees everything, even if you're driving through thick fog/smoke and bad weather. Highly consequential when looking at camera based FSD safety and precision!
There's no reason the 2 can't work together. Ultrasonics for the front and rear distances, vision for sides. I think the vision system could do a better job of outlining the drivable space, like FSD does, which would make the dings a bit more clear on what they're for (curb vs object just past the curb) But, there's plenty of room for improvement for the Tesla Vision system as well.
I haven't been able to find this information: Will "older" cars with Ultrasonic Sensors also get Tesla Vision? Since they have the same cameras. Or are they software-limiting it to cars without? A combined effort (USS+Vision) could yield great results!
are you sure a recent software update did not change from ultrasonics to Tesla Vision? Also with the measurements, the rear bumper sticks out further than the camera, so I think you should have measured from the bumper. Great experiments.
Like, why not combine the two? The car's expensive already. Clearly with/without Tesla vision and ultrasonic have their pros and cons, and I believe this can be actually perfect if combined.
Curious, couldn't they strategically use UV LEDs that illuminate so vision cameras can see, yet to humans it would appear as if no light is being emitted, allowing vision to see well even in the dark?
I essentially agree, but at the other end of the visible light spectrum. Infrared illumination has been effectively used for vehicle reversing camera illumination for at least 20 years now
objective test... well done. Still room for improvement but somewhat useful. With the new Highland refresh I expect Tesla Vision will improve as it does not come with USS or front bump camera.
You are measuring the distance to the garage incorrectly on the vision only car. The measurement should be from the bumper to the garage. Not the inset area of the hatch lid. The bumper is the closest part of the car to the door. I have measured my 2022 Model Y with ultrasonics and it is surprisingly accurate. However, I backed into a Tesla urban charging space once where the chargers are not directly behind the car. What’s behind the car is a “Tesla charging only” sign on a skinny pole. The ultrasonics cannot see a skinny pole at all so one can back into it. One more thing. I get side warnings on my ultrasonic sensors. Every time I leave my condo garage they go off because the door is rather narrow and there are bollards mounted on raised concrete blocks on both sides which show up as cones on the screen.
You know what I don’t understand? With all this tech, why does the Tesla even allow you to hit something? In the garage door example, it would allow you to keep reversing into and through the door. Shouldn’t the vehicle refuse to move knowing there is an object in its direct path?
It's a difficult balance. What if it thinks there's something there but there's not and it stops you for no reason? At some point the human input has to override the computer decision. But the car does have the ability to stop you from hitting things at least temporarily. Look up the obstacle aware acceleration ability which can stop you from running into things. The car can also slam the brakes for you if you're about to back into another car or pedestrian or something without noticing
Lucky us in Australia we still get USS on our Model Y. Our 2023 model y delivered 2 weeks ago has uss. It would appear from your video that a hybrid solution would be best by using both uss and vision.
Thanks for the video.. and the testing. Seems to me that they should have left both radar and USS in the cars and done vision also. Each of the technologies have there strengths and weakness. Fog, low light, or an unlit garage etc. I would not car about the side view that much... If I want to get close to a curb, I would just LOOK at the screen and see the camera feed or use the mirrors... Just like now I use my side mirrors to see where the curb is.
It’d be interesting to see both cars going towards each other and comparing the numbers on both that way we’d be able to see the measurements they each give for the exact same distance
in germany / europe we have really tight parking spaces from time to time, without USS it will be a nightmare, since you need every cm to get the car into the parking space
Exactly this. I don’t know what parking in the US is like, but TV park assist does not function at distances small enough to be useful parking in Europe. Nobody needs sensors to park a car in a massive space, you need them in tight spaces and that’s exactly when TV gives up.
@@Andy-sj2hl We have a lot of trucks here. Our parking spots are massive. One of the first things my friend pointed out when she traveled here is how big our parking spots are.
USS are completely useless in that case as well, as everything below 30cm is STOP.
@@Buggabones You didn't mention what country you are referring to.
@@482jpsquared First line of the dude I replied to.. US
Thanks for proving what everybody against the vision sensors ignored: The ultrasonics are not perfect and miss a lot of small objects in front of the car.
I was quite surprised by the step stool. The reason I chose the items I did was to show 1. an item they'd both miss and then 2. an item that could differentiate their abilities... I was wrong :)
I don’t agree, it only shows that Tesla hasn’t delivered what they promised. As usual, buy a product for what it delivers today not for what it maybe can do in the future.
What didn’t they deliver? I do not know much. I’m just curious
Or what did they promise?
@@Harrythehun sorry buddy but Tesla didnt promise you anytihing, USS are a joke, get over it and move on.
When I got my Model S a few months ago, Vision Park Assist did not work yet. A few weeks later, it was installed through OTA update. I took my family out one evening and kept hearing the fast beeps while trying to parallel park. I was still far from the vehicles in front and behind me, so I had no idea what the beeps meant. Then I heard the awful noise of wheels scraping the sidewalk. I was immediately both impressed and heart-broken that Tesla Vision was working and that I had curb-rashed my front wheel. So I can attest that the side view of park assist works, and I have the scars to prove it.
I got that today I think? OTA
When I park with something in front, it always tells me to stop one or two feet away. The difference with that scenario and yours is that yours can leverage the side cameras. Mine has nothing but the “memory” the car had as it was pulling up to the object, and algorithms calculating distance (poorly).
I recommend you turn on all the cameras while parking. The side view cameras give you an excellent view of curbs while parallel parking.
I just have my 2020 M3 park for me haha
Vision being off by 2 feet could certainly be an issue in a robotaxi situation. A lot of downtown areas require tremendous precision during parking maneuvers, etc.
Thats not an issue as they will probably not even getting level 3 self driving ever.
@@TimEndzeit lol @ever
@@TimEndzeit did you watch the latest video Elon released on 8/25?
@@TimEndzeit You have no idea at what point Tesla really is.
@@achilles-live so far it is really at level 2. What they can do does not matter if they don't want to take liability for millions of possible FSD drivers and accidents that might happen.
You could test with hard flat surfaces and uneven soft surfaces. Ultrasonics don't like soft sound absorbing surfaces. Vision shouldn't care.
They need to add a camera in the bumper to really provide accurate object detection
I think that will happen with the Highland.
Actually needs more than that.
You'll need movement to estimate distance. So if you if you place something while parked and the car is off, it can not know the distance.
@@FutureSystem738oh nice! If I buy a HW4 model X, do you think we can retrofit bumper cameras?
Highland is here and no camera in the front bumper. But you get lower quality headlights and no stalks. Tesla is the best
Having switched from a 2020 M3LR to a 2023 MYP earlier this week, the change was dramatic.
In my garage, the M3 would work perfectly in telling me when to stop pulling forward. It also showed the clearances on both sides, not sure why people say it didn't work to the side. The dings were generally accurate, if you hear the ding you really need to pay attention. It was pretty accurate in all areas.
The MY is, in my opinion, horrible in this regard. The forward sensors say to stop WAY earlier than the M3. If I followed it (like I followed the M3) the car is around a foot further back than it was before and not far enough in to put down the garage door. I've figured out that when the red stop line on the display reaches the passenger compartment it's about right. Additionally, I've had it tell me to STOP when pulled off to the side of the road with nothing in front of me at all. Finally I nearly backed into a four-foot retaining wall (on purpose, as a test) without the car reacting at all.
I hope this will improve, but at present it's so inaccurate as to be pretty much unusable. The difference in the areas that I had a lot of experience with (e.g. my driveway and garage) was night and day with the M3 being pretty much flawless and the MY being pretty much useless. This aspect was far and away my biggest disappointment with the new car, and I believe they made a huge mistake removing those sensors. I still love the cars, but in my opinion this was a definite step backward.
Cost cutting move. I am disappointed in this change as well because we have a tight garage and need to use all of it to put the garage door down. Now I will have to just guess. Things should not get worse with newer models. My 8 year old SUV can tell me when to stop much more accurate than the Model Y I will be taking delivery of later this week.
I'd want both...... some level of sensor diversity is reallllllly useful
I’d vote for radar, as I can accept a parking experience that I’ve used my own gray matter for the past several years. But collision avoidance in situations where visual tools work poorly is more important to me. That is when one would want collision avoidance the most…poor visibility. Radar isn’t perfect, but it is better than cameras. Coupled with cameras, it’s even better.
Hopefully Toyota or other Japanese will catch up, so we don't have to experience this nonsense with no stalks, no USS, shitty headlights, touch buttons for horn etc.
The car can count tire rotations once an object leaves the field of view. Probably the only reason that they haven’t activated some features is that they are collecting data to convince themselves that the algorithms are working properly.
Management normally asks developers to create a proof-of-concept build to validate an idea before making a decision. I was expecting that the first software release for non-USS cars would have this algorithm, but instead, it was unusable. Did they not do a proof-of-concept then? Among all the Tesla decisions, this one seems the most half-baked. Still don't know how it's supposed to work with a smooth wall, or in the dark, and how it's supposed to bend light to see in front of the car.
@@lsauvedoes it not have headlights to see?
@@BenefitOfTheDoubtInquirynot on the sides 😂
@lsauve agile at its worst... what about the yoke/turn signal? This one was a bad move too
WHAT?!? Every other manufacturer can collect enough data before the car leaves the factory. This is awful performance by Tesla.
It would be nice if you could enter in a distance you wanted it to move so you could place the vehicle with extreme accuracy.
It seems to me that it would be even better to have both systems working together, drawing on each others' strengths, to get the most complete and accurate accounting of the obstacles near the vehicle. I said (and still believe) the same thing about removing the radar systems. I get that the vision alone is sufficient in most situations, but added information is only a good thing when making safety decisions.
Merging the data (radar and vision) is very complex they found. Elon always going to remove parts that have overlapping functions.
@@shocktreatmentgamingnah the main issue with FSD is how it drives, not what it sees. I think we will see a huge change with V12 because what the car does in every situation won’t have to have been manually programmed for every possible scenario
I have heard RF sensors are even better
The problem with radar is that it was introduced into cars for highway cruise control where the object Infront is very likely another car traveling slowly relative to the car you're in. Relatively high speed objects (i.e. everything stationary) are filtered out as uninteresting. Which makes them difficult to rely on in urban environments unless completely reprogammed. Anyone who has tried radar only adaptive cruise control on a twisty road will have experience as to how unworkable they tend to be. Fine on the motorway though.
@@EwanM11 then they should just use it on the motorway instead of limiting the maximum speed where cruise control works
This is one of those trust exercises. Lie on the ground and tell your wife to drive toward you.
With Stephanie, I'm pretty sure the probability of accident is less than 0!
and he lives in Michigan in the middle of no where!!!
@@davinci05131979 and if she did run him over she could blame “sudden unintended acceleration”
In our garage with a lot of stuff, our 2022 MY ultrasonics aren’t enough. I ceiling-mounted a motion-activated dual laser pointer that shines into the cabin. I aligned it with the sliding console lid trim and can park the car very accurately. I also put a wheel chock to block a front wheel if I go too far for some reason. The laser pointer I got was from Amazon for about $35 CDN. Lotsa love to all ❤❤❤ PS Thanks Chris and Steph too 🙏
That's a great solution for that one location, but for me, I would like Tesla Vision to navigate into parking spaces outside of my property.
Vision ultrasonic is a game changer in terms of avoiding curb rash on the wheels. Much easier to avoid damaging the rims with vision that adds the side distance monitoring and alerting.
for Tesla yes, other brands do this with Ultrasonic sensors on the side.I'm not very impressed with Tesla Vision and Tesla Ultra Sonic Sensors.
Other cars have things like 360° camera and even transparent view of the car, where you can literally see what's around and under the car.
@@dragsteraa Seems like a lot of work for a party trick. Doesnt seem very useful for normal drivers. Just like Tesla Vision, its mostly pointless unless you're like 16 and need help.. People that have Teslas are 30-60yo, I think the average age is 54. We dont need a 360 birds eye camera to pull into a parking spot...
@@Buggabones The most difficult issue is backing into a parking space next to a curb (parallel parking). The other, easier issues, it nosing or backing into a garage or other parking space.
vision ultrasonic?
When you're measuring around the 11:22 mark, the car seems to (correctly) want to stop where the pavement break is, not where the door itself is. Subtract the 6 to 8 inches of door inset, and measure from the rear bumper, with a 6" buffer. The car is accurate to within 6 inches of where you want it to stop, and as you said, it gave you a *good* place to stop, before you hit the door. Seems like the right thing to do.
Ive been wondering this for a while!!! Thanks Chris
As one of the lucky people that Tesla screwed over by taking our money for a load of features they then removed from the car before delivery I’ve had plenty of experience of Vision park assist by now. The bottom line is that it assists you in every situation where you don’t need help and as soon as the situation gets difficult (weather, darkness, confined space) it taps out and either says ‘degraded’ or just tells you to stop. I hope it’s going to improve because there is NO WAY summon (etc) can safely move a car around without a driver with the accuracy the system currently has.
I should add (to balance the negativity) that I think the concept has great potential and COULD turn out to leave USS in the dark ages. The question, as always with Tesla, is whether they will get there fast enough and be able to do it with the hardware on the car that I have.
It leaves a sour taste that we’ve been ripped off with something that doesn’t really work to fund the R&D for people buying some future ‘2025’ or HW5 vehicle to benefit from.
Isnt it the same with autopilot? The faster I drive the more I "really" need it as a safety system. But then it doesn't work anymore at 180kmh. And when my sight is limited during a rainy/foggy night, tesla vision has the same limitations.
This is the exact impression I got from this video.
Someone at Tesla has really overthought reversing and entering into tight spaces.
Every car I’ve owned in the past 10 years has had factory ultrasonic sensors that were dialled in to cover virtually 100% of parking scenarios.
That’s all you need Tesla. I need to park my car in tight spaces without hitting anything.
There is no need to reinvent the wheel here.
I prefer the USS but having the backup camera sort of makes it a mute point. Yes, vision is inaccurate but you can see what you're backing into on the screen. Whenever my car starts beeping, I never look to see how many inches I have, I look directly at the screen. However, not having a camera for the front bumper is probably the weakest link in Tesla Vision.
Yep, and there are rumors of front camera (we know Cybertruck has it)so let's hope it's true!
@@DirtyTeslaThat won’t help existing car owners though?
It’s moot point.
@@davidkaplan5517 thank you. I did not realize I used the wrong word.
@@rubberchicken98 honestly I learned that like fairly recently lol
Great comparison thanks. Our cars in Australia 🇦🇺 still have ultrasonic sensors. However as anyone should know, ultrasonic sensors on any car are FAR from 100% reliable, especially with posts etc. So often I have seen people whinge about this when they hit one, when in fact the problem is the nut behind the wheel.
Somehow I survived driving with absolutely no kind of parking assistance without hitting anything for forty odd years, before I had a car with a rear camera, and then a few more years still I owned a car with USS. KIND OF giving away my age there 😅
“Honey come quick, the neighbor is running over random things in the road again” 😂
13:50 are those infrared LEDs above the rearview mirror new? I haven't noticed them before. Thinking they are used for driver monitoring for FSD, but they are on now, when FSD is not engaged, interesting
Yeah I wonder if that's a HW4 thing.
11:15 I think you're missing that you can put a trailer hitch on the car, which will extend the car. I think 15 inches could actually be the length of the trailer hitch :)
in machine learning, this class of problem is called "out of distribution" failure. The sensor could be detecting the objects, but the model doesn't know how to classify the object. Basically, the training data didn't have data like it, so the model can't handle it. Training neural networks is mostly about the quality and quantity of the training data. If you have 1 million photos of cars and roads, but none with a specific object, it will still fail. The main way to fix the failure is to add more diverse types of objects and retrain the model. This is what computer scientists call "data distribution".
The way neural networks work today (aug 2023), there isn't a way to analyze the model weights and prove definitively the model is doing what we "believe" it's doing. The fancy name for this is model interpretability. Since we can't interpret the model weights today, people just add more data. The catch is, adding another million pictures of cars doesn't actually make it better. What you need is high quality photos of things a kid might put in your drive way or things the model hasn't been trained on.
Even if you were to send Tesla all of the data from FSD system in your tests, they would need to spend months to figure out exactly what the sensors detected and why the models failed.
Always living up to the channel’s name 💕
Thanks for your video it helped me understand the difference between the New Tesla Vision and the older system. The only thing I realized before seeing your video was that with Tesla Vision I do not have Summon and Enhanced Summon and Park Assist that I/we paid $6 grand for, and have no idea when we’ll get the SW update to provide us these advertised features. Thank you.
Parking a Model X forward into a garage with white walls no "stuff" laying around and it thinks I hit the wall when still being a feet away.
Another disadvantage of the missing USS on the MX is that the doors only open a little bit if you approach the car, open from the app or from inside the car and not anymore as far as possible. With previous MXs you could walk up to the car, the driver side door opens automatically as far as possible, you get in, push the break and the door closes again. With the current MX, you have to pull open the slightly opened driver side door.
dude is out there hitting objects with his car for science
Thanks for taking the time to compare these features. No doubt HW4 cameras bring better visuals, but they cannot stand alone. The ultrasonic sensors win on proximity, even if you see the object from afar you do not know how close you are to the object when you or the cameras cannot see it. A real world scenario is when you're approaching a smaller retaining wall not visible to the driver and the cameras when you reach a certain distance. I was considering buying my first Tesla, but will wait until this is sorted.
You are a brave scientist, Chis. I could see the reluctance on your wife's part to drive close to you. She is a keeper!
We have a 2022 Shanghai Model Y with radar parking sensors in the bumpers, but it does not function as your older car. Our system does provide side distances as well as front and rear, presumably combining information from cameras and the radar.
I have the same model, I thought (but may be incorrect) actually that with the software update for Tesla Vision, it actually stopped using the ultrasonic sensors.
In any case I recently hit a small wall (about 45cm - 1.5 foot) with the left front corner of the car, and AFAIK didn't get any warning 😥
@@HermanBovens I wonder how we can test whether the ultrasonics are still being used? Yesterday I measured the actual distance to a small barrier around a lamp post, about 0.5m high, made of 5cm tubing, and the reported distance on the screen was within 2cm of the measured distance.
I measured it as this video has made me paranoid about hitting stuff!
This is usually how I reason when I park: if I’m having trouble to park at a spot because of it being tight, unrelated to the cars system, the statistics of me getting my car damage, will be more likely to be caused by another car, then me trying to park.
Conclusion: If the space is too narrow, find a bigger one and avoid getting damages from a third party, because that’s more likely.
Better accuracy of the cars system, i.e. USS, will not avoid this fact.
PARK SAFER, NOT CLOSER!
That exact thing happened to me. My wife put the bag of swim stuff in front of the car and I couldn’t see it from where I was, I tried to summon and it wouldn’t come.
Should be able to display the 360 overview as Elon promised several years ago now, would be much better in those tight parking situations. My 2011 car has this so why teslas doesn't is quite strange
Rumored for Highland. Need a front bumper camera.
The only thing scarier than you being run over was denting your garage door. :)
Both systems should react EXACTLY the same as long as the object is detected (e.g. visualized as a trash bin) because the reaction (after detection) is software driven and completely independent of hardware. Both should react precisely the same to the "garbage bin". The only difference should be in the detection phase.
As expected, the ultrasonic system is more accurate for close distances for a flat surface. No surprise there.
I'm very impressed with the fact that Tesla Vision performed so well. It makes me less annoyed that I will never get V4 hardware. 🙂
Dont worry, by the time HW4 is actually working and doing better than HW3, they will announce HW5.
Not having USS is such a deal breaker, I will just get a used 3/Y
Can Tesla change those calm sounds? we are almost hitting objects and all we hear is " ding". BOOM!!! ...
I can not believe they took away ultrasonics., and the camera do Not use a non filtered IR..software controlled vision... come on T-Engineers...😅❤
13:45
Very interesting to see the IC scanning of the interior camera in the top. I heard the new system is looking for where your eyes look and how much you blink and stuff, and all other cars also have these types of sensors for this detection.
Sad so see another nail in the "HW3 can get retrofit" coffin, but at least this detector suite can be used to enable hands-free driving in the right conditions if done correctly.
I mean a parking sensor is there for showing you the distance to an object not just "stop" even though there is tons of room. If you want to know if you can take the turn or if you have to reverse one more time while you rank, tesla vision will be useless for this because it just tells you to stop with half a car length of space. I find it really disrespectful of the customer to just remove uss from a car while you could easily integrate tesla vision features like the detection on the sides into uss system to make a more powerful system than any of these two could be alone. Nevertheless I appreciate your work and thanks for all your videos!
thanks for making this video!
It needs light for street, but use infrared to monitor occupants
Bless you sir for all your hard work.
For tight garages, nothing beats a parking pad glued to the floor and rubber bumpers on the walls. Some people in the comments are too stressed about things they have absolute control over. For novel situations and areas, i prefer a little conservatism.
13:24 -- What are those two flashing lights up top??
Everyone is overcomplicating this. They are simply using their "Drivable Space" vs "Non-Drivable Space" logic + the Occupancy network for distance estimation. That's why it shows up until the curb and not the poles. Distance estimation is worse, but reading weird shapes is better. Simple
Good video. I thnk there are a few salient important points. First, although the Tesla with ultrasonic sensors did not do as well as expected, it would be easy for one to conclude that Tesla Vision is almost as good as ultrasonic-equipped cars. This would be true for Teslas, but certainly not for cars that are not Teslas. I’ve had cars that go back over a decade that perform better than a Tesla with ultrasonic sensors. This points to Tesla’s inability to write software that works well for the scenarios noted.
Second, Tesla should remove phantom features from their sales tool. Autopark, Summon and Smart Summon are not available on Tesla Vision vehicles. Implying they are, only to have some legal disclaimer that wipes out the marketing lie, is really a testament of their ethics in my mind. I’m surprised that the FTC has’t gone after them.
Third, Tesla is really a reckless company when it comes to changes. Unlike most responsible companies, Tesla’s approach to changes is absurd. The “right” way to behave when they made the choice to remove ultrasonic sensors (and radar) would have been to provide feature parity before the change was made. Instead, their approach was to remove it and then make empty promises that the features would return “soon.” They haven’t. It’s clear what happened here. Tesla wanted to save money and removed parts, and they didn’t really care what impact it would have on customers. I believe their approach has impacted the safety of the vehicles as well; it would be interesting to see a reputable company re-test Tesla Vision cars to see if they arrive at the same conclusion as it relates to safety. (Collision avoidance, for example.)
Someone else posted that one should buy something for features that it currently has, versus those that are promised. Bingo. Reliable software that does what it is supposed to do is not Tesla’s strong suit. The software that really matters (drivetrain control) works well I believe. But anything outside of that is a crapshoot.
People cling to the Elon Musk puffery for some strange reason. Anyone who believes what he says is probably in line to buy prime ocean-front Nevada real estate knowing that the “big one” is going to hit California thus making it sink. The Tesla software playbook is vintage early software companies-promise functionality to make the sale and hope that it can someday be delivered.
I think it would be cool to back towards the garage door and then open the door. Maybe both while in the car and with the car asleep to see what it says. USS should obviously see the door disappear, but does vision?
My guess is that vision would incorrectly say the door is still there if you open the door while the car is asleep and don’t move.
How well does it work with a dirty rear camera? Winters are going to be terrible for the back up camera and estimating distances.
It doesn’t. You see a lot of the ‘Park Assist is degraded’ message.
My wife's Porsche has a bumper cam. I wish Tesla would add that.
Maybe put some foam on the crate before you assault your clear coat so much but looking at the side of the car and the name of the channel - Great Job!
Now if they can get FSD to stop changing lanes back and forth like pong.
The distance count down is not to the object it’s to the stop line, so it’s always adding a buffer.
We have one Tesla with and one without. I personally find the vision-based assist to be worse than nothing, basically just giving you incorrect information 100% of the time.
Same here, I turned off the parking beeps as they just distracted me. I get being conservative but I can tell when I'm three feet from something on my own. Inaccurate parking sensors are worse than not having them. People say Tesla's have great tech, but honestly it's got some important gaps that almost any other car made in the last five years can do well.
They removed USS too soon. They could have run with both, training TV based on the USS readings on the car. Then when they got them to performing similar, they could stop putting in the USS. I think they jumped the gun here.
By the way, side USS is possible, in Europe, 20k cars have that with 360 degrees top view
I get the side lines with my 21 and 22 M 3 LR cars.
New saw those 2 infrared lights there on top of the rear view mirror 🧐
..and I don’t remember someone ever talking about it.. 🤔
Ultrasonics do warn on the sides, at least on my 2021. There are ultrasonic sensors pointing straight left/right just in front of the front wheels and just behind the back wheels. So you do get ultrasonic alerts for the sides of the car but only on the front and back edges. That is enough: you don't need ultasonics, for example, halfway down the side of the car because by that point, you've already driven past the sensors on the front/rear and they've already warned you as you drove past the obstacle(s).
I'm confused by this test... It's my understanding Tesla disabled the ultrasonics' on existing vehicles that still had em, so long as they are on HW3... IE - aren't both these cars just relying on Tesla Vision at this point, regardless of whether the sensors are installed? Or they are still utilized, but only for parking purposes?
FSD beta doesn't use USS anymore (it used to) but USS are still used for park assist
Thank you for your thoughts and the testing. My experience with a 2023 Model X is very different: When I approach obstacles the car absolutely overreacts and often tells me to stop 1 meter (3,3 feet) or more in front of it - which isn't really helpful at all. On the other hand, it often misses to detect lamp posts completely. Also: I really, really dislike the "curb" detection as it is. Tesla should display curbs differently (maybe a gray line?) or not at all. Because what's the point if I need to park so that the front of the car is above the curb (very, very typical situation here in Germany!) and the car warns me because of the curb but fails to display the wall/tree/post behind it instead? The test with the "small object" in front of the car isn't so funny when you realize that a typical parking situation in Europe includes people in front of you or behind you moving (parking in a line at the side of the street) their cars and often the distance in parking situations here is only 50 to 60cm (1,6 feet roughly) between the bumpers. Then I come back to the car and it will show the OLD distances in the front. They can be too close or to wide, depending on what happend in my time away from the car. What they should have done is add a "nose" camera - which, frankly, every other car in this price range has. Also: The Model X had previously ultrasonics on all four dours (and of course front/back and even the roof), so my 2018 Model X very well detects obstacles all around the car (and very precise!), but the new one is just a mess when it comes to parking. Please realize: European parking lots are a lot smaller, sometimes, when I drive from floor to floor the distance between the front of the car and the wall is only 30cm (~one foot) so if the sensor are off around one foot, that's all it takes.
That hum in the garage is interesting, why was it so loud? Also congrats on 100k!
The experience was better with the US. I wish they would bring them back. The cameras give warnings, but it's not accurate enough in tight situations. IMO, that was a bad move by Tesla.
I did similar tests with a model 3 last May. Same kind of results, the distances are underestimated which is erring on the same side.
Given the profit margin on these cars, it’s too bad that they don’t use both vision and USS.
I agree
I was unaware that the new Y's have the IR driver monitor camera
To come to a conclusion Tesla vision is just garbage for parking. All the features about surroundings are nice to have but they just could add it on top of the USS probably. They won’t solve it especially when you have for example a white wall where the vision system will have problems to find the right reference points.
Measure from the concrete threshold in front of the garage. Vision also measures to parking border lines, something the ultrasonic can't do.
Thanks for making the video! I was finding videos about the visual parking vs the normal sensors that you would get in a 2021 and below! As a 13 year old myself I will probably get a normal gas car then I’ll move to a tesla!
So you're 13 now, but by the time you're old enough to drive, Teslas will be much more affordable than gas cars. They're already at price parity today. They're also safer. Park assist will be even better and FSD with hardware 4 will be in full effect.
It's interesting you used the term 'normal' gas car. Sooner than you think EVs will be the new normal. They're already the norm for me and I am 55 having driven gas cars for decades before getting a Model Y.
Curious, why might you still be considering a gas car over a Tesla as your first car?
@@RobertLoPinto I might get a gas car because if I go car shopping In like 2 years. Tesla might have different prices depending on the model
You made the point. USS are far more reliable than Tesla Vision (as for now, we hope for the best in the future). In every tests, it is almost precise in 1-2 inch radius whereas vision is far off. That’s hopefully protective but too largely conservative when you have to park in tight spaces as in Europe. In crowded cities it gets very hard and no aid. USS used to work as a charm whatever the weather, night and day.
The sides detection does not really help, there is no measurement and line is always red, even when far from the curb
Very interesting test.
Would be cool if Tesla would support a measuring label (like barcode but with larger bars) for more accurate distance measurement. Land Rover uses it for towing a trailer. Tesla can use it for more accurate distance measurement in tight garages and tight parking spots.
As Sandy Munro says, FLIR sees everything, even if you're driving through thick fog/smoke and bad weather. Highly consequential when looking at camera based FSD safety and precision!
get that man a set of mud guards!
Why did you blur the screen when you stood in front of the car?
Great job.
I suspect that the front ultrasonics are calibrated for the optional front license plate holder mounted which adds a couple inches.
The curbs are really accurate! but other stuff need more training.
19:13 was it giving distance to the ground objects? You didn't measure to the bag on the ground.
There's no reason the 2 can't work together. Ultrasonics for the front and rear distances, vision for sides. I think the vision system could do a better job of outlining the drivable space, like FSD does, which would make the dings a bit more clear on what they're for (curb vs object just past the curb)
But, there's plenty of room for improvement for the Tesla Vision system as well.
Ummm… I use auto park all the time on my M3SR+…. So what is this that I’m hearing that it doesn’t park itself, because it definitely does.
My experience with my 2023 Model Y is Tesla Vision does NOT work in the rain.
I haven't been able to find this information:
Will "older" cars with Ultrasonic Sensors also get Tesla Vision? Since they have the same cameras.
Or are they software-limiting it to cars without?
A combined effort (USS+Vision) could yield great results!
I believe they turned off the ultrasonics on my FSD 2020 Model Y
are you sure a recent software update did not change from ultrasonics to Tesla Vision? Also with the measurements, the rear bumper sticks out further than the camera, so I think you should have measured from the bumper. Great experiments.
USS isn't used in fsd beta anymore, but is still used for park assist if you have them.
Like, why not combine the two? The car's expensive already.
Clearly with/without Tesla vision and ultrasonic have their pros and cons, and I believe this can be actually perfect if combined.
I'm suprised the USS didn't pick up the smaller items when driving forward. Seems like the pick up curbs and parking blocks.
They definitely pick up the curbs (shown later) but I think that chair was just way too flat.
Curious, couldn't they strategically use UV LEDs that illuminate so vision cameras can see, yet to humans it would appear as if no light is being emitted, allowing vision to see well even in the dark?
I essentially agree, but at the other end of the visible light spectrum. Infrared illumination has been effectively used for vehicle reversing camera illumination for at least 20 years now
Tesla was not thinking about consumer experience when they removed USS. They were thinking about parts shortage and cost cutting.
Disappointing move!
Great video could we get updated video with new software
Literally just bought the 2023 model 3 and regreting massively as we bought it for the safety
Safety is still sky high on these vehicles. You made the right choice. Check out the new high fidelity park assist in the holiday update
There are ultrasonic on the side of the 2021 model y with non vision. It's next to the wheel.
objective test... well done. Still room for improvement but somewhat useful. With the new Highland refresh I expect Tesla Vision will improve as it does not come with USS or front bump camera.
I think if you had the front license plate on the uss car the measurement would be dead on.
You are measuring the distance to the garage incorrectly on the vision only car. The measurement should be from the bumper to the garage. Not the inset area of the hatch lid. The bumper is the closest part of the car to the door. I have measured my 2022 Model Y with ultrasonics and it is surprisingly accurate. However, I backed into a Tesla urban charging space once where the chargers are not directly behind the car. What’s behind the car is a “Tesla charging only” sign on a skinny pole. The ultrasonics cannot see a skinny pole at all so one can back into it. One more thing. I get side warnings on my ultrasonic sensors. Every time I leave my condo garage they go off because the door is rather narrow and there are bollards mounted on raised concrete blocks on both sides which show up as cones on the screen.
Your sonic sensors are on the bumpers, so measure from them .
Also it's normal to message you to stop earlier depending on the inertia of the car ?
For USS I was measuring exactly from the sensor (and a few other spots just to check)
You know what I don’t understand? With all this tech, why does the Tesla even allow you to hit something? In the garage door example, it would allow you to keep reversing into and through the door. Shouldn’t the vehicle refuse to move knowing there is an object in its direct path?
It's a difficult balance. What if it thinks there's something there but there's not and it stops you for no reason? At some point the human input has to override the computer decision. But the car does have the ability to stop you from hitting things at least temporarily. Look up the obstacle aware acceleration ability which can stop you from running into things. The car can also slam the brakes for you if you're about to back into another car or pedestrian or something without noticing
Lucky us in Australia we still get USS on our Model Y. Our 2023 model y delivered 2 weeks ago has uss. It would appear from your video that a hybrid solution would be best by using both uss and vision.
Thanks for the video.. and the testing.
Seems to me that they should have left both radar and USS in the cars and done vision also. Each of the technologies have there strengths and weakness. Fog, low light, or an unlit garage etc.
I would not car about the side view that much... If I want to get close to a curb, I would just LOOK at the screen and see the camera feed or use the mirrors... Just like now I use my side mirrors to see where the curb is.
What about phantom braking? Is it more of an issue with vision only? Thank you.