@@sunrosec Nope. Don't spread b.s.. It works just as well at night as it does in the day. The only time I've had FSD degrade is in very heavy fog that is dicey for a human to drive in.
WOW... just WOW! The drone footage perfectly above the car... even just that is amazingly done, but OVERLAYING FSD visuals... BRILLIANT! Great job! I'm sure it was a TON of work.
Wow, these visuals of the car's perception superimposed over reality are stunning! No one else conveys this like you do. Keep it coming! I cannot wait to see v12 continue to improve this. I'm waiting on my v12 download!
Dude this video is KILLER. I love these. I'm not even a real Tesla fan or Elon-stan, I just genuinely enjoy the tech and your testing of it. Can't wait for more!
And it is totally useless. There is a shit load of other solutions that actually works. Once Rolls-Royce representatives were asked why they do not invest and develop driving aids and self driving into their cars. They politely replied that they have solved this problem 100 years ago. Its called personal driver, it gets you in any conditions anywhere safely, parks and picks you up, he does not need lanes or other road infrastructure to operate and works 100% anywhere in a world.
@@MonkeySmart12325 Sorry that you find it totally useless. I find it somewhat useful, although it could be better of course. I think parking assist is a more viable option for 99+% of owners rather than hiring a driver.
@@BobbieGWhiz you did not get the point, key problem with tesla is that they do try to reinvent the wheel instead of using it. Instead of making sure the solutions they have works 100% every time, they instead delivers half baked solutions that works occasionally or only if certain favorable conditions met.
That top-down editing was really helpful and truly shows just how accurate the Vision system can be, while also pointing out where exactly is currently has weaknesses!
Fastest notification click ever, this is the test I’ve been waiting for. Aerial view compared to the 3D reconstruction, this is the only thing I wanted to see and nobody did this test, THANK YOU!
3:59 This is likely less about "memory" than about uncertainty whether the curb is an object(s) that might move. So as the curbs passes out of view, FSD has to ascribe some chance for the curb to have moved since the last time it was seen. FSD clearly has a degree of "object permanence", but much less certainty in predicting what occluded objects are likely to do.
I'm pretty sure the reason for this is that the car doesn't have a bumper camera looking forward, from afar you have the windshield camera accurately calculating, but as soon as you get close it loses visibility and is just an estimation.
@@Juan.Irizar Lack of camera is _why_ the curb goes out of view. This happens to humans all the time, but we know that curbs do not move and have a pretty accurate estimation of where it is and how close we are. Eventually FSD will be able to do this as well, but right now once something is out of view of any cameras, it becomes increasingly uncertain about whether it is there and where it is, which is what we are seeing in the visualization.
@@GreylanderTV Why not just assume that anything that's not a human or has wheels has practically zero chance of moving? Why does it have assign a chance of moving away to every single object it sees? The only objects that move so easily during the same drive are lighter objects like plastic bags and paper which are not a threat for the car anyway.
@@baab4229 Of course. But this is not something you just program into a neural network. The NN must _learn_ even the most basic facts about the 3D visual world. This takes time and data. FSD has the inference-time compute power of something like a squirrel brain, while the training server clusters have the computing power of maybe a chimp, which is what is responsible for the _learning._ Think about how long it takes a human baby to just lean object permanence (about 8 months). Then 2-3 years to get all the basic physics of what objects are and how they behave. Also note that "practically zero chance" is not _zero_ chance. You don't want self driving cars rolling dice on getting into accidents. But as you can see from the video, it completed the maneuver, so it was certain enough that it would not hit the curb. Just not as certain as a human, so we see the curb "smear" in the visualization.
4:42 isn’t that because the front camera is above the windshield meaning it has a very significant blind spot in front of the bumper? The lack of ultrasonic sensors makes it even worse. This is a hardware design issue and something the software simply can’t fix.
Why isn’t Tesla’s AP team calling you to provide you with V12, like they are with Omar from the Whole Mars Catalog? Your analysis of FSD is light years better than anyone else on social media.
Again and again you create the best content regarding a topic. Everybody just places boxes here and there. Your idea and implementation was brilliant. Thanks 👌👌
My Tesla cameras are affected by rain, snow, ice, dirt, salt, condensation in tunnels, and even bright sunshine. That doesn't leave many days with weather conditions where cameras are usable. So as impressive as the video here looks, I'm wondering if the tech is effectively unavailable most of the time for half the countries on the planet. So at the same time as removing the manual controls that are still going to be vital for manually driving the car, the camera-only solution means the car won't be self driving either for the duration of the time people would usually keep the car.
The memory issue is spot on. I pulled out of my driveway and had my garbage can at the curb. I brought it in the garage and moved the car. It thought the garbage can was still there.
@@AIDRIVRsounds about right. Just keep doing what you’re doing then! You are easily my favorite context creator when it comes to FSD beta. I love your videos and I look forward to you getting V12
Ultrasonic and 360 camera on my Volvo works very well. And yes, it can detect curbs, and the camera makes it even better. That outperforms this from Tesla. Why? Because it is not guestimating. It is more close to reality. And ultrasonic also works in pitch black and also works when you do not have very low contrast, like snowwall against snow. And they also work even when dirty.
People tell me to drive forward into spots for best Sentry recordings, but the cameras are so much better when reversing in (and easier when exiting). Glad you're sticking around 👍👍👍👍
That is a BRILLIANT way to show how accurate it is. Thanks for all that hard work. I have ultrasonic but it's not that good for me where I live as I really need more accuracy. If the new vision can get accuracy to within 10cm then that will be amazing. Phillip.
Man, your voice is just soo calm, one of the chilliest RUclipsrs I follow. I wasn't feeling the need to skip the sponsor segments cause it didn't disturb me unlike some other RUclipsrs do. Keep up the amazing work and don't you dare to leave RUclips!
When I test drove a Model 3, trying to park in our garage was just horrendous. The visualisation was off by so much and I had to be waved in (since I did not know the dimensions of the car and did not want to damage it). I wonder how this would do in this scenario (nosing into a narrow garage), since the ultrasonics on my current car feel way more reliable
I park in a narrow garage everyday. It's awful. I can't rely on it at all. I'm stuck using my mirrors .. makes me so upset Everytime. I truly can't believe I'm stuck with this car for 3 years.
0:42 - I have noticed that in my Model S, ground markings when driving (like for lanes that are merging to the left) are no longer shown. I wonder why that is?
"with time it will getting better, hopefully" is the most used sentence in this videos. Whats about Windscreen wipers? They are usable now? Or will they get better?😂
Great work! Love the effort you’ve gone into on this video! 👍👍👍 I agree there are so many areas this fails at, I just reviewed the Highland and the parking was a 6/10
Version 11 tests are not pointless. I think they are an opportunity for even more videos, because you can set up some version 11 tests specifically for comparison with version 12 after its release last November.
The lack of sensors means this will never get enough accuracy to be useful. Cameras only isn't up to the task, and this is why Tesla is at the back of the pack when it comes to FSD.
Not trying to make an ass out of myself here. I'm kinda a tesla fan boy due to it being the only information I have been exposed to. But what other production car can I go purchase right now that gives me access to FSD. FSD is in my dream car. Up until now that has been Tesla. You seem pretty confident....Change my mind.
6:08 - _"In my opinion all it really needs is a memory of what it's already seen"_ Easy said than done... what you said brought me back to 2018~2019 when I was working with computer vision and "all we really needed was memory of what our models had already seen". We never solved that problem with the model(s) we were using ;)
This is actually a pretty cool system! With a few tweaks to how its rendering the environment, you could make it work pretty flawlessly. One thing that stuck out is how it handles static objects. If it knew something was a line or a curb within a certain perimeter of the car, it would not need to keep updating that but instead just use reference frames. The cone is a good example, it knew a cone was there but then once it lost sight it deleted it from its surroundings. If it knew it was a cone once it hit a high enough certanty level and had the position locked in, objects like that would not be an issue.
It's unfortunate that the fidelity and accuracy decreases as you approach objects, given that the modeling and visualization are ostensibly for parking in tight spaces.
Other cars don't just have the stitched top-down view. You can also switch to viewing individual cameras, without distortion or fantasies. And also - yes, there are dedicated curb-detection ultrasonic sensors that are pointing *down* at the curbs and measure distance between your wheel and the curb at sub-centimeter precision.
I have been driving my Model 3 Highland since late November. I love the car on so many levels - but I have to say, the parking assist is pretty terrible. When I park in my driveway it is always insisting I am driving inside the walls, despite having 30-50cm distance still. That is parking front-first usually. Generally parking front first gives very inaccurate estimations.
Much lower ultra sonic sensors paired with this would serve to better display the cone scenario and maybe even cubs. Plus I really don't think this system alone would detect a kid playing hide and seek in the front of the car.
Yes. More cameras and sensors all around the EVs would greatly help. Look into gasless and oiless self running generators. Already proven and patented for many years. Now, combine the self running generators and solar in EVs and everything else. No more plugging in, no more charging stations and no more limited range.😎🇺🇸 Please look up the gasless and oiless self running generators, patents and technologies, thoroughly. Not just a little or not at all. We already have the tech needed to go far beyond our many issues and limitations.
The issue with objects disappearing when u get too close is most definitely because it's vision only, if they merged between 2 tech it would've been much more accurate. But yeah I believe they still can make a lot of improvements to this overtime
@ 1:14 I didn't take your warning seriously, I naively thought I knew every minute detail of the visualisations but had never noticed that... Do you know if it's also true for the non-fsd visualisation? Also congrats on 100k, best fsd channel imo,well earned
Wow love the video editing you did here overlaying the drone footage fantastic! I do worry about how fixable the closeup imaging will be vs when they has ultrasonic sensors.
I enjoyed your analysis of Tesla Parking Assist. It's so crazy!!😆😆 And I’m curious about the equipment used to film real life footage. (3:00) What equipment did you use and how did you secure it on top of the vehicle?
I'm so confused about whether self driving will be real or not. I've seen enough long-form essays disparaging the idea from people with sound reason that my confidence in the idea is shaken. I'm glad you're holding the line, and keeping an objective view on the matter. Cheers!
The drone view footage combined with Tesla vision was amazing! I would love to see more of this. With pillars, cones, small things, sharp edges, steep roads?? An extensive review would be awesome! Tesla could also record everything and make a high fidelity 3D model of cities. Or even let us contribute kinda like Waze gps when there’s an accident or road maintenance, etc.
4:18 better memory here would be totally unnecessary if it just had a front facing camera down low...like Cybertruck. Amazing Tesla is just totally ignoring this.
This video was sponsored by Brilliant - Sign up at brilliant.org/AIDRIVR/ for a free 30-day trial + 20% off for the first 200 people!
I should have listened to your warning. FSD visuals forever ruined for me. TSLA please fix.
The Tesla footage with top down real life footage is awesome. can't wait to see this more in future videos!
Yeah! Great work!
Don't be fooled by sponsored videos. The thing sucks when the weather isn't perfect and especially at night. A few cameras are not enough..
@@sunrosec Nope. Don't spread b.s.. It works just as well at night as it does in the day. The only time I've had FSD degrade is in very heavy fog that is dicey for a human to drive in.
@@sunrosec This isn't even sponsored by Tesla. How in any conceivable way would a completely different company sponsoring this lead to bias?
@@sunrosec The video is sponsored by Brilliant ... not Telsa. What's the point?
glad to be stuck with you! 😂 thanks for staying!
WOW... just WOW! The drone footage perfectly above the car... even just that is amazingly done, but OVERLAYING FSD visuals... BRILLIANT! Great job! I'm sure it was a TON of work.
Glad you enjoyed it! Took a while to get right but I know what to do to make it easier next time
Can't wait until you get FSD 12 :) You always go into so much detail covering every aspect.
Wow, these visuals of the car's perception superimposed over reality are stunning! No one else conveys this like you do. Keep it coming! I cannot wait to see v12 continue to improve this. I'm waiting on my v12 download!
Dude this video is KILLER. I love these. I'm not even a real Tesla fan or Elon-stan, I just genuinely enjoy the tech and your testing of it. Can't wait for more!
Wow, this is the gold standard of high fidelity parking assist videos. No one else has come close. Thanks.
Except for Rivian, they "hit it spot on". But that's a problem...
You're joking right?
And it is totally useless. There is a shit load of other solutions that actually works. Once Rolls-Royce representatives were asked why they do not invest and develop driving aids and self driving into their cars. They politely replied that they have solved this problem 100 years ago. Its called personal driver, it gets you in any conditions anywhere safely, parks and picks you up, he does not need lanes or other road infrastructure to operate and works 100% anywhere in a world.
@@MonkeySmart12325 Sorry that you find it totally useless. I find it somewhat useful, although it could be better of course. I think parking assist is a more viable option for 99+% of owners rather than hiring a driver.
@@BobbieGWhiz you did not get the point, key problem with tesla is that they do try to reinvent the wheel instead of using it. Instead of making sure the solutions they have works 100% every time, they instead delivers half baked solutions that works occasionally or only if certain favorable conditions met.
That top-down editing was really helpful and truly shows just how accurate the Vision system can be, while also pointing out where exactly is currently has weaknesses!
It almost looks like even Tesla isn't using this kind of methodology to check the performance.
I really appreciate when someone puts this much work into such a short video, excellent job! Although I would watch 1 hour videos from you anytime :)
Fastest notification click ever, this is the test I’ve been waiting for. Aerial view compared to the 3D reconstruction, this is the only thing I wanted to see and nobody did this test, THANK YOU!
3:59 This is likely less about "memory" than about uncertainty whether the curb is an object(s) that might move. So as the curbs passes out of view, FSD has to ascribe some chance for the curb to have moved since the last time it was seen. FSD clearly has a degree of "object permanence", but much less certainty in predicting what occluded objects are likely to do.
Good point! Haven't thought of that myself
I'm pretty sure the reason for this is that the car doesn't have a bumper camera looking forward, from afar you have the windshield camera accurately calculating, but as soon as you get close it loses visibility and is just an estimation.
@@Juan.Irizar Lack of camera is _why_ the curb goes out of view. This happens to humans all the time, but we know that curbs do not move and have a pretty accurate estimation of where it is and how close we are. Eventually FSD will be able to do this as well, but right now once something is out of view of any cameras, it becomes increasingly uncertain about whether it is there and where it is, which is what we are seeing in the visualization.
@@GreylanderTV Why not just assume that anything that's not a human or has wheels has practically zero chance of moving? Why does it have assign a chance of moving away to every single object it sees? The only objects that move so easily during the same drive are lighter objects like plastic bags and paper which are not a threat for the car anyway.
@@baab4229 Of course. But this is not something you just program into a neural network. The NN must _learn_ even the most basic facts about the 3D visual world. This takes time and data.
FSD has the inference-time compute power of something like a squirrel brain, while the training server clusters have the computing power of maybe a chimp, which is what is responsible for the _learning._ Think about how long it takes a human baby to just lean object permanence (about 8 months). Then 2-3 years to get all the basic physics of what objects are and how they behave.
Also note that "practically zero chance" is not _zero_ chance. You don't want self driving cars rolling dice on getting into accidents. But as you can see from the video, it completed the maneuver, so it was certain enough that it would not hit the curb. Just not as certain as a human, so we see the curb "smear" in the visualization.
Happy you're sticking with the RUclips game. Congrats on 100K. It's going to be a great year for your channel given what's happening with FSD.
Amazing work as always you do. Thank you for rethinking the way of showing us the features.
Agreed. amazing test of the feature.
Can’t wait for your videos on FSDv12. You make some of the best videos to assess FSD performance.
Appreciate it! Tell Elon to send it to me pls
Your editing in this video is crazy good- well done.
I appreciate it!
you always have a fresh way of looking at this stuff. Production value is insane. Thanks
I appreciate that, thank you!
Damn, this overlay footage is amazing. So great to see it like this. Keep up putting out these amazing and original videos!
Thanks for another good video. Comparing the screen to an actual drone shot is genius
4:42 isn’t that because the front camera is above the windshield meaning it has a very significant blind spot in front of the bumper? The lack of ultrasonic sensors makes it even worse. This is a hardware design issue and something the software simply can’t fix.
Literally all i can say is, thank you. This is the perfect video to display this feature!!! Amazing, will gladly stick here with you!
Thank you for watching and sticking around!
Why isn’t Tesla’s AP team calling you to provide you with V12, like they are with Omar from the Whole Mars Catalog? Your analysis of FSD is light years better than anyone else on social media.
I've been staring at my phone waiting for the call that hasn't come :[
Really impressive overlay of the real with the virtual, makes the comparison easy to see.
This is by far the best analysis of the new park assist I have seen. Thanks so much for sharing.
Thanks!
Thank you so much Robert! Truly appreciate it
@4:05 the car thinks the shadow of the curb is the curb when it "reupdates"
Again and again you create the best content regarding a topic. Everybody just places boxes here and there. Your idea and implementation was brilliant. Thanks 👌👌
Your testing methodology, editing, and analysis is simply the best there is! Super looking forward to V12 testing!
“This is the worst it’s going to be”, if that was true we’d have working auto wipers by now.
Soooo glad you’re not quitting! Keep up the great work! ❤
My Tesla cameras are affected by rain, snow, ice, dirt, salt, condensation in tunnels, and even bright sunshine. That doesn't leave many days with weather conditions where cameras are usable. So as impressive as the video here looks, I'm wondering if the tech is effectively unavailable most of the time for half the countries on the planet. So at the same time as removing the manual controls that are still going to be vital for manually driving the car, the camera-only solution means the car won't be self driving either for the duration of the time people would usually keep the car.
Your editing on these videos is very impressive.
7:19 “worst this is ever going to be.” Generally yes, though the history of isolated regressions have been acute enough for some people to disagree.
IKR. The next update could be a buggy mess!
The memory issue is spot on. I pulled out of my driveway and had my garbage can at the curb. I brought it in the garage and moved the car. It thought the garbage can was still there.
Bro, did you wake up one morning and go “I’m going to make the best freaking tesla videos in the world”. Drone overlay with high fidelity was genius.
I am soo happy I found your channel 1 year ago !
These videos are so well planned and executed! Thx!
The real "high fidelity" tech on display here is your drone footage overlay! Amazing stuff!
crazy good visualization method
removing ultra sonics and radar was a mistake
Great work on this video. The time and effort you spent truly shows. I hope the tesla team sees it
the drone overaly is sick nice work
Holy cow! The top down viewpoint super imposed on top of the virtual one is so frikin cool! Keep up the good work!
Second thought, definitely hellishly difficult I’m sure, but imagine if someone could do that with the regular FSD view.
I’ve given it a shot before, I can make still frames look pretty good but while the car is moving it’s nearly impossible
@@AIDRIVRsounds about right. Just keep doing what you’re doing then! You are easily my favorite context creator when it comes to FSD beta. I love your videos and I look forward to you getting V12
@@patrickday3393 Me too my friend, me too
Ultrasonic and 360 camera on my Volvo works very well. And yes, it can detect curbs, and the camera makes it even better. That outperforms this from Tesla.
Why?
Because it is not guestimating. It is more close to reality. And ultrasonic also works in pitch black and also works when you do not have very low contrast, like snowwall against snow. And they also work even when dirty.
Love the setup. So cool. Awesome video!
People tell me to drive forward into spots for best Sentry recordings, but the cameras are so much better when reversing in (and easier when exiting).
Glad you're sticking around 👍👍👍👍
That is a BRILLIANT way to show how accurate it is. Thanks for all that hard work. I have ultrasonic but it's not that good for me where I live as I really need more accuracy. If the new vision can get accuracy to within 10cm then that will be amazing.
Phillip.
Man, your voice is just soo calm, one of the chilliest RUclipsrs I follow. I wasn't feeling the need to skip the sponsor segments cause it didn't disturb me unlike some other RUclipsrs do. Keep up the amazing work and don't you dare to leave RUclips!
Congrats to the 100k. Keep up the good work. I love your channel.
That was amazing. Really cool how you lined up the visualizations. Well done.
As always, your use of visualization is consistently innovative and super helpful. Well done, man
Great visuals. You always kill it with the unique viewpoint.
Excellent concept and execution here. Ive never seen this done before. Really helps us to see what its doing. Keep it up!
When I test drove a Model 3, trying to park in our garage was just horrendous. The visualisation was off by so much and I had to be waved in (since I did not know the dimensions of the car and did not want to damage it). I wonder how this would do in this scenario (nosing into a narrow garage), since the ultrasonics on my current car feel way more reliable
I park in a narrow garage everyday. It's awful. I can't rely on it at all. I'm stuck using my mirrors .. makes me so upset Everytime. I truly can't believe I'm stuck with this car for 3 years.
@@geekgrrlegetting rid of ultrasonic was a mistake, and yet they doubled down on it
0:42 - I have noticed that in my Model S, ground markings when driving (like for lanes that are merging to the left) are no longer shown. I wonder why that is?
Congrats on the 100k subs! Awesome work on the overlay too!
"with time it will getting better, hopefully" is the most used sentence in this videos. Whats about Windscreen wipers? They are usable now? Or will they get better?😂
Sucks that everyone is quitting, but it's great to be stuck with you!
Great work! Love the effort you’ve gone into on this video! 👍👍👍
I agree there are so many areas this fails at, I just reviewed the Highland and the parking was a 6/10
Version 11 tests are not pointless. I think they are an opportunity for even more videos, because you can set up some version 11 tests specifically for comparison with version 12 after its release last November.
Yeah this is a good point. See if you can get some footage of tests that you can compare against directly when you get V12
Excellent video. No problem on the lack of FSD videos. Looking forward to future version 12 vids.
Drone stabilization is really impressive
Whoa....There are 2 S's in stabilization? That is cool and good to know. Thanks!
Yep my english is sucks@@jaystarr6571
Great footage and edit, thanks - happy to wait for V12
The lack of sensors means this will never get enough accuracy to be useful. Cameras only isn't up to the task, and this is why Tesla is at the back of the pack when it comes to FSD.
Tesla is the king 🤴 duder
Not trying to make an ass out of myself here. I'm kinda a tesla fan boy due to it being the only information I have been exposed to. But what other production car can I go purchase right now that gives me access to FSD. FSD is in my dream car. Up until now that has been Tesla. You seem pretty confident....Change my mind.
I hope this improves quickly, it's disconcerting when it doesn't display some vehicles and everything jiggles around
6:08 - _"In my opinion all it really needs is a memory of what it's already seen"_
Easy said than done... what you said brought me back to 2018~2019 when I was working with computer vision and "all we really needed was memory of what our models had already seen". We never solved that problem with the model(s) we were using ;)
This is actually a pretty cool system! With a few tweaks to how its rendering the environment, you could make it work pretty flawlessly. One thing that stuck out is how it handles static objects. If it knew something was a line or a curb within a certain perimeter of the car, it would not need to keep updating that but instead just use reference frames. The cone is a good example, it knew a cone was there but then once it lost sight it deleted it from its surroundings. If it knew it was a cone once it hit a high enough certanty level and had the position locked in, objects like that would not be an issue.
It's all fun and games until someone puts an RC cone in the parking lot... /s
That overlay footage was og. Great work/effort my man
It's unfortunate that the fidelity and accuracy decreases as you approach objects, given that the modeling and visualization are ostensibly for parking in tight spaces.
This is an outstanding video. Well done mate.
Other cars don't just have the stitched top-down view. You can also switch to viewing individual cameras, without distortion or fantasies. And also - yes, there are dedicated curb-detection ultrasonic sensors that are pointing *down* at the curbs and measure distance between your wheel and the curb at sub-centimeter precision.
Imagine if Tesla actually had accuracy like that!
Great video. Side by side comparison is so informative.
I have been driving my Model 3 Highland since late November.
I love the car on so many levels - but I have to say, the parking assist is pretty terrible. When I park in my driveway it is always insisting I am driving inside the walls, despite having 30-50cm distance still. That is parking front-first usually.
Generally parking front first gives very inaccurate estimations.
Absolutely stellar job with the camera work!
Much lower ultra sonic sensors paired with this would serve to better display the cone scenario and maybe even cubs. Plus I really don't think this system alone would detect a kid playing hide and seek in the front of the car.
Damn dude. Great production quality. Very helpful
Thus is an absolutely astonishing video! Great work!
Amazing video and explanation. Great work
Super useful original and unique content. Subscribed.
Cool view! How did you get the top down view of the real video?
Yes. More cameras and sensors all around the EVs would greatly help. Look into gasless and oiless self running generators. Already proven and patented for many years. Now, combine the self running generators and solar in EVs and everything else. No more plugging in, no more charging stations and no more limited range.😎🇺🇸 Please look up the gasless and oiless self running generators, patents and technologies, thoroughly. Not just a little or not at all. We already have the tech needed to go far beyond our many issues and limitations.
The issue with objects disappearing when u get too close is most definitely because it's vision only, if they merged between 2 tech it would've been much more accurate. But yeah I believe they still can make a lot of improvements to this overtime
BIG YIKES with that Rivian crash for sure
@ 1:14 I didn't take your warning seriously, I naively thought I knew every minute detail of the visualisations but had never noticed that... Do you know if it's also true for the non-fsd visualisation? Also congrats on 100k, best fsd channel imo,well earned
excellent comparison and content thank you!
Wow love the video editing you did here overlaying the drone footage fantastic! I do worry about how fixable the closeup imaging will be vs when they has ultrasonic sensors.
This video is awesome! truly my fav FSD youtube channel
I enjoyed your analysis of Tesla Parking Assist.
It's so crazy!!😆😆
And I’m curious about the equipment used to film real life footage. (3:00)
What equipment did you use and how did you secure it on top of the vehicle?
I'm so confused about whether self driving will be real or not. I've seen enough long-form essays disparaging the idea from people with sound reason that my confidence in the idea is shaken. I'm glad you're holding the line, and keeping an objective view on the matter. Cheers!
Love your content man. Congrats on the new play button !!!
The drone view footage combined with Tesla vision was amazing! I would love to see more of this. With pillars, cones, small things, sharp edges, steep roads?? An extensive review would be awesome!
Tesla could also record everything and make a high fidelity 3D model of cities. Or even let us contribute kinda like Waze gps when there’s an accident or road maintenance, etc.
Imagine someone draw a realistic hole on the road which could prevent all tesla from passing by
Okay, so you're going to continue to innovate beautifully, it seems. Chapeau!
Damn fine use of your magic, "How dey do dat?", cameras! Kudos and Glory to ya!
😎👍
That overlay turned out Sick!
5:03 "edge case" i see what you did thar
That's some impressive editing work.
Great video, one of the best on this specific topic. 👏
4:18 better memory here would be totally unnecessary if it just had a front facing camera down low...like Cybertruck. Amazing Tesla is just totally ignoring this.
Very nice. Did you use a skydio drone?
It shows some fine lines in the cars, might mean they trained it on lidar data.