Hey there, i dont own a tesla (yet), looked to me like the red hands was caused by the fact your beautiful car was crossing lanes and stopped kind of between the lanes at the time it happened. Looking forward to order me a Juniper soon. Fan of these FSD Videos, keep on supervising ^^ German LEO (Customs) here. Have a nice one
SW engineer here. I think the red hands may have happened while stopped due to the state of the cars parallax calculations not being held (cached) while not moving due to an overly-aggressive state cach-flushing policy - which is a bug they will fix later. While driving, this state always updates so no red hands at 80mph. Impossible to say though since I have no visibility into the system
Very interesting. And you could be right. When I get the service techs inspection I will definitely let y’all know! Thanks for watching and commenting!
One of your new subscribers. Love your videos!❤️ Very much into Tesla but have to pay off my car first. I might be one of your youngest, I'm 18. I don't live in your area I live in Georgia
I've found FSD v13 to have a weakness when it comes to exiting parking lots. And I had one of those bizarre turn-then-u-turn sequences leaving a local Arbys the first night I had 13.2.1.
Yeah I have heard folks having interesting things taking place in parking lots. Mine has been really good, but I am in Houston, TX where Cybercabs are likely to roll out first or at least right after Austin. So they are really dialing in Texas to get ready. Thanks for watching and commenting!
I have a theory for red wheel. Maybe FSD Computer thinking too much at some places and tripping itself and overloading the computer.. only TeslaAi team can answer. 🤔
Your video is so good. I feel very comfortable 'cause you describe the things around very lovely and clearly as if I am watching a beautiful scenery film with wonderful narration. Your wife's Mecedez or your cousin's Infinity or any other cars are just not attractive to you anymore, because you have the great FSD.
I thought that was the sun, but yes, it happens at strange moments, and the visualization doesn’t seem to be weird. When you regain control, the visualization is fine. Hopefully, the camera will be fixed with the appointment with Tesla.
Was it just the camera filming or did it almost make you rear end that truck then threw up the red hands. Seemed like a close call. I've been watching your videos for awhile because I love them and that's the only time I've wanted to yell watch out! Stop! Keep doing your drive around it's fun to see the area. I used to live in the Clearlake area.
Thanks for watching and commenting! You are correct that it definitely came in a little hot since that queue was longer than normal. But it never felt out of control at all. But it could have been since we got up so close behind the truck in order to keep our rearend out of the other lane.
@@johnchristopher7697maybe not even just because so close but did the car sense that it was in a unsafe position because the rear end was technically still sticking out in the other lane? Just a thought on that circumstance. The parking lot really made no sense other than that was the end of the navigation
I have had several people tell me the exact opposite, that they like I don’t do that, and one of the reasons they prefer my videos. One day I’m going to figure out how to grow 20 arms and legs 🤣🤣🤣 But thank you for watching and commenting
@@johnchristopher7697 well maybe u can keep swapping them over that way it's best of both worlds I like ya long drives atm as I have been interested in tesla since I saw the model S arrive here in the UK and got fascinated with FSD and years later I have watched it improve and improve untill now so keep up the long trips Also to add for these long trips maybe you can talk about your experience being a ex army guy and some sort of background to help viewers possibly know u better
I think you got the first red hands because that massive truck was taking up most of the camera vision and looked like a grey wall , also you were partially in the lane, so the AI system was unsure of the situation
Retired Software engineer here, former founder of an AI startup, 25 years building for the Enterprise large and small applications, machine learning and AI research background. Here's what's likely happening with " Red hands" takeover notifications. You have to remember that the current system is now mostly an end-to-end neural network system. Meaning the photons that are coming in to the cameras coupled with the acceleration and velocity data that are captured by the vehicle are being fed as input into the neural model, which itself is actually a composition of different models that specialize in different parts of the tasks required to do the driving such as judging the lanes judging the velocity of the vehicle interpreting street signs and lights And issuing controls to the vehicle in order to perform various maneuvers. That end-to-end processing of the network models, however, needs to be interrupted at critical points in the transport activity. Such as when outlying conditions occur such as sunlight, blaring into a camera Or when the system discovers that the driver is unattentive to the road. These particular interruptions have to be higher level in the operation stack, then the neural network itself as they have to override what the neural network otherwise would want to do... This is almost certainly heurist code and it's also almost certainly the source of the erratic nature of these interruptions. This heuristic code is likely of a nature that it can accept and interrupt from the data that's coming from the neural network itself as it's driving such that when it has a low confidence or low probability of being able to make a certain maneuver that low probability is flagged as a reason for the higher level heuristic code to take over and requested the driver take over. As part of the process of going to full end-to-end neural networks that high-level heuristic code is going to have to be replaced itself by a sub-model in this case that submodel would look at the pro low probability signal being sent by the bulk of the model and then make an additional determination of action based on the current situation. For example, ultimately whether or not there's a driver looking at the road won't matter because there will be no condition in the heuristic code that triggers an interrupt of the models driving signals to the car. Instead, the vehicle will simply learn to conduct itself so that it does not require the supervision of the driver. Similarly, in a condition such as the glaring of the sun, a solution can be developed where the camera data is used to create a high dynamic range set of frames when a certain level of intensity is registered incoming into the camera enabling a high contrast set of frames to be rendered which can then be used to issue higher probability predictions... Though the human eye has significant ability to see variation in brightness digital sensors, can approximate a significant percentage of this range by using bracketed frames at different exposures... There's no reason why this cannot be invoked in real time when brightness oversaturation is detected at the sensor of a forward camera. With this solved yet again, two conditions in the higher level heuristic code that would trigger the red hands can now be removed. I don't know how many other conditional clauses they have in that particular agent, but the idea is they're going to get rid of them all by replacing them with some sort of autonomous sub model solution. As it stands, it seems there are only a small number remaining. Consider that there was a time when FSD would immediately freak out if it found itself on even slightly snowy roads as it lacked the necessary training data to understand how to conduct itself on a potentially slippery roads. But with that training data now captured, it conducts itself quite admirably on snowy roads. Eliminating the need for a condition that would would trigger interruption on a certain level of jerk was exceeded by the vehicle, which is likely how it's determining the slipperiness of the road surface. All that said, there is one more outlying possibility and that is hallucinations... The model could be outputting prediction data That triggers an interruption by the higher level heuristics but is actually not correlated with the ground truth... However, this type of artifact would have to be replicated by both cores in the inferencing computer since the final decision is determined by agreement between both course coming to the same probabilistic calculations. So it's extremely unlikely that it's hallucination at work. Finally, it could just be that there's a bug in the code that some aspect of that high-level heuristic set of conditions is being triggered when it shouldn't be due to the fact that various input parameters might be out of bounds in such a way as to trigger interruption. Again, as the heuristics in these conditions are replaced by trained submodels, those types of artifacts will go away. The good news is all of these in my view are things that can be solved pretty trivially at this point. Yes, they'll take retraining the model several times more, but that's what cortex is for and the monthly output of new updates is going to ensure that they run through the remaining issues fairly quickly.
Would you be interested in doing a video review and installation of aftermarket products? If so, could you please provide your email address so we can talk further?
Maaan I'm loving these daily long videos! Keep them up!!
Ah man! Glad you like them!
Hey there, i dont own a tesla (yet), looked to me like the red hands was caused by the fact your beautiful car was crossing lanes and stopped kind of between the lanes at the time it happened.
Looking forward to order me a Juniper soon. Fan of these FSD Videos, keep on supervising ^^ German LEO (Customs) here. Have a nice one
You will love your new Tesla!!! Thank you for watching and commenting!
I’m looking forward to it.
Awesome!
You are in my part of the woods. Nice.
Awesome!
@@johnchristopher7697 just got v13 today. Loving it.
@ Awesome!
SW engineer here. I think the red hands may have happened while stopped due to the state of the cars parallax calculations not being held (cached) while not moving due to an overly-aggressive state cach-flushing policy - which is a bug they will fix later. While driving, this state always updates so no red hands at 80mph. Impossible to say though since I have no visibility into the system
Very interesting. And you could be right. When I get the service techs inspection I will definitely let y’all know! Thanks for watching and commenting!
V13 is like baby AGI. Just so fascinating to see.
It’s been amazing! Thank you for watching and commenting!
I think you found the Bermuda triangle of Spring Texas 😂🤣
Hahaha! That parking lot......there is something going on there 🤣
One of your new subscribers. Love your videos!❤️ Very much into Tesla but have to pay off my car first. I might be one of your youngest, I'm 18. I don't live in your area I live in Georgia
That’s awesome young man! Thank you so much! And glad to have you as a subscriber! Don’t hesitate to let me know if you have any questions.
I've found FSD v13 to have a weakness when it comes to exiting parking lots. And I had one of those bizarre turn-then-u-turn sequences leaving a local Arbys the first night I had 13.2.1.
Yeah I have heard folks having interesting things taking place in parking lots. Mine has been really good, but I am in Houston, TX where Cybercabs are likely to roll out first or at least right after Austin. So they are really dialing in Texas to get ready. Thanks for watching and commenting!
Hi JC, you might want to check the recording from the front camera around the time of the "red hands." It could give you some helpful clues.
I’m getting it looked at on 12/27. Hopefully they can fix. Also got new software update, who knows maybe that will help.
now you're just making me hungry (lol)
🤣🤣🤣
JC, your wife's car looks clean to me. I suggest you don't say anything about it, unless you want to clean it😁
🤣🤣🤣
💜💜💜
❤️❤️❤️
hi
thanks for the videos
Please for the front face camera, could you angle it so your rear view is maximized? thanks
Funny thing I get comments about the opposite. I may try to switch up from time to time. Hard to please everyone as you might imagine 🤣
@@johnchristopher7697 indeed and thank you
I have a theory for red wheel. Maybe FSD Computer thinking too much at some places and tripping itself and overloading the computer.. only TeslaAi team can answer. 🤔
That’s a great thought and I am eager to see what they say. I’ve got appointment on 12/27. Thanks for watching and commenting!
Maybe the car is throwing red hands when it's stopped because it's bored and want you to play with it 😂
Hahahaha! Maybe I should bring one of my dogs along for it to play with 🤣
Your video is so good. I feel very comfortable 'cause you describe the things around very lovely and clearly as if I am watching a beautiful scenery film with wonderful narration. Your wife's Mecedez or your cousin's Infinity or any other cars are just not attractive to you anymore, because you have the great FSD.
Thank you so much! And yes I love driving with FSD! And thank you so much for watching! ❤️
I thought that was the sun, but yes, it happens at strange moments, and the visualization doesn’t seem to be weird. When you regain control, the visualization is fine. Hopefully, the camera will be fixed with the appointment with Tesla.
Maybe you could display the front camera feed through the camera preview while driving?
Thank you!
Was it just the camera filming or did it almost make you rear end that truck then threw up the red hands. Seemed like a close call. I've been watching your videos for awhile because I love them and that's the only time I've wanted to yell watch out! Stop! Keep doing your drive around it's fun to see the area. I used to live in the Clearlake area.
Thanks for watching and commenting! You are correct that it definitely came in a little hot since that queue was longer than normal. But it never felt out of control at all. But it could have been since we got up so close behind the truck in order to keep our rearend out of the other lane.
@@johnchristopher7697maybe not even just because so close but did the car sense that it was in a unsafe position because the rear end was technically still sticking out in the other lane? Just a thought on that circumstance. The parking lot really made no sense other than that was the end of the navigation
Can we have the map on a full screen with the fsd at the side it's alot better like that
I have had several people tell me the exact opposite, that they like I don’t do that, and one of the reasons they prefer my videos. One day I’m going to figure out how to grow 20 arms and legs 🤣🤣🤣 But thank you for watching and commenting
@@johnchristopher7697 well maybe u can keep swapping them over that way it's best of both worlds
I like ya long drives atm as I have been interested in tesla since I saw the model S arrive here in the UK and got fascinated with FSD and years later I have watched it improve and improve untill now so keep up the long trips
Also to add for these long trips maybe you can talk about your experience being a ex army guy and some sort of background to help viewers possibly know u better
2025 Y owner here, looking forward to 13 version.....
It will be soon since you have an Ai4 Tesla!
Seemed like kind of a close call with that RAM when it gave the red hands but maybe it was just the camera.
We were good, but agree we got up very close to the truck so as to keep our rear end out of the other lane
I think you got the first red hands because that massive truck was taking up most of the camera vision and looked like a grey wall , also you were partially in the lane, so the AI system was unsure of the situation
That definitely could have been the case. I will definitely let y’all know what the service techs say. Thanks for watching a commenting!
v12.5.6.4 seems impressive, but I get a little uneasy at times.
I’m actually on v13 now!
It could be the sun was bouncing off the Dodge ram that was in front of you when you’re at the traffic light I don’t know it could be
That’s a good point! At this point I’m just going to wait to see what the service techs say. Thanks for watching and commenting!
The first time I watched this, I thought you were being too overconfident by sitting backward in the passenger seat :-)
🤣🤣🤣🤣🤣🤣🤣🤣
You sound like Elon to me :-)
Sound like Elon? Elon is from South Africa, not south Alabama 🤣
Retired Software engineer here, former founder of an AI startup, 25 years building for the Enterprise large and small applications, machine learning and AI research background.
Here's what's likely happening with " Red hands" takeover notifications.
You have to remember that the current system is now mostly an end-to-end neural network system. Meaning the photons that are coming in to the cameras coupled with the acceleration and velocity data that are captured by the vehicle are being fed as input into the neural model, which itself is actually a composition of different models that specialize in different parts of the tasks required to do the driving such as judging the lanes judging the velocity of the vehicle interpreting street signs and lights And issuing controls to the vehicle in order to perform various maneuvers.
That end-to-end processing of the network models, however, needs to be interrupted at critical points in the transport activity. Such as when outlying conditions occur such as sunlight, blaring into a camera Or when the system discovers that the driver is unattentive to the road. These particular interruptions have to be higher level in the operation stack, then the neural network itself as they have to override what the neural network otherwise would want to do... This is almost certainly heurist code and it's also almost certainly the source of the erratic nature of these interruptions.
This heuristic code is likely of a nature that it can accept and interrupt from the data that's coming from the neural network itself as it's driving such that when it has a low confidence or low probability of being able to make a certain maneuver that low probability is flagged as a reason for the higher level heuristic code to take over and requested the driver take over. As part of the process of going to full end-to-end neural networks that high-level heuristic code is going to have to be replaced itself by a sub-model in this case that submodel would look at the pro low probability signal being sent by the bulk of the model and then make an additional determination of action based on the current situation. For example, ultimately whether or not there's a driver looking at the road won't matter because there will be no condition in the heuristic code that triggers an interrupt of the models driving signals to the car. Instead, the vehicle will simply learn to conduct itself so that it does not require the supervision of the driver.
Similarly, in a condition such as the glaring of the sun, a solution can be developed where the camera data is used to create a high dynamic range set of frames when a certain level of intensity is registered incoming into the camera enabling a high contrast set of frames to be rendered which can then be used to issue higher probability predictions... Though the human eye has significant ability to see variation in brightness digital sensors, can approximate a significant percentage of this range by using bracketed frames at different exposures... There's no reason why this cannot be invoked in real time when brightness oversaturation is detected at the sensor of a forward camera.
With this solved yet again, two conditions in the higher level heuristic code that would trigger the red hands can now be removed. I don't know how many other conditional clauses they have in that particular agent, but the idea is they're going to get rid of them all by replacing them with some sort of autonomous sub model solution.
As it stands, it seems there are only a small number remaining. Consider that there was a time when FSD would immediately freak out if it found itself on even slightly snowy roads as it lacked the necessary training data to understand how to conduct itself on a potentially slippery roads. But with that training data now captured, it conducts itself quite admirably on snowy roads. Eliminating the need for a condition that would would trigger interruption on a certain level of jerk was exceeded by the vehicle, which is likely how it's determining the slipperiness of the road surface.
All that said, there is one more outlying possibility and that is hallucinations... The model could be outputting prediction data That triggers an interruption by the higher level heuristics but is actually not correlated with the ground truth... However, this type of artifact would have to be replicated by both cores in the inferencing computer since the final decision is determined by agreement between both course coming to the same probabilistic calculations. So it's extremely unlikely that it's hallucination at work.
Finally, it could just be that there's a bug in the code that some aspect of that high-level heuristic set of conditions is being triggered when it shouldn't be due to the fact that various input parameters might be out of bounds in such a way as to trigger interruption. Again, as the heuristics in these conditions are replaced by trained submodels, those types of artifacts will go away.
The good news is all of these in my view are things that can be solved pretty trivially at this point. Yes, they'll take retraining the model several times more, but that's what cortex is for and the monthly output of new updates is going to ensure that they run through the remaining issues fairly quickly.
Wow! I’m going to have to bookmark this to come back to and read in full with more time. Thank you so much!!!🫶❤️
Would you be interested in doing a video review and installation of aftermarket products? If so, could you please provide your email address so we can talk further?
We could definitely talk. Are you on X. You can contact me there. Thanks for watching!
For some reason, I am unable to respond. I am available in X as well.