I bet the driver was sleeping, even if the ai didn’t catch that, nobody who’s awake doesn’t react to a flipped over semi even with autopilot on. That car didn’t slow down or even swerve a little bit. They need a couple of those cameras on the inside of the car.
yeah it just scares me to think that future brands are gonna remove steering wheels and pedals for the full auto pilot experience like imagine an EMP wave and all the cars just stop working and you can't use your brakes, this is a very unlikely situation but it's kinda an example
@@tankbg1311 well tesla has been aware since they first started making cars, I don't know how they'll survive an EMP but I know they're working on technology to protect it, I believe many of their vital components are protected by a shield but I don't know how well it protects the computers
It's because he's an engineer too. I noticed he explains things very well in his videos and looked him up because I knew he had to have a science/engineering background.
As a committed "Old Guy" (not an Old Guy who has been committed, I hasten to add, just one who has a commitment to being old, (glad I got that straight!)). I find these videos really interesting, the presenters are obviously very clever because they are able to inform about complex subjects in an easy to digest way. Laughed out loud at the "Large and not so bright "joke! Keep up the great work.
@@WindFireAllThatKindOfThing That is a horrible decision by Tesla to give it so much functionality. When the car isn't moving you can pull out a phone/tablet/laptop that would play your porn or anything else just as well. It is like flexing with a self driving ability it doesn't have. In my opinion the interior should be barebones so you can actually focus on the important part: driving.
Plot twist: the guy standing on the side of the road was actually road runner from looney toon who painted the underside of the flipped over truck to look like the road behind it.
Back when I still flew planes, turning the auto-pilot on DID NOT mean you could sit back and relax! You STILL needed to monitor what was going on around you. Why do people think that computer driven cars are any different?
it's less relevant and helpful in cars though....the airplane rarely if ever needs to change course, whereas a car is almost constantly turning, so why put that type of system on a car because it is nearly self defeating
They will be different and completely autonomous, just not yet. It's exciting tech that is moving quickly though, and I'd wager it'll be safer and more reliable than a human driver or pilot in just a few years.
Not just Tesla; I put the cruise control on my Toyota Corolla and started watching TikTok’s. Next thing I know, I’m heading into a ditch, flipped over, car totaled. Totally unacceptable. It’s fantastic we’re calling out car companies who have the audacity to expect people to actually pay attention while driving. I sympathize with gen Z that Driving a car manually is simply too difficult to comprehend😭
He was a Honorable Mention for the Darwin Awards. … Isn't it sad that people blame the self-driving car a little too much sometimes, indirectly defending some pretty braindead people? The car hit the top of the truck (the "softer" side). Imagine if it had crashed into the bottom side (hard axle and frame)?
@@Eduardo-yh3rv ya but if this happened in a car without autopilot obviously you’d call the driver dumb for just driving into a truck but in a tesla people jump to blaming the autopilot because people think it’s supposed to just work when tesla themselves say pay full attention no matter what because you still have 100% control even when autopilot is on
0:26 Yes, but what was "In" the truck? Look how much the truck moved when the "little" (by comparison) Tesla hit it. My guess is that the truck was empty or hauling something light (maybe boxes of snack-chips, for example). If my guess is right, then the Tesla crashed into an aluminum-walled air[box]. Also, notice the smoke from the brakes about 200 ft before impact (basically right when it passes the driver by the shoulder). … The Tesla was computing, "Oh, sh___, OH sh___, OH SH___! STOP! 😱" I'm not surprised the guy wasn't harmed, in my opinion. [Edit]: word-swapped "semi" for "truck"... smaller than a semi, just a regular box-truck.
First of all. That explanation of how sound travels through liquids/solids more effectively than air, was the best example I've seen yet. Pool balls! Who could've known?!
One small quibble: The video makes it sound like Tesla is unique in using machine learning to develop autonomous vehicles. This seems to be a misconception that a lot of people have. All of these companies are extensively using machine learning. In fact, the models build on top of more data sources, such as lidar, will tend to me much more robust to new environments, and unusual cases.
Fun facts - not Tesla, Elon. A number of Tesla SD engineers have been practically begging him on their knees to let them use LIDAR in the car, he won't, because he mouthed off about it a few years ago, and including it now would make him look like a jackass who didn't know what he was talking about. People have even both been fired and quit over it.
@@breckercanfield9957 NDA'ed, sorry. But if you want to put pieces together yourself, consider five things - Genuine SD experts and researchers, barring a small few outliers, pretty universally agree that LIDAR is needed for safe SD implementations at this time. Elon has very publically rubbished LIDAR on a number of occasions, for example claiming anyone using it is "Doomed" in April 2019, though his comments go back a lot further, easy enough to find those. Elon has also talked - bragged, really - about how involved he is with designing tesla products, and this is backed up by numerous accounts from employees, ex-employees and leakers about his micromanaging style, it's well known that if he says no, it doesn't go. And finally, Their SD team has an odd amount of turnover - not alarmingly high, but odd - and there's numerous leaker accounts of Elon firing people arbitrarily because they disagree with him.
Tesla wouldn't drive into wall, it has front facing radar. As for Elon and Tesla Engineers, it was Mobileye (Intel) that created the FSD camera concept. As for LIDAR, it has a lot of limitations and is really only used on premapped high resolution roads. Lidar system don't really think for themselves and since most conditions are constantly changing, LIDAR has a hard time adapting. One of the other big issues about using Lidar, is power consumption. Lidar maybe good for ICE Cars, but they would eat a lot of range in BEVs. One of the biggest issues, is costs. Lidar costs about 10 -20k, cameras about $50-100, Lidar doesn't work so well in rain, fog and etc, but Costs is the biggest factor. Getting back to the Accident in this video, Tesla has all the data and but is prevented from releasing, unless the Owner gives the OK. Maybe the Driver doesn't want it released as long to hide his mistake, but the truth is well known....it's just a matter of time before we all find out. So far in most of these cases, it has been driver error that was at fault. The Truck was overturned in the lane for a long time before the car hit it.
The car probably mistake the white surface with a sun flare or reflection. What tesla is trying to achieve is just above human level driving. Elon disses lidar on the sense that the other manufacturers are putting to much trust on these systems without improving the decision making ai.
@@calimio6 above human level driving? Don't think that's their goal, maybe their marketing. A person would have seen that truck easily,for example. The problem is that humans make mistakes or lose attention. Cars still can't drive as a person and probably never will. What they actually want is for it to be safe and predictable, a self driving car won't win a race against a race driver or avoid some specific accidents a human would, humans will still be better than self-driving. Again, the problem comes that statistically humans fail and some are idiots.
As I listened to the possible theories and explanations, I assumed I knew the direction where his answer was headed, but after he said that I realized he may not be able to outright state the obvious answer without opening himself to possible litigation from said car companies, tech manufacturers or negligent parties involved. Therefore, he was forced to end with "I can't tell you" as a way to claim no bias or blame was being directed at anyone mentioned.
I did some research on this previously after I had seen it on another youtube channel. The theory that makes sense to me is that the cameras confused the white roof of the overturned trailer with the white of the sky behind it. There have not been a lot of deaths with Teslas in autopilot mode, but the very first one in 2016 (Joshua Brown) was a similar case. A white semi truck was making a turn right in front of a Tesla model S. The Tesla did not see this white trailer possibly because again it was confusing the large white surface on the horizon with the sky and it drove right under the trailer, shearing off the roof and killing the driver. This happened again in 2019 with a model 3. The Tesla drove under a white trailer and decapitated the driver. In all of these cases the drivers obviously had become too comfortable with their Tesla's in autopilot mode, I mean who would blame them....it had worked for hours and hours of driving before. Complacency and inexperience takes lives everyday on our highways. Everyday I see people tailgating each other on the freeway, never expecting a car up front needing to come to an emergency stop and bam...you get a 5 car pile up.
The car slammed on, then didn't, then did. You see the smoke off the tyres. I think the guy at the side of the road had something to do with that though.
@@TheSarcasticEngineer I heard that Radar has trouble with stationary objects which is why Tesia removed it from the full self driving suite. That might explain the lack of response to a visible object.😇
@@vandal2569 I'm not sure if that's true. But regardless, you need to be better than 99%. A 1% failure rate is way too high when we're talking about an activity that's as common as taking a trip in a car and where the consequences are death and permanent injury.
@@waldolemmer Problem is seed generation is taking stuff out of many entropy sources, for example voltage drop or increase is tied strongly towards quantum effects of molecules and they are in fact, random, even sensor can output values beyond its practical accuracy, and sure that data is garbage in value but in fact can be used as unpredictable source of randomness. Linux primary rely on keyboard press timings, mouse movement, and some controller timings (like IDE/SATA etc.) and those results are often given beyond practical accuracy. The primary diffrence in case of Tesla AI and human brain is that AI is trained for just specific situation and nothing more. Eg. Imagine you have a sheets of papers with handwritten numbers 0-9. And you teach AI to recognize those numbers. It will work great. Problem comes the moment someone comes and writes "A". Then someone else comes and writes chinese character. AI will forcefully try to choose a number from 0-9 while human reading it will say "hold on something is wrong". The Tesla AI did see object in front but couldn't properly notice something is wrong and didn't try to slow down like human would. Current neural networks etc. lack giant unsolved quality that people do posses - abstract thinking.
@@invisible468 The car hit brakes a moment before the crash, and released the brake and then hit. Comments around here saying auto braking got engaged to avoid the pedestrian, and the driver saw the pedestrian and overridden the brake just to hit the truck.
@@MoJo01 There is collision avoidance on most cars nowadays, but on the "self driving" one it doesn't work ? This is also 100% a sensor failure on the car's part.
How can someone rely on car's sensor for avoid accident? Such a fool!. Even in 2077 with all of advanced technologies on cars people still crashing all the time!
This is actually the problem with mostly autonomous systems, like a tesla or a jet airliner, they are so automated that when the automation fails, the pilot has to make split second correction or it's death. That's happened with those boeing jet airliners.
Something to note: LiDAR has a crucial weakness in that on really hot days the road can heat up the air enough to make it refract light. This can then fool the LiDAR sensor, causing unwanted reactions (or lack of wanted reactions). In addition, the same thing can happen with water puddles and other reflective surfaces. You can see how that could be a problem. Elon Musk is correct in that a car relying solely on LiDAR is doomed. However, it is an extra tool in the bag and can help decide when one sensor is feeding the computer bad data.
which is why he could reopen the grill in the teslas, position a LIDAR and RADAR system behind it, keep all the cameras and supersonic sensors and have extra redundancy and an additional system to help improve sensor resolution for what the camera has difficulty identifying.
@@mohitnaik4245 too an extent, all titles need to be clickbait. An interesting title and thumbnail draws views. What they did here was reasonable, and nothing egregious. Donut has done this with multiple videos that don't reach the algorithm
You can only figure that out if you can figure out why human beings do the same. People still hit objects, animals, and other people even with all driver assists available as an advantage. Unfortunately, that is way too common and it is now a norm.
Person driving their Tesla, while looking at their phone, talking to their insurance: "Yep, it was the car's fault, it was in autopilot"😠. Don't buy that BS for a second🤦♂️
Alot of vehicles are able to record what mode they were in during crashes for decades now. I would guess a Tesla, connected to a fleet, would be able to see what mode the car was in at the time of the crash. I find it funny when stupid people don't know how smart their cars are and get an insurance fraud court date in the mail.
Even if the autopilot was on, the manual says you need to pay attention and have at least one hand in the wheel. So the insurance would just point out they should have been paying attention.
The overturned semi might've been totally invisible to the radar if the AI wasn't intelligent enough to understand what it was looking at. What looks like might've happened is similar to an aircraft's radar when it gets "notched." It's a very specific phenomenon that might've had a similar effect here. Very interesting vid! :)
@@riverrainwater4890 in the fatal accidents they think the drivers were asleep when it happened. In the video if you look closely you can see what looks like tire smoke just before impact.
@@danielm6049 I was going to say the same thing. Either the driver or the car slammed on the brakes, it was just going too fast it seems. Im thinking its oversaturation of stationary signals being ignored. Also there could be a faux stealth effect like stated before. Lots of nooks for radar to get dissipated by under a truck.
Hi guys. New to the channel. Trying to understand Differences between Indy car, F1, the 500, et cetera and all kinds of other motorsport things... I'm old but I'm still damn curious. Only caught the 2 hosts so far you guys are a lot of fun to watch and listen to, well edited and good professional sound too...and you're not annoyingly crass like some other channels I won't mention! Thank you very much, keep up the good work and I am now a subscriber ...all the best!
@@user-ox5ir2rd6g not really, it’s good if you need to grab something for a couple seconds. A crash like this only happens if you aren’t paying attention at all for long periods of time.
@@thisisntsergio1352 Soundaktors are only one type of system. It makes the already existing sound more resonant. There are a few others which involve digitizing the sound entirely and putting it through the speakers.
4chan has known for years. That's why it's not typed captchas anymore, because we figured out that only one word had an answer, the other one was us teaching it how to read handwriting. Look up "Opperation Renigger"
@@tye3648 *clears throat* may I direct you to Mercedes Benz's Concept semi truck that was fully autonomous, and they built it before Tesla even became as famous as it is now?
"Oh, this is a perfect example, Susan. See this crashed truck in our lane? Yeah, my Tesla is going to stop by itself, watch. No, wait, I think we have to be closer. No, don't worry, it works- I've tried it before in the parking lot with Mark. Ok, it should stop now. Ok, now wai... AAAAARGGGH..."
AI are just as flawed as the people who program them. Computers just can't compete with us at things involving split second decision making Look at Formula1 drivers and fighter pilot aces. AI can't touch them by a pretty large margin
Musk wants these cars to be fully autonomous. Let’s not start the conversation of “oh why didn’t the driver do this or that”. Remember Tesla is charging for full self driving in their vehicles. That’s unacceptable to advertise and sell as a feature if it’s not ready.
@@maxlu2456 its not legal to have a full autonomous car on the road so it is still the drivers fault for not breaking. Yeah the system should have worked, but it just shows that you can't trust these systems
@@hairstonjr_5337 you do know that even if self driving is 100% capable and doable , it would still fall under the drivers liability right? We aren’t talking about fault here. Technically the driver is at fault for almost everything. This is about Tesla and their roadmap to fully autonomous driving.
Hey Jeremiah, just a little mistake: Radar’s do not work excellent in all conditions, for example, in rain, fog or snow radars work poorly and in most cases do not find any objects, either moving or stationaty. That is why in maritime industy vessels use X-Band and S-Band radars, the difference being in wave length (3cm and 10cm waves) and in frequency. 😉
I work for a major automotive supplier in a radar department. 80% of what donut said here is either incomplete or incorrect about radars. Should have done their lessons more on this one. But to cut a long story short, radars are calibrated. They are not supposed to look mindlessly into the air. Take a look at radar guided cruise control and then I don't think you will see any type of accidents like this. So no a radar will NOT mistake an overhang with a truck. Especially 77Ghz models. Except a gross mistake on the development side. Static objects are an issue if the vehicle is stationary. 77Ghz models can easily make distinctions between bikes, trucks, shopping trollies etc. They do have their limitations that is why a progression to lidar + cameras is necessary. There is talk about integrating small lidar modules in the headlights tail lights etc.
@@elvo6217 but he literally didn’t say that they don’t work you fucking moron, he said they need to be calibrated, they don’t just hook up and work magic. Either you’re a kid or you are at a below average reading comprehension level.
My thought on why it crashed: The truck was turned over with its smooth top pointing at an angle to not return any radio waves. For the camera.. Maybe again it was the smooth angle and the person confused it.. For the person: On their phone.
People who buy cars with an "auto-pilot" will most assuredly (sometime) allow it to drive itself. It will not be like autopilot on a plane. It will be like the fools we see getting tickets for riding in the backseat as their self-driving chauffeur drives them to their destination.
“There are actually videos out there of self-driving cars saving people” *For some reason I thought of a Tesla model x dragging a person and shouting “medic!”*
The guy that crashed should’ve interfered by stepping on them dang brakes. Tesla can’t be sued cause paying attention is literally something Tesla tells you before you even buy a car
That's not how public liability works though. Tesla should not be selling cars with this feature, plain and simple. The driver is ultimately responsible for engaging the system and then allowing his attention to lapse, however such a lapse in attention would not be possible in a car without autonomous features. It's partly Teslas fault for marketing a system that's in beta, and then attempting to absolve themselves of any liability with a crappy warning screen nobody reads. There's no excuse for a system like this on the road. This shit needs to be banned pronto.
@@Patrick-857 "however such a lapse in attention would not be possible in a car without autonomous features." - Im calling hot bullshit on that one good sir. People have their attention lapse ALL THE TIME on the road. its a miracle there arent more accidents. Oh wait... "36,835 fatalities in 2018" according to nhtsa www.nhtsa.gov/press-releases/2019-fatality-data-traffic-deaths-2020-q2-projections#:~:text=Traffic%20deaths%20decreased%20nationwide%20during,traveled%20increased%20by%20nearly%201%25.
@@TitusGargilius Nice one idiot, I'm talking about the kind of lapse in attention that involves engaging autopilot and taking your hands off the wheel and eyes of the road. That doesn't happen without a massively oversold and underdeveloped autopilot system. Stop defending this ridiculous technology.
We all know that totally autonomous self-drive isn't quite ready to go prime-time. So the real question is, what was wrong with the software that it failed to stop or divert around the large fixed unmovable object? This should be an easy test-case for self-drive systems, that should be fixable with a software update.
I am not putting any blame on the car it is all human error to fully trust technology to that extreme. Watch the original Total Recall and see how the Johnny Cab worked out there for a laugh, but in the real world use common sense when in a vehicle.
Uncle Jerry aught to randomly pop in advising everyone on the donut media team - I love that guy!!! 😎❤️🤪 - (Sometimes I rewatch his spots even if he’s promoting a sponsor. - cheesy but true.)
@@scanspeak00 And that we don’t know about. I bet the autopilot would have reacted to the presence of the pedestrian which was close enough to be hit !
I guess autopilot decreases safety then because people allow themselves to more readily to be distracted... What is the point of it if it can't handle the 1% or 0.1% of situations?
You were basically spot-on with the "it didn't know what the semi was" thing. I doubt Tesla engineers had trained the model to recognize an overturned semi truck. This is why it would probably be helpful for them to be smart and instead of using a huge 3D LiDAR system like other manufacturers, just replacing the radar with a unidirectional 2D LiDAR sensor, similar to what newer Roombas use. Then it could've seen that there was a large stationary object directly in front of the car and braked, without having an ugly thing on top of the car.
The answer is simple. The guy behind the wheel wasnt paying attention. He could’ve avoided the truck had he been paying attention. But he wasnt. So this aint the cars fault, but the drivers
@@willgotrocks5348 But keep in mind that if a planes auto pilot systems did basic mistakes like that they probably wouldt be trusted at all. Immagine if you autopilot lost altitude or some shit
@@willgotrocks5348 But keep in mind that if a planes auto pilot systems did basic mistakes like that they probably wouldt be trusted at all. Immagine if you autopilot lost altitude or some shit
@@theshunt545 Well there is a experiment on Car in-build collision avoidance system by independent company sometimes ago. And one of which is a Tesla Model 3 facing againts pedestrian. Rather than stopping, it's just continue to drive like nothing happened. Even Honda CRV sensor works slightly better when it hits the pedestrian and stop, rather than crashing into it and drive on
@@ivanbima5877 My position is against self driving, to be more clear. I thought human disposition used for driving will be hard to mimic, and counter productive in term of energy (for Datacenter involved) that will follow.
@@theshunt545 The tesla actually did see it, hence the sudden breaking and puff of dust on the road when passing the pedestrian. but as the human was stationary and the car never hit it the auto breaking ceased. I have had this happen on my Model X with a bicycle once.
That's what I thought, you are supposed to keep aware of your surroundings when driving these cars because they aren't perfect yet if they ever will be.
I work as a full time Software Quality Assurance Engineer and I can tell you that one of the 7 Principles of testing is: "Exhaustive testing is impossible", meaning that you can never test all scenarios, you can only evaluate risks, improve testing strategies and continuously improve quality even after a software is fully deployed and in maintenance phase. As to give you an example I follow up air crash investigations and very often they made stress testing on planes by causing 3 malfunctions, at 10000 feet, but never below 200 feet, when this happened in real live below 200 feet other issues appeared and completely messed up the software functionality, other parts of the software code override and plane crashed
It's not driverless technology yet, there's even a warning when you take your hands off the wheel. The driver wasn't paying attention, whether or not this was a Tesla, that accident still would have happened
I like that diplomatic save that you added in the end there. Love your content, very informative! Useful when the guys are talking about car facts and people disagree
"Captchas help to train AI Systems of self driving cars"
My next captcha:
Select all tiles containing: Flipped over white semi trucks
So me proving Im not a robot has just been making better robots!🤯
@@zacheryshreves7516 😱🤯
@@zacheryshreves7516 I'm not mad about it 😂
@@zacheryshreves7516 bamboozled 🤯
I bet the driver was sleeping, even if the ai didn’t catch that, nobody who’s awake doesn’t react to a flipped over semi even with autopilot on. That car didn’t slow down or even swerve a little bit. They need a couple of those cameras on the inside of the car.
4:37 disliked, unsubbed, uncommented, reported, undonated, uninstalled youtube, sold pc
I loik cereal
Yea it got me too
I hope that’s sarcasm 😂
Oh hi Exyl what ye doin in here?
Lololol
11:10 that reliable system should have been the human driver, but they weren’t paying attention
Exactly 💯
yeah it just scares me to think that future brands are gonna remove steering wheels and pedals for the full auto pilot experience like imagine an EMP wave and all the cars just stop working and you can't use your brakes, this is a very unlikely situation but it's kinda an example
@@tankbg1311 well tesla has been aware since they first started making cars, I don't know how they'll survive an EMP but I know they're working on technology to protect it, I believe many of their vital components are protected by a shield but I don't know how well it protects the computers
Human drivers are a lot of things. Reliable is not one of those things.
@@whatareyoudoingyouidiot342 😂😂😂😂
As an engineer, I have to give props for how well you explained this complicated topic to people ❤
It's because he's an engineer too. I noticed he explains things very well in his videos and looked him up because I knew he had to have a science/engineering background.
*The model 3* : hi Ron
*The truck* : hi Billy
*that hurt*
@@valerie_screws_around legend
The one other car: oh hi Marc
This man knows his vintage RUclips. Kudos
@@lukestreet8473 hi
Autopilot be like: Not today you ain't fooling me with that fake truck
Oh it ain't a fake
(Stolen joke:) It was roadrunner (the guy standing on the road) who painted the underside of the truck to look like the rest of the road.
Tesla after crash: lol that's craaazy
Truck was mind gaming the mind game
U guys know what autopilot means ?
@@Krystalmyth XD
The real question is why didn't the "Driver see the big huge truck"
Because he saw only a uniform area in a similar grey as the sky, maybe? And the car did the same? (and saw nothing in radar)
He was sleeping
The driver bet on the autopilot and slept or really engaged in other stuffs
obviously the driver didn't have lidar on his head, duh.
yes the driver is responible 1000%
As a committed "Old Guy" (not an Old Guy who has been committed, I hasten to add, just one who has a commitment to being old, (glad I got that straight!)). I find these videos really interesting, the presenters are obviously very clever because they are able to inform about complex subjects in an easy to digest way. Laughed out loud at the "Large and not so bright "joke! Keep up the great work.
That "like your mom" out of nowhere really caught me off guard lol
I had to rewind lol
Lmao same here
What do you mean it showed a cow was that not enough for you?
He said it so slick that it went over people's head.
@@GryphusOneYT 0
Pro tip: Watch where you're going.
Tesla Autopilot: Screw your human common sense. My processor operates at 50 quadrillion peta-bytes per nano-second. I know what I'm doing....
Ok boomer
A novel idea....
Exactly. It's called auto pilot, not set and forget it.
@@danesebruno but many of these autopilot lovers will rely on it and be distracted
The real question is “what was the driver doing?”
@@WindFireAllThatKindOfThing so he pulled back hard on the handbrake, just the wrong one.
@@WindFireAllThatKindOfThing That is a horrible decision by Tesla to give it so much functionality. When the car isn't moving you can pull out a phone/tablet/laptop that would play your porn or anything else just as well. It is like flexing with a self driving ability it doesn't have. In my opinion the interior should be barebones so you can actually focus on the important part: driving.
@@assassin3g I think he was making something called a joke?
The real answer is “your mom!”
@@WindFireAllThatKindOfThing Both the car and the driver had their eyes on that screen. That's what happened here.
"LIDAR is a fool's errand."
*Makes cars that drive straight off cliffs, straight into concrete barriers, straight into other vehicles.
Tesla's are a fool's coffin
*You talk with your ass without any source to back your claims*
"Shit! Just flipped my truck on the highway my boss is gonna kill me... ayo is that a Tesla"
"Wait I'm in China......" (cuz the location of tthe footage is from China)
how TF did you get nolan’s face on your username???
Lmaoooo
Plot twist: the guy standing on the side of the road was actually road runner from looney toon who painted the underside of the flipped over truck to look like the road behind it.
howd you know???
SUCH A UNDERRATED COMMENT
That MUST have been the reason!!!
Meep meep!
Or he was just the truck driver (maybe)
"Figuring Out Why This Tesla Hit That Truck"
Donut: ...idk
Pretty much, it should be called... How autonomous vehicles sense the road.
😂🔥
That's donut media for you, lazy, false facts and annoying hosts
Well if you read between the lines: wrong sensor setup at tesla
@@FlorianMickler Tell me more please
Just to say a big thank you. Everything is so slick, and so well written. You make the complex subject so easy to understand. Bless you all. Freddy
Back when I still flew planes, turning the auto-pilot on DID NOT mean you could sit back and relax! You STILL needed to monitor what was going on around you. Why do people think that computer driven cars are any different?
it's less relevant and helpful in cars though....the airplane rarely if ever needs to change course, whereas a car is almost constantly turning, so why put that type of system on a car because it is nearly self defeating
Becuase people are stupid and believe tesla way to much they should change it from autopilot to driving assistance
They will be different and completely autonomous, just not yet. It's exciting tech that is moving quickly though, and I'd wager it'll be safer and more reliable than a human driver or pilot in just a few years.
Because I don't want to fuckin drive I want to pregame in my own car to the party
Because it takes many hours of flying to get your pilot's license, but you can get a driver's license from a box of cereal.
tip: you should drive while driving.
Thanks man then i turn 18 this knowlage will help me a lot and maybe save me life. You are a hero
@@fallendown8828 Fuxking right!😆
Best answer 👍👍👍
Thanks now I’ve stop rolling into everyone
Sensible
5:07 "My insurance rates are about to go up? That's a scam... I don't have insurance" *internal screaming*
Gabi that sponsors Donut : Am I a joke to you?
@꧁ RITA - F**UСК МЕ ꧂ I don't feel good about your username.
Not just Tesla; I put the cruise control on my Toyota Corolla and started watching TikTok’s. Next thing I know, I’m heading into a ditch, flipped over, car totaled. Totally unacceptable. It’s fantastic we’re calling out car companies who have the audacity to expect people to actually pay attention while driving. I sympathize with gen Z that Driving a car manually is simply too difficult to comprehend😭
Made me laugh.
"like your mom"
oh damn we doing that now?
I laugh so hard
My self esteem was crushed, my mothers as well. I had to dislike and unsub.
I’m so upset right now 😔😔
@@GardenGuy1942 bruh u serious ? He was joking for god sake. I mean like r u that sensitive ? No offense just asking ...
@@car-enthusiast3141 joke?
Title : "Finding out "
At the end of the video : " I don't know "
👏👏👏👏
Honestly, was like, great but I have learnt nothing
Every donut video ever
@@FerencKranicz haha lol
@@davidbiendasaleh8880
Was expecting more info but no..
@@nokiaairtel5311 bruhhh💔 12 minutes down the drain
if you let car to decide life or death scenario, you shouldn't own one.
He was a Honorable Mention for the Darwin Awards.
… Isn't it sad that people blame the self-driving car a little too much sometimes, indirectly defending some pretty braindead people? The car hit the top of the truck (the "softer" side). Imagine if it had crashed into the bottom side (hard axle and frame)?
Fr
Everytime you get in a car, its a life or eath scenario....
@@Eduardo-yh3rv ya but if this happened in a car without autopilot obviously you’d call the driver dumb for just driving into a truck but in a tesla people jump to blaming the autopilot because people think it’s supposed to just work when tesla themselves say pay full attention no matter what because you still have 100% control even when autopilot is on
I'd love to die unknowingly. I can't wait to put my faith in a car bc whatever
"Why This Self-Driving Tesla Car Hit That Truck"
11 mins later*
"I have literally zero clue?"
Me: clicks on all the wrong images on recaptcha
self driving cars when they see another car: hmm yes this is *B O A T*
Cough cough “challenger”
😂😂
Must be a dodge
"the driver wasn't harmed"
*Looks at clip again*
Damn they really are a safe car.
Exactly what i was thinking
0:26 Yes, but what was "In" the truck? Look how much the truck moved when the "little" (by comparison) Tesla hit it. My guess is that the truck was empty or hauling something light (maybe boxes of snack-chips, for example). If my guess is right, then the Tesla crashed into an aluminum-walled air[box]. Also, notice the smoke from the brakes about 200 ft before impact (basically right when it passes the driver by the shoulder). … The Tesla was computing, "Oh, sh___, OH sh___, OH SH___! STOP! 😱"
I'm not surprised the guy wasn't harmed, in my opinion.
[Edit]: word-swapped "semi" for "truck"... smaller than a semi, just a regular box-truck.
It is
Those truck cargo containers are made of thin aluminum or fiber glass. It cushioned impact
Apparently not safe enough to avoid the truck.
First of all. That explanation of how sound travels through liquids/solids more effectively than air, was the best example I've seen yet. Pool balls! Who could've known?!
Better than they teach in school
@@sahlsiddiqui9944 way better
really? Based on that it travels better through concrete than through open air?
Sound traveling better in water than air is obvious for anyone who tried having a conversation with a friend under water in a swimming pool!
@@klaushipp1207 Yes but the transition between the two materials can cause reflections.
You need to match the impedance to get good transmission.
The fact of the matter is, drivers need to stay aware at all times. Autopilot of any sort promotes distraction. Love you, Donut!
Jerry is looking more and more like Shrek when they wished him into a human.
I can’t unsee it now.
4:39
Bro this comment is GOLD , If this was reddit i would award you! 😱😂
He was fine af tho
Next video I'll by doing my Shrek impression the entire time - Jerry
I haven't been gotten by a "your mom" joke since I was in highschool. Well played.
I concur. I liked this video just because of the joke. 🤣
This turned dark real fast
when joe mama
It was slipped in there pretty well huh.
Joe momma
He gotta chill with that mom joke 😂😂😂😂😂
I squared up so fast then I remembered it was just a video
@@younghex9577 😂😂😂😂
@@younghex9577 they had u reaching for it
@@younghex9577 ong though whole mood changed
One small quibble: The video makes it sound like Tesla is unique in using machine learning to develop autonomous vehicles. This seems to be a misconception that a lot of people have. All of these companies are extensively using machine learning. In fact, the models build on top of more data sources, such as lidar, will tend to me much more robust to new environments, and unusual cases.
Tesla: "LiDAR is a fools errand."
*slams into a painting of a tunnel on a wall*
Fun facts - not Tesla, Elon. A number of Tesla SD engineers have been practically begging him on their knees to let them use LIDAR in the car, he won't, because he mouthed off about it a few years ago, and including it now would make him look like a jackass who didn't know what he was talking about. People have even both been fired and quit over it.
Lol, the 'ol Wiley Coyote trick
@@Churbas source?
@@breckercanfield9957 NDA'ed, sorry. But if you want to put pieces together yourself, consider five things - Genuine SD experts and researchers, barring a small few outliers, pretty universally agree that LIDAR is needed for safe SD implementations at this time. Elon has very publically rubbished LIDAR on a number of occasions, for example claiming anyone using it is "Doomed" in April 2019, though his comments go back a lot further, easy enough to find those. Elon has also talked - bragged, really - about how involved he is with designing tesla products, and this is backed up by numerous accounts from employees, ex-employees and leakers about his micromanaging style, it's well known that if he says no, it doesn't go. And finally, Their SD team has an odd amount of turnover - not alarmingly high, but odd - and there's numerous leaker accounts of Elon firing people arbitrarily because they disagree with him.
Tesla wouldn't drive into wall, it has front facing radar. As for Elon and Tesla Engineers, it was Mobileye (Intel) that created the FSD camera concept. As for LIDAR, it has a lot of limitations and is really only used on premapped high resolution roads. Lidar system don't really think for themselves and since most conditions are constantly changing, LIDAR has a hard time adapting. One of the other big issues about using Lidar, is power consumption. Lidar maybe good for ICE Cars, but they would eat a lot of range in BEVs. One of the biggest issues, is costs. Lidar costs about 10 -20k, cameras about $50-100, Lidar doesn't work so well in rain, fog and etc, but Costs is the biggest factor.
Getting back to the Accident in this video, Tesla has all the data and but is prevented from releasing, unless the Owner gives the OK. Maybe the Driver doesn't want it released as long to hide his mistake, but the truth is well known....it's just a matter of time before we all find out. So far in most of these cases, it has been driver error that was at fault. The Truck was overturned in the lane for a long time before the car hit it.
spoiler alert: they did not figure out why this tesla hit that truck
Actually, they did. The driver wasn't paying attention.
What tesla should do is tell us what the data said but nobody knows yet.
@@MrTeff999 Thats not what was asked. They asked why the Tesla failed to brake, not why the human failed to brake.
The car probably mistake the white surface with a sun flare or reflection. What tesla is trying to achieve is just above human level driving. Elon disses lidar on the sense that the other manufacturers are putting to much trust on these systems without improving the decision making ai.
@@calimio6 above human level driving? Don't think that's their goal, maybe their marketing. A person would have seen that truck easily,for example. The problem is that humans make mistakes or lose attention. Cars still can't drive as a person and probably never will. What they actually want is for it to be safe and predictable, a self driving car won't win a race against a race driver or avoid some specific accidents a human would, humans will still be better than self-driving. Again, the problem comes that statistically humans fail and some are idiots.
Spoiler: He doesn't have an answer. He just stalling.
you are a hero
I still watched it to make sure you were right. You were right.
I mean he said why it might have happened but there's no way he could now exactly what happened without the cars data
@@uhohmemebiggestboy212 coz they can’t hack a Tesla 😂
Clickbait: still no answer.
The answer to the car crash is simple
the guy who founded the company is a con man who promises more then he will ever deliver
"Remember kids: AI are meant to assist, not to take over" that guy who created rules for robots.
Rename it to Auto Assist and not Auto Pilot
@@leshiro5574 assist pilot sound better for me
is that why they advertise it as autopilot? Specifically meant to take over?
Tell that to google
Actually his quote fits better for Neurolink.
Watch a 12 minute video to arrive at “who knows?” 😭
Thanks, your sacrifice saved me 12 minutes
@@armink9686 It's actually a very interesting video overall tho.
As I listened to the possible theories and explanations, I assumed I knew the direction where his answer was headed, but after he said that I realized he may not be able to outright state the obvious answer without opening himself to possible litigation from said car companies, tech manufacturers or negligent parties involved. Therefore, he was forced to end with "I can't tell you" as a way to claim no bias or blame was being directed at anyone mentioned.
still worth it
@@MrMakosi very informative but the title is clickbait
I did some research on this previously after I had seen it on another youtube channel. The theory that makes sense to me is that the cameras confused the white roof of the overturned trailer with the white of the sky behind it. There have not been a lot of deaths with Teslas in autopilot mode, but the very first one in 2016 (Joshua Brown) was a similar case. A white semi truck was making a turn right in front of a Tesla model S. The Tesla did not see this white trailer possibly because again it was confusing the large white surface on the horizon with the sky and it drove right under the trailer, shearing off the roof and killing the driver. This happened again in 2019 with a model 3. The Tesla drove under a white trailer and decapitated the driver. In all of these cases the drivers obviously had become too comfortable with their Tesla's in autopilot mode, I mean who would blame them....it had worked for hours and hours of driving before. Complacency and inexperience takes lives everyday on our highways. Everyday I see people tailgating each other on the freeway, never expecting a car up front needing to come to an emergency stop and bam...you get a 5 car pile up.
@@michaeldautel7568 it looked like the car braked at the last second
@@michaeldautel7568 This video clearly shows the top of the semi being hit. Or are you referencing one of the other accidents?
@@MorbidOrangeCat you are right sorry☹️
The car slammed on, then didn't, then did. You see the smoke off the tyres.
I think the guy at the side of the road had something to do with that though.
@@TheSarcasticEngineer I heard that Radar has trouble with stationary objects which is why Tesia removed it from the full self driving suite. That might explain the lack of response to a visible object.😇
Any system that works 99% of the time, but fails in dangerous edge cases...isn't ready to drive vehicles on public roads
u can't have a 100% chance that is impossible
@@vandal2569 I'm not sure if that's true. But regardless, you need to be better than 99%. A 1% failure rate is way too high when we're talking about an activity that's as common as taking a trip in a car and where the consequences are death and permanent injury.
“It’s literally how bats work”
I mean, yes?
Why did this garner 100+ likes?
@@TheCarpenterUnion Because 100+ people liked the comment, there's no correlation whatsoever
@William Schwartz "Stevie Wonder coulda done better!"
-Jezza
@William Schwartz stevie wonder isnt blind. It's a myth
Are you sure ?
A: How many turned-over trucks has CAPTCHA asked you to identify? ;O)-
To be fair tho.
Absolutely! AI is only as smart as the unwitting dupes that trained it
@@luislongoria6621 not true. AI cannot be random, humans can. Humans being able to be random is what makes them more intelligent than AI.
@@waldolemmer except they can be predicted, because RNG isn’t entirely random.
@@waldolemmer Problem is seed generation is taking stuff out of many entropy sources, for example voltage drop or increase is tied strongly towards quantum effects of molecules and they are in fact, random, even sensor can output values beyond its practical accuracy, and sure that data is garbage in value but in fact can be used as unpredictable source of randomness. Linux primary rely on keyboard press timings, mouse movement, and some controller timings (like IDE/SATA etc.) and those results are often given beyond practical accuracy.
The primary diffrence in case of Tesla AI and human brain is that AI is trained for just specific situation and nothing more. Eg. Imagine you have a sheets of papers with handwritten numbers 0-9. And you teach AI to recognize those numbers. It will work great. Problem comes the moment someone comes and writes "A". Then someone else comes and writes chinese character. AI will forcefully try to choose a number from 0-9 while human reading it will say "hold on something is wrong". The Tesla AI did see object in front but couldn't properly notice something is wrong and didn't try to slow down like human would. Current neural networks etc. lack giant unsolved quality that people do posses - abstract thinking.
“Drivers” fault 100%, you know the person was either sleeping or on the phone
Anthony Tah May or trying to get someone else to pay his car note....
@@invisible468
The car hit brakes a moment before the crash, and released the brake and then hit. Comments around here saying auto braking got engaged to avoid the pedestrian, and the driver saw the pedestrian and overridden the brake just to hit the truck.
@@MoJo01 There is collision avoidance on most cars nowadays, but on the "self driving" one it doesn't work ? This is also 100% a sensor failure on the car's part.
But the driver is driving the car, should just hit the brakes
just wait for the sensors on a self driving semi truck go out and takes out 20 cars at once
I expected some tesla hating, but your video was really balanced and informative.
Kudos
Great to see this guy being appreciated. After all the “hate” he got after his first apperance.
I'm still not a fan tbh
He got hate? I like this guy a lot
I really like him. He's smart and super knowledgeable, and seems like he loves what he does. He's great
@@ajrestivo Oh, if you saw the comments on his first video, it's almost brutal
Eh, still not a fan.
How can someone rely on car's sensor for avoid accident? Such a fool!. Even in 2077 with all of advanced technologies on cars people still crashing all the time!
Especially when having sex in a tank
@@zachpare206 we all know you make sure the tank is stationary before layin’ pÿpe.
Wow! How is it like coming back from 2077?
There will never be a non accident year with cars
@@zachpare206 cant forget with a neural link
The real question is why didn't the "driver" see the truck and hit the breaks
He did you can see the smoke when he slammed on the brakes
@@ToddTurd obviously not soon enough
Probably distracted
This is actually the problem with mostly autonomous systems, like a tesla or a jet airliner, they are so automated that when the automation fails, the pilot has to make split second correction or it's death. That's happened with those boeing jet airliners.
This will be the problem with cars with autopilot, people wont pay attention and will be hooked on their phones relying on the car to do everything.
It’s ‘Stereoscopic Vision’ for anyone curious.
You guys make sick videos and I really appreciate them. Much respect.
“Hello? My insurance rates are about to go up”
“That’s a scam, I don’t have insurance”😂
That a fine
Lame ass joke
Something to note: LiDAR has a crucial weakness in that on really hot days the road can heat up the air enough to make it refract light. This can then fool the LiDAR sensor, causing unwanted reactions (or lack of wanted reactions). In addition, the same thing can happen with water puddles and other reflective surfaces. You can see how that could be a problem.
Elon Musk is correct in that a car relying solely on LiDAR is doomed. However, it is an extra tool in the bag and can help decide when one sensor is feeding the computer bad data.
And then there's the whole "useless in fog, rain or snow" thing on top of your most excellent point above...
An educated comment those are rare...🤙🏽🤙🏽👍🏽keep em coming.
which is why he could reopen the grill in the teslas, position a LIDAR and RADAR system behind it, keep all the cameras and supersonic sensors and have extra redundancy and an additional system to help improve sensor resolution for what the camera has difficulty identifying.
I was so early this video was called How Autonomous Cars "See" the Road
I still have the original notification up
me too
Please stop click bait , or youve lost my respect @donut
@@mohitnaik4245 how is this clickbait?
@@mohitnaik4245 too an extent, all titles need to be clickbait. An interesting title and thumbnail draws views. What they did here was reasonable, and nothing egregious. Donut has done this with multiple videos that don't reach the algorithm
You can only figure that out if you can figure out why human beings do the same. People still hit objects, animals, and other people even with all driver assists available as an advantage. Unfortunately, that is way too common and it is now a norm.
"...something larger and not so bright, like your mom..."
*SMASHES thumbs up*
This guy just revived the your mom joke!! Was looking for the comment 😂
@@lemsdk17 stolen but jeah
@@RecklessGenesis isn't all your mom jokes invented by now, so when ever one is used it is stolen? I like the performance and presentation 😄
A good “your mom” jokes always makes me laugh
Person driving their Tesla, while looking at their phone, talking to their insurance: "Yep, it was the car's fault, it was in autopilot"😠.
Don't buy that BS for a second🤦♂️
I’ll put money on the fact that it wasn’t on
They can go into the system and get a crash report that tells them if it was in autopilot or not
Alot of vehicles are able to record what mode they were in during crashes for decades now. I would guess a Tesla, connected to a fleet, would be able to see what mode the car was in at the time of the crash. I find it funny when stupid people don't know how smart their cars are and get an insurance fraud court date in the mail.
Even if the autopilot was on, the manual says you need to pay attention and have at least one hand in the wheel. So the insurance would just point out they should have been paying attention.
His/her*
The overturned semi might've been totally invisible to the radar if the AI wasn't intelligent enough to understand what it was looking at.
What looks like might've happened is similar to an aircraft's radar when it gets "notched." It's a very specific phenomenon that might've had a similar effect here.
Very interesting vid! :)
Didn't expect you here lol
You mean kind of like all the angular shapes of older stealth aircraft? The underside of a trailer is pretty complicated.
I still wonder why the driver didn’t just brake, he had many seconds.
@@riverrainwater4890 in the fatal accidents they think the drivers were asleep when it happened. In the video if you look closely you can see what looks like tire smoke just before impact.
@@danielm6049 I was going to say the same thing. Either the driver or the car slammed on the brakes, it was just going too fast it seems. Im thinking its oversaturation of stationary signals being ignored. Also there could be a faux stealth effect like stated before. Lots of nooks for radar to get dissipated by under a truck.
Hi guys. New to the channel. Trying to understand Differences between Indy car, F1, the 500, et cetera and all kinds of other motorsport things... I'm old but I'm still damn curious. Only caught the 2 hosts so far you guys are a lot of fun to watch and listen to, well edited and good professional sound too...and you're not annoyingly crass like some other channels I won't mention! Thank you very much, keep up the good work and I am now a subscriber ...all the best!
Meanwhile, it’s “ *driver assist* “ N O T “ *driver asleep* “
Damn, what a shitty driver assist
dont brand it as auto pilot then
where's my Uber.
@@GonzoGonzoGonzo fair enough.
@@user-ox5ir2rd6g not really, it’s good if you need to grab something for a couple seconds. A crash like this only happens if you aren’t paying attention at all for long periods of time.
The tesla be like:
I am speed
😂
Yessir
no
@@sbrbrz777 ?
Then it was like:
I am dead.
Yo, Donut Media- Team!
Could u cover these sound generators of modern cars making them exhaust and engine notes appear louder to consumers?
They did that already about a year ago, Nolan covered it on wheelhouse.
Soundakors?
@@thisisntsergio1352 Soundaktors are only one type of system. It makes the already existing sound more resonant. There are a few others which involve digitizing the sound entirely and putting it through the speakers.
@@abalakrishnan4152 see: ecoboost mustang
@@abalakrishnan4152 this. The Golf R literally vibrates the windshield glass to create sound.
Captcha: Click the bus
me: clicks anything but a bus
Captcha: Want to learn
Also Captcha: Youre wrong
"Figuring out why this Tesla hit the truck"
Watched 11 minutes to hear "i can't tell you" at the end.
At least i've learnt something on those Captchas
What did you learn from Captcha? I didn't finish the video.
Omg .. time to fast forward to see if he shows the hit
4chan has known for years. That's why it's not typed captchas anymore, because we figured out that only one word had an answer, the other one was us teaching it how to read handwriting. Look up "Opperation Renigger"
Thanks to tell .. im at 2 min so time to leave
Car guys: "This is why we like our steering !!"
Truck drivers :. No fucking robot can do this lol
Bold statement.
@@billygoat7094 Tesla Semi Truck Drivers: ....
@@tye3648 *clears throat* may I direct you to Mercedes Benz's Concept semi truck that was fully autonomous, and they built it before Tesla even became as famous as it is now?
@@tye3648 where's my truck?
"Oh, this is a perfect example, Susan. See this crashed truck in our lane? Yeah, my Tesla is going to stop by itself, watch. No, wait, I think we have to be closer. No, don't worry, it works- I've tried it before in the parking lot with Mark. Ok, it should stop now. Ok, now wai... AAAAARGGGH..."
Did the driver & occupants survive?
Susan or Karen?
9:55 "Slurrrp" 🤣🤣🤣
It truly warms my heart to know that i'm still better than an AI at doing something. We should all cherish these moments while they last.
I mean, the driver was in the car, and they also didn't see the truck, so it's still not worse then them I suppose
@@malachi3438 Exactly. One car with bad ai, and one car with a shitty driver. Both equally missed it
AI are just as flawed as the people who program them.
Computers just can't compete with us at things involving split second decision making
Look at Formula1 drivers and fighter pilot aces. AI can't touch them by a pretty large margin
@@manz7860 you're living in the past
@@hulknwill the car had two bad AIs.
The thing is, why didn't the driver just brake? He can overrun the auto pilot so it's his fault he wasn't even paying attention to the road
Musk wants these cars to be fully autonomous. Let’s not start the conversation of “oh why didn’t the driver do this or that”. Remember Tesla is charging for full self driving in their vehicles. That’s unacceptable to advertise and sell as a feature if it’s not ready.
@@maxlu2456 its not legal to have a full autonomous car on the road so it is still the drivers fault for not breaking. Yeah the system should have worked, but it just shows that you can't trust these systems
@@hairstonjr_5337 you do know that even if self driving is 100% capable and doable , it would still fall under the drivers liability right? We aren’t talking about fault here. Technically the driver is at fault for almost everything. This is about Tesla and their roadmap to fully autonomous driving.
@@maxlu2456 its still the drivers fault... hence y they still need insurance...
@@maxlu2456 nobody ever advertised self driving as 100% ready and the cars will even tell you to keep your hands on the wheel and pay attention.
Hey Jeremiah, just a little mistake: Radar’s do not work excellent in all conditions, for example, in rain, fog or snow radars work poorly and in most cases do not find any objects, either moving or stationaty. That is why in maritime industy vessels use X-Band and S-Band radars, the difference being in wave length (3cm and 10cm waves) and in frequency. 😉
I work for a major automotive supplier in a radar department. 80% of what donut said here is either incomplete or incorrect about radars. Should have done their lessons more on this one. But to cut a long story short, radars are calibrated. They are not supposed to look mindlessly into the air. Take a look at radar guided cruise control and then I don't think you will see any type of accidents like this. So no a radar will NOT mistake an overhang with a truck. Especially 77Ghz models. Except a gross mistake on the development side. Static objects are an issue if the vehicle is stationary. 77Ghz models can easily make distinctions between bikes, trucks, shopping trollies etc. They do have their limitations that is why a progression to lidar + cameras is necessary. There is talk about integrating small lidar modules in the headlights tail lights etc.
You say radars don't work and proceed to prove your case that radars do work
@@elvo6217 how
@@elvo6217 but he literally didn’t say that they don’t work you fucking moron, he said they need to be calibrated, they don’t just hook up and work magic. Either you’re a kid or you are at a below average reading comprehension level.
@@iwantsexseemyvideo1512 Haha , dont we all
My thought on why it crashed: The truck was turned over with its smooth top pointing at an angle to not return any radio waves. For the camera.. Maybe again it was the smooth angle and the person confused it.. For the person: On their phone.
“That’s a scam, I don’t have insurance”
Modern problems require modern solutions
Vino
"Why di this tesla crash"
*because the driver wasn't paying attention*
Precisely
then it shouldnt be called autopilot
@@jangofett1599 well why is it not called full self driving
@@jangofett1599 that's why the tesla autopilot keeps telling the driver to put his/ her hand on the steering wheel everytime..
@@jangofett1599 pretty sure Plane auto pilotcan hit the mountain.
Shoutout to the old title, “How autonomous cars see the world” 👁👄👁
Wtf I started seeing the video with the old title and then suddenly changed
And the old thumbnail
Samr
They're also changed the thumbnail thrice
Not gonna lie the switch worked on me, saw the old title and thumb nail and didnt click, this one is more click baity and got me to click lol
Why This Self-Driving Tesla Car Hit That Truck?
"I dont know."
Nice, good work.
What we see: a black Ford Focus.
What Tesla sees: trash can
Wish I could like this one more time
I drive a black Ford Focus and I am not amused at all. I love my car, if you wanted to make a trash joke you could've mentioned a Fiat instead
@@RageGamingCroatia way to take a joke to the heart, move along.
same thing
Facts 🤣
"My insurance is about to go up,? That's A Scam I don't have insurance" 😂😂😂😂the best😆😆
me deliberately solving captcha incorrectly is proving fruitful.
You caused this. The moment you clicked that empty space when it asked "click all spaces with an overturned truck".
Elons livelihood is in your hands max. With great power comes great responsibility.
@@Got2bescene i literally loled to this comment
People who buy cars with an "auto-pilot" will most assuredly (sometime) allow it to drive itself. It will not be like autopilot on a plane. It will be like the fools we see getting tickets for riding in the backseat as their self-driving chauffeur drives them to their destination.
4:40 dammit, I wasn't ready for that. I spit my coffee all over the place.
ikr lmao
Lol
“There are actually videos out there of self-driving cars saving people”
*For some reason I thought of a Tesla model x dragging a person and shouting “medic!”*
LMAO
@꧁ RITA - F**UСК МЕ ꧂ wtf is that nickname ?
@@iwantsexseemyvideo1512 wtf xD
@@yacinealg152 what is wrong with the both of them?
@@thedogethatgames look at the nickname lmao
The guy that crashed should’ve interfered by stepping on them dang brakes. Tesla can’t be sued cause paying attention is literally something Tesla tells you before you even buy a car
That's not how public liability works though. Tesla should not be selling cars with this feature, plain and simple. The driver is ultimately responsible for engaging the system and then allowing his attention to lapse, however such a lapse in attention would not be possible in a car without autonomous features. It's partly Teslas fault for marketing a system that's in beta, and then attempting to absolve themselves of any liability with a crappy warning screen nobody reads. There's no excuse for a system like this on the road. This shit needs to be banned pronto.
@@Patrick-857 especially when musk's own wife was seen on social media doing stupid shit in autopilot
@@Patrick-857 "however such a lapse in attention would not be possible in a car without autonomous features." - Im calling hot bullshit on that one good sir. People have their attention lapse ALL THE TIME on the road. its a miracle there arent more accidents. Oh wait...
"36,835 fatalities in 2018" according to nhtsa
www.nhtsa.gov/press-releases/2019-fatality-data-traffic-deaths-2020-q2-projections#:~:text=Traffic%20deaths%20decreased%20nationwide%20during,traveled%20increased%20by%20nearly%201%25.
Exactly. You still have a body that you can move around with, but you can't brake by yourself. Ironic side of "Not Ready Human".
@@TitusGargilius Nice one idiot, I'm talking about the kind of lapse in attention that involves engaging autopilot and taking your hands off the wheel and eyes of the road. That doesn't happen without a massively oversold and underdeveloped autopilot system. Stop defending this ridiculous technology.
The $20k is a great idea. That can pay for insurance and maintenance for a quite a while. Respect!
"something larger and not so bright, like your mum"
Dude, my mum's right behind me, hit the brakes!
theory: the driver was just stupid and blamed their dumbness on autopilot
Is that true
EXACTLY!!!
@@dianagranados8715 Yes
if he doesn't blame it on autopilot, his insurance probably wont reimburse the damage
We all know that totally autonomous self-drive isn't quite ready to go prime-time. So the real question is, what was wrong with the software that it failed to stop or divert around the large fixed unmovable object? This should be an easy test-case for self-drive systems, that should be fixable with a software update.
Accident seems like it could have easily been avoided if the driver were paying attention and pressed the brakes.
Can’t wait to watch the autonomous trucks in the Rockies, during the Winter.
Would LIDAR have avoided the accident: Yes. Always put more sensors on than you need even if you think you don't need to, I'm sure boeing can attest.
Wouldn't staying AWAKE and actually DRIVING have avoided the accident too?
@@DL30Creations falling asleep isn't a design fault on the part of the car!
I am not putting any blame on the car it is all human error to fully trust technology to that extreme. Watch the original Total Recall and see how the Johnny Cab worked out there for a laugh, but in the real world use common sense when in a vehicle.
You won’t even be able to drive there in a car with lidar. Those cars are very limited where they can go
Ok hear me out maybe instead of using "auto pilot" just drive the car yourself? Crazy I know!
Jerry: (teaching pretty well)
Also Jerry out of no where: LIKE YOUR MOM
rekt
Uncle Jerry aught to randomly pop in advising everyone on the donut media team - I love that guy!!! 😎❤️🤪 - (Sometimes I rewatch his spots even if he’s promoting a sponsor. - cheesy but true.)
Nah most likely he was distracted and just blamed it on the autopilot to save his ass
The computer logs would have determined if autopilot was active.
@@scanspeak00 And that we don’t know about. I bet the autopilot would have reacted to the presence of the pedestrian which was close enough to be hit !
I guess autopilot decreases safety then because people allow themselves to more readily to be distracted... What is the point of it if it can't handle the 1% or 0.1% of situations?
You were basically spot-on with the "it didn't know what the semi was" thing.
I doubt Tesla engineers had trained the model to recognize an overturned semi truck. This is why it would probably be helpful for them to be smart and instead of using a huge 3D LiDAR system like other manufacturers, just replacing the radar with a unidirectional 2D LiDAR sensor, similar to what newer Roombas use. Then it could've seen that there was a large stationary object directly in front of the car and braked, without having an ugly thing on top of the car.
The answer is simple. The guy behind the wheel wasnt paying attention. He could’ve avoided the truck had he been paying attention. But he wasnt. So this aint the cars fault, but the drivers
Well its the cars fault too. Cant market your shit as self driving if it cant detect an overturned semi.
@@theashennamedjerry3203do pilots still need to pay attention while the plane is on auto pilot? Yes.
@@willgotrocks5348 cant argue with that.
@@willgotrocks5348 But keep in mind that if a planes auto pilot systems did basic mistakes like that they probably wouldt be trusted at all. Immagine if you autopilot lost altitude or some shit
@@willgotrocks5348 But keep in mind that if a planes auto pilot systems did basic mistakes like that they probably wouldt be trusted at all. Immagine if you autopilot lost altitude or some shit
Does anyone see that there's a person standing in the middle of the lane trying to stop them?
wow youre really observant
Seen it, yes, obviously the tesla don't :)
@@theshunt545 Well there is a experiment on Car in-build collision avoidance system by independent company sometimes ago. And one of which is a Tesla Model 3 facing againts pedestrian. Rather than stopping, it's just continue to drive like nothing happened. Even Honda CRV sensor works slightly better when it hits the pedestrian and stop, rather than crashing into it and drive on
@@ivanbima5877 My position is against self driving, to be more clear.
I thought human disposition used for driving will be hard to mimic, and counter productive in term of energy (for Datacenter involved) that will follow.
@@theshunt545 The tesla actually did see it, hence the sudden breaking and puff of dust on the road when passing the pedestrian. but as the human was stationary and the car never hit it the auto breaking ceased. I have had this happen on my Model X with a bicycle once.
Are you saying that Tesla can't detect a brick wall in front of the car? Because that's what the truck became.
The Tesla didn’t crash into that truck. A person driving the car did.
That's what I thought, you are supposed to keep aware of your surroundings when driving these cars because they aren't perfect yet if they ever will be.
@@werty_9189 and let’s be quite honest, self driving cars will not have steering wheels.
@@barnettzack That's much farther off than full self driving
This is why the last thing you will ever hear in life will be: You're in a Johnny Cab!
@@WindFireAllThatKindOfThing sorry I can only give your comment one thumb up. 👍
MAAAAD props for putting a time bar on the ad man. That shiz just earned you a subscriber.
that was my favorite part of the video too.
The ads are actually interesting and funny
I work as a full time Software Quality Assurance Engineer and I can tell you that one of the 7 Principles of testing is: "Exhaustive testing is impossible", meaning that you can never test all scenarios, you can only evaluate risks, improve testing strategies and continuously improve quality even after a software is fully deployed and in maintenance phase.
As to give you an example I follow up air crash investigations and very often they made stress testing on planes by causing 3 malfunctions, at 10000 feet, but never below 200 feet, when this happened in real live below 200 feet other issues appeared and completely messed up the software functionality, other parts of the software code override and plane crashed
Bro if the driver could see the flipped truck why wouldn’t they swap lanes
Day 307 of asking James to do an Up to speed on his Dad
good morning
I caught him early
Whoop, whoop. Pagani gaming has arrived!
Damn dude you're really fast
pagani the legend
the guy in the tesla is probably binging on his McDonald's meal in the back
It's not driverless technology yet, there's even a warning when you take your hands off the wheel. The driver wasn't paying attention, whether or not this was a Tesla, that accident still would have happened
it can be avoided at every different ways
It probably wouldn’t have, because the driver was relying on teslas technology, whereas in another car he would be forced to pay attention
@@seabasso6849 Thank you, exactly! Here you go 👑
“That’s a scam, I don’t have insurance.”
Honestly same
I like that diplomatic save that you added in the end there. Love your content, very informative! Useful when the guys are talking about car facts and people disagree