*Note: 3:05 a Model X was accidentally shown rather than a Model Y*. The first 1,000 people to use my link will get a free trial of Skillshare Premium Membership: skl.sh/newsthink06221
Well my 2021 Model 3 can’t see the lane markings at night on dark stretches of interstate highway. It claims multiple cameras are fogged or blocked. There’s nothing blocking them - they just can’t see in the dark.
If you're feeling clever, this could likely be remedied by hooking up some cheap IR lighting to the exterior of the car. Either LEDs manufactured specifically for outdoor night vision cameras or just an IR LED cluster you built\wired yourself. Grab a flashlight, aim it at one of the cameras, assuming you have a means to see what the camera can see, push some buttons on the remote and test if the cameras have IR filters installed (maybe stating the obvious, but you'll see light coming from the remote. Test on a cell phone camera if you need a demo). Knowing Musk, probably no filters...
I have yet to witness this issue on my 2021 model 3. I have driven on highway at night with autopilot for many miles now. Curious as to why this is happening
The "self-driving cars are 10 years away" hype is dead. Everyone knows level 5 isn't happening anytime soon and Musk doesn't want someone out hyping him. So he's not just saying LIDAR has hit a wall, but that Tesla's system has too.
I think that completely removing radar is rather foolish. Machines are meant to surpass our abilities, not match them. If I cannot see a deer on the side of the road with visible light, but a heat camera could, i want that heat camera.
all consumer cameras, like the one on your phone have an IR blocker, basically it blocks IR light to produce a human looking image. Tesla has removed that blocker and so the camera can see in IR aka the heat signature. You can test it with your phone, open the camera app and point it at the front of a TV remote, then click a button on the remote and you'll see a light through the phone camera but not with your eyes. Cameras + Neural Nets is the ONLY way for autonomy.
In the Alarm industry for decades, I used all the evolution of detection sensors and knew the problems with each, over the electromagnetic spectrum. The other guys just wired them up not having a clue how they worked.
ruclips.net/video/TTXCcacdqz0/видео.html This video explains basically why tesla removed radar, it confuses the OTHER sensors because if one sensor is saying "OBJECT IN FRONT" and another is saying "OBJECT CLEAR" which sensor do you believe?
@@TypicalBlox the one that sees it, obviously. As the video you linked suggests, the computer should have cross referenced the two images. If tesla plans to make cars capable of solving this problem with cameras alone, then the system was already capable of avoiding this incident. Radar is only an addition, not a subtraction
Except you only have 2 eyes. A Tesla car has more all around them and all receive input and can act on that input faster. Also cameras can see more then a human could. Trust me. It’s surpassing your ability’s.
Please let people see this comment. The reason Lidar is so hated by Elon, is because he has already sold one million+ "self driving" cars that do not have LiDar built in, just cameras. He hates it because he didn't foresee it being so use full and its too late for him to use it
Lidar gives an exact measurement of distance. Camera vision has to interpret depth and distance. Tesla also uses ultrasonic sensors for close maneuvering like parking, because cameras are not enough. Using cameras and software alone will probably be as or more expensive than using cameras and lidar together. You wouldn't want to be in an aircraft that uses cameras only. Most moving robots use lidar -- and an autonomous vehicle should probably use it, too. It's okay for Musk to make fun of lidar as long as he doesn't have a working autonomous vehicle.
Musk just wants to differentiate Teslas. The reality is cameras + computer vision is still not as scalable and reliable as old school LiDAR. For instance, a LiDAR car can be driven in any country without "training". Meanwhile, a Tesla that drives in Asia would've to be fed with "training data" in order for it to adapt to new environment.
If they’re using lidar as a base to test and “train” their systems, that would suggest to me that lidar is superior to cameras in self driving vehicles. 🤔
Elon's case was that it was expensive. If they can match the performance with cameras then it will be an overall more affordable package. It's like atomic clocks. Extremely accurate. But quartz is more than sufficiently accurate to tell time and significantly cheaper.
@@ArianrhodTalon for safety, is it ever enough? I can see that computer vision approach human safety level, but how close? that alone will never surpass the human driver safety level in the end
@@kenshee627 That will be up to Tesla to prove that their technology can match up. They chose to go down this route, but it will be the consumers that will do the voting with their dollars.
Not really. Deriving depth data and mapping that data onto a 3D representation of the surrounding is much easier to do with Lidar than vison, which makes it seem like the surefire move to go 100% self driving. Tesla however is ahead in the game in that regard and due to their software advantages have already more or less figured that tech out, as they have shown multiple times. LIDAR has its own shortfalls that vision doesn't and thus needs vision to be solve self driving . Those include Lidar data not giving any other tangible information other than depth leading to many restrictions regarding weather usability for ex(a snowflake has the same "signiture" and opacity as a brick wall). Thus, as said in the vid, even if you understand how far everything is away from you in a Lidar dotmap, you still need to figure out what is what and how to navigate those obstacles correctly using vision. The car still needs to understand that for ex. a plastic bag is a plastic bag and not a stone, which Lidar cannot do, so focusing on vision 100%, rather than wasting time building and implementing additional sensors is the way to go if you truly believe to be capable of reaching that lvl of autonomy. Using Lidar as a training device is more or less there to verify the findings of the vision system and doesn't mean that it is superior per se. As Musk said, if you can drive pretty well with 2 slow cameras as we do with our eyes, then it should be possible with 8 superhigh refreshing cameras.
@@Omakhara well, you only gives the example of LIDAR, I didn’t mean only LIDAR as my concern is that Elon only wants pure vision, it means removing radar as what is happening in the new model now. There are newer version of radar which can overcome weather condition as well now and that can be very helpful in most accidents which happened with Tesla And sure there are some cases which both vision and some other sensors can’t overcome, but there are a lot more which vision cannot overcome alone
"It's too expensive" says the car buyer while considering ev's future and before even considering the fact that the tech is getting better and more affordable with time. Saying "it's too expensive" even goes against Tesla's own style. Given the fact that Tesla bets on a future where they can build batteries with very high energy density.
it being too expensive just was the last deterrent in a long list of very good reasons why LiDAR is dumb and should only be used as a validation tool. Unless you want to carry a full on gaming PC in the car 24/7 just for processing the not so high def LiDAR
@@Supreme_Lobster didn't Elon Musk say something like, the Tesla model S can run Cyberpunk with ps5 like performance? So I'm pretty sure Teslas already have the lidar's required processing power.
People will pay if it's worth it. A level 5 system with no limitations is easily worth $200k+ excluding the vehicle cost. But Tesla's $10k cruise control is already pretty expensive for the benefits and a $30k LIDAR system isn't going to give you much more benefit. The issue isn't cost, it's that none of the technology bridges the gap that makes a human unnecessary.
You can get LIDAR or under a grand, his cars were 50 to 100K (pre-covid car price insanity) and even they they are always falling apart. I know of no happy Tesla owner and many angry ones. Look it up. I will still be driving my Toyota in decades. Hopefully his company will be like Theranos by then.
If it’s good for validation, then it’s good for training. Unless Lidar affects the processing speed or is too slow to work in real time, which I don’t think it is, then it doesn’t hurt to do so. Musk is prob going to be wrong here.
@@jackwhitlock1 it most new cars with radar and camera systems the both systems can initiate emergency braking, the CAS cameras and radar most of the time act independently of each other.
I still don't know why he (Elon) says it's a "fool's errand", isn't it better to have more sensors than less? Especially with the new solid-state LiDAR systems.
No. The data from the sensors can interfere with each other. When you have two or more sensors telling you two different things which hone do you believe?
He’s saying it’s a fools errand because it gives you a false sense of progress. It’s easy to build a demo with LiDAR, that’s why there’s dozens of startups doing it. But you need to solve the computer vision problem too for true self driving. You need to be able to understand that that thing on the street is a plastic bag and not a raccoon. LiDAR will give you a depth map but you need to know what the items are at those depths anyway. And vision has been shown to be pretty accurate for distance already.
@@apacheattackhelicopter8778 sure, but if starts being built at scale it'll get less expensive. And there are companies working on solid-state LiDAR, so you don't have those spin-y looking things on top of your car.
This to me is just... astronomically dumb. When human lives are on the line, redundancy is a must. That's why we have seat belts and air bags. If one fails, temporarily or permanently, the other system can take on the main load, until maintenance can occur. Right now if the one system fails, the entire car fails.
The real reason is: it's hard. That's the reason why they now even dumped radar. It's not about the price of lidar units as they have been and are still going to radically drop in price. However sensor data fusion is real hard and Tesla just has not the expertise.
I believe it is about the price. If you have two Teslas to chose from, one is 40k USD, the other is 50k USD and has big bumb on its roof, lower range and it's said to be 10% safer, IMHO most consumers would choose the cheaper, prettier and more efficient option, not the more expensive and safer one.
@@DB-gh4nj But the , best part is no part. Radar is not nearly as hard to work with in automations systems as video. I didn't work with it but I did some work in computer vision, including depth and thermal cameras. All of the sensors are pretty much the same from difficulty perspective, the less feature extraction you have to do, the easier it is. And you try to extract "radar data" (distances) from cameras, so I would say cameras are harder than radars, since you have to have feature extracting software.
@@Tondadrd Well obviously you have no clue about lidar. You don't necessarily need 360° spinning lidar's. You can use Solid state lidar's which don't need to be mounted on the roof. Additionaly the price drop of lidar's was and will be enormous. So neither the build factor nor the price will play a role in the Future
@@Tondadrd well it isn't really hard to extract depth information from cameras either. I actually have quite a lot of experience with that. I mean especially with multiple cameras it gets easy but you can even do it over a time series with just one camera. But that was not the point. The hard part ist bot using radars data or camera data or lidar data but combining those. The data fusion is hard. What do you do, when your radar sais there is something but you camera sais there is not and vice versa. That is the really hard part
I have a pure vision just got it. Some features are off, like stop light stopping on autopilot, etc. waiting to see how it’ll update. Pure vision has 3 more cameras by the mirror looking at the road.
at this point, the majority of the problems are not with perception but with decision making. LiDAR does nothing for decision making other than making it more difficult and slower (because more data takes more time).
Nonsense, apple is already putting lidar on their devices and I'm sure apple is developing a even better/cheaper and more efficient/powerful lidar sensor.
@@alanmay7929 so? Look, if you knew every single measurement of the road you are on, you wouldn't be a better driver, at all. LiDAR only solves the problem of not wanting to solve the Vision problem. LiDAR does not do anything for decision making, it's just a sensor.
Musk im a fan of everything you do brother but i think partial lidar assist would up your effectiveness especially in fog and snowfall even if its just forward facing
Elon musk doesn't like it because someone who actually knows about tesla vehicles told him that. I mean he pretty much has nothing to do with the vision, tech or really anything other than being in the spotlight.
I want top level safety using multiple redundant ways to detect and verify surroundings. Without that I won't let a computer make decisions as it can potentially kill me. I want cameras, lidar, radar, the whole nine yards. And if the computer is absolutely certain that no matter what happens, snow, rain, dark, it will be bullet proof, then I will happily hand over the steering wheel.
Agree, can you imagine flying on an airliner that solely uses visible light cameras even for landing at night or in fog? Why not leverage technology and its abilities that can reduce risk in areas where visible light imaging is weak? Musk has a real blindspot here in trying to create a system that acts like human vision replacement.
Tesla spotted with lidar is hyped gossip. Musk himself said they use lidar to train/calibrate vision system. Road infrastrucure is designed for human vision and Musk may be right saying he is hardware ready for level 5 but on a much longer run he may be forced to correct himself in case systems with other modalities can proove themselves safer still. The issue is that once cars become much safer any one accident will be much more rare and therefore more scrutinized (more like airplane crash today, with some kind of black-box cloud record of the crash) . The issues involved by then can be very subtle and solutions are now hard to foresee, certainly past the boundaries of human driving capability and perception, which puts Musks argument at question already today.
Probably safer than humans do as, when presented with such conditions, we human's will general say "Screw it. Off we go!" whereas Tesla's system might say "Screw it. The conditions are too dangerous. Wait a bit and try again.".
@@moon-pw1bi Not really, LIDAR could get screwed over quite badly by rain or hail interfering with the light bouncing off. Vision would likely be better in this scenario. Fog would be debatable.
Idk Elon I was driving the other day and my Tesla said one of the cameras was blocked probably by condensation or something. It didn't go off for miles.
He's obviously not stupid, but my god...absolutely dumb to not integrate lidar. Teslas get into accidents all the time because the cameras get blinded by bright sun or cannot see clearly. There was a deep dive recently into the safety of Teslas and camera based self driving systems - results were not great. In extremely sunny and bright areas, the sun glare has literally blinded the cameras and we've seen videos of Teslas just full on slamming into things that were bright white or on sunny days because it couldn't see them
But human drivers cant. Since while driving, knowing the heuristic directions, the rules, and obstacle avoidance are enough. Having the map is a plus. Having a view of every nook and corner is not necessary.
@@ztechrepairs I guess we'll wait and see how they program for it, but the cameras aren't that great a quality to begin with. ruclips.net/video/sbKc3nxNYtA/видео.html
Cindy you missed this. Elon has said he's not against LIDAR in general. They Use LIDAR on SpaceX Dragon capsule to autonomously dock with International space station! He told that LIDAR in Self-driving cars doesn't makes sense. I really think he knows what he's talking about!
Just a question. Camera data processing involves a lot of computation especially since a lot of ML is needed like CNNs while radar signal processing is relatively less complex. Then why do Tesla still believe that Cameras are the future, what advantage do we get by using sensors that use more complex signal processing?
Google maps use satellites to produce top view image, but that's it. Someone has to verify on the street if that's really where the street is. That's where streetview cars came in.
No he doesn't but it is also true that he has actually done quite a few things that people said was impossible. Like land and reuse rocket boosters ten times.
it still can actually,. it's just less effective since there will be tons of unnecessary data such as atmospheric (rain, snow, etc) lidar data that the computer needs to compute and separate from real object data. that's where 3D sensor fusion (combining all sensors) comes in handy.
If the cameras are obscured by mud or frost or rain, Radar nor Lidar will save you, a car that uses Lidar/Radar uses them *on top* of computer vision, CV is still needed to see the road and know where to go. If the car can't see the road, the car still can't drive, lidar or no lidar.
@@AkshayKumar-vg2pi , there are a lot of reasons to not use LIDAR, and price is just one of them. Durability is another, the issues LIDAR has in rain and snow is another, and the most important one is because they do not need LIDAR to achieve level 5 autonomous. Tesla's goal is to accelerate the transition to EVs, and keeping costs down is important. Beyond that, a basic principle of good engineering is that no part is the best part. Reducing the total number of parts has always been something Tesla engineers strove for, so adding LIDAR when it is not required is not going to happen.
I would bet that if "integrate Lidar" was part of the vehicle design specification rather than something bolted on, it could be made to look good. The Apple Car sensor backpack already looks acceptable. If it was incorporated into the roofline, it would look like cool alien technology from a science fiction movie.
Solid state lidars are much more compact than the traditional mechanical LIDAR designs. Multiple solid state lidar units can be spread around the vehicle and hidden (e.g. under headlights)
@@nathanlewis42 It is a SPAD, operating in the near infrared, LiDAR operates at longer wavelengths using InGaAs semiconductors. They are both time-of-flight but there is a huge difference in capabilities.
@@silicitimmm there are much more sophisticated in software design to handle these kinds of scenarios that are much more scalable. I highly doubt this is actually true considering Tesla is a multi billion dollar company with infinite resources as far as I’m concerned
@@silicitimmm it’s true because I wrote it, my boss approved it and the team knows about it. Nothing significantly wrong it’s just a man made software and computer vision and artificial intelligence can never be 100% intelligent. There’s no single published paper on AI ever in history can be perfectly accurate so human intervention in corner cases are inevitable. And that’s the limitation of not receiving physical signals from the real roads, which LIDAR can easily taken care of. And Musk’s interpretation of how human eyes work is simply wrong and too naive.
It's not an issue as to if it can be done. Humans do it so it can be done. The issues is can you fit the amount of CPU horsepower needed to do it into a car. I'm not going to say one way or another now. But for sure it will be the case some day.
Yes you can, in fact having more sensors requires wayyyy more processing power, especially if you're using LiDAR. Processing power is definitely not an issue, Comma ai is literally doing partial self-driving using an off the shelf android phone.
@@harsimranbansal5355 If you where just joking ignore the rest of this. It seems you don’t understand the actual problem space. That’s fine. Most people haven’t worked in AI and just believe the weird stuff RUclips/Media claim about it. LiDAR is not relevant. It doesn’t help at all in the actual problem space. It’s use is not for driving the car, it’s for a safety override in case the AI that drives the car gets it wrong. It’s fine for test vehicles but in the real world it doesn’t work because of all the false positives you get. If your interested in the topic here are some suggestions of things to get under your belts to get started on the subject. Look up the difference in cpu power between a computer reproducing a picture via a camera and its power/time needed understanding what it’s taken the picture of. Look up how LiDAR actually works. To get around the false positive problem they are pumping the lidar into an ai so the car can at least get down the road. This is because lidar takes fancy pictures with extra info and does nothing to help with the understanding problem. So, self defeating really Look up the impossible tasks of driving. (humans or ai). For example, as he roads are laid out it is impossible not to have crashes. Look up the Tesla’s CPU size, which is 100% used and compare that to the 80 20 rule. Sure you could say that 14% of the trips will not end in crashes. But that’s no where near good enough. If you bring the maximum speeds of everything down to say 3 mph maybe they could the crash rage to below 1% with existing hardware in a reasonable time frame. Anyway, ones you get that stuff under your belt you’ll at least start to get an idea as to why Tesla's sometimes what to head for a guard rails and why fixing that isn’t as simple as saying, tell it not to crash into guard rails.
Hey Cindy, as your next video you can do one on Tesla battery technology or different battery technologies, other one on engineering design & capabilities of Starship, or one on OpenAI, thanks.
it’s an extra 10,000 for self driving on a Tesla. When I get mine one day .. I would rather get the longer rage than self driving, or just save 10,000 bucks on a Standard range model 3. I am more than capable of driving it myself. Heck, I already feel like I am on autopilot when driving to work 😂.
Elon Musk and Tesla are jumping to the vision mode early due to the limited supplies / computer chip shortage affecting their ability to add radar to their vehicles now (and in turn to keep prices lower)
I remember web cameras to be 0.3 megapixels and costing $100. Almost everyone had it. Today 5 MP camera considered crap. A $100 cellphone can have 12 MP. Lidar maybe expensive today but it will be like camera costing. The prices droped many times already. Having $1000 worth of Lidar on a car is the reality today. In the future it will be $100. It's after all superior to cameras hense safe to people. I am sorry but cameras will be dash cams.
12k RAM was said to be enough on PC. Today 6GB(6 000 000k) is a weak computer. Mask is onto something but mostly to hype how his car can do without it. At the end he knows he is selling enough EV to get a fair share of the market. Also it won't take him long to say hey I was wrong next Tesla 10 will have lidar. Right now Mask sells what he already has in production. Its enough as is.
I think LiDAR as with all tec products will get cheaper every year as it scales. I really want to buy a Tesla for my next car as they are currently a far superior product compared with the competition . We shall see in 4 years who offers the safest and most relaxing driving experience. I think that Elon will change his opinion once the price / performance of LiDAR improves
Elon chooses not to use lidar and to drop radar is to reduce the cost. As AI thru camera cannot recognize what’s in front and thought it’s safe then proceed w full speed, it will cause the accident or collision become deadly. A safety related system without build-in secondary redundancy verification is just stupid.
What very few understand is that level five autonomous technology is half of what one needs to compete in autonomous taxis. The other half is a huge fleet of durable energy efficient relatively low cost EVs, and Tesla is the only one who has that. It does not matter if Waymo and Cruise both achieve level five three years before Tesla does, Tesla will still end up owning the autonomous taxi industry.
I have an upmarket dashcam that is supposed to warm me if I am drifting out of a lane and if I am too close to a car. On average it warms me I am drifting out of my lane twice in 10 kilometers both times it is as I pass a right-hand turn lane. The warning of traveling too close varies with the colour of the car. No way would I want to sit back and use this as part of a self-driving system.
@Green Mamba Games yes BUT only if it was JUST A PC which ain't true, it's also the fastest 5 seater car, it's the new drag race champion, it's electric ⚡
Even overhyped AI can't remotely compete with the human brain's pattern recognition ability. That's why we can we can drive safely even with our poor eyesight. Autonomous vehicles have to compensate by relying on other sensor inputs - and LiDAR provides the data overlay. Tesla cannot afford to retrofit its lineup so they are doubling down, but new EV makers (incl legacy brands) don't have this issue. Sometimes it's strategic to not be the first - which is how Apple typically wins in tablets, watches etc.
@@UnknownPerson-wg1hw I am an AI investor, so not betting against. Still, AI isn't magic - it relies on data inputs, and removing an important LiDAR data overlay greatly diminishes output predictions. As for AI - which is still really an expert prediction system - it is orders of magnitude away from the human brain's pattern recognition. For instance, potholes vs paint patterns on a road. Read Kurzweill.
2 cameras lined up can guess the relative distance, plus, lidar can be unable to measure in fog, rain and snow, a good ai with camera plus radar can even go farther from lidar
I drive my tesla almost everyday and I don't depend on autopilot in the area which is pretty much a hood ( city of San Leandro and Oakland California). We have too many ghetto drivers who drive recklessly all the time here so you really need to keep you eye open and watch out for other cars. We have people who try to race, cross red lights and cut off others while looking at the phone at the same time. The drivers in the hoods are the ones you always want to watch out for. I would drive a tank or a heavy duty hummer if I have to just to protect your self from these crazy drivers.
3:10 the car on the bottom is a model X, not model Y. You can tell from the two door handles being right next to eachother, plus the shape, headlights, and wheels being the wrong ones for a Y.
LIDAR is great for an emergency braking system. It's completely useless for actual self-driving. It can't read signs, it can't see road lines, it can't understand lights, it can't recognize people. It's a great shortcut if your end goal is a Level 1, maybe Level 2 system, where the most you really want your car to do is hit the brakes and not crash into things. If your goal is full self driving, level 4 or 5, then you're going to have to develop intelligent camera systems anyway. By the time the camera AI systems are good enough to do level 4 or 5 stuff, LIDAR will be completely redundant long before that.
Less so. As post-processing can be applied to video that we cannot do ourselves. EG: tesla will apply a post-process to reduce gain on high brightness situations. To do this ourselves we'd need to squint, pop on sunglasses or flip down the visor. All which take far longer than a fraction of a second and even then aren't as good as post-processing.
Normal camera's might be sufficient to make self driving vehicles just as good as human drives. LIDAR and RADAR can make it even better than humans, especially in low light and foggy conditions.
No, a machine with human-like vision would drive much more safely. People don't cause accidents, because it's inevitable due to lack of vision. They do, because they don't pay attention, they take unnecessary risks, they fall asleep, they are drunk. I don't have any statistics to back it up, but these factors probably cause 80% on the accidents.
@@anonanonkiewicz1921 okay fair enough, and there is also reaction time and the fact that there are blind spots behind pillars etc, camera's can have full 360 degrees vision. But still, thick fog and low light situations have the same problem with normal camera's compared to human vision. LIDAR of RADAR could improve on that
I give u one example: the cheetah is the fastest animal and it uses legs to run. Wheel does not appear in nature. By Elon's logic we need to solve leg-based locomotive mode, wheel is a fool's errand, anyone depending on wheel is doomed.
What I don’t get is that everyone with a recent iPhone pro or iPad Pro has a lidar sensor. I mean it just has a depth perception of 5 meters, but still the sensor cannot be more expensive than 10$. Why wouldn’t similar tech work with cars?
The biggest point about using cameras is that it provides future revenue. Lidar maybe is the easier to use technology for selfdriving BUT cameras can watch the world around the car and not only the road. This way Tesla would have huge amounts of data on everythign that happens close to roads and it would be updated automaticly. Just imagine what you could do with all that data. Cameras provide massive amounts of data Lidar can never capture. 24/7 surveilance and the customer is paying the equipment. Lets see how they use all this data...
Elon Musk does not hate LIDAR. He even said this publicly during their Autonomy Day presentation event, as well as on a few other occasions. He and his engineers at SpaceX developed their own LIDAR system, so they know its pros and cons well. The only reason LIDAR is even being used is because there isn’t yet a pure vision system that can do what humans can with their two eyes and brain. So solving pure vision only FSD is the ultimate goal, which is why Elon/Tesla are taking the challenge head on instead of relying on expensive cheat devices/hardware add-ons that don’t actually help achieve the main goal in the long run. LIDAR is like taking a powerful pain killer to make the pain in your body go away, but what you really need is to fix what is causing the pain in the first place.
If people can setup lidar with their current cars, this will take a large cut from tesla. Seems Elon doesn't want to spend the money at the moment, but once the tech is streamlined and improved. People will purchase the tech. Especially for drones to walk dogs. Or possibility would be using drone with lidar tech for search and rescue.
I don't know enough about how lidar works to be entirely sure about this but I would argue that as a emit and receive system lidar works best when there are no other lidar scanners around at the same time because they interfere with each other. So their usefulness lessens with every other car using it. That I would be a good reason to invest in only receiver type sensors like cameras.
TLDR: Its expensive, so they're going to remove it to make more money, and smear its use in competitors to stop people choosing them instead for having more comprehensive sensors.
*Note: 3:05 a Model X was accidentally shown rather than a Model Y*. The first 1,000 people to use my link will get a free trial of Skillshare Premium Membership: skl.sh/newsthink06221
Thanks!
😂😂
I am more excited about Tesla's future than my own
💀
If you invest in Tesla, like I do. It’s exciting to follow Tesla.
I’m more excited about the future then mine
At my age, I may never get to see much but I got tons of kids that will!
@@SyncedJay me too. My life currently sucks but seeing some progress in the world gives me some hope
Well my 2021 Model 3 can’t see the lane markings at night on dark stretches of interstate highway. It claims multiple cameras are fogged or blocked. There’s nothing blocking them - they just can’t see in the dark.
Tbf, if they're painted on, lidar or radar wouldn't see it either
saw a video of a tesla just driving on the wrong side of the road,cause at some point there was no lane marking and the car just "fuck it"
@@SirEdubardo I mean, there are always going to be some cases. And trust me, you only see it when it fails
If you're feeling clever, this could likely be remedied by hooking up some cheap IR lighting to the exterior of the car. Either LEDs manufactured specifically for outdoor night vision cameras or just an IR LED cluster you built\wired yourself.
Grab a flashlight, aim it at one of the cameras, assuming you have a means to see what the camera can see, push some buttons on the remote and test if the cameras have IR filters installed (maybe stating the obvious, but you'll see light coming from the remote. Test on a cell phone camera if you need a demo). Knowing Musk, probably no filters...
I have yet to witness this issue on my 2021 model 3. I have driven on highway at night with autopilot for many miles now. Curious as to why this is happening
Elon doesn’t “hate” Lidar. Spacex uses it. He just doesn’t think it’s needed for driving.
Yeah, such a misleading and clickbaity title. They use lidar for docking Dragon with ISS.
He literally said it's a fool's errand. He definitely hates it
@@okiedokie2557 LOL, go watch the whole conversation on Tesla Autonomy Day.
The "self-driving cars are 10 years away" hype is dead. Everyone knows level 5 isn't happening anytime soon and Musk doesn't want someone out hyping him. So he's not just saying LIDAR has hit a wall, but that Tesla's system has too.
i disliked the video lol
why use lidar. When you can ask human capcha to select zebra crossing or traffic light :))
I am not a robot
My country doesn't have that rule. The road is for everyone.
Smoothest ad transition in the game.
Ikr
They almost got me I paused the video after 2 seconds. i'm out.
i hated the transition
Cindy's a Pro through and through!
I love it when she makes her cameo appearances.
Sponsorblock extension
I think that completely removing radar is rather foolish. Machines are meant to surpass our abilities, not match them. If I cannot see a deer on the side of the road with visible light, but a heat camera could, i want that heat camera.
all consumer cameras, like the one on your phone have an IR blocker, basically it blocks IR light to produce a human looking image. Tesla has removed that blocker and so the camera can see in IR aka the heat signature. You can test it with your phone, open the camera app and point it at the front of a TV remote, then click a button on the remote and you'll see a light through the phone camera but not with your eyes. Cameras + Neural Nets is the ONLY way for autonomy.
In the Alarm industry for decades, I used all the evolution of detection sensors and knew the problems with each, over the electromagnetic spectrum. The other guys just wired them up not having a clue how they worked.
ruclips.net/video/TTXCcacdqz0/видео.html
This video explains basically why tesla removed radar, it confuses the OTHER sensors because if one sensor is saying "OBJECT IN FRONT" and another is saying "OBJECT CLEAR" which sensor do you believe?
@@TypicalBlox the one that sees it, obviously. As the video you linked suggests, the computer should have cross referenced the two images. If tesla plans to make cars capable of solving this problem with cameras alone, then the system was already capable of avoiding this incident. Radar is only an addition, not a subtraction
Except you only have 2 eyes. A Tesla car has more all around them and all receive input and can act on that input faster. Also cameras can see more then a human could. Trust me. It’s surpassing your ability’s.
"Apple's rumored smartcar is said to be looking for the sensor". Actual line from this video.
Never heard somerhing so speculativ.
Yeah, the video is pretty good but tries to dramatize stuff too much
Yeah, you gotta give em one thing: that smartcar is already pretty clever if it can look for sensors all by itself.
@@seybertooth9282 When I saw that part I thought "oh, how does that work" before remembering there's a chip shortage
Maybe it can use LIDAR to find it.
Please let people see this comment. The reason Lidar is so hated by Elon, is because he has already sold one million+ "self driving" cars that do not have LiDar built in, just cameras. He hates it because he didn't foresee it being so use full and its too late for him to use it
idc as long as it works well n affordable
Lidar gives an exact measurement of distance. Camera vision has to interpret depth and distance. Tesla also uses ultrasonic sensors for close maneuvering like parking, because cameras are not enough. Using cameras and software alone will probably be as or more expensive than using cameras and lidar together. You wouldn't want to be in an aircraft that uses cameras only. Most moving robots use lidar -- and an autonomous vehicle should probably use it, too. It's okay for Musk to make fun of lidar as long as he doesn't have a working autonomous vehicle.
Ultrasonic sensors are crap which is why the are no longer used in any security system.
@@toriless lidar isn’t ultrasonic.
@@toriless lidar aint ultrasonic dude 🤦🏽♂️ Its the damn light
Musk just wants to differentiate Teslas. The reality is cameras + computer vision is still not as scalable and reliable as old school LiDAR. For instance, a LiDAR car can be driven in any country without "training". Meanwhile, a Tesla that drives in Asia would've to be fed with "training data" in order for it to adapt to new environment.
Time has proven you right, I would say.
If they’re using lidar as a base to test and “train” their systems, that would suggest to me that lidar is superior to cameras in self driving vehicles. 🤔
Elon's case was that it was expensive. If they can match the performance with cameras then it will be an overall more affordable package. It's like atomic clocks. Extremely accurate. But quartz is more than sufficiently accurate to tell time and significantly cheaper.
@@ArianrhodTalon for safety, is it ever enough? I can see that computer vision approach human safety level, but how close? that alone will never surpass the human driver safety level in the end
@@kenshee627 That will be up to Tesla to prove that their technology can match up. They chose to go down this route, but it will be the consumers that will do the voting with their dollars.
Not really. Deriving depth data and mapping that data onto a 3D representation of the surrounding is much easier to do with Lidar than vison, which makes it seem like the surefire move to go 100% self driving. Tesla however is ahead in the game in that regard and due to their software advantages have already more or less figured that tech out, as they have shown multiple times. LIDAR has its own shortfalls that vision doesn't and thus needs vision to be solve self driving . Those include Lidar data not giving any other tangible information other than depth leading to many restrictions regarding weather usability for ex(a snowflake has the same "signiture" and opacity as a brick wall). Thus, as said in the vid, even if you understand how far everything is away from you in a Lidar dotmap, you still need to figure out what is what and how to navigate those obstacles correctly using vision. The car still needs to understand that for ex. a plastic bag is a plastic bag and not a stone, which Lidar cannot do, so focusing on vision 100%, rather than wasting time building and implementing additional sensors is the way to go if you truly believe to be capable of reaching that lvl of autonomy. Using Lidar as a training device is more or less there to verify the findings of the vision system and doesn't mean that it is superior per se. As Musk said, if you can drive pretty well with 2 slow cameras as we do with our eyes, then it should be possible with 8 superhigh refreshing cameras.
@@Omakhara well, you only gives the example of LIDAR, I didn’t mean only LIDAR as my concern is that Elon only wants pure vision, it means removing radar as what is happening in the new model now. There are newer version of radar which can overcome weather condition as well now and that can be very helpful in most accidents which happened with Tesla
And sure there are some cases which both vision and some other sensors can’t overcome, but there are a lot more which vision cannot overcome alone
"It's too expensive" says the car buyer while considering ev's future and before even considering the fact that the tech is getting better and more affordable with time. Saying "it's too expensive" even goes against Tesla's own style. Given the fact that Tesla bets on a future where they can build batteries with very high energy density.
it being too expensive just was the last deterrent in a long list of very good reasons why LiDAR is dumb and should only be used as a validation tool. Unless you want to carry a full on gaming PC in the car 24/7 just for processing the not so high def LiDAR
@@Supreme_Lobster didn't Elon Musk say something like, the Tesla model S can run Cyberpunk with ps5 like performance? So I'm pretty sure Teslas already have the lidar's required processing power.
People will pay if it's worth it. A level 5 system with no limitations is easily worth $200k+ excluding the vehicle cost. But Tesla's $10k cruise control is already pretty expensive for the benefits and a $30k LIDAR system isn't going to give you much more benefit. The issue isn't cost, it's that none of the technology bridges the gap that makes a human unnecessary.
Too expensive is indeed a bad argument, not only for advancing technology but economies of scale as well
You can get LIDAR or under a grand, his cars were 50 to 100K (pre-covid car price insanity) and even they they are always falling apart. I know of no happy Tesla owner and many angry ones. Look it up. I will still be driving my Toyota in decades. Hopefully his company will be like Theranos by then.
Tesla uses LIDAR to tune their vision software, they compare their functionality against LIDAR to make sure the results match and are reliable.
the only good use of LiDAR: as a validation tool
yes, that is what the video said
If it’s good for validation, then it’s good for training. Unless Lidar affects the processing speed or is too slow to work in real time, which I don’t think it is, then it doesn’t hurt to do so. Musk is prob going to be wrong here.
So because ... lidar is ... better?
@@justmoritz because radar is slower but I think both of them are similar
how can you rely on one system only? You need redundancy to be safe
There is redundancy. They have 8 cameras and 2 separate computers.
@@crusherman2001 Well but if there's a flaw in the camera system, more cameras wouldn't solve the problem I guess
@@danielwaldner7121 Even with Radar and Lidar, a camera malfunction would still stop the system working
@@jackwhitlock1 it most new cars with radar and camera systems the both systems can initiate emergency braking, the CAS cameras and radar most of the time act independently of each other.
If they disagree which do you trust? Cameras....LIDAR just has very poor resolution, latency and data density.
I still don't know why he (Elon) says it's a "fool's errand", isn't it better to have more sensors than less? Especially with the new solid-state LiDAR systems.
No. The data from the sensors can interfere with each other. When you have two or more sensors telling you two different things which hone do you believe?
He’s saying it’s a fools errand because it gives you a false sense of progress. It’s easy to build a demo with LiDAR, that’s why there’s dozens of startups doing it. But you need to solve the computer vision problem too for true self driving. You need to be able to understand that that thing on the street is a plastic bag and not a raccoon. LiDAR will give you a depth map but you need to know what the items are at those depths anyway. And vision has been shown to be pretty accurate for distance already.
@@minhazm And yet somehow Google has made it work.
Also costs. Less parts is better than more parts. Less complexity and cheaper cars to produce.
@@SdoubleA No, they didn't. That's why Waymos availability is very limited.
Thank You Cindy!
I dont have LIDAR in my head and i can drive just fine.
But imagine how much better of a driver you'd be if you had LiDAR.
@@JJs_playground but it's expensive
@@apacheattackhelicopter8778 sure, but if starts being built at scale it'll get less expensive. And there are companies working on solid-state LiDAR, so you don't have those spin-y looking things on top of your car.
@@JJs_playground homie. just slap some cameras on your car.
You probably also have walked into a glass door
This to me is just... astronomically dumb. When human lives are on the line, redundancy is a must. That's why we have seat belts and air bags. If one fails, temporarily or permanently, the other system can take on the main load, until maintenance can occur. Right now if the one system fails, the entire car fails.
The real reason is: it's hard. That's the reason why they now even dumped radar. It's not about the price of lidar units as they have been and are still going to radically drop in price. However sensor data fusion is real hard and Tesla just has not the expertise.
I believe it is about the price. If you have two Teslas to chose from, one is 40k USD, the other is 50k USD and has big bumb on its roof, lower range and it's said to be 10% safer, IMHO most consumers would choose the cheaper, prettier and more efficient option, not the more expensive and safer one.
@@DB-gh4nj But the , best part is no part. Radar is not nearly as hard to work with in automations systems as video. I didn't work with it but I did some work in computer vision, including depth and thermal cameras.
All of the sensors are pretty much the same from difficulty perspective, the less feature extraction you have to do, the easier it is. And you try to extract "radar data" (distances) from cameras, so I would say cameras are harder than radars, since you have to have feature extracting software.
@@Tondadrd Well obviously you have no clue about lidar. You don't necessarily need 360° spinning lidar's. You can use Solid state lidar's which don't need to be mounted on the roof. Additionaly the price drop of lidar's was and will be enormous. So neither the build factor nor the price will play a role in the Future
@@Tondadrd well it isn't really hard to extract depth information from cameras either. I actually have quite a lot of experience with that. I mean especially with multiple cameras it gets easy but you can even do it over a time series with just one camera. But that was not the point. The hard part ist bot using radars data or camera data or lidar data but combining those. The data fusion is hard. What do you do, when your radar sais there is something but you camera sais there is not and vice versa. That is the really hard part
I have a pure vision just got it. Some features are off, like stop light stopping on autopilot, etc. waiting to see how it’ll update. Pure vision has 3 more cameras by the mirror looking at the road.
at this point, the majority of the problems are not with perception but with decision making. LiDAR does nothing for decision making other than making it more difficult and slower (because more data takes more time).
Nonsense, apple is already putting lidar on their devices and I'm sure apple is developing a even better/cheaper and more efficient/powerful lidar sensor.
@@alanmay7929 so?
Look, if you knew every single measurement of the road you are on, you wouldn't be a better driver, at all.
LiDAR only solves the problem of not wanting to solve the Vision problem.
LiDAR does not do anything for decision making, it's just a sensor.
@@Supreme_Lobster every techs have their pros and cons, there is no perfect one.
@@alanmay7929 no, but LiDAR is counterproductive in every measurable dimension
Musk im a fan of everything you do brother but i think partial lidar assist would up your effectiveness especially in fog and snowfall even if its just forward facing
Musk is an idiot lol
Elon musk doesn't like it because someone who actually knows about tesla vehicles told him that. I mean he pretty much has nothing to do with the vision, tech or really anything other than being in the spotlight.
4:44 marques is on skill share!!
I want top level safety using multiple redundant ways to detect and verify surroundings. Without that I won't let a computer make decisions as it can potentially kill me. I want cameras, lidar, radar, the whole nine yards. And if the computer is absolutely certain that no matter what happens, snow, rain, dark, it will be bullet proof, then I will happily hand over the steering wheel.
Agree, can you imagine flying on an airliner that solely uses visible light cameras even for landing at night or in fog? Why not leverage technology and its abilities that can reduce risk in areas where visible light imaging is weak? Musk has a real blindspot here in trying to create a system that acts like human vision replacement.
Tesla spotted with lidar is hyped gossip. Musk himself said they use lidar to train/calibrate vision system. Road infrastrucure is designed for human vision and Musk may be right saying he is hardware ready for level 5 but on a much longer run he may be forced to correct himself in case systems with other modalities can proove themselves safer still. The issue is that once cars become much safer any one accident will be much more rare and therefore more scrutinized (more like airplane crash today, with some kind of black-box cloud record of the crash) . The issues involved by then can be very subtle and solutions are now hard to foresee, certainly past the boundaries of human driving capability and perception, which puts Musks argument at question already today.
Will the car be banned from making calls when it’s supposed to be concentrating on driving?
It's a computer, it can focus on multiple things at once
@@brixan... pretty sure it was a joke
@@cuppajoe2 I agree
I wonder how Teslas will handle fog, heavy rain, glaring sun, or extreme hail / snow.
Probably safer than humans do as, when presented with such conditions, we human's will general say "Screw it. Off we go!" whereas Tesla's system might say "Screw it. The conditions are too dangerous. Wait a bit and try again.".
You don't have those in Cali so we will never know
In this context, I think those things would cause as much problems for LIDAR as for cameras.
@@russellthorburn9297 ye sure but worse than lidar thats the point
@@moon-pw1bi Not really, LIDAR could get screwed over quite badly by rain or hail interfering with the light bouncing off. Vision would likely be better in this scenario. Fog would be debatable.
Lidar is a technology to calculate the distance between lidar and objects and eventually to map out 3D models.
Idk Elon I was driving the other day and my Tesla said one of the cameras was blocked probably by condensation or something. It didn't go off for miles.
He's obviously not stupid, but my god...absolutely dumb to not integrate lidar. Teslas get into accidents all the time because the cameras get blinded by bright sun or cannot see clearly. There was a deep dive recently into the safety of Teslas and camera based self driving systems - results were not great.
In extremely sunny and bright areas, the sun glare has literally blinded the cameras and we've seen videos of Teslas just full on slamming into things that were bright white or on sunny days because it couldn't see them
Now, how can lydar distinguish a stationary bridge Vs a flipped truck if front of you, while you are traveling at 60km/hr
Microwave imaging can "see" though things other wavelengths cant and vice versa
But human drivers cant.
Since while driving, knowing the heuristic directions, the rules, and obstacle avoidance are enough.
Having the map is a plus.
Having a view of every nook and corner is not necessary.
So what happens in low visibility? Say heavy rain or fog?
Lidar cannot see in rainfall
^false , depending on what type of LiDAR setup your using it could be perfectly fine.
Pure vision doesn't work in fog or heavy rain. Pretty big step backwards just to save a grand or two.
How do you know. It hasn't even come out yet.
@@ztechrepairs I guess we'll wait and see how they program for it, but the cameras aren't that great a quality to begin with. ruclips.net/video/sbKc3nxNYtA/видео.html
LIDAR confirmed doesn't work in fog and rain. You're assuming cameras can't use something like night vision for those conditions.
Vision does work in those conditions as it’s done it before. Lidar can’t work in those conditions. Cameras can.
Humans use vision to drive in fog and rain just fine
1:06 “You will see.” Just got chilled there
I will never let a car without lidar or radar drive me faster than 30mph.
Cindy you missed this.
Elon has said he's not against LIDAR in general. They Use LIDAR on SpaceX Dragon capsule to autonomously dock with International space station! He told that LIDAR in Self-driving cars doesn't makes sense. I really think he knows what he's talking about!
Just a question. Camera data processing involves a lot of computation especially since a lot of ML is needed like CNNs while radar signal processing is relatively less complex. Then why do Tesla still believe that Cameras are the future, what advantage do we get by using sensors that use more complex signal processing?
If Lidars can map out the surrounding area, why does Google have to map out the routes STILL? Interesting thought...
Google maps use satellites to produce top view image, but that's it. Someone has to verify on the street if that's really where the street is. That's where streetview cars came in.
going to take them a long time to geofence the entire world. Tesla vision is easier to adapt to changes too.
@@thestudentofficial5483 Yeah, probably on garbage collection day.
@@pwells10 Hehehehe, they are already have. You obviously have not tried any of their advance features.
This means that Tesla is not really in the car business but the AI business. Cars are just the proof of concept.
👏
LIDARs are excellent while in development stage to test and teach cameras - are they correctly determined objects and distances to them
You couldn't find a picture of a model y?
Good thing that everything Elon promises turns out exactly how he promised it, he’s got the cleanest record out there.
Ehm
No he doesn't but it is also true that he has actually done quite a few things that people said was impossible. Like land and reuse rocket boosters ten times.
Lol poor take
LIDAR is just for the AI to have labeled data, similar to touching and object to confirm what you are seeing.
If they can make LIDAR work in rain and snow, then I’ll believe it.
True. It cant do that.
@@curious_one1156 Around here, that leaves about 20% of the time left.
it still can actually,. it's just less effective since there will be tons of unnecessary data such as atmospheric (rain, snow, etc) lidar data that the computer needs to compute and separate from real object data. that's where 3D sensor fusion (combining all sensors) comes in handy.
3:16
SEXY
Yes
The Tesla Model 3 was originally called Model E but the name was changed because Ford holds the rights to the name.
lmao on the Singapore version of the Tesla site they do not show the Model Y so it just says: "SEX"
I am always laughing on the "S3XY" line up. XD
And when one or more cameras are obscured by slush, rain, mud, frost...then what? Back to driver control.
There are eight cameras, how many eyes do you have?
If the cameras are obscured by mud or frost or rain, Radar nor Lidar will save you, a car that uses Lidar/Radar uses them *on top* of computer vision, CV is still needed to see the road and know where to go. If the car can't see the road, the car still can't drive, lidar or no lidar.
@@tribalypredisposed its accurate , tesla not using because its expensive , tesla is using cheap parts
@@AkshayKumar-vg2pi , there are a lot of reasons to not use LIDAR, and price is just one of them. Durability is another, the issues LIDAR has in rain and snow is another, and the most important one is because they do not need LIDAR to achieve level 5 autonomous. Tesla's goal is to accelerate the transition to EVs, and keeping costs down is important. Beyond that, a basic principle of good engineering is that no part is the best part. Reducing the total number of parts has always been something Tesla engineers strove for, so adding LIDAR when it is not required is not going to happen.
I want sam’s concept car, it looks amazing 🤩
How does Tesla keep the camera lens clean enough to always see the surroundings especially in bad weather and dirty roads?
In additional to that, lidar makes the car look hideous like 1980’s to early 1990’s police cars
its accurate , tesla not using because its expensive , tesla is using cheap parts
I would bet that if "integrate Lidar" was part of the vehicle design specification rather than something bolted on, it could be made to look good. The Apple Car sensor backpack already looks acceptable. If it was incorporated into the roofline, it would look like cool alien technology from a science fiction movie.
Solid state lidars are much more compact than the traditional mechanical LIDAR designs. Multiple solid state lidar units can be spread around the vehicle and hidden (e.g. under headlights)
Lidar is on the iPhone 12 Pro.
@@nathanlewis42 It is a SPAD, operating in the near infrared, LiDAR operates at longer wavelengths using InGaAs semiconductors. They are both time-of-flight but there is a huge difference in capabilities.
3:08 that's a Model X, not a Model Y.
Currently, as of June 2021, inside Tesla's codebase, it still uses if-else to handle corner cases.
And you know this how
assuming this is true, what's wrong with using conditioning? if(ai_thinks_this) { doThis() } else { doOtherwise() }
@@silicitimmm there are much more sophisticated in software design to handle these kinds of scenarios that are much more scalable. I highly doubt this is actually true considering Tesla is a multi billion dollar company with infinite resources as far as I’m concerned
Techniques*
@@silicitimmm it’s true because I wrote it, my boss approved it and the team knows about it. Nothing significantly wrong it’s just a man made software and computer vision and artificial intelligence can never be 100% intelligent. There’s no single published paper on AI ever in history can be perfectly accurate so human intervention in corner cases are inevitable. And that’s the limitation of not receiving physical signals from the real roads, which LIDAR can easily taken care of. And Musk’s interpretation of how human eyes work is simply wrong and too naive.
It's not an issue as to if it can be done. Humans do it so it can be done. The issues is can you fit the amount of CPU horsepower needed to do it into a car. I'm not going to say one way or another now. But for sure it will be the case some day.
Yes you can, in fact having more sensors requires wayyyy more processing power, especially if you're using LiDAR. Processing power is definitely not an issue, Comma ai is literally doing partial self-driving using an off the shelf android phone.
@@harsimranbansal5355 If you where just joking ignore the rest of this.
It seems you don’t understand the actual problem space. That’s fine. Most people haven’t worked in AI and just believe the weird stuff RUclips/Media claim about it.
LiDAR is not relevant. It doesn’t help at all in the actual problem space. It’s use is not for driving the car, it’s for a safety override in case the AI that drives the car gets it wrong. It’s fine for test vehicles but in the real world it doesn’t work because of all the false positives you get. If your interested in the topic here are some suggestions of things to get under your belts to get started on the subject.
Look up the difference in cpu power between a computer reproducing a picture via a camera and its power/time needed understanding what it’s taken the picture of.
Look up how LiDAR actually works. To get around the false positive problem they are pumping the lidar into an ai so the car can at least get down the road. This is because lidar takes fancy pictures with extra info and does nothing to help with the understanding problem. So, self defeating really
Look up the impossible tasks of driving. (humans or ai). For example, as he roads are laid out it is impossible not to have crashes.
Look up the Tesla’s CPU size, which is 100% used and compare that to the 80 20 rule. Sure you could say that 14% of the trips will not end in crashes. But that’s no where near good enough. If you bring the maximum speeds of everything down to say 3 mph maybe they could the crash rage to below 1% with existing hardware in a reasonable time frame. Anyway, ones you get that stuff under your belt you’ll at least start to get an idea as to why Tesla's sometimes what to head for a guard rails and why fixing that isn’t as simple as saying, tell it not to crash into guard rails.
Hey Cindy, as your next video you can do one on Tesla battery technology or different battery technologies, other one on engineering design & capabilities of Starship, or one on OpenAI, thanks.
Lidar is an easy way for startup to call themselves autonomous company and get a bunch of funding.
super smooth sponsor transition! loved it! well done!
it’s an extra 10,000 for self driving on a Tesla. When I get mine one day .. I would rather get the longer rage than self driving, or just save 10,000 bucks on a Standard range model 3. I am more than capable of driving it myself. Heck, I already feel like I am on autopilot when driving to work 😂.
This is fatally untrue. No matter how good you think you are, an AI will always do better.
Elon Musk and Tesla are jumping to the vision mode early due to the limited supplies / computer chip shortage affecting their ability to add radar to their vehicles now (and in turn to keep prices lower)
I remember web cameras to be 0.3 megapixels and costing $100. Almost everyone had it. Today 5 MP camera considered crap. A $100 cellphone can have 12 MP. Lidar maybe expensive today but it will be like camera costing. The prices droped many times already. Having $1000 worth of Lidar on a car is the reality today. In the future it will be $100. It's after all superior to cameras hense safe to people. I am sorry but cameras will be dash cams.
12k RAM was said to be enough on PC. Today 6GB(6 000 000k) is a weak computer. Mask is onto something but mostly to hype how his car can do without it. At the end he knows he is selling enough EV to get a fair share of the market. Also it won't take him long to say hey I was wrong next Tesla 10 will have lidar.
Right now Mask sells what he already has in production. Its enough as is.
I think LiDAR as with all tec products will get cheaper every year as it scales. I really want to buy a Tesla for my next car as they are currently a far superior product compared with the competition . We shall see in 4 years who offers the safest and most relaxing driving experience. I think that Elon will change his opinion once the price / performance of LiDAR improves
What if it’s really rainy? Really sunny? Snowing? Your car gets muddy and the cameras are covered?
exactly
Weird to think LIDAR is in phones now.
Sort of, it is AI LIDAR, not true LIDAR which scans.
Elon chooses not to use lidar and to drop radar is to reduce the cost. As AI thru camera cannot recognize what’s in front and thought it’s safe then proceed w full speed, it will cause the accident or collision become deadly. A safety related system without build-in secondary redundancy verification is just stupid.
If Elon says something is stupid, then clearly it isn't.
Yeah, Tesla cameras already sucks. The lidar would genuinely be very useful.
What very few understand is that level five autonomous technology is half of what one needs to compete in autonomous taxis. The other half is a huge fleet of durable energy efficient relatively low cost EVs, and Tesla is the only one who has that. It does not matter if Waymo and Cruise both achieve level five three years before Tesla does, Tesla will still end up owning the autonomous taxi industry.
Tesla is not the only one that has that
"durable energy efficient relatively low cost EVs"
Replacing the human driver is the real deal. That is the real cost.
Correction: Elon Musk is not a fan of LIDAR being used in cars.
Car can do good already and it gets better
Yes! They Use LIDAR on SpaceX Dragon capsule to autonomously dsock with International space station!
@@JayPatel-ug1nh Exactly!
3:12 Both are Model X
You’re right - my bad!
I have an upmarket dashcam that is supposed to warm me if I am drifting out of a lane and if I am too close to a car. On average it warms me I am drifting out of my lane twice in 10 kilometers both times it is as I pass a right-hand turn lane. The warning of traveling too close varies with the colour of the car. No way would I want to sit back and use this as part of a self-driving system.
At least we can play cyberpunk!
@Green Mamba Games yes BUT only if it was JUST A PC
which ain't true, it's also the fastest 5 seater car, it's the new drag race champion, it's electric ⚡
Even overhyped AI can't remotely compete with the human brain's pattern recognition ability. That's why we can we can drive safely even with our poor eyesight. Autonomous vehicles have to compensate by relying on other sensor inputs - and LiDAR provides the data overlay. Tesla cannot afford to retrofit its lineup so they are doubling down, but new EV makers (incl legacy brands) don't have this issue. Sometimes it's strategic to not be the first - which is how Apple typically wins in tablets, watches etc.
you are literally betting against AI technology.. which is the backwards thing to do right now
@@UnknownPerson-wg1hw I am an AI investor, so not betting against. Still, AI isn't magic - it relies on data inputs, and removing an important LiDAR data overlay greatly diminishes output predictions. As for AI - which is still really an expert prediction system - it is orders of magnitude away from the human brain's pattern recognition. For instance, potholes vs paint patterns on a road. Read Kurzweill.
2 cameras lined up can guess the relative distance, plus, lidar can be unable to measure in fog, rain and snow, a good ai with camera plus radar can even go farther from lidar
What happens if there's a fog.. Some places have a sense fog.. Would the car keep up if you are caught driving in a bad snowstorm or heavy rain
His ego will completely destroy him.
What about rain, snow or sand on those cameras? Radars etc don't get distorted like a camera lens from stuff like that.
Rain and ice can have a massive impact on Radar
Never corner yourself with just one tech.. prices will change.. no harm having both..
I drive my tesla almost everyday and I don't depend on autopilot in the area which is pretty much a hood ( city of San Leandro and Oakland California). We have too many ghetto drivers who drive recklessly all the time here so you really need to keep you eye open and watch out for other cars. We have people who try to race, cross red lights and cut off others while looking at the phone at the same time. The drivers in the hoods are the ones you always want to watch out for. I would drive a tank or a heavy duty hummer if I have to just to protect your self from these crazy drivers.
You better pre-order a cybertruck!
And now for the REAL reason Elon hates Lidar: Lidar stole his girlfriend in college. He never got over it.
NOW THAT SOUND RIGHT.
3:10 the car on the bottom is a model X, not model Y. You can tell from the two door handles being right next to eachother, plus the shape, headlights, and wheels being the wrong ones for a Y.
They mentioned their mistake in a pinned comment.
Good one. Cover a wide range of interesting topics. Tesla is a good company, with innovative car brands. Keep it up
Elon's high horse is gonna cost his saving measures to expensive in the long run
LIDAR is great for an emergency braking system. It's completely useless for actual self-driving. It can't read signs, it can't see road lines, it can't understand lights, it can't recognize people. It's a great shortcut if your end goal is a Level 1, maybe Level 2 system, where the most you really want your car to do is hit the brakes and not crash into things.
If your goal is full self driving, level 4 or 5, then you're going to have to develop intelligent camera systems anyway. By the time the camera AI systems are good enough to do level 4 or 5 stuff, LIDAR will be completely redundant long before that.
wdym lol? it can read signs and lines
It's brave considering in low visibility, the car will be just as visually impared as we are.
Less so. As post-processing can be applied to video that we cannot do ourselves. EG: tesla will apply a post-process to reduce gain on high brightness situations. To do this ourselves we'd need to squint, pop on sunglasses or flip down the visor. All which take far longer than a fraction of a second and even then aren't as good as post-processing.
Normal camera's might be sufficient to make self driving vehicles just as good as human drives. LIDAR and RADAR can make it even better than humans, especially in low light and foggy conditions.
No, a machine with human-like vision would drive much more safely. People don't cause accidents, because it's inevitable due to lack of vision. They do, because they don't pay attention, they take unnecessary risks, they fall asleep, they are drunk. I don't have any statistics to back it up, but these factors probably cause 80% on the accidents.
@@anonanonkiewicz1921 okay fair enough, and there is also reaction time and the fact that there are blind spots behind pillars etc, camera's can have full 360 degrees vision.
But still, thick fog and low light situations have the same problem with normal camera's compared to human vision. LIDAR of RADAR could improve on that
I give u one example: the cheetah is the fastest animal and it uses legs to run. Wheel does not appear in nature. By Elon's logic we need to solve leg-based locomotive mode, wheel is a fool's errand, anyone depending on wheel is doomed.
I'm glad LiDAR has finally grown on Musk. Now that it is affordable, it makes WAY more sense.
What I don’t get is that everyone with a recent iPhone pro or iPad Pro has a lidar sensor. I mean it just has a depth perception of 5 meters, but still the sensor cannot be more expensive than 10$.
Why wouldn’t similar tech work with cars?
Because Lidar in phones can’t see more than 5 meters, Lidar in cars needs to see 200+ meters
enyone see something wierd in 3:16 ?
s3xy
The biggest point about using cameras is that it provides future revenue. Lidar maybe is the easier to use technology for selfdriving BUT cameras can watch the world around the car and not only the road. This way Tesla would have huge amounts of data on everythign that happens close to roads and it would be updated automaticly. Just imagine what you could do with all that data. Cameras provide massive amounts of data Lidar can never capture. 24/7 surveilance and the customer is paying the equipment. Lets see how they use all this data...
The fact is Luminar is only selling equipment to customers that really interested in the technology since they are with high demand pieces.
Elon Musk does not hate LIDAR. He even said this publicly during their Autonomy Day presentation event, as well as on a few other occasions. He and his engineers at SpaceX developed their own LIDAR system, so they know its pros and cons well. The only reason LIDAR is even being used is because there isn’t yet a pure vision system that can do what humans can with their two eyes and brain. So solving pure vision only FSD is the ultimate goal, which is why Elon/Tesla are taking the challenge head on instead of relying on expensive cheat devices/hardware add-ons that don’t actually help achieve the main goal in the long run. LIDAR is like taking a powerful pain killer to make the pain in your body go away, but what you really need is to fix what is causing the pain in the first place.
Let's hope no one makes a shot on iphone meme on this in the future
The fly don't use LIDAR, The Lion don't use LIDAR, the falcón don't use LIDAR, Humans don't use LIDAR.
never mind Lidar how bout that Mustang Sketch!!!
If people can setup lidar with their current cars, this will take a large cut from tesla. Seems Elon doesn't want to spend the money at the moment, but once the tech is streamlined and improved. People will purchase the tech.
Especially for drones to walk dogs. Or possibility would be using drone with lidar tech for search and rescue.
I don't know enough about how lidar works to be entirely sure about this but I would argue that as a emit and receive system lidar works best when there are no other lidar scanners around at the same time because they interfere with each other. So their usefulness lessens with every other car using it. That I would be a good reason to invest in only receiver type sensors like cameras.
TLDR: Its expensive, so they're going to remove it to make more money, and smear its use in competitors to stop people choosing them instead for having more comprehensive sensors.
These videos never disappoint