Why EVERYONE Needs To Be Tested On FSD and Similar Tech BEFORE They’re Allowed To Use It!

Поделиться
HTML-код
  • Опубликовано: 5 фев 2025

Комментарии • 118

  • @karlInSanDiego
    @karlInSanDiego 9 месяцев назад +24

    The science on driver inattentiveness showed long ago that the less you engage a driver, the more inattentive they are. Semi-autonomous driving mode CREATES distracted, unaware drivers, no matter their skill level, concern, or belief that they maintain vigilance better than others can. It's not autopilot of FSD's product name that is the problem here. It's a fundamental issue with disengagement that is created with autonomous systems.
    There is an minimal amount of engagement needed to maintain attention and there is a maximum amount of stimuli a person can tolerate before their abilities can't keep up. If you saturate someone's senses and demand too much engagement from them, (race car driver, or truly defensive motorcycle rider going quickly in traffic) the rate of exhaustion is increased. The overstimulated brain is firing hard but it's burning out fast too.
    If you under stimulate the brain, by removing the steering, braking, and throttle operation from the driver, it is extremely hard to maintain vigilance with the hope that said driver is fully aware of the position of their vehicle, all the lanes around them, all of the drivers around them, as well as other vulnerable road users, emergency vehicles, or unexpected road obstacles.
    In a BEST case scenario for level 3 driving that assumes that the autonomous system is CAPABLE of maintaining near perfection AND has the time to notify a driver to take back control because it somehow worked out that it no longer can devise its own strategy to avoid a problem, there will be a disconnected, largely unaware driver being asked to quickly ascertain what the problem is and how to safely avoid it while not causing a collision with others at the same time. That will lead to tragedy.
    The toys you all are playing with on public roads are just that, toys. Tesla's system since launch has required extreme vigilance because it does fault and CAUSES rather than prevents a collision scenario. Fred Lambert, long time Tesla apologist, though he would now dispute that, famously penned an editorial on Electrek entitled, "Tesla FSD Beta tried to kill me last night". Yet Fred can't stop himself from trying to will FSD into working for him. People have died when they weren't vigilant and OVERRIDING autopilot, let alone FSD, and the two are now merged.
    Licensing and driver training to use lazy self steering crutches is not the solution. The reasonable case for Autonomous driving may be limited to chauffer for people who are blind or otherwise incapable of safely operating a vehicle, in which case, there is no driver to take over. Level 3 driving assistance is the most dangerous type that could have been proposed, strictly because you cannot reliably disengage a driver and then expect them to reengage when a self-driving system may or may not realize that the driver is actually needed and a disengaged driver is in no position to oversee if an error is occurring. "Supervised" Self Driving, is an oxymoron, but at a deeper level than you've described in this video. Self driving is the root cause for drivers' inability to take over, not the label we place on the product or the licensing or training we undergo. Until we accept that, we'll be dissatisfied with attempts to fix the problem of "why can't drivers stay attentive when they switch on the lazy controls"
    I think it's important for folks to understand that these questions were asked and answered by science a long time ago. Here's a quote and a reference,
    "removing all the driver’s responsibilities except following the detection of an infrequent obstacle makes the disengaged driver’s task very much like a vigilance
    task [6]. The new driver role requires sustained attention for the occurrence of an infrequent, unpredictable event over long periods of time, essentially the definition of a vigilance task."
    IMPACT OF PHYSICAL DISENGAGEMENT ON DRIVER ALERTNESS: IMPLICATIONS FOR PRECURSORS OF A FULLY AUTOMATED HIGHWAY SYSTEM
    1998 IEEE
    sci-hub.se/10.1109/ITSC.1997.660544
    Humans struggle to succeed at performing vigilance tasks.
    The less often you need to intervene on behalf of a Level 3 system, the less likely you will be to do so in time. Your comfort level and reliance on it will be be commensurate with it's seemingly outstanding, improved performance. Therefore, as AV's appear to get better and safer, we will see higher mortality and crashes with them as people are surprised when then fail and are unable to react quickly enough to prevent crashes. I really wish we were honestly assessing the costs (increased risk, added VMT) vs. benefits of AV instead of letting the tech dorks lead the discussions.

    • @howiemeltzer7040
      @howiemeltzer7040 9 месяцев назад +1

      Nailed it!

    • @jimmurphy5355
      @jimmurphy5355 9 месяцев назад

      There will be crossover point where it won’t matter much if the driver stays vigilant - as long as vigilance is rarely required.

    • @karlInSanDiego
      @karlInSanDiego 9 месяцев назад

      @@jimmurphy5355 you've completely missed the point. You'll never prevent the rare fatal mistakes, if you are not ready at all times to react to them. There is no point in an autonomous cat that is 99% flawless, or even 99.5% flawless. You'll still have to save 1 out of 100 drives or one out of 200 drives, but You'll never know when that save is needed. You cannot remain vigilant enough to enact the save. And even if the rare individual can, the overwhelming percentage of unengaged drivers will not, resulting in one out of 100 cars crashing every day due to AV error, resulting in a complete distrust of AV technology by consumers, other road users, government regulators, and insurance companies. An imperfect AV is a liability, not an asset.

    • @ChrisBigBad
      @ChrisBigBad 8 месяцев назад

      yes.
      also: the newer drivers who now will get their licenses will want to use these systems with basically zero experience in actually piloting their own vehicle. now ask them to prevent a serious problem that would get experienced drivers sweating at the drop of a hat after 2 hours of staring out the windscreen.

  • @AerialWaviator
    @AerialWaviator 9 месяцев назад +4

    Another excellent TE video.
    Just a reminder that while aircraft "autopilot" are often used as an analogy for automation in road vehicles; nearly all aircraft autopilots are NOT able to takeoff, or land an airplane. This requires additional certified hardware and automation beyond at typical autopilot system. Often referred to as instrumented landing system (ILS), and the use of which is only allowed to be used at airports that have been approved for its use, not just any random airport.

  • @milohobo9186
    @milohobo9186 9 месяцев назад +3

    More drivers education seems like a very good idea. Maybe a special test for larger vehicles like SUVs and large pickup trucks, a special one for high performance vehicles like sports cars, special ones for motorcycles, and on-going continuing education as well for such things as well. Imagine the things people could learn about to better improve their understanding of cars, roads, and safety! Defensive driving and train safety would be a must, but also getting the opportunity to learn about electric cars, the symbols on those 18 wheelers, and more!

  • @HWKier
    @HWKier 9 месяцев назад +8

    After participating in the one-month Tesla FSD trial, my impression is that the biggest risks occurred when I disabled the system inappropriately. Several times I took over when I probably shouldn't have right in the middle of a tricky situation, then barely made it out of the tight spot because my reflexes were too slow.

    • @esprit1st75
      @esprit1st75 9 месяцев назад +4

      If you are in a situation where your reflexes are too slow, I would argue that you probably took over too late. Obviously I don't know the exact circumstances so I'm not judging.

    • @linusa2996
      @linusa2996 9 месяцев назад +1

      ​@esprit1st75 reacting time meaning how long it takes to react to autopilot chime, analyze why the autopilot turned off, figure out what you're supposed to do about it and act on it. That time ranges from 0.4 second if you are a Michael Schumacher or Niki Lauda. The average human more like 0.7 to 1.5 seconds

    • @esprit1st75
      @esprit1st75 9 месяцев назад

      @@linusa2996 Well, we then have to talk about what you are supposed to do when supervising an autopilot. if you are supervising the system, you don't have to figure out the situation from scratch. That's what you are describing. You are supervising the system, do you know where you (the car) are supposed to be, if the car is actually doing what it is supposed to do and you monitor the situation/traffic around you. But that is exactly the problem this video is about. People are not doing that. (Again, not judging @HWKier) People are on their phones, or doing other stuff because they think the system is going to work perfectly. Imagine there was a test where they would sit in a simulator and they would go through situations where the car accidentally drives into a barrier or ignores/misinterprets an obstacle and doesn't react. I bet you they will pay more attention in the real thing because they probably don't want to die.

  • @grejen711
    @grejen711 9 месяцев назад +8

    Having "workedwithcomputers" for nearly 40 years now I have a pretty extensive experience with the general public thinking computers, and networks, and AI, and technology of all sorts, are something they are not and are capable of magic because these system seem magical to them. Them, that are not like us. Us that have a mighty need to actually know how stuff works.
    It's quite maddening sometimes, bewildering most of the time.
    I like to say - "To err is human, to really ##$#up you need a computer." Until semi autonomous becomes fully in control (a while yet I'm sure) I absolutely agree it should be a requirement for people to get a supplement to operate a vehicle in autonomous mode. Yeah that Boeing 737 reference was spot on!

    • @FameyFamous
      @FameyFamous 9 месяцев назад +2

      The 737 Max crashes were not caused simply by the failure to disclose the new automation as part of pilot training. Another problem was that the system was activated by data from a single probe which occasionally fails. Now, the system is safer because pilots are trained properly and because it uses 3 probes.

    • @ScorchedEarthRevenge
      @ScorchedEarthRevenge 9 месяцев назад +3

      That's my experience too. Though I find software developers not familiar with autonomous driving over-estimate how close we are to actual FSD by failing to factor in the open environment cars operate in. The edge cases are virtually infinite and can never be solved in sufficient quantity to match human performance unless a new form of ML is invented to solve the interpolation vs extrapolation problem.
      Tesla's approach to the problem is a joke that will only ever lead to slightly better level 2 systems.

    • @grejen711
      @grejen711 9 месяцев назад

      @@FameyFamous Training and even simple awarness of the existence of the system would have saved lives.

    • @grejen711
      @grejen711 9 месяцев назад

      @@ScorchedEarthRevenge Overestimating AI and other systems has been an issue since the 1960's. I'm not sure anything much better than level 2 is even doable unless the system can trust there are no unpredictable humans on the road. Matching human performance should not be the goal. Human performance is dismal but acceptable morally because .... we're human. Taking humans out of the problem entirely would eliminate enough edge cases to make the autonomous systems almost perfect. But probably we'll still be more accepting of human frailties (and outright criminality) than any software glitch no matter how rare!

    • @ScorchedEarthRevenge
      @ScorchedEarthRevenge 9 месяцев назад

      @grejen711
      Nobody said it was "the goal". Why would there be only one goal? Human-level is the bare minimum, and we're nowhere near it.
      The number of edge cases are virtually infinite and ML models have to be explicitly trained for each one. It's totally impractical. Humans can extrapolate/generalise their knowledge. They don't need to be specially trained to not crash into an object they don't recognise.
      Taking Humans off the road reduces the edge cases, but unless you're talking about a closed loop system, not nearly by enough to matter.

  • @esprit1st75
    @esprit1st75 9 месяцев назад +11

    Thank you Nikki for another great video! BTW People never mention that pilots have to do either a test (commercial pilots transporting people) every year, or a biannual flight review every other year even as a private pilot. It's not just hop in and go after passing a one time pilot test. Never mind that pilots constantly monitor an autopilot even though there are no intersections or stop signs in the skies. No cellphones, movies or napping allowed either.

    • @anderspedersen6750
      @anderspedersen6750 9 месяцев назад +1

      And type certifications for certain planes (been a while, but anything over 12500??).

    • @esprit1st75
      @esprit1st75 9 месяцев назад +2

      @@anderspedersen6750 Correct. Although if flying commercially you need a check ride in every model you fly anyways.

    • @anderspedersen6750
      @anderspedersen6750 9 месяцев назад +1

      @@esprit1st75 guess the auto drive functionality would be more akin to a Complex signoff for a pilot.

    • @esprit1st75
      @esprit1st75 9 месяцев назад +1

      @@anderspedersen6750 oh yes, definitely. But my point is that pilots get checked regularly as well. So if you don't stay proficient you need to train more.

    • @JasonTaylor-po5xc
      @JasonTaylor-po5xc 9 месяцев назад

      To be fair, pilots are trained on specific aircraft because they are all very different. Also, it's their job and they have to take off and land several times a day for their entire career. They also have an age limit, which forces them into retirement. Granted, most make a nice salary, so not complaining about that.

  • @peterprokop
    @peterprokop 9 месяцев назад +4

    I am a very experienced driver and I have honed my driving skills over decades, and I have developed a specific driving style that involves above average distance keeping and a lot of anticipation what other drivers will probably do, often slowing down well in advance because of a high probability that a slower car will pull into my lane, or a car moving in a way that indicates the driver is not aware of my presence, which is especially critical when riding a motorcycle.
    I have also fairly good knowledge of AI and self-driving technology, but not much practical experience. However, the little experience I have taught me that I don't like the driving style these systems lead to.
    I use simple cruise control a lot, but it already has it's weaknesses. It generally accelerates faster than I would manually in the "resume" mode, and keeping constant speed is also often not the most comfortable choice. With manual throttle control I slow down and accelerate more "organically", leading to a more comfortable ride, and the more congested the road becomes, the more more stressful it becomes to use cruise control, and there is a point where manual throttle control is the better option.
    Now extend that to "adaptive cruise control", where the car automatically slows down and keeps distance from the car in front of you. Typically you can set how much distance you want to keep in relation to your speed, but I found all other settings but the most aggressive unusable because the distance it kept was so large that other cars often intruded into it, sometimes even passing me on the right. The downside of the "aggressive option" was however that the system always used to break later and harder than I would have, causing stress. I also experienced about one failure per hour of driving where I had to manually break because the system somehow failed to detect a car in my lane, usually involving curves and the cars driving not in the center of their lane.
    Coming to "lane keeping" next: There are two different kind of systems: One system basically alerts you when you leave your lane without signing and can induce some opposite steering momentum, making you "bounce off" the lane marker. If you just let that happen, you will bounce a few times between the lane markers until the angle becomes to big for the system to bounce you back, and you will leave the line.
    A real "lane keeping" system will try to keep you in the middle of the lane, making the car actually follow the road. Combined with adaptive cruise control, you basically have an "autopilot" that will drive you from intersection to intersection.
    If you add automatic road sign and traffic light detection, you can even maneuver some intersections automatically.
    For full self driving however you also need to handle a lot of special situations: Road constructions, sites of accidents or broken down cars, ill behavior of other drivers or road users like pedestrians. And add situations of bad weather, heavy rain, icing conditions, dust or fog, blinding sunlight and dead insects on the windshield, or those moments when you pass a truck in heavy rain and it wheels dump tons of water on your windshield, but you drive on because you know you will be through in a second or so.
    All these situations need to be handled by a full self driving system, and many more. A lot of people are selfish when it comes to driving, and some to an even aggressive level, pass you in an unsafe manner, intrude on you safety distance take your right of way.
    Fully self-driving cars without manual driving capabilities or no driver on board are also quite defenseless in case of aggression or attack, and some people seem to like to have fun blocking or even vandalizing autonomous cars.
    My point is: Every level described above introduces new failure modes, and a good driver must be able to put himself into the mind of other road users. This is why I believe that full self driving can not be done by narrow AI, but requires a fairly sophisticated general AI system able to understand the world on a human level; otherwise it will always have a hard time to navigate it properly.

    • @jimmurphy5355
      @jimmurphy5355 9 месяцев назад

      If you have any way to do an extended drive with Tesla's re-branded "Supervised FSD" based on the AI trained neural network, you might be surprised. This latest system is still far from perfect, but also far better than their previous system. It still accellerates a bit too hard for my taste when resuming speed or starting from a stop, and I could not find a way to tune that. But the lane keeping is really good, and behaves a lot like a human drivers. One tiny detail I noted: it scootches away from trucks when it is passing then on the highway, giving them a wider berth rather than strictly centering in the lane. It does the same when passing bicycles on surface streets. It's acting, in many situations, pretty close to the general AI you mention.

    • @jimmurphy5355
      @jimmurphy5355 9 месяцев назад

      @@mikewallace8087 Yep. A human would not stop for that. AI needs to get so good that it doesn't either, and I have no idea how close we are to achieving that. What Tesla's system can do now is pretty impressive, but it's nowhere near a state that would make a car without controls feasible. BTW, I drove about 3500 miles using the latest FSD variant recently. Tesla's one month free trial coincided with a planned road trip to see the 2024 eclipse, and I combined that with lots of other sightseeing on the way to and from Texas (I live in the SF bay area.) It was almost good enough to entice me to pay $99/month to keep it working after the free month expired. Almost... If they lower the price to $49/month I'll go for it. It's good enough to be useful and stress reducing on both highway and surface streets. It doesn't always drive the way I would, but neither do my wife or son. I ride with them and don't stress out just because the pass (or don't pass) someone when I would have (or wouldn't have), or yield (or not yield) for drivers coming out of a parking lot, etc. etc. I just kick back, let the car do its thing, and only intervene if the car really needs help. That's surprisingly infrequent with the version I just tested.

    • @rp9674
      @rp9674 9 месяцев назад

      Full autonomous is hard, not impossible, presumably

    • @rp9674
      @rp9674 9 месяцев назад

      Perfect drivers will still get into accidents because of other drivers, same applies to full autonomous

    • @jimmurphy5355
      @jimmurphy5355 9 месяцев назад +1

      @@rp9674 Accidents that happen while cars are driving themselves will be (and are becoming) hot news. Simply because they are "new." I fear that hype about the autonomous driving failures will delay the adoption of technology that would reduce overall accident rates, despite still having some accidents.

  • @Peter-vn8ue
    @Peter-vn8ue 9 месяцев назад +2

    The FSD System will not be 100 per cent successful until every vehicle on the road has it.
    There is a driver's misconception that FSD is already here and activating it means it absolves them from any driver's responsibility.
    Your FSD system is not going to save you from a vehicle with no level 2 safety systems at a minimum, coming out of a side street or cross traffic intersection that fail to stop and hit your vehicle side on, right in the middle of your vehicle, so until all the vehicles on the road have a least level 2 safety systems accidents will continue to happen.
    Its why Tesla have added "supervisor" to their name of their safety systems, so the driver should know they still need to supervise the vehicle through driver input.

  • @nozomikurai952
    @nozomikurai952 9 месяцев назад +1

    I am reminded of the 24 hour Tesla Model Y test drive. I went for a short trip to a nearby city, to test the autonomous self-driving. It kept trying to drive in the left lane, even when the left lane was under construction, then attempted to drive me onto oncoming traffic in a construction zone that made a 2-lane highway (one way) into a one way highway with oncoming traffic. The salesperson shrugged it off, so I figure that this functionality shouldn't be trusted. My PHEV doesn't work in my new neighbourhood, so I guess I will have to suck it up and do the driving, myself. I could wait to win the lottery and get myself a driver. That's sure to happen before "full self driving" ends up being reliable.

  • @Mwesi
    @Mwesi 9 месяцев назад +7

    Hey Smarty Pants: yes I took apart a VCR at the age of 4. 😮 I got into quite a lot of trouble.

    • @firestar4430
      @firestar4430 9 месяцев назад +3

      Anyone could take it apart! The real question is did you manage to put it back together? 🤣

  • @mutantbob
    @mutantbob 9 месяцев назад +12

    I want to see numbers on-screen showing the per-capita adverse incidents for semiautonomous -vs- traditional driving. Accidents involving robot cars are far more interesting to news outlets than the more numerous accidents of normal cars.

    • @jamesheartney9546
      @jamesheartney9546 9 месяцев назад +2

      It's not a simple thing to measure. Semiautonomous driving systems work better in some situations (say, long trips on limited-access roads, or within slow-moving traffic jams, all in good weather during the day) than others (complex surface streets with lots of pedestrians, cyclists, rain, snow, fog, at night). As a result, most of the semiautonomous miles will be in the easier driving environments, so you can't make a one-to-one comparison. (Elon Musk has taken advantage of this to make extravagant, misleading claims about the safety of Autopilot and FSD.)

  • @bikeforever2016
    @bikeforever2016 9 месяцев назад +1

    The problem with AI is using people as the source of training data. So they behave like people, suprise?

  • @robinplaysgames137
    @robinplaysgames137 9 месяцев назад +5

    Also, pilots have to do training every 2 years and have an instructor sign them off as still able to fly. I really wish driving was similar here in the USA.

    • @scottmcshannon6821
      @scottmcshannon6821 9 месяцев назад +1

      here in the US drivers are not really even licensed. they are given a license on their 16th birthday as a present, they are not earned.

    • @rp9674
      @rp9674 9 месяцев назад

      They also use autopilot regularly because it's safer, that is, almost full autonomous.

  • @kdenyer1
    @kdenyer1 9 месяцев назад +2

    I think all the software should take a driving test first.😂😂😂😂😊

  • @Kirmo13
    @Kirmo13 9 месяцев назад +14

    It's not weird to want to know how things work. Heck, I bet that in this community that's the norm

    • @JasonTaylor-po5xc
      @JasonTaylor-po5xc 9 месяцев назад

      For me, it's more of an aversion to the government being more involved. If this was optional with additional perks, I could see it (i.e. disable the nag). We already have laws in place that require drivers to know how to operate their specific vehicles. No need to get an endorsement to prove I can operate my Minivan or Corolla. If I caused an accident because I didn't know how to operate the windshield wipers, I would be cited and held accountable.

    • @Tullo_x86
      @Tullo_x86 9 месяцев назад

      ​@@JasonTaylor-po5xcwhat about perks like "the semi-autonomous mode can be enabled at all"?

    • @JasonTaylor-po5xc
      @JasonTaylor-po5xc 9 месяцев назад

      @@Tullo_x86 That's not how endorsements work. For example, when I lived in Florida, I had a glasses required endorsement - if I ever got pulled over not wearing my glasses, I could get fined. Of course, cops don't know ahead of time if I should be driving with glasses or not. Technically, that's how licenses work too - if I get caught driving without a valid license, the penalties are high but the car still works. There are a few "gates" to reduce access to cars to unlicensed people, but they are easy to overcome.

  • @eranschau
    @eranschau 9 месяцев назад +2

    Just like everyone who uses the self checkout at the grocery store should also be required to take a proficiency test first. But maybe I'm just getting crotchety in my old age.

    • @jamesheartney9546
      @jamesheartney9546 9 месяцев назад

      Or (hear me out) self-checkout systems that are not usable without explicit training should NOT BE DEPLOYED. If the store wants to hire me to do their checkout job, they can pay me for it.

  • @stevewausa
    @stevewausa 9 месяцев назад

    Great analysis as usual, you took the whole thing apart and explained how it (often doesn’t) work.

  • @JayCreates
    @JayCreates 9 месяцев назад +2

    Good points well made I just hope the industry has the maturity to tackle this imposing problem

    • @JayCreates
      @JayCreates 9 месяцев назад

      Oh it will be the government too, well that will be changing soon so...

    • @rp9674
      @rp9674 9 месяцев назад

      Develop better, real full autonomous

  • @teardowndan5364
    @teardowndan5364 9 месяцев назад +1

    If semi-autonomous features require additional training besides what is required to pass a driver's license to use, maybe the semi-autonomous features are designed wrong. Make the automation conform to the drivers at least to the extent that it is safe to do so instead of making drivers conform to the automation. If your automation requires that the driver maintains hands on the steering wheel, then simply make it so lane-keeping requires at least minimal input in the correct direction to actually keep the lane, otherwise let the car drift across lanes when it is safe to do so, then escalate to disabling the accelerator and coasting to a park on the shoulder lane if there are no more lanes left to drift into. This way, people won't be able to disengage from actually driving if they want to get where they want to go. Much simpler than all of that fancy behavior monitoring and driver annoyance systems.
    What I have been saying for years is that most people cannot remain attentive to the road unless they are actively engaged. I know I have minimal ability to focus on the road while riding as a passenger, no automation beyond cruise control for me until we get to L4+ where I will be able to simply take a nap or do whatever else between A and B. Google suspended its road testing when its employees couldn't babysit their AI driver much beyond two weeks before they became too comfortable with the AI driver to pay attention anymore.
    If a special license is required to operate a semi-autonomous vehicle, I bet most people and rental companies will opt out of that automation. It will also create a nightmare for people borrowing or lending vehicles.
    If you want to be pedantic about what is or isn't a charger, phone and laptop "chargers" aren't chargers either. They are just 5/9/12/16/20/48V mostly dumb AC-DC power adapters. The actual charging circuitry that takes the 5/9/12/16/20/48V DC and converts it to whatever voltage/current the battery needs at whatever SoC and temperature it is at is in the phone/tablet/laptop/whatever itself.

  • @williammurphey1090
    @williammurphey1090 9 месяцев назад +3

    You and most automobile pundits are misusing the term "autopilot". It is a term of art from aviation that is a shortened version of the term "automatic pilot". It has NEVER been a term referring to autonomy, it is not "Autonomous pilot". Tesla's use of the term Autopilot for its lane keeping cruise control is an accurate description of its capabilities. Please use the terms correctly. (40-year career in aviation as an airline pilot and check airman.)

    • @rp9674
      @rp9674 9 месяцев назад +2

      The typically discussed problem is Tesla FSD full self-driving, which is really partial full self-driving. I think Tesla autopilot is like adaptive cruise + lane keep + automatic braking

  • @JorgTheElder
    @JorgTheElder 9 месяцев назад +1

    I agree on the need for testing and training but not at all at the naming rules. The names of products are marketing. Expecting a simple two-word name to correctly describe the capability of a product is not reasonable in my opinion. For example "Auto-Pilot" is an incredibly generic term. Most auto-pilot systems do nothing but keep altitude, heading, and speed and know nothing else about the environment they are in. They are literally the air-craft equivalent of cruise-control. It is 100% up to the pilot to make sure that continuing like that is the right thing to do. I feel the same way about "Full Self Drive Beta". It is common knowledge that a beta product is not finished and is not ready to be turned loose. It has also been reported that the on-screen prompts that you get when you sign-up-for and turn-on FSD-Beta explicitly state that the driver is still needed. That is training. If people accept those prompts, they accept responsibility. Go ahead, add some test questions that make you explicitly state that you understand that FSD-Beta still requires you to be ready to take over. I don't think it will change anything, as it appears the information is already being provided before you can enable the feature. People just don't like personal responsibility. Formal training and testing with a focus on the drivers responsibilities are the only answer.

  • @williamelkington5430
    @williamelkington5430 9 месяцев назад

    Thanks very much for this video. What I think is this: You are 100 percent correct. And your recommendations make very good sense and are welcome.
    I'll augment these simple statements with a few additional points.
    First, the traffic conditions in which automobile drivers operate are much more complex than the traffic conditions in which pilots operate. The risks of a mishap are much greater, because of the much higher traffic density, the unpredictability of other drivers, the unpredictability of the traffic dynamics, the unpredictability of road conditions (e.g. slipperiness, construction impediments, and random objects that appear magically in the roadway), and the unpredictability of the weather.
    Second, not all driver automation systems are of equal quality and dependability, since there is no regulatory body (whether government or industry) that has the authority to define the product development, testing, verification, and validation requirements for such systems. (Software or firmware can be both very complex and very hidden. Observing such systems understanding them, testing them, and verifying their quality and reliability can therefore be impossible, without the authority to impose development and testing standards on them.) In other words, the further we go into driver autonomy, the more complex semi-autonomous/autonomous systems will be, and the more we will need the regulatory authority of the FAA. NHTSA does not have this authority. Only Congress can confer this sort of authority on NHTSA or some other automotive agency.
    Third, in addition to operator testing, what the FAA does is require training---simulator training. So as we move into autonomy, we should also develop a simulator capability that all drivers will have access to in order to improve their skills, without first putting other drivers and pedestrians at risk in the learning/competency-acquisition process.

  • @CCARL11
    @CCARL11 9 месяцев назад

    Very intelligent analysis and suggestion on how to make the increasing technology in new cars safer. Knowing the industry they will more likely increase insurance rates based on what technology your car will have, ex if you have a self driving feature your insurance rate will be 20 to 30 percent higher than a car without that feature.

  • @UshasRides
    @UshasRides 9 месяцев назад

    Brilliant video. I concur… there should be a special test section for EVs.

  • @davidmccarthy6061
    @davidmccarthy6061 9 месяцев назад +1

    A good concept but in America how to make it worthwhile? We don't have the most stringent driver training/testing anyway and it varies by state. I guess just an extra exam and "stamp" on your license like we have for motorcycles.

  • @Tullo_x86
    @Tullo_x86 9 месяцев назад

    You got a hearty chuckle out of me with "when I were a lad" 😄

  • @scottmcshannon6821
    @scottmcshannon6821 9 месяцев назад

    you said it your self, people get bored, almost anyone can pay attention long enough to pass a test, the real test is when after 3 months an issue comes up and the driver stopped paying attention 2 months and 2 weeks ago. the test would probably end up helping less than 10% of drivers.

  • @GraemeLeRoux
    @GraemeLeRoux 9 месяцев назад

    This a very good and thoughtful video. It makes several points which law makes should be thinking about. However I must take issue with your use of the term ‘accident’. A crash is *never* an accident, rather it is the result of a chain of mistakes and/or errors. It was in recognition of this that, some years ago now, authorities and the media here in Australia stopped referring to ‘road accidents’. Gradually this has resulted a subtle change in public attitude; people accept that collisions, loss of control, etc are an effect that has been caused by something - at least intelectually. Australians still die, suffer injuries and do a lot of damage on our roads, but most of us don’t think of this as an ‘accident’ nor do our courts.

  • @JohnRoss1
    @JohnRoss1 9 месяцев назад

    I think the driver assistance features should be rated to the driving behavior level of an impaired driver based on blood alcohol, (or drugs) and age.

  • @esthermofet
    @esthermofet 8 месяцев назад

    Valuable Style Points have been awarded for the Invader Zim references!

  • @nettlesoup
    @nettlesoup 9 месяцев назад

    This is all very well and good analysis that will be important for those systems that remain in the "driver assist" realm for the foreseeable future.
    However, if/when one or more systems accelerate to becoming as good as, or better than, [pick your percentage] of current human drivers in that region, then those are no longer driver assist, but just robot drivers. *They* will need to pass a test (or preferably 20 tests with different examiners in each state) each time a new version of the software is released. Then the software can be released widely.
    Some manufacturers have stated that they have no desire to reach this goal. They are content being level 2.

  • @wiltaylor
    @wiltaylor 9 месяцев назад +4

    I just don't see the point. If you don't want to drive there is uber and Public transport! Leave the driving for the drivers!

    • @jakthebomb
      @jakthebomb 9 месяцев назад +1

      It is so nice when stuck in bumper to bumper traffic. Just turn it on and don’t have to stop, go, stop, go. Also for long road trips, it takes some of the stress away.
      I agree that it is kinda pointless using it on city roads. Highways are where it is really useful.

    • @patdbean
      @patdbean 9 месяцев назад +3

      What about those who can not drive? Those who are registered blind/partially sighted.
      Taxies are far to costly and mass transport (cheap enough) but far to unreliable.

    • @wiltaylor
      @wiltaylor 9 месяцев назад

      @@patdbean I get that but as she Nikki mentioned with pilots do you really want someone behind the wheel who cannot take over or even monitor the vehicle should it go wrong?

    • @patdbean
      @patdbean 9 месяцев назад

      @@wiltaylor if it is true level 5 then "by definition" there need be NO Wheel. If you need a wheel then it is only level 4 at best. In these comments people talk as if human drivers are infallible. We kill 1 million people a year on the roads worldwide. About one for every 60 million miles driven.

    • @patdbean
      @patdbean 9 месяцев назад +1

      @@wiltaylor but the whole definition of level 5 automation is that it dose not need a driver or even a wheel for that matter. If you still need a driver then by definition it is only level 4.
      And if you Still have a driver then Where is your cost saving? You still have to pay him, he is not going to say " ok don't pay me this shift, because I never had to touch the wheel" is he?

  • @MarkLLawrence
    @MarkLLawrence 9 месяцев назад +1

    Definitely would be better to weed out those people that refuse to listen to the limitations semi-autonomous vehicles currently have and blame the cars for incidents where they themselves should have been paying attention.

  • @jakthebomb
    @jakthebomb 9 месяцев назад +2

    While I see their perspective, as someone who has driven for 15 years with not a single accident or ticket on record, I would be annoyed having to take a new test. I bought my 2024 Tesla model y about two weeks ago and have used the FSD a few times, and have taken over because I didn’t feel safe with its maneuvering.
    Stupid people will be stupid regardless of how many tests you give them. Adding extra blockers seems pointless.

    • @aritakalo8011
      @aritakalo8011 9 месяцев назад +3

      "Adding extra blockers seems pointless."
      No it isn't, since it will cut down in accidents. This is classic Perfect solution fallacy. "Well that won't eliminate every single bad case, so it shouldn't be done".....
      so it isn't for example worth it for eliminating 75%, 50% or 25% of the cases. Your inconvenience is the minor cost of most likely saving thousands of lives. I think that cost is well worth paying. One can argue about exactly how strict the test should be and cost benefit analyse how much it prevents. However "it won't remove every single bad thing in the world (in it's category)" isn't a good argument. Good enough is sometimes good enough. Saved lives are saved lives, even if one can't save every single life.
      Or you can just drive private roads. Then you don't need a license. Well you need whatever license the private road owner demands. However public good roads, you get the license. It isn't for your benefit and your original just manual car license was inconvenience also. This is just new inconvenience, since world changed.
      Eventually it might just become part of the original inconvience aka semi-autonomous system understanding and proper procedure just becomes standard part of driving license overall or say an optional extra category almost every one chooses just to take as standard.

    • @jakthebomb
      @jakthebomb 9 месяцев назад

      @@aritakalo8011 What exactly would a test solve?
      This is almost as ridiculous as the safety tests required for Legal Drone Flying. Not to mention, how do you enforce who has or hasn't passed the test? If I passed the test, but my roommate didn't, how would the car know to disable the feature when she drove?
      There are far bigger fish to fry when it comes to Driving. Like better handling of repeat offenders of drunk driving, speed limiters or cars that can self report reckless drivers.
      Having used FSD for the past two weeks, I cannot see a single reason for a Test. If your car is doing something unsafe, it is on the driver to take over. That is basic knowledge and our existing Driving Tests cover everything a driver needs to know.
      At the end of the day, Tesla pushed a unproven product into production cars. They pushed a false sense of security which led to early adopters putting too much faith in a flawed system. A test won't change that.

    • @aritakalo8011
      @aritakalo8011 9 месяцев назад

      @@jakthebomb legal tests needed for drone flying aren't at all stupid. It's hundreds of grams of weight, if not even kilos, potentially 150 meters up in the air. That has a malfunction and drops on someone, it can do nasty damage. Hence it is good idea to have licensing of "you know not to to totally stupid things like fly drones over crowds, know to not fly into airport airspace and so on". Also one can't claim one didn't know, when it is time to punish the rule breakers. Since the whole point was the test and coursing was so one would know.
      It isn't that one should catch every single rule breaker or no license driver. How do you catch people who drive a heavy truck with only a light passenger car license. You don't. However the consequences of getting caught is real enough it works as deterrence.
      You do it as you do all licensing. Sometimes random stops (like say random breathalyzing or random license checks just for driving license in the first place).
      Car has semi-autonomous system and one gets stopped? Police looks at the vehicle registration, realizes it's one registered with having semi-autonomous system installed, you check for the semi-autonomous endorsement on drivers driving license.
      You never catch every one driving without license either. However in case of accident or harm, it is another aggrevating thing and punishment to add on the pile as deterrence. Plus as said random checking and checking along upon other stops greates certain potential of getting caught to work as deterrent.
      Since most people are mostly law abading upon there being basic level of rule and enforcement. People will get the course and endorsement, since that is the rule in place. They might never get checked for it, but they have heard about that news story about that guy who was in collision and was punished for driving semi-autnomous without license.
      If your room mate don't have the license, your room mate doesn't drive that car. It is just a thing to consider when getting semi-autonomous enabled vehicle. It limits the pool of potential drivers to licensed ones. Just like not everybody can drive a big truck weight parcel truck and so on.
      TLDR just like any other rule. Generic enforcement means, spot checks and checking for it after wards after some bad thing happened. Plus you don't have license, you don't drive. Too bad, so sad.

  • @Koulis_
    @Koulis_ 9 месяцев назад

    By this criteria, people should just be tested on level of idiocy.

  • @JasonTaylor-po5xc
    @JasonTaylor-po5xc 9 месяцев назад

    I treat Autopilot as just a slightly more advanced cruise control. I'm still responsible for the car, just like in ACC. I don't need an endorsement for cruise control. However, I'd be willing to get an endorsement if that would remove the 30-second nag.

  • @gary_sustainableplanet
    @gary_sustainableplanet 9 месяцев назад

    What you propose would make the roads safer and perhaps reduce confusion with heavy media reporting over accidents with driver assistance systems active. What’s missing is analysis of cost vs benefit. Once you try to quantify that, and compare it to the cost/benefit of other potential regulatory changes, it might not be worth pursuing. How many annual fatalities (ones NOT caused by deliberate abuse of these systems) might such a testing system avoid? A few dozen? A hundred? That’s about .2% of the ~38,000 annual accident deaths. For the huge time and effort (for regulators and drivers time in taking a test) is there no other change that would save more lives? In the US, driver testing is regulated by the states. So all 50 states would need to gear up to devise such a test and put it in place. How many years might that take? Self-driving tech is fast-evolving. Well-intentioned, detailed regulation of fast-moving technology is ripe for becoming irrelevant or a hindrance. Automakers already make an effort to monitor driver attention. If drivers actively seek to defeat these safeguards, that’s on the drivers. In weighing cost vs benefit, note your proposal would address only the subset of the problem caused by ignorance.

  • @robertkirchner7981
    @robertkirchner7981 9 месяцев назад

    I think it could get complicated if every new update or release required a new type-certification for the driver(s).

    • @kensmith5694
      @kensmith5694 9 месяцев назад

      That would only be true if the updates were forced on people. There are lots of folks still using Windows-10 so they don't need to get recertified on how to make Windows-11 and Windows-12 work.

  • @grantrandall1674
    @grantrandall1674 9 месяцев назад

    I can't see additional retrospective driving tests ever being mandatory although existing driver instruction and testing should incorporate aspects of autonomous driving and It may be that manufacturers may be required to introduce their customers to the autonomous systems such as Tesla doing extended handovers on the road.
    What you discuss only apply to the lower levels of autonomy. Level 5 means the car can effectively go empty or with unlicensed passengers in which case only the manufacturer is to blame should the autonomy cause an accident.

    • @aritakalo8011
      @aritakalo8011 9 месяцев назад

      Why not? Driving test already is mandatory. It would be just new part of already existing test. Since any "but muh freedom and inconvenience" gets counter with "you are using public good roads. If you want to only drive private roads, go ahead don't get the autonomous vehicle category/endorsement on your license. However you want to travel on the public and shared good public roads with semi-autonomous system active, you get the license or don't travel". Ones right to not get a license ends where one is given right to share roads with others who didn't personally approve you sharing it with them. Instead that permission giving is outsourced (as flawed as it is, but is there any other system people can propose?) to a government licensing agency and testing procedure.

    • @grantrandall1674
      @grantrandall1674 9 месяцев назад

      @@aritakalo8011 I meant an additional test for licenced drivers. Yes, I agree driving tuition and testing needs to incorporate aspects of autonomous driving controls, particularly enhancing the understanding of the levels of autonomous driving. I'll amend my original post to reflect this.

  • @vernonhampton6973
    @vernonhampton6973 9 месяцев назад +1

    ...or teach someone how to properly drive in the first place, with a manual gearbox...

  • @jameshiggins-thomas9617
    @jameshiggins-thomas9617 9 месяцев назад

    Any system which works "most of the time" and requires "immediate" intervention at those other times will be problematic. The better it is at "most", the worse that intervention requirement becomes. It is inhuman to expect constant monitoring of a system that usually works, whether that system is a machine or even a person or persons. This is the automation equivalence to the "uncanny valley" of cartoons. If there were time available to step in, a human could do it. But you need to (a) recognize that you need to react; (b) understand the current situation and choose an action; and (c) take appropriate action. *Before* you hit that pylon or drive off a cliff. That's simply too much to ask .
    On the other hand, once the system actually does better than a human overall (less damage, less death) then even that risk would be a good trade for society, but not likely financially (lawsuits against systems and their companies will be held to a much higher standard than we are). Getting there, though, is, as they say, interesting times

  • @bobnelsonfr
    @bobnelsonfr 9 месяцев назад

    I have to wonder if humans are capable of managing semiautonomous driving. Anyone who has worked in quality assurance knows that people cannot - are literally incapable of - staying attentive to a repetitive situation. We get bored. That is our nature. Despite our efforts, our attention drifts.
    Either a car is fully autonomous or it must have a full-time driver.

  • @scottmcshannon6821
    @scottmcshannon6821 9 месяцев назад

    99.9% of all "accidents" are caused by the loose nut behind the steering wheel.

  • @linusa2996
    @linusa2996 9 месяцев назад

    Recently the USAF has been testing an AI pilot and its beaten every pilot it has flown against. The resson why it wins is because its not afraid to die.

    • @linusa2996
      @linusa2996 9 месяцев назад

      @@mikewallace8087 the AI was going for head on shots down to 100m and the only reason it was not a collision was the human pilot tried to avoid the collision.
      The only pilot who beat one of these was a non AF pilot. A South Korean Sim pilot who won because he too made head on shots.

    • @rp9674
      @rp9674 9 месяцев назад

      " if he dies, he dies" - Drago.

    • @linusa2996
      @linusa2996 9 месяцев назад

      @mikewallace8087 it's whichever gets to a shooting position first otherwise it's a midair

  • @bestboff1
    @bestboff1 9 месяцев назад

    The technology will put your brain asleep. Just think about all the drunks will risk it. I do use it on long haul highway driving only.

  • @robertsyourrelative
    @robertsyourrelative 9 месяцев назад +1

    This is probably the most inane commentary I have seen on this channel. If the state or country has a fairly pathetic level of test performance to authorize actual manual driving, then what could make you think that testing could in any way alter the behavior of the bad drivers who, without autonomy, would probably be smashing their cars on a regular basis. Get real!
    Why don't you research the actual accident rates of those using FSD regularly vs the entire community? In such an analysis the blame for the accident would also have to be accounted for. Age might also play a part. I certainly feel much safer using FSD on busy streets than trying to see what all of the idiots are doing 360 degree around my car.

    • @transportevolved
      @transportevolved  9 месяцев назад

      Perhaps you could refrain from shouting at your device…

    • @robertsyourrelative
      @robertsyourrelative 9 месяцев назад

      Shouting, what shouting?
      Maybe you are confused because "FSD" is capitalized. Of do you consider a single exclamation mark to constitute a raised voice. It does not.

    • @ryanfraley7113
      @ryanfraley7113 9 месяцев назад

      No offense, but just because you feel safer using FSD, that doesn’t insulate you from harm from the dumb person you’re pointing out who may crash into your car anyway for reasons. Ask countries like Germany that have much safer roads than the US because they have much stricter driver training.

    • @robertsyourrelative
      @robertsyourrelative 9 месяцев назад

      @@ryanfraley7113 No offense taken. Yes other countries do have much more stringent driver testing than most US states. They also have smaller, more complex roars, smaller les powerful cars and the much safer roundabouts instead of traffic lights. While using FSD here I was rear-ended by a Toyota because the driver wasn't paying attention. The only solution to inattentive drivers is to have level 5 FSD (a long time coming) or totally foolproof driver attention monitoring (probably impossible). Nevertheless I have still not seen the statistics for FSD accidents compared with non-FSD accidents where the FSD driver was at fault.

  • @rogermartinez78
    @rogermartinez78 9 месяцев назад +1

    Elon has a point but he is being over optimistic, but after mid century and closer to 2100, his dream will come true.

    • @rp9674
      @rp9674 9 месяцев назад

      Hopefully sooner, but maybe not

  • @robfelts8076
    @robfelts8076 9 месяцев назад

    Not gonna lie. I had to look up loquacious. 😅

    • @firestar4430
      @firestar4430 9 месяцев назад

      Need to rewatch Goblet of Fire 😊

  • @Derpy1969
    @Derpy1969 9 месяцев назад +1

    We all have drivers licenses. We all know how to drive. Why do we need to learn how to use FSD? Either the car can drive itself or it can’t.

  • @rp9674
    @rp9674 9 месяцев назад

    I'm not a stockholder, don't like Musk. Full autonomous will save lives and provide mobility and safety to people. Better drivers would be nice, it hasn't happened yet. Public transportation is nice, doesn't work great for low population density, still has its place tho.
    The transition phase to full autonomous will be rough

    • @rp9674
      @rp9674 9 месяцев назад

      If we give up this technology someone else will run with it and own us, we gave up chips and batteries to China let's stop making the same mistake

    • @rp9674
      @rp9674 9 месяцев назад

      I'm hesitant to get in the mostly autonomous controls where you have to be more alert then human driving, it's a difficult transition point

  • @Roddy451
    @Roddy451 9 месяцев назад

    Algorithm

  • @LewdCustomer
    @LewdCustomer 9 месяцев назад

    This test idea is a baloney one. This isn't Baby America. You're absolutely safer when using ADAS correctly. Stop being frightened and learn about ADAS.

    • @transportevolved
      @transportevolved  9 месяцев назад +1

      For the record, those of us on the team who drive cars with ADAS use it.
      But clearly folks aren’t aware… and people are being hurt and worse.

    • @AnonymousFreakYT
      @AnonymousFreakYT 9 месяцев назад +1

      You said it yourself "…when using ADAS correctly." The whole point of this video is that people *don't* use it correctly far too often. That yes, if people *DID* use it correctly, things would be fine.

    • @ryanfraley7113
      @ryanfraley7113 9 месяцев назад

      @LewdCustomer The test idea is a good one even if you take ADAS out of the picture. If you live in the USA or Canada, you live in a country that gives out licenses like candy to teenagers as opposed to mandating rigorous driver training and testing. Drivers more and more should be forced into rigorous recertification. ADAS is fine for an open road where there isn’t jack you know what for traffic. In cities with crowded roads, and bad traffic? Hard pass. That’s why we need better public transport so we can give people more options and move people more efficiently.