Discount Waymo Gets Stuck in Crosswalk |

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024

Комментарии • 98

  • @Bobrogers99
    @Bobrogers99 5 месяцев назад +16

    Two situations where a human driver wouldn't have been stuck, and examining them would be a good opportunity for the Waymo team to update their programming.

    • @D00MINIK
      @D00MINIK 5 месяцев назад +1

      At least the second one was done by a human. When it said "Our team is working to get you back on track", and than it said "You're back on your way" a human took over controll.

  • @JJRicks
    @JJRicks 5 месяцев назад +7

    "we're cookin' " oh yeah bois we cookin

  • @technotion_
    @technotion_ 5 месяцев назад +6

    Has anyone ever done a full walk through of the Waymo menus in a while? I saw you change the temp and would be interested in a breakdown of how all the features look and feel on the screen.

    • @technotion_
      @technotion_ 5 месяцев назад +2

      Interesting design choice from the Waymo team to change "personal belongings" to "Phone Keys or Bag"

    • @KevinChen5
      @KevinChen5  5 месяцев назад +1

      Don’t think this has been done after recent updates. Is there anything in particular you would want to see?

    • @technotion_
      @technotion_ 5 месяцев назад +7

      @@KevinChen5 I would be interested in a walk through of all UI elements from pre-drive, during drive, and post drive. How much can you do on the screen without the app? What can you do only on the screen and not in the app? Also, what does the "pull over" button do? Really appreciate your content, Kevin!

  • @MelissaAndAlex
    @MelissaAndAlex 5 месяцев назад +4

    The guy honking 😂

  • @Aimon30boy
    @Aimon30boy 5 месяцев назад +3

    Can u try FSD in that same scenario where it got stuck where people went around you if ur able too? I think it would be really interesting how they behave different

  • @JJRicks
    @JJRicks 5 месяцев назад +3

    8:20 excellent deduction sir, dedication to the content 🫡

  • @AbhinavGupta777
    @AbhinavGupta777 4 месяца назад

    I think when the car first stops at the intersection of Townsend & 4th st, look at the traffic lights on the left side of the st which is green, and on the right side of the st (which isn't as clear to me but appears to be glitching between yellow/red)

  • @johnfurr6060
    @johnfurr6060 5 месяцев назад +1

    I like the visualization and route planner a lot! I use Tesla FSD quite a bit and the new version really shortened the route planner (blue noodle) and I'd really appreciate seeing it again to anticipate the cars moves better.

  • @fecorekasi
    @fecorekasi 5 месяцев назад +2

    Driving stack needs time to reboot :D

  • @distractos
    @distractos 5 месяцев назад

    About the Pittsburg left. Here in Boston we typically wait or do a slow roll for the left turn traffic to go through incrementally for a smoother flow of traffic. Otherwise, they may have to wait for a full light cycle before moving, which would slow down traffic on our narrower streets. It's possible that's what the person at 11:23 was doing.

  • @TeslaDo_d
    @TeslaDo_d 5 месяцев назад +9

    100% a human intervened remotely to continue at stuck positions. Don't be fooled. They are not L5 autonomous.

    • @communityband1
      @communityband1 5 месяцев назад +1

      Waymo has never claimed to be Level 5. Level 5 does not and will not ever exist unless AI literally passes up humans in every way, and even then, it's very possible it won't take the form of cars driving themselves. Waymo is Level 4.

  • @manwingchi9156
    @manwingchi9156 4 месяца назад

    If it changes you by mileage, then it made prefect sense going a loop.

  • @germainmachleidt2321
    @germainmachleidt2321 5 месяцев назад

    Funny how the BMW driver changes their mind about going straight 7:09 ... 😄

  • @Arron_Mottram
    @Arron_Mottram 5 месяцев назад

    10:25-11:20 You can see on the display how two pedestrians who crossed the road stopped to watch the autopiloted car drive on

  • @mikafiltenborg7572
    @mikafiltenborg7572 5 месяцев назад +6

    Tesla's FSD competition is coming All over them self 😂

    • @xploration1437
      @xploration1437 5 месяцев назад +2

      Waymo is trash

    • @communityband1
      @communityband1 5 месяцев назад

      One thing people don't understand is that Waymo and FSD operate under two completely different rule sets. FSD is designed to take chances when it isn't sure whether it's correct or safe. Tesla sets the threshold _pretty_ high for safety, but they absolutely let the software take risks when they aren't 100% sure it's safe or aren't 100% sure they are identifying objects correctly. FSD is optimized to create an experience that the human driver will enjoy, and it relies on the human driver to intervene if it makes a critical mistake. Waymo on the other hand does not. If it has any doubt at all, it does not act and instead seeks remote assistance.
      Something in this situation seemed to create a seed of doubt for the Waymo. We can't say for certain, but it seems a fair guess that it fell within that safety tolerance gap that FSD and Waymo have between them. If and when Tesla decides to truly make FSD autonomous, operating with no human driver inside, they will become much, much more cautious and risk averse than they are today. Comparing them right now is apples to oranges, because FSD is not behaving at all like it will behave if it absolutely can't screw up.

    • @xploration1437
      @xploration1437 5 месяцев назад

      @@communityband1 you missed the real difference. Waymo is a silly map based system as to where Tesla is not. Tesla deals w 9th real time driving situations based on cameras alone. Waymo is covered in many very expensive sensors. Waymo won’t make it ,ugh longer.

    • @communityband1
      @communityband1 5 месяцев назад

      @@xploration1437 That is incorrect. Waymo is not "map-based." It operates much the same way FSD does, creating a dynamic map of the environment on the fly and reacting to it. The distinction here is that Waymo ALSO has a detailed 3D map of the environment stored in memory that it then compares this live data to. It is essentially memory, and it's similar to the way humans operate. We remember things about the places we've driven. We remember where potholes are. We remember where lanes divide and merge. We remember where stop signs are supposed to be, even if they've been knocked down. We remember where curbs are, even if they're invisible because of snow. City traffic operates much better because the vast majority of drivers have a memory of the roads and how things work. Waymos maps are made by the vehicles themselves, and the concept is more like the way humans operate than FSD, not less.
      Waymos don't _need_ the maps to be accurate to operate. When the dynamic map created on the fly doesn't match the stored map, the vehicles simply drive the same way FSD does. The thing is though, that can be extremely rare, and this is where autonomous vehicles can have a big advantage over humans. They can share their discoveries. Instead of everyone encountering a new pothole the first time, you have the potential for just one vehicle to discover it and add it to the saved memory for all vehicles to know about. We all encounter construction zones on a fairly regular basis. But we're almost never the _first_ ones to encounter them.
      Tesla's lack of sensors is not an advantage. First of all, Tesla is by no means better at image recognition than Waymo. Waymo comes from Google, and Google leads the world in this area, thanks to billions of users feeding them unlimited data. So if Waymo finds it advantageous to use multiple types of sensors for redundancy, you shouldn't be quick to dismiss it. But the real problem for Tesla is one that FSD users have pointed out. They have blind spots, and they can be blinded. We recently saw a video where rain droplets on the rear camera blinded FSD to traffic from behind, and it may have been a factor in the vehicle nearly causing an accident before the driver intervened. FSD also has blind areas, such as just in front of the car. If a person is on the ground close to a Tesla before FSD starts, it can be impossible for FSD to know they're there. I bring up this particular example because a scenario like this is what caused Cruise to be taken off the roads in California. Having more sensors and having sensors of different types is what enables Waymo to have the high confidence it needs to operate. And this may ultimately force Tesla to match, because if an accident EVER happens due to sensor limitations that Tesla knows about and which their competitors solved by not being so cheap, Tesla's liability would be massive.

    • @xploration1437
      @xploration1437 5 месяцев назад

      @@communityband1 your ignorance is really showing.

  • @JD-kf2ki
    @JD-kf2ki 5 месяцев назад

    If it were Tesla’s FSD, it would’ve long gone already at that intersection.

  • @davidwarren78
    @davidwarren78 5 месяцев назад +3

    Mate, what do these rides normally cost?

    • @KevinChen5
      @KevinChen5  5 месяцев назад +3

      $20 normally, or $15 with the weekday daytime promotion. Screenshot: files.kevinchen.co/a/v1/icgnm7io475dl3crl8l46vgy/IMG_3250.png

    • @davidwarren78
      @davidwarren78 5 месяцев назад +1

      @@KevinChen5 thank you. Something to consider as some of your viewers may not be in the USA and have no idea about waymo. I'm in Australia, never heard of Waymo, somehow found your videos and went "wtf". Keep them coming.

    • @brightya
      @brightya 5 месяцев назад +1

      @@davidwarren78 It might finally become legal for Waymo to drive in California only a few weeks ago

  • @designbymickey
    @designbymickey 5 месяцев назад

    I can't imagine what it would feel like while being in this car both in a positive and negative sense. This video really shows some interesting flaws. Completely random: I'd love to ask you some questions in an interview (I'm a UX designer and I'm doing a thesis project for my master's degree!)

  • @hybridsnowleopard
    @hybridsnowleopard 5 месяцев назад

    This thing witnessed a fraudulent repo recently.

  • @ZoomZoomMX3
    @ZoomZoomMX3 5 месяцев назад

    The way it turns the wheel when the car is not moving is destroying the cars steering system

    • @I_dont_want_an_at
      @I_dont_want_an_at 5 месяцев назад

      maybe, maybe not. Either way, this thing is a piece of crap. If it EVER does things this dumb, it shouldn't be out there. it proves it lacks reliable intelligence

    • @xploration1437
      @xploration1437 5 месяцев назад

      Tires, not steering.

  • @guocity
    @guocity 5 месяцев назад +1

    does waymo avoid pothole?

  • @technotion_
    @technotion_ 5 месяцев назад

    Interesting bug. It not connecting to support or turning on the hazards for that long leads me to believe that despite being stopped for 3+ min in the road, the driver thought everything was operating as normal perfectly fine. I also saw that stop sign flash on the viz before entering the intersection, does that have anything to do with this?

    • @findlisa5
      @findlisa5 5 месяцев назад

      Right, I think the Waymo flashed the stop sign on its display because it recognized the red and white "DO NOT ENTER" sign on the right as a stop sign. The Waymo knew that it was, nevertheless, legal to continue straight on Townsend St (because of its maps), and it merged the presence of a stop sign with that of a functioning traffic light and tried to obey both simultaneously. It stopped for the stop sign initially, by making a complete stop even at a green light, then waited for the red light AND waited its turn at what it considered a busy 2-way stop sign during the green light. Waymo needs to work with SF traffic engineers and get the "DO NOT ENTER" sign at 4th and Townsend to face the correct direction as well as any other signs like this one.

    • @technotion_
      @technotion_ 5 месяцев назад

      @@findlisa5Arguably Waymo should be able to tell a stop sign from a do not enter but generally I agree that more cooperation between muni and Waymo is needed

    • @afelso
      @afelso 5 месяцев назад

      Stop sign came on as soon as the Waymo detected the tram. And this somehow messed up the driver.

    • @findlisa5
      @findlisa5 5 месяцев назад +1

      @@technotion_ Good point. Let me correct that. I think the Waymo saw the "DO NOT ENTER" sign, and wrongly interpreted it to pertain to traveling NW on the NE side of Townsend St, which the Waymo was doing at the time. That was incorrect, since that road usage is lawful. Instead, that sign pertains to going SW on the NW side of 4th St, which is NOT lawful road usage.
      I think the problem is that the right lane on Townsend St. is so far from the right side of the road, that any person or machine could read signs meant for drivers who are heading SW on 4th St. It looks like Waymos do not try to see the way a sign is pointed. Instead, it looks like Waymos just try to read those signs. If Waymos can read these signs, then Waymos figure those signs pertain to their driving, which here was wrong, but ultimately was fixed a hazard-light-related reset.

    • @findlisa5
      @findlisa5 5 месяцев назад +1

      @@afelso I thought that at first too, but I think that is wrong. Note that that Waymo display shows a red stop sign starting at 3:54, before the tram on the right even arrives at the intersection. The hardest part is that it gets genuinely goofed up, because there are times later that everything is gone (trams and all other traffic), yet the Waymo still doesn't go. There, it seemed like a reset must have occurred after turning on the hazard lights, and that did clear the problem, whatever it was. My guess is that allowed the Waymo to override what it saw as a "DO NOT ENTER" sign.

  • @I_dont_want_an_at
    @I_dont_want_an_at 5 месяцев назад

    if it EVER does things like this, it shouldn't be out there without a human driver. It proves it lacks reliable intelligence. Maybe you can argue it always makes safe mistakes, never dangerous mistakes. But really...

  • @khotan
    @khotan 5 месяцев назад +10

    I am sure FSD 12.3.3 don't have this issue.

    • @JJRicks
      @JJRicks 5 месяцев назад +8

      yea it'll drive you into a wall

    • @blessguy5330
      @blessguy5330 5 месяцев назад +3

      @@JJRicksI don’t know about that but It Is still a little sketchy but a huge Improvement still nothing to compare Waymo to. 😂

    • @thewatcher5822
      @thewatcher5822 5 месяцев назад +7

      @@blessguy5330 I actually think Tesla have solved autonomy. It is not perfect (hence being supervised) But all the pieces are now in place, Tesla just have to put them together. I wouldn't be surprised if Tesla are operating level 4 somewhere next year.
      Waymo is good, but has huge limitations.

    • @ddud4966
      @ddud4966 5 месяцев назад

      ​ @thewatcher5822 Not even just Tesla but ever since 2023 or so, anyone can come along with a big driving dataset and a bunch of compute, and train a world model for e2e driving. Tesla is leading when it comes to these world models but it's probably just a few years gap before there is some open source project on Github that basically does the same thing. Not only laying the groundwork for driving, but building something that accrues to future robots that will go into any random kitchen and make you a sandwich and then make your bed.
      Handcoding some C++ driving planner like a Waymo driver in 2024 is just farcical, doomed to be replaced like the old days of computer vision where people hand-tuned feature detectors and all that crap.

    • @eli128
      @eli128 5 месяцев назад +1

      Tesla and Waymo both live in the same period where they're forced to reckon with the fact you can train these giant foundation world models that automatically learn all the structures needed to drive end-to-end anywhere on earth, which you couldn't do only like two years ago. Tesla threw out their entire 300k loc C++ stack and Waymo will have to do the same because handwriting some driving robot in 2024 is just utterly farcical now. It's like old school computer vision where you're still trying to hand-tune feature detectors in the 2010s, no one does this. And before long there will be some free Github project that basically does the same thing.
      The only thing they're missing now is real agency, like it's all purely predictive there is no way for the thing to really formulate some plan like some Go-playing bot. Maybe it's good enough for driving but not sure if it will get us cooking and cleaning robots.

  • @ovalwingnut
    @ovalwingnut 5 месяцев назад +1

    What a load of carp.

    • @beehappy7797
      @beehappy7797 5 месяцев назад +1

      Musk is laughing all the way to the bank.

  • @beehappy7797
    @beehappy7797 5 месяцев назад

    One or two of these are fine, but too many will create traffic jams across the city. There is no indication that they will improve significantly anytime soon.

  • @technotion_
    @technotion_ 5 месяцев назад +3

    Pittsburgh Left at 11:24 was completely unacceptable in my book due to pedestrian clearly in crosswalk when turning. That is blatantly against the law.

    • @blessguy5330
      @blessguy5330 5 месяцев назад +1

      The turn was more then 70% complete when that guy jumped out

    • @technotion_
      @technotion_ 5 месяцев назад +1

      @@blessguy5330 He didn't exactly "jump out", ped walking on sidewalk with possible intent to cross should be treated that way IMO. It's a dangerous game between acceptable and unacceptable in these situations, and I don't think autonomous systems should be playing those games with our most vulnerable road users, especially for something like this which is nothing more than a time saver.

    • @zachb1706
      @zachb1706 5 месяцев назад +3

      Guy was nowhere near the car, it’s a perfectly safe manoeuvre.
      If autonomous vehicles have to follow the letter of the law then they will fail. 1. Humans don’t even follow them, many are excessive, humans won’t want an autonomous vehicle that’s slower than them. 2. These laws are written with humans in mind, not autonomous cars with a 360 surround view, insane reaction times and super accurate predictions of where things will move. Things like rolling stops will become standard in autonomous cars because they don’t need to stop.

    • @blessguy5330
      @blessguy5330 5 месяцев назад

      @@technotion_ Yes we have very different opinions because the car was legit doing the turn when that guy decided to go on the crosswalk since the car was far away.

    • @ArielChelsau
      @ArielChelsau 5 месяцев назад +1

      This is unreasonable. The car has to be assertive in order to move efficiently. Without that they would take forever to reach anywhere. The point of transportation isn't safety, it's moving from A to B.

  • @sonicmoore6196
    @sonicmoore6196 5 месяцев назад

    LOL Waymo is trash! 😂

  • @ZoomZoomMX3
    @ZoomZoomMX3 5 месяцев назад +2

    Get this garbage off the roads.
    Pay a driver if you cant drive.

    • @itsyo42
      @itsyo42 5 месяцев назад

      Hahahahaha

    • @JJRicks
      @JJRicks 5 месяцев назад +1

      Cry about it

    • @quickpstuts412
      @quickpstuts412 5 месяцев назад

      😂

    • @MaxBrix
      @MaxBrix 5 месяцев назад +2

      Most people can't drive and do not pay a driver to do it. They just drive bad.

  • @zepplin839
    @zepplin839 5 месяцев назад

    Self Driving DOA?

  • @xploration1437
    @xploration1437 5 месяцев назад +1

    Waymo is trash

  • @kenion2166
    @kenion2166 5 месяцев назад

    I feel like Waymo made no improvements in the last years, looks and feels exactly the same. Steering Wheel also still jerks all over the place...