The people in the car are the most protected so the car should sacrifice itself and its passengers because they are the most likely to survive (it also has the most control over itself) and the car can't control the other people and we must remember that 10% of accident may not be driver error but that doesn't mean that the drivers couldn't avoid it. Your chanced to die by your own hand is way higher than by the cars. (it could be as low as 1 in a billion)
Driverless cars should be programmed to communication with each other. This will dramatically reduce vehicular collisions. Additionally, they should also be programmed not to crash into or run over any obstacle, including humans.
If the car can be safer overall by a large enough margin, it could have protect passengers as it's primary aim and still be involved in less fatal accidents with pedestrians and other cars than current driven-by-occupant cars.
How would a robot choose between hitting an oncoming car or diving off the bridge when a crash is imminent? It's happen'd to me and I chose to hit the oncoming car, it was a light wreck and everyone was ok, but if the car was self driven would it have driven my car off the bridge? Driverless cars will never be a thing, and if it does, the lawsuits would dismantle the whole system in months.
Pre-programmed ethics will impact everyone even if you don't own a car. Think about getting into a driverless taxi or uber. Those would no longer be programmed to save the rider at any cost. It would favor less total harm instead of your life.
Simple Answer: driverless cars should initially prioritize the driver to replace the entire market with them over 20 years (they could even offer both options of priority for the altruistic market). Then we would eventually update all the software on the cars to be utilitarian. This will reduce the deaths greatly even at first, since the cars will still be much better drivers. Also we will have to make it illegal to drive regular cars eventually for safety.
Why are people laughing about the eject button? That actually sounds like a good idea... although... it's maybe something that should also be triggered automatically. Not sure...
because it's a thought experiment about when there are no safe exits and the car has to make a tough choice, its not about making up some tech that can save you
All this talk about driverless cars just assumes they would be so much safer than normal cars. I feel like isolated accidents would decrease, but the risk of something catastrophic happening to EVERYONE's car at the same time would increase. I've seen enough cases of companies being hacked and overloaded, which shuts down the service for everyone. If that happens to whatever network these cars are connected to, then all the car accidents which happen over the span of a year could happen at once.
It could be decided that the auto-pilot would only be able to be connected to via a wire (i.e. not wirelessly) and, therefore, wouldn't communicate with the other cars surrounding it and would rely exclusively on its sensors and AI. So if you wanted to hack in, you'd have to physically plug into the car. They could also look at how the auto-pilot for planes work. I've yet to hear of anyone hacking into that and crashing a plane.
This dilemma is one that man must go through with many of the new technologies being introduced in society. Then comes the issue of the old Star Trek train of thought of what does one do? Does one want to benefit the many or the few? And that creates division with the many! Interesting dilemma.
Self driving AI should not be able to harm a human, I think it should prioritize the protection of its owner but so long as that means it does not make a decision that would harm another person. In the example, the car should just keep going forwards, while it kills the most people, it means that the car has not made a decision on the matter, of course if there is a viable option to save everyone it should take that.
The chance of dying if a car hits you is much higher for the pedestrians than for the one person in the car. My car should harm me rather than run over two other people crossing the street on the green signal. If the signal is red the car should run them over even if there is a pregnant woman or two kids.They should not cross the street when the red light is on.
But then again, is the owner behind the driverless car at fault for the accident? Or is it the company that created the car? If it's the company, then you can expect prices of driverless car to skyrocket. If not, then you just give corporation a massive loophole to take zero accountability when someone is harmed due to their product. Excellent future
Why don't the car just brake? Your best bet for survival in any imminent accident is to drive dead straight while applying hard brake. Autonomous cars doesn't drive sleepy, rushed, drunk, visual impairment, old age, unpredictable velocity changes, road rage. It drives with discipline, focus. It drives smooth and predictable. It is the non-automated cars that are the danger. There are so many morons driving vehicles.
AI will be trained on billions of cases, recording drivers, and in edge cases like this it will be trained on the Win Win situations. not lose win, lose lose, or win lose.
The literal problem will more likely be circumvented by superior average speeds being achieved without deadly higher speeds, larger distances between cars, and "training" for AI chain reactions that would minimize "any damage" (aiming to not hit anything), and even softer, outer-cushioned cars. You can even reach to a point where you need to make the AIs face the dilemma of hitting a person or more, but having reduced the odds that it would result in a fatality or grave injury to the closest possible of negligible. The evaluation will be probably ultimately set by legal standards, which I guess will ultimately be "utilitarian"/numeric, minimize the number of people harmed and severity of harm. With more people harmed being better than one person killed, anyway. Not a simple "equation" I can come up with here, but something along these lines. How does legislation actually deals with the situation of people actually being put by a sadistic villain in front of a lever that controls the direction that a car will go on a bifurcation with different numbers of people tied to the rails?
To minimize the number of people being harmed seems to be the best solution but in the case that you're the passenger and your car decides to go to the right side because on the left side there are some pedestrians and the car kill only two instead of maybe five. Butnow these two people are your wife and your child because they were shopping and simply in the wrong place at the wrong time? Could you live with yourself seeing your car driving your family dead while you're in the car? I wish you all the best with the rest of your life after the accident.
Self-driving cars, trucks, busses, and other vehicles should be implemented everywhere around the world. We need more self-driving vehicles with minimal input from driver. Technology will controls everything, including our lives. Machines need minimal input and maximum output from computers.
What about people who enjoy driving? That is important as well. I hope driving will still be allowed as those who love cars are more careful that average drivers.
@@SR2brother that's not important. The only important is about alogarithm, computers, sensors, softwares and hardwares, artificial intelligence, deep learning, machine learning, internet of things, big data, cyber physical system, edge computing, cloud computing, high-speed internet, etc that related to self driving cars. Why did you care more about peoples who love driving experience anyways ? They're outdated, doesn't want change, just care for themself, selfish. They should accept technological changes for automobiles, government regulation, and more internet-connected devices.
It looks easy to solve: First the cars has a set of rules to save the passenger without harming anyone (everybody wins) If that fails, minimise deaths. Where the passenger is the bad guy no one in the street asked for you to drive there. Problem solved, thats it If you have a family in the car vs 1 guy on the street, hit the empty wall hope for the best, make better car parts next time or yes turn in to maintenace as required you a.h. family 🤣.
It was my answers to the Ethical Machine that was presented. :-) I thought the animals were forced into the situation so they should get extra protection. Kids have less social bonds and less competens is lost, so fewer are affected. The people in the car have obviously chosen to be there so if everything else being equal they should pay the price, but not so much that the total loss of life is increased. Jay walkers probably contributed to the accident so they deserve less protection. Would I buy such a car? No, but I think car ownership goes away with self driving cars, you order one for each trip. I have not refused to ride with known bad drivers so I would take a ride with these cars as well. :-)
The Auto Industry taught people to HONKkkkkk when they lock and arm their cars ..... Some of us care about those with hypertension, ptsd, cardiac disease, migraines, and more to do that. We care that people can sleep, not be woken up, and have peace of mind during the day. Please join us. ****** We lock with light flash only. Thank you.
this seems cool, but all this is doing is taking more jobs from the people. first there is robots that do the making of cars ect, then now they talking about robots at fast food places and that is just the beginning cause now they are controlling the driving industry. and they also have trucking companies doing this same thing. before you know it there will be no more jobs for the people. even at warehouses there are robots that get the product. just saying.
I know I want to be a trucker and am begging to get into the industry and it’s sad that you have to actually worry about a robot taking over you’re job what has this world come too I don’t know if I should become a trucker I’m good at driving that’s all I like and the robots might take it away from me
The people in the car are the most protected so the car should sacrifice itself and its passengers because they are the most likely to survive (it also has the most control over itself) and the car can't control the other people and we must remember that 10% of accident may not be driver error but that doesn't mean that the drivers couldn't avoid it. Your chanced to die by your own hand is way higher than by the cars. (it could be as low as 1 in a billion)
Driverless cars should be programmed to communication with each other. This will dramatically reduce vehicular collisions. Additionally, they should also be programmed not to crash into or run over any obstacle, including humans.
If the car can be safer overall by a large enough margin, it could have protect passengers as it's primary aim and still be involved in less fatal accidents with pedestrians and other cars than current driven-by-occupant cars.
people are 99.995% safe way harder to get cars just as safe.
@@ChickenPermissionOG HAH hahaha hahahaha no people are not even close to that safe. Hahaha
How would a robot choose between hitting an oncoming car or diving off the bridge when a crash is imminent? It's happen'd to me and I chose to hit the oncoming car, it was a light wreck and everyone was ok, but if the car was self driven would it have driven my car off the bridge? Driverless cars will never be a thing, and if it does, the lawsuits would dismantle the whole system in months.
hey why don't you program it to hit the brakes? you don't have to hurt anyone.
alternatively, swerve into the one person, then swerve back to hit the crowd, and then hit a wall. kill everyone.
Such an honor to be on the slate of speakers with you. I have thought often about your talk.
I thoroughly enjoyed your ted talk as well Miss Wu.
Pre-programmed ethics will impact everyone even if you don't own a car. Think about getting into a driverless taxi or uber. Those would no longer be programmed to save the rider at any cost. It would favor less total harm instead of your life.
Simple Answer: driverless cars should initially prioritize the driver to replace the entire market with them over 20 years (they could even offer both options of priority for the altruistic market). Then we would eventually update all the software on the cars to be utilitarian. This will reduce the deaths greatly even at first, since the cars will still be much better drivers. Also we will have to make it illegal to drive regular cars eventually for safety.
So you're forcing people to drive driverless cars? Wow, you're the definition of a dictator. Get out of America
5:50 explains exactly what happens when you reject objective morality.
Why are people laughing about the eject button? That actually sounds like a good idea... although... it's maybe something that should also be triggered automatically. Not sure...
i know. it's a safe way to ensure that the only damage is to the car.
because it's a thought experiment about when there are no safe exits and the car has to make a tough choice, its not about making up some tech that can save you
All this talk about driverless cars just assumes they would be so much safer than normal cars. I feel like isolated accidents would decrease, but the risk of something catastrophic happening to EVERYONE's car at the same time would increase. I've seen enough cases of companies being hacked and overloaded, which shuts down the service for everyone. If that happens to whatever network these cars are connected to, then all the car accidents which happen over the span of a year could happen at once.
Theres no way i would buy a car that someone else can drive while im in the damn seat lol
It could be decided that the auto-pilot would only be able to be connected to via a wire (i.e. not wirelessly) and, therefore, wouldn't communicate with the other cars surrounding it and would rely exclusively on its sensors and AI. So if you wanted to hack in, you'd have to physically plug into the car. They could also look at how the auto-pilot for planes work. I've yet to hear of anyone hacking into that and crashing a plane.
Blockchain technology would make that a near impossibility
Not if each individual car has its own AI. You would have to hack into every car. You're assuming that all the cars are running on one system.
Zoey Chevalier that's when all cars are connected to one another. This was not the topic of this video. He just assumed a singular self driving car 😑
This dilemma is one that man must go through with many of the new technologies being introduced in society. Then comes the issue of the old Star Trek train of thought of what does one do? Does one want to benefit the many or the few? And that creates division with the many! Interesting dilemma.
Just add random() to that algorithm ;)
#yolo
Self driving AI should not be able to harm a human, I think it should prioritize the protection of its owner but so long as that means it does not make a decision that would harm another person. In the example, the car should just keep going forwards, while it kills the most people, it means that the car has not made a decision on the matter, of course if there is a viable option to save everyone it should take that.
The chance of dying if a car hits you is much higher for the pedestrians than for the one person in the car. My car should harm me rather than run over two other people crossing the street on the green signal. If the signal is red the car should run them over even if there is a pregnant woman or two kids.They should not cross the street when the red light is on.
But then again, is the owner behind the driverless car at fault for the accident? Or is it the company that created the car? If it's the company, then you can expect prices of driverless car to skyrocket. If not, then you just give corporation a massive loophole to take zero accountability when someone is harmed due to their product. Excellent future
Why don't the car just brake? Your best bet for survival in any imminent accident is to drive dead straight while applying hard brake.
Autonomous cars doesn't drive sleepy, rushed, drunk, visual impairment, old age, unpredictable velocity changes, road rage.
It drives with discipline, focus. It drives smooth and predictable.
It is the non-automated cars that are the danger. There are so many morons driving vehicles.
AI will be trained on billions of cases, recording drivers, and in edge cases like this it will be trained on the Win Win situations. not lose win, lose lose, or win lose.
These cars are fantastic. And they will improve exponentially over time.
Agree.
The literal problem will more likely be circumvented by superior average speeds being achieved without deadly higher speeds, larger distances between cars, and "training" for AI chain reactions that would minimize "any damage" (aiming to not hit anything), and even softer, outer-cushioned cars. You can even reach to a point where you need to make the AIs face the dilemma of hitting a person or more, but having reduced the odds that it would result in a fatality or grave injury to the closest possible of negligible. The evaluation will be probably ultimately set by legal standards, which I guess will ultimately be "utilitarian"/numeric, minimize the number of people harmed and severity of harm. With more people harmed being better than one person killed, anyway. Not a simple "equation" I can come up with here, but something along these lines.
How does legislation actually deals with the situation of people actually being put by a sadistic villain in front of a lever that controls the direction that a car will go on a bifurcation with different numbers of people tied to the rails?
Totally agreed, was thinking the same.
To minimize the number of people being harmed seems to be the best solution but in the case that you're the passenger and your car decides to go to the right side because on the left side there are some pedestrians and the car kill only two instead of maybe five. Butnow these two people are your wife and your child because they were shopping and simply in the wrong place at the wrong time? Could you live with yourself seeing your car driving your family dead while you're in the car? I wish you all the best with the rest of your life after the accident.
Self-driving cars, trucks, busses, and other vehicles should be implemented everywhere around the world. We need more self-driving vehicles with minimal input from driver. Technology will controls everything, including our lives. Machines need minimal input and maximum output from computers.
What about people who enjoy driving? That is important as well. I hope driving will still be allowed as those who love cars are more careful that average drivers.
@@SR2brother that's not important. The only important is about alogarithm, computers, sensors, softwares and hardwares, artificial intelligence, deep learning, machine learning, internet of things, big data, cyber physical system, edge computing, cloud computing, high-speed internet, etc that related to self driving cars. Why did you care more about peoples who love driving experience anyways ? They're outdated, doesn't want change, just care for themself, selfish. They should accept technological changes for automobiles, government regulation, and more internet-connected devices.
It looks easy to solve:
First the cars has a set of rules to save the passenger without harming anyone (everybody wins)
If that fails, minimise deaths. Where the passenger is the bad guy no one in the street asked for you to drive there.
Problem solved, thats it
If you have a family in the car vs 1 guy on the street, hit the empty wall hope for the best, make better car parts next time or yes turn in to maintenace as required you a.h. family 🤣.
bit of a tangent, but them rascal robots in the Will Smith film really needed that zeroth law...
RUclips subtitles could not recognize the names of his colleagues. could someone repeat them ?
Just ban cars and put all resources into public transportation 👀
To me there is no dilema. I DONT WANT A DRIVERLESS CAR!
It was my answers to the Ethical Machine that was presented. :-)
I thought the animals were forced into the situation so they should get extra protection. Kids have less social bonds and less competens is lost, so fewer are affected. The people in the car have obviously chosen to be there so if everything else being equal they should pay the price, but not so much that the total loss of life is increased. Jay walkers probably contributed to the accident so they deserve less protection.
Would I buy such a car? No, but I think car ownership goes away with self driving cars, you order one for each trip. I have not refused to ride with known bad drivers so I would take a ride with these cars as well. :-)
The Auto Industry taught people to HONKkkkkk when they lock and arm their cars .....
Some of us care about those with hypertension, ptsd, cardiac disease, migraines, and more to do that. We care that people can sleep, not be woken up, and have peace of mind during the day. Please join us.
****** We lock with light flash only.
Thank you.
Fact is in most cases all you can really do is hit the brakes if something happens in front of you.
American are ready not give up cars
@jan simonides lol...
this seems cool, but all this is doing is taking more jobs from the people. first there is robots that do the making of cars ect, then now they talking about robots at fast food places and that is just the beginning cause now they are controlling the driving industry. and they also have trucking companies doing this same thing. before you know it there will be no more jobs for the people. even at warehouses there are robots that get the product. just saying.
I know I want to be a trucker and am begging to get into the industry and it’s sad that you have to actually worry about a robot taking over you’re job what has this world come too I don’t know if I should become a trucker I’m good at driving that’s all I like and the robots might take it away from me
Is English his natural language?
ЛЮДИ НУЖНО НА БЕСПИЛОТНЫЙ АВТОМАБИЛЬ УСТАНОВИТЬ КАТАПУЛЬТУ, ЧТОБЫ В СЛУЧАЙ ОПАСНОСТИ КАТАПУЛЬТИРОВАЛА ПАССАЖИРА!!!!!!!!!!!@@!!!!!
john cena
The car should save the driver else nobody witll buy this shit! Other consequences do not matter. I hate this subject.