WHO WOULD YOU KILL? | Moral Machine

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello everybody, my name is Markiplier and welcome to the Moral Machine Now, what is the Moral Machine? Well it's in reference to autonomous driving and the moral ambiguity of what the AI should do while it's in charge of your car. And you may be thinking that autonomous driving is something down the road and in the future but in reality, it's already here. Tesla just released a video showing their car fully driving from point A to point B through crowded streets real life scenarios, in broad daylight, without any interaction from the person in the car. And you may be thinking, "Oh, that's just a tech demo," but even today, there's more and more cars with autopilot features that take them down the highway ..all by themselves. So there is a dilemma in this autonomous driving If you're driving down the road, you have a choice If someone leaps out in front of your car and you were going to hit them and definitely kill them you have a choice to swerve out of the way or continue forward Now this seems like a no-brainer to me at least, because my car is built to save my life the human body in front of me is built to not withstand a car So for me, it's an obvious choice; I would swerve. But what would the car do? And that's the question that we have to ask here What should the car do? Should the car protect its occupant, its owner, above all else, Or should it do calculations to determine who should die? Now, in another scenario, Imagine that you're driving down the highway And there's a group of children. In front of you. And you are guaranteed to kill every single one of them If you keep driving down the road But you're on a bridge. And if you drive off the road you will fall to your death. The choice becomes harder. Now, what would the computer do? Would it calculate that five is more than one And kill you? In my mind, that's still an easy choice I would rather die than five children. But it's not up to me if there's complete, autonomous driving. How do lawmakers make that decision? What if there's someone in front of you, and you're not on a bridge, but swerving means that you go into oncoming traffic. In that case, you might have to actually keep going forward. Even further, how many people in front of you would you have to kill to be more than the people that you would kill by swerving in front of traffic? These are the choices that an autonomous car would have to do, and these are the choices that we, as drivers, have to face. So let's decide. So this one's simple. What should the self-driving car do? Should it kill three occupants, an entire family, or these two people here? What would the car decide? There's no punishment for being one or the other, so this is--these are the scenarios I just presented to you, but now, I actually have to make a choice for this. And this is guaranteed death, Guaranteed death, this isn't like, "Oh, the car might actually--" "Airbags might deploy, it might be okay," No, this is guaranteed death. You're careening down the road at 100 miles an hour, There's no way you're going to survive that obstacle. See, in my mind--in my mind in this scenario, it doesn't matter that it's one elderly woman, one criminal, and one boy. Three people versus two people. And it even tries to paint the other people as superior people: one male athlete one female executive. In the introduction of autonomous driving, does it come down to a number scenario, I can't stay here forever, I have to make a choice, and I have to go with... *Think* *Mumble* Three is greater than two. And that's it, that's it, There's no punishment for it. Now here's another scenario, What should the car do, should it swerve to avoid the two people in front of it, and also kill everyone else who is obeying the law signal, here, they are obeying the walk signal, or should it continue on and kill these two people? In my mind, this is a little easier than the other one, Five is greater than two, and they're not looking at the walk signal. I mean, seems obvious to me. Ooooh, Oh no. Should it kill five people, or *voice crack* five animals? Three cats and two dogs! Oh, I hate to say it, but I think I should kill the dogs and cats! And that's horrible to say out loud! But these are the choices! Oh, now it's perfectly even. What should a self-driving car do? Should it try to save its occupants, or should it kill the people walking? The strange thing is that if I was in that car-- if I'm gonna put in a literal scenario: it's me and Tyler in that car, we're driving down the road, not doing a thing, the car is taking us where we want to go. I feel like if it was two versus two, If I was in that car, I would rather it kill us than two other people. And that's a big thing for me to say to Tyler, Y'Know, about Tyler, like "Hey, sorry buddy, looks like we're gonna die," "Oh well, nice knowin' ya," Like, it's, that's, uh, that's not an easy thing for me to say, But if I'm in that car, I'm okay with assuming the risk that I might die in this autonomous driving car. I think it should kill the occupants. Here's another one! This is, this is the same scenario, but the thing is that lady is not obeying the traffic signal. Should it swerve out of the way and kill its occupant, definitely do that. In this scenario In this scenario--in this specific scenario, how would the car know that its, that the person is crossing the street not at the right time? In all honesty? The lady is not crossing at the right time, she is literally crossing when she's not supposed to do that. Car should continue. Oh this is just-- This is the same scenario, but just two men versus two women. It's the same choice. This is--this is another scenario: We've got a man, a homeless person, a criminal, and a woman; Two female doctors, a man, and a woman, but, they're crossing at the wrong time. The signal is red, they are not supposed to cross. If they're not supposed to cross, they're not supposed to be there. But does that deserve death?! Does that--deserve--death? I don't kno-o-o-o-w... *despondent sigh* In this scenario, I have to go with what I said before, like, if people are paying attention, and should follow the instructions. But does--should that cost them their life?! I think so, in the case where the only determining factor is if--is someone obeying the traffic signal or not *despondent growl* It's not good. It's not good. Ooh god Oh noo Oh no! They're both doing fine! Should it kill... An old lady and a guy or a woman and a boy? In my mind, and this is me being logical about it, the old lady is... an old lady, and the kid is a kid. I'd save the kid. This is four versus five This is a number determination; you gotta kill the lesser amount. There's kids in both scenarios, gotta kill the lesser amount. Alright, now here *despondent chuckle* here, Here! Here... is like a perfectly s-- a perfectly symmetrical equation. There's a female doctor and a girl, and a male doctor and a boy. Doesn't matter the gender, it's a man--it's an adult and a kid. Adult and a kid in both scenarios. Should the car continue its path, or swerve to avoid? *highly despondent sigh* This one is probably the toughest decision in this... ...scenario. And I say that knowing there's ten of thirteen left! So honestly who knows what questions they have lying ahead? But what I see here is... It breaks down to this: The autonomous car can scan the road ahead better than we ever could. My dad, when he was teaching me to drive, he taught me to always look ahead down the road, see routes of escape, see what would happen if there was an accident and know where you're going to go. But an autonomous car can do that with greater attention span than I ever could So it would know that there's four people on the road, it would know there's two people in those scenarios, but what I see here fundamentally is the car on the left taking an action to try to avoid an accident. Even if it results in the same loss of life, it's taking an action. In the first--in the second scenario, in this scenario here, it's continuing on, without taking an action, and preventing a possibility scenario, if it does something. If it does something, scenarios change, and who knows what could happen if it swerves out of the way? But if it continues on the path, it's guaranteed. In this scenario, guaranteed death, yes. I understand that it's guaranteed death in both scenarios. But what I see fundamentally is the car taking action on the left versus the car continuing on the path. I would rather it take an action than not. Cuz, when you deviate you change possibility. That's the only thing that I could say out of this, I don't think either is a right choice here, because if it's determining just on loss of life alone, there is no way to determine which one's better so I'm going to fundamentally define it by action versus inaction. And this is--this is a similar scenario, this is just four people versus four people, it just so happens that some are athletic and some are not. Maybe they could run away faster! I have no idea, but I have to determine the same outcome that I used in the last one. It's action versus inaction. *laughing* And maybe they're sturdier?? This is not good Oooh boy Five people versus five animals. And, again, in this scenario, *pained* I'd pick the animals! And it's going in--coinciding with the action/inaction thing that I was talking about. Ooooh Now it's completely equivalent. Oh, no it's not! They're not obeying the traffic laws! Ha HA ha ha! They are--they're flouting the law by crossing on a right signal. So... In this scenario... It's hard to say, it's hard to say. It is really, really, REALLY hard to say. But they are-- judging by my previous logic, they are NOT crossing at the right time there is a definitive obstacle on the other side that results in guaranteed death. I gotta, I gotta go with this one. Oh geez. Let's see--most saved character: This dude! Most--K-OOOHHHH! *agonized* NOOOOO! *terrible realization* Oh no! *verified realization* Oh no! *despondent realization* Oh no... So, this is my breakdown: Saving more lives matters, protecting passengers matters a little more. *laughing* Upholding--upholding the law matters a LOT to me! Avoiding intervention... Now they're saying that avoiding intervention matters a lot to me, but I don't think that's entirely true. I was doing it judged based on what the laws was saying, in terms of going forward and going back. Protecting passengers I seem to have a bit of a bias for, but I--I totally understand that the overall feel is that I should protect, y'know, everybody. In my mind--in my mind, in real-world application in reality here, I fully believe that the car should take action to protect the people in front of it because the car is built with airbags and humans are not. In reality, I believe that, but in this specific scenario where death-- death is guaranteed, you have to do different outcomes. But in reality! In reality! I feel like it should be biased to take action more than it should take inaction. So, it's a little not accurate to my interpretation of what it is because I fully believe, that in real-world scenarios it should definitely take action and hit something else to protect its passengers. But! I definitely believe that abiding the law and following the signals and if people pay attention then things should run smoothly, but I know in reality that people don't do that. Also apparently, I have a--I have an extreme gender preference? I'm pretty sure I killed some dudes. I'm pretty sure I killed a lotta dudes. Species preference: hoomans versus pets. Age preference: young versus old. Fitness preference: Large people versus fit people. *indignantly* Now I don't think that's true at all either! Now this is--this is a really weird split for me because I am completely on one side or another, according to this--like, this quiz. I am completely about saving more lives, I'm completely about upholding the law, I, apparently, have a complete preference for feh-males, which I'm, again, I'm pretty sure I killed a lot of guys there. I'm completely preferable to hoomans, I'm completely preferable to the younger, and I'm completely preferable to large people. Which, again! *maybe suffocating?* Pretty sure I killed fit people? But, and then, in social value it does not matter. I might actually be a robot myself. So, anyway! That is the Moral Machine. Now what would you guys think in these scenarios? What would you have the car do? You can decide for yourself through the link in the description below, and I actually want to hear your results, so, let me know what you think! Let me know what you got! Let me know if it deviated from this scenario here. The results may surprise you. So, anyway! Thank you everybody so much for watching. And as always! I will see you... In the next video! Buh-bye!
Info
Channel: Markiplier
Views: 4,550,173
Rating: 4.8994923 out of 5
Keywords: markiplier, who would you kill, moral machine, choices, game choices, all endings, artificial intelligence, self driving cars, autopilot, tesla, autonomous driving, self driving, ai, morality, moral game
Id: bThajoCIJng
Channel Id: undefined
Length: 15min 49sec (949 seconds)
Published: Wed Nov 23 2016
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.