181. Issues of Utilitarian Ethics | THUNK

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
have you ever noticed how Godzilla always seems to target things like powerlines and dams he's a real utility monster we're going to be talking a little about ethics and I'm sure that you know a thing or two about the subject but let's take it from the very top we see people acting in certain ways and we feel that those actions are good or bad or neither sometimes we disagree with each other about those evaluations or we feel conflicted about a scenario that pits our moral instincts against each other still it feels important that we be able to tell good from evil and those inconsistencies lead us to wonder what is it exactly that makes something right or wrong how can we explain the rightness or wrongness of an action especially to someone who disagrees that's the role of ethics philosophers attempt to develop some rigorous foundation for our feelings about morality to try and figure out the rules that govern right and wrong there have been a number of promising theories advanced over the past few millennia about what those rules are or maybe even what they ought to be one example utilitarianism suggests that what our moral intuitions are pointing at is some sort of maximum ultimate well-being for everyone that all of morality can be boiled down to increasing welfare and decreasing suffering many people find utilitarianism compelling as a model of morality because it agrees with our intuitions in many situations and provides unambiguous criteria to resolve sticky disputes in conflicts torturing people needlessly results in lower collective well-being so it's evil check feeding the hungry results in higher collective well-being so it's good check pulling a switch so a trolley kills only one person instead of five people results in some negative consequences but on the balance it results in higher collective well-being so it's ultimately good sure that sounds totally plausible some have taken this correlation with our moral intuitions as proof that utilitarian ethics is objectively correct but if someone has moral inclinations that point in other direct they're simply wrong or mistaken although this sort of all-in commitment to the utilitarian framework hazards of manages there are several scenarios where utilitarian ethics taken at face value seem to mandate behavior that doesn't quite aligned with our moral intuitions or worse yet behavior that seems outright evil as a theory focused entirely on maximizing a single variable utilitarianism makes no allowance for other things like rights justice equality or virtue unless they serve some instrumental purpose for increasing utility that can make it very uncomfortable in certain thought experiments let's take a look at a few utility monster this is Felix Felix is just like you but he has this particular psychological quirk he enjoys things a lot more than most people if you would enjoy some ice cream Felix would grown in ecstatic delight at each lick if you would love having a puppy to play with Felix would experience rapturous transcendence weeping at how happy the puppy made him anything you could possibly imagine appreciating Felix likes it better now that's all well and good but there's a finite amount of resources on this planet only so much ice cream in so many puppies to go around a strict utilitarian would have to weigh the ultimate enjoyment produced by each of these resources as they're consumed and come to a somewhat irritating conclusion whatever Felix wants he gets even if that means that you don't get any if that ice cream you were going to enjoy can be safely stored for Felix to enjoy later you shouldn't get a single scoop if you had some time to play with your puppy but Felix is free right now you'll just have to wait until he's done in fact what are you doing right now because if it's bringing you less pleasure than Felix eating ice cream morally speaking you should probably be slaving away in the ice cream factory along with everyone else this is an example of Robert Nozick's famous utility monster argument highlighting how utilitarianism violates one of our key intuitions about morality a sense of fairness or equality it seems to favor individuals who are capable of greater enjoyment even at the expense of the welfare of others if a serial killer is having a good enough time utilitarianism would seem to suggest that we line up and let them do their thing the experience machine another of Nozick's famous thought experiments should be familiar to anyone who's seen the matrix if we develop sufficiently advanced virtual reality something that would simulate any subjective experience winning a marathon discovering a cure for cancer whatever would hooking everyone up to the machine be morally justified would it be mandatory if the only metric for morality is maximizing some subjective experience like pleasure or happiness it seems that we should be doing everything in our power to get people into the machine after all if the value of climbing Everest can be fully reduced to the subjective experience of having climbed Everest there's no reason to have anyone out there risking their lives to climb it they can feel the same way safely and more reliably in the machine so even if nobody ever actually climbed Everest even if humanity had done nothing of substance besides inventing the experience machine it wouldn't be any worse of a world in fact it would be a utilitarian utopia optimized to generate as much positive subjective experience as we could handle unless of course you think that a world of real achievements might be more desirable that would seem to indicate that there's something besides subjective experience that matters organ harvesting organ donation is a great way to do a lot of good in the world after you're dead your liver your heart everything that keeps working after your untimely demise can give another person a new lease on life but the organ donation procedure isn't really proactive what if instead of waiting for you to kick the bucket the hospital were to just you know reallocate organs according to utilitarian principles I mean how much pleasure do you think you can manage in the remainder of your lifetime is it more pleasure than two or three other people would have if their lives were saved by your gift a strictly utilitarian approach one which made no allowances for petty things like rights or autonomy would mandate that we dice up healthy people if there were at least a couple others who could be saved at the very least if he were to wander into a hospital for a checkup so long as you had two working kidneys you'd probably wake up without one some have made arguments that the fear of getting diced up would outweigh the benefit of such a policy but if they could keep a secret murderer / organ reallocation conspiracy under wraps would that necessarily be a good thing the repugnant conclusion when we talk about utilitarian ethics we tend to focus on one part of the equation increasing well-being that's where a lot of the support for it as a standard of morality comes from because it usually endorses courses of action that end up improving people's lives but there is another possible approach notably posited by Derek Parfit rather than trying to improve the existing human lives we could just make more humans a lot more humans in fact in perfect framing of the utilitarian calculus it seems that the maximum utility we should be pursuing is the most populous society we can manage even if life in that society is just barely worth living take these two groups of people a and a prime the height of the bar is the well-being of people in the group and its width is the number of people living in it Group a has a small number of people at a high standard of living while a prime has that population plus another population enjoying themselves less not bad mind you but not quite a stellar it would be weird to say that even if their lives are worth living if someone wasn't going to enjoy themselves as much as the happiest people they should never have existed in the first place so it makes sense to say that a prime is better than a or at least not worse now let's compare a prime to Group B same number of people slightly lower maximum happiness but higher average happiness overall that looks even better but where has that brought us we've found that a larger population at a slightly lower standard of well-being is superior to a small population of very happy people we can repeat the same procedure over and over until we reach what perfect called the repugnant conclusion a massive population of people who are just barely eking it out with the structure of the above argument utilitarianism would seem to suggest that that's more desirable I'm sure that all the parents in the audience are reassured but that seems wrong somehow repugnant even in scenarios like these were utilitarian models of morality seem to conflict with our intuitions about right and wrong some philosophers choose to double down on utilitarianism saying that the model is so good that even when it points at seemingly immoral conclusions we should bite the bullet and simply accept those conclusions as being correct utility monsters should be fed organs should be harvested our squeamishness about following the recommendations of utilitarian ethics is simply due to our imperfect moral apparatus still there's a key difference that's worth acknowledging unlike other phenomena there's no objective measure we can hold up to models of morality to evaluate their truthfulness if we disagree about which scientific model better represents the world we can turn to data and experimentation to resolve our differences or at least to figure out what sort of information might lend credence to one idea or another but the best we can hope for out of a system of ethics is internal consistency and some sort of correlation with our moral intuitions which may be biased or incorrect there's no science that can be done to empirically verify a system of ethics without begging the question without deciding beforehand what framework for morality is correct in that light we should probably be extraordinarily careful about prioritizing the dictates of ethical systems over our moral instincts it's good to try to develop rigorous systems both to try and explain those instincts and to give us useful tools for navigating dilemmas but we should be mindful that those intuitions are what pelvis to develop ethics in the first place and that methodological tidiness is a poor substitute for truth especially when you're talking about right and wrong do you think that we should give precedence to systems of morality over our moral instincts please leave a comment below and let me know what you think thank you very much for watching don't forget to bla bla subscribe bla share and don't stop something
Info
Channel: THUNK
Views: 4,669
Rating: 4.9580421 out of 5
Keywords: THUNK, philosophy, morality, ethics
Id: QA2JwEhJss8
Channel Id: undefined
Length: 11min 33sec (693 seconds)
Published: Wed Feb 05 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.