by ertgbnm on 10/12/2023, 1:57:25 PM
by ameliaquining on 10/12/2023, 1:01:37 PM
The problem here isn't with the main character's moral philosophy, but with his decision theory. He'd be dealing with exactly the same predicament if the mugger were threatening to harm him.
The solution is indeed "don't give in to muggers", but it's possible to define this in a workable way. Suppose the mugger can choose between A (don't try to mug Bentham) or forcing Bentham to choose between B (give in) or C (don't give in). A is the best outcome for Bentham, B the best outcome for the mugger, and C the worst for both. The mugger, therefore, is only incentivized to force the choice if he expects Bentham to go for B; if he expects Bentham to go for C, then it's in his interest to choose A. Bentham, therefore, should have a policy of always choosing C, if it's worse for the mugger than A; if the mugger knows this and responds to incentives (as we see him doing in the story), then he'll choose A, and Bentham wins.
And none of this has anything to do with utilitarianism, except in the respect that utilitarianism requires you to make decisions about which outcomes you want to try to get, just like any other human endeavor.
by lcnPylGDnU4H9OF on 10/12/2023, 12:46:30 PM
Bentham brought up a good point:
> Fair enough. But, even so, I worry that giving you the money would set a bad precedent, encouraging copycats to run similar schemes.
I don’t understand how it was logically defeated with escalation as in the story. Would it be wrong for a Utilitarian to continue arguing against this precedent, saying that the decision to be mugged removes overall Utility because now anyone who can be sufficiently convincing can also effectively steal money from Utilitarians. (I guess money changing hands is presumed net neutral in the story?)
by AndrewDucker on 10/12/2023, 12:36:56 PM
The point here is largely that reality (at our level) is not something which can be simply solved by the application of a couple of rules, from which Right Action will thenceforth necessarily flow.
Reality is a big, complex, ball of Stuff, and any attempts to impress morality upon it will be met with many corner cases which produce unwanted results unless we spend our time engaged with dealing with what initially look like tiny details.
by erostrate on 10/12/2023, 6:03:16 PM
I used to be a utilitarian, but it made me morally repulsive, which pushed my friends away from utilitarianism. I had to stop since this had negative utility.
More seriously, any moral theory that strives too much for abstract purity will be vulnerable to adversarial inputs. A blunt and basic theory (common sense) is sufficient to cover all practical situations and will prevent you from looking very dumb by endorsing a fancy theory that fails catastrophically in the real world [1]
[1] https://time.com/6262810/sam-bankman-fried-effective-altruis...
by uKVZe85V on 10/13/2023, 2:26:54 PM
Fun fact: I asked ChatGPT to help me translate that to French for private use, pasting there the first part of the conversation.
It started answering, then within seconds my question was replaced with "This content may violate our content policy. If you believe this to be in error, please submit your feedback — your input will aid our research in this area."
Then, seconds after, the still-appearing answer was replaced with the same message.
Doh! The content filter got tripped because "obviously" it's not a philosophical thought experiment about utilitarianism but an evil text about mugging someone, which is an illegal activity. What a time to be alive!
by paulhart on 10/12/2023, 11:00:48 AM
I love this for three reasons:
1: the dig at Effective Altruism;
2: I went to UCL back in the days when you could hang out with the Bentham In A Box;
3: One of my (distant) colleagues is a descendant of Bentham.
by jl6 on 10/12/2023, 12:23:45 PM
Is there perhaps more than a finger's worth of utility in deterring such muggings by refusing the initial deal?
by throwaway101223 on 10/12/2023, 1:57:46 PM
> Here's the thing: there is, clearly, more utility in me keeping my finger than in you keeping your measly ten pounds.
How is this clear? This is one of the things I find strange about academic philosophy. For all the claims about trying to get at a more rigorous understanding of knowledge, the foundation at the end of the day seems to just be human intuition. You read about something like the Chinese Room or Mary’s Room thought experiments, that seem to appeal to immediate human reactions. “We clearly wouldn’t say…” or “No one would think…”
It feels like an act of obfuscation. People realize the fragility of relying on human intuition, and react by trying to dress human intuition up with extreme complexities in order to trick themselves into thinking they’re not relying on human intuition just as much as everyone else.
by jjk166 on 10/12/2023, 4:28:15 PM
The problem here stems from trying to have some universal utility values for acts. You can't say cutting off a finger is fundamentally worse than losing 10 pounds, even if it frequently would be. I wouldn't give up one of my fingers for 10 pounds, and I think most sane people wouldn't either, but here the mugger is willing to do that. So in this particular instance, the mugger is valuing the utility of keeping his finger at 10 pounds, and thus the decision on whether or not to give it to him is a wash. The moment you start dictating what the utility values are of consequences for other people you get absurd outcomes (e.g. some of you may die, but it's a sacrifice I'm willing to make).
by alphazard on 10/12/2023, 2:09:21 PM
The most pressing problem facing utilitarians has never been choosing between principled vs. consequentialist utilitarianism. It's how to take a vector of utilities, and turn it into a single utility.
What function do I use? Do I sum them, is it the mean, how about root-mean-squared? Why does your chosen function make more sense than the other options? Can I perform arithmetic on utilities from two different agents, isn't that like adding grams and meters?
by earthboundkid on 10/12/2023, 4:05:23 PM
Utilitarianism is supposed to be a strawman theory that you teach in the first week of class in order to show the flaws and build a real theory of ethics the remaining 14 weeks of the semester. SMDH at all these people who didn't get that basic point.
by superb-owl on 10/12/2023, 4:31:37 PM
Maybe morality can't be quantified.
https://blog.superb-owl.link/p/contra-ozy-brennan-on-ameliat...
by kubb on 10/12/2023, 11:04:02 AM
It’s amazing how contrived and detached from reality the counterexamples for utilitarianism have to be able to attack even the most basic forms of it. It really makes you think that utilitarianism is a solid principle.
by Veedrac on 10/12/2023, 5:58:38 PM
> If I find an unmuggable version of utilitarianism with more explanatory power, I'll let you know.
Functional Decision Theory
by nisegami on 10/12/2023, 11:30:52 AM
I feel like this statement hides something critical, "Here's the thing: there is, clearly, more utility in me keeping my finger than in you keeping your measly ten pounds."
My point is that, is that so clear? Or is the utility function being presumed here lacking?
by brindlejim on 10/12/2023, 9:29:22 PM
Now imagine that instead of a mugger you have an AI researcher who both believes that AGI will destroy the world and is intent on being in the room when it is created. The self-mugging of an AI safety-ist.
by firecall on 10/13/2023, 1:02:21 AM
I feel mugged having attempted to read that nonsense ;-)
by JR1427 on 10/12/2023, 12:46:16 PM
But the mugger could have avoided making the deal with the thug, so I don't see how that deal changes much.
by 1970-01-01 on 10/12/2023, 2:29:04 PM
Reads like a ChatGPT argument with an idiot savant, with emphasis on the idiot.
The underlying assumption is that Bentham is a true act utilitarian yet simultaneously has 10 pounds in his pocket that he can stand to lose without much harm. If he truly were an act utilitarian, the utility of the 10 pounds remaining in Bentham's possession must be so high that it outweighs the mugger losing their finger, otherwise Bentham would have already spent it on something similarly utility maximizing. Clearly that 10 pounds was already destined to maximize utility such as staving off Bentham's hunger and avoiding his own death or the death of others.
Meanwhile the utility of the mugger's finger is questionable. The pain of losing the finger is the only real cost. If they are just a petty criminal, the loss of their finger will probably reduce their ability to commit crimes and prevent him from inflicting as much suffering on others as he otherwise would have. Maybe losing his finger actually increases utility.
Bentham: "I'm sorry Mr. Mugger but I am on my way to spend this 10 pounds on a supply of fever medication for the orphanage and I am afraid that if I don't procure the medicine, several children will die or suffer fever madness. So when faced with calculating the utility of this situation I must weigh your finger against the lives of these children. Good day. And if the experience of cutting your finger off makes you question your own deontological beliefs, feel free to call upon me for some tutoring on the philosophy of Act Utilitarianism."
Any other scenario and Bentham clearly isn't a true Act Utilitarian and would just tell the Mugger to shove his finger up his ass for all Bentham cares. Either strictly apply the rules or don't apply them at all.