10
   

The Watchmen Dilemma

 
 
Night Ripper
 
  1  
Reply Mon 21 Jun, 2010 12:55 pm
@Huxley,
Huxley wrote:
You could walk on and mind your own business (non-agression), or you could call the police and yell out. If you walk on you didn't commit rape, but you certainly allowed it to continue to happen.


A rapist has already initiated force so I can retaliate on the behalf of the victim. I don't need to just walk on. Also, the fact that I didn't try to stop something doesn't put the blame on me. The rapist is the guilty party. They are the one that committed the act. Let's not lose sight of that.
Huxley
 
  1  
Reply Mon 21 Jun, 2010 01:04 pm
@Night Ripper,
Night Ripper wrote:

A rapist has already initiated force so I can retaliate on the behalf of the victim. I don't need to just walk on. Also, the fact that I didn't try to stop something doesn't put the blame on me. The rapist is the guilty party. They are the one that committed the act. Let's not lose sight of that.


I'm not. Maybe I wasn't clear. Would it, in your opinion, be immoral to just walk on in such a situation?
Night Ripper
 
  1  
Reply Mon 21 Jun, 2010 01:09 pm
@Huxley,
Huxley wrote:

Night Ripper wrote:

A rapist has already initiated force so I can retaliate on the behalf of the victim. I don't need to just walk on. Also, the fact that I didn't try to stop something doesn't put the blame on me. The rapist is the guilty party. They are the one that committed the act. Let's not lose sight of that.


I'm not. Maybe I wasn't clear. Would it, in your opinion, be immoral to just walk on in such a situation?


In a situation where it would take me little or no effort to help someone? It would be immoral. In the situation of turning away people that would die at sea, the stakes are much higher. It's not just a matter of turning my head and whistling. If they refuse to be quarantined they are putting me at risk and in that case it's not immoral to refuse help. It's perfectly moral to refuse help when the cost to oneself is great enough.
Huxley
 
  1  
Reply Mon 21 Jun, 2010 01:24 pm
@Night Ripper,
OK. That makes more sense to me now.
0 Replies
 
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 01:27 pm
@Huxley,
Huxley wrote:

I admit that I stand somewhat leery of hypotheticals such as the ticking time bomb and those similar to it.

What's stopping someone in your scenario from setting up another quarantine separate from the other one for the new arrivals?


You should say "What stopped them" because that is how they used to do it. Simply put, you have a very large city and people traveling to it from all over, designing a perfect system is logistically impossible. Yes, it would be better, but it's a real world scenario where you have to make that choice.


Quote:
Let's try an abstraction, instead. Suppose you live in two-option world, where every moral problem has two possible solutions. Knowledge of the better choice is unknown with certainty by the actor, but defined by the world (there exists a certain solution to the problem. The solution is presently unknown). The actor is left to develop a calculus for decision making. Certainly we want the best of two options, but the calculus for determining which is which is left for us to develop, and at present the calculus remains underwhelming. Further, by stipulation of a hypothetical, a moral problem arises in which we know both options to be bad, we just do not know which option is worse. In such a scenario, judging phenomenologically (rather than from a God's eye perspective), is there a right answer to such a moral problem?


Why wouldn't you take what you believe to be the lesser of two evils?

Night Ripper wrote:

Jebediah wrote:
They would die at sea if they left.


So what? How am I guilty for their deaths? If someone comes to my door and I turn him away and he freezes to death, I didn't kill him. I have no problem using force against people that use it against me first. I just don't initiate force.

http://en.wikipedia.org/wiki/Non-aggression_principle

Jebediah wrote:
But it's ok to kill innocents if they accepted the risk?


People have the right to risk their own lives and lose them.


Your common sense is calling Night Ripper.
Night Ripper
 
  1  
Reply Mon 21 Jun, 2010 01:33 pm
@Jebediah,
Jebediah wrote:
Your common sense is calling Night Ripper.


That's not very productive or meaningful.
0 Replies
 
Huxley
 
  1  
Reply Mon 21 Jun, 2010 01:40 pm
@Jebediah,
Jebediah wrote:

You should say "What stopped them" because that is how they used to do it. Simply put, you have a very large city and people traveling to it from all over, designing a perfect system is logistically impossible. Yes, it would be better, but it's a real world scenario where you have to make that choice.


They could pull out of harbor for the time period that the best medical practitioners thought necessary with food given to them , and if they're not sick, they could be allowed in (say, 500 yards out, something like that). It would be a separate quarantine, then.

Quote:

Quote:
Let's try an abstraction, instead. Suppose you live in two-option world, where every moral problem has two possible solutions. Knowledge of the better choice is unknown with certainty by the actor, but defined by the world (there exists a certain solution to the problem. The solution is presently unknown). The actor is left to develop a calculus for decision making. Certainly we want the best of two options, but the calculus for determining which is which is left for us to develop, and at present the calculus remains underwhelming. Further, by stipulation of a hypothetical, a moral problem arises in which we know both options to be bad, we just do not know which option is worse. In such a scenario, judging phenomenologically (rather than from a God's eye perspective), is there a right answer to such a moral problem?


Why wouldn't you take what you believe to be the lesser of two evils?


That's what we're both doing in this scenario.
engineer
 
  1  
Reply Mon 21 Jun, 2010 01:43 pm
@Jebediah,
Jebediah wrote:

Why wouldn't you take what you believe to be the lesser of two evils?

Because which is the lesser of two evils depends on your calculus and your certainty. Back to the Watchmen dilemma, you think there is going to be a catastropic nuclear war and you compute the probability of that at say 80% with the likely result the death of all humanity. If you choose to kill 20 million people, the possiblity of all out nuclear war goes down to 20%. There is a chance that your scheme causes the war you were trying to prevent, but still the odds as best you can figure are 80% down to 20%. Do you pull the trigger? Do you trust your calculations on such a complex problem that much? What percent reduction is worth 20 million lives? How about 80% down to 60%, doubling the likelihood of humanity's survival (as you calculate it). What if it was just one life? What if that life was your mother, child or spouse? What about 80% to 79%? Would you sacrifice 0.2% of the humans on the Earth so that the other 99.8% gained a 5% (20% to 21%) improvement in their odds (once again, as computed by you)? When someone kills his neighbors because he claims Satan was going to use them as a conduit to destroy all mankind, do we applaud him for saving us all with an acceptable sacrifice or consider him horribly deranged? Do you trust your view of the world and your sanity so much that you would push the button?
Jebediah
 
  2  
Reply Mon 21 Jun, 2010 02:01 pm
@engineer,
engineer wrote:

Jebediah wrote:

Why wouldn't you take what you believe to be the lesser of two evils?

Because which is the lesser of two evils depends on your calculus and your certainty. Back to the Watchmen dilemma, you think there is going to be a catastropic nuclear war and you compute the probability of that at say 80% with the likely result the death of all humanity. If you choose to kill 20 million people, the possiblity of all out nuclear war goes down to 20%. There is a chance that your scheme causes the war you were trying to prevent, but still the odds as best you can figure are 80% down to 20%. Do you pull the trigger? Do you trust your calculations on such a complex problem that much? What percent reduction is worth 20 million lives? How about 80% down to 60%, doubling the likelihood of humanity's survival (as you calculate it). What if it was just one life? What if that life was your mother, child or spouse? What about 80% to 79%? Would you sacrifice 0.2% of the humans on the Earth so that the other 99.8% gained a 5% (20% to 21%) improvement in their odds (once again, as computed by you)? When people kill their neighbors because they claimed Satan was going to use them as a conduit to destroy all mankind, do we applaud them for saving us all with an acceptable sacrifice or consider them horribly deranged? Do you trust your view of the world and your sanity so much that you would push the button?


Yes, that's a good answer to the question I asked. You wouldn't take the option you believed to be the lesser of two evils in the watchmen scenario unless you also believed that you had a very good reason to think it was the lesser of two evils. The more uncertainty their is the more you would hesitate. If I remember correctly the character in the book was a genius and had studied the situation for years not a deranged person.

But, it's a framework question you had (like all of these hypotheticals). If you think about how we make moral decisions, it's important to have a "not too heavy, not too light" kind of framework. You need to define a space in which you think about the problem. Pure utilitarianism is pointless--operating by it would not be utilitarian. But we can't go just by rules. You have to keep in mind why we have them in the first place. Otherwise you start throwing people in jail for being addicted to drugs, and other ludicrous things like that.

That's kind of what bugs me about this debate. It should be ultra basic. Any criticism should be along the lines you propose--"yes, but in the real world it's not so simple". But instead we have "so what? I left him to die in the cold, that's all" and "it would be better if humanity was wiped out".
Night Ripper
 
  1  
Reply Mon 21 Jun, 2010 02:13 pm
@Jebediah,
Jebediah wrote:
That's kind of what bugs me about this debate. It should be ultra basic. Any criticism should be along the lines you propose--"yes, but in the real world it's not so simple". But instead we have "so what? I left him to die in the cold, that's all" and "it would be better if humanity was wiped out".


These are thought experiments. If you say, "so what it will never happen" then you are missing the point. These particular situations will never happen but to a lesser extent analogous ones will. The point is to strain our principles so that we can condense them down to what really matters.

Getting back to the issue. Life, in and of itself, is pointless. What makes human life worth living are our acts of self-creation, the codes we give ourselves, the values we hold, relationships we make. If you treat people as obstacles, kill a few to save a lot, then you reduce them to sacks of meat. You're not saving humanity. You're saving humans.
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 02:13 pm
@Huxley,
Huxley wrote:

Jebediah wrote:

You should say "What stopped them" because that is how they used to do it. Simply put, you have a very large city and people traveling to it from all over, designing a perfect system is logistically impossible. Yes, it would be better, but it's a real world scenario where you have to make that choice.


They could pull out of harbor for the time period that the best medical practitioners thought necessary with food given to them , and if they're not sick, they could be allowed in (say, 500 yards out, something like that). It would be a separate quarantine, then.


What food??????????????????? What about the storm that's about to dash them into the rocks?????? Do we have to keep playing this silly game? Do you really believe that there has never been and never will be a scenario in which there is a choice between two bad things? Is your answer to the trolley problem "make the trolley fly away"?

Sorry, I'm getting frustrated...if you believe in a purely rule based system, then why do talk about ethics at all? It's all laid out for you already, and you can't question it. Because if you can question it, then you need some grounds for questioning, which could, for example, be "but it causes the death of 2 billion extra people".
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 02:16 pm
@Night Ripper,
Quote:
These are thought experiments. If you say, "so what it will never happen" then you are missing the point. These particular situations will never happen but to a lesser extent analogous ones will. The point is to strain our principles so that we can condense them down to what really matters.


I agree

Quote:
Getting back to the issue. Life, in and of itself, is pointless. What makes human life worth living are our acts of self-creation, the codes we give ourselves, the values we hold, relationships we make. If you treat people as obstacles, kill a few to save a lot, then you reduce them to sacks of meat. You're not saving humanity. You're saving humans.


If that were true your view would make more sense. But my action would only affect me--it wouldn't reduce all of the people I saved into sacks of meat. They would still be able to have a life worth living.

Night Ripper
 
  1  
Reply Mon 21 Jun, 2010 02:28 pm
@Jebediah,
Jebediah wrote:
If that were true your view would make more sense. But my action would only affect me--it wouldn't reduce all of the people I saved into sacks of meat. They would still be able to have a life worth living.


If I found out I was one of those that survived because of the death of millions, I wouldn't be thankful. I'd rather have died. I'm going to die anyways and I value justice more. I'm sure a few people that don't value justice would selfishly be thankful that others were murdered but I don't think I will be too sad they're going to be killed with the rest of humanity. So, in the end, the decent people welcome their fates honestly and the cruel people die. Works for me.
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 02:33 pm
btw Engineer (since I can't edit my last post).

I think our instinct errs quite far on the side of "don't do this bad thing, even if it has an overall good effect (i.e. don't consider the distant effects)". And it's a good thing that we have that instinct, a wonderful thing. We also have the opposite instinct of "do this thing that seems good, without thinking about the possible negatives". Both are necessary instincts, but both have caused a lot of harm. Being human is all about using our intellect to rise above basic instinct. And it seems like the philosophical discussion of ethics is too.

I think your point is promoting vigilance in the wrong direction. We already have that instinct.
Huxley
 
  1  
Reply Mon 21 Jun, 2010 02:33 pm
@Jebediah,
Jebediah wrote:

What food??????????????????? What about the storm that's about to dash them into the rocks?????? Do we have to keep playing this silly game?


Well, note the amount of qualifiers that one has to place on scenarios in order to create a scenario with only two solutions. I think that's important, personally, because it highlights how there is very rarely situations in which there are only two solutions.

I abstracted in the above into the world in which there are only two solutions to get at the meat of the problem, and we both agreed that we would choose the lesser of two evils in such a world.

Quote:

Do you really believe that there has never been and never will be a scenario in which there is a choice between two bad things?


Never? No.

I think they're rare, though.

Quote:

Is your answer to the trolley problem "make the trolley fly away"?


No, I have yet to come up with an alternate solution to that one. So, I pull the lever there. Though I stipulate that one should then try to warn the group in danger -- at the least you can run towards them.

Quote:

Sorry, I'm getting frustrated...if you believe in a purely rule based system, then why do talk about ethics at all? It's all laid out for you already, and you can't question it. Because if you can question it, then you need some grounds for questioning, which could, for example, be "but it causes the death of 2 billion extra people".


My intent is not to frustrate you, so... sorry for that.

I think that's a fine response to the question, myself. I don't hold, however, that the hypothetical is so cut-and-dry that the response is the only response. I came up with another solution to the problem -- disarm the button and try and stop the war through peace activism. I think activism works. It may fail, but it has a reasonable chance of success.

I mean, if we wanted to abstract, and say "You exist in the world where your life consists of one choice, and then you die. The options of this one choice are either kill 10 million in another world, or kill 20 million in that other world" then, sure, I choose the 10 million choice. I don't see that level of certainty in the Ticking Time Bomb, or in the Watchmen, or even in the trolley problem (though it comes much closer than the other two).


EDIT: I just realized I didn't answer your initial question. We debate ethics because the solutions aren't known with certainty, and because it's worthwhile to model and contemplate ethical problems in preparation for the ethical problems we actually have to face.
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 02:36 pm
@Night Ripper,
Night Ripper wrote:

Jebediah wrote:
If that were true your view would make more sense. But my action would only affect me--it wouldn't reduce all of the people I saved into sacks of meat. They would still be able to have a life worth living.


If I found out I was one of those that survived because of the death of millions, I wouldn't be thankful. I'd rather have died. I'm going to die anyways and I value justice more. I'm sure a few people that don't value justice would selfishly be thankful that others were murdered but I don't think I will be too sad they're going to be killed with the rest of humanity. So, in the end, the decent people welcome their fates honestly and the cruel people die. Works for me.


How many is a "few people"? I think you'll find that the vast majority would be happy to be alive. A million years of evolution is on the side of my estimate...
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 02:42 pm
@Huxley,
Huxley wrote:
Never? No.

I think they're rare, though.

...

No, I have yet to come up with an alternate solution to that one. So, I pull the lever there.


That's a relief to me...whenever we have this debate I have nightmares (figuratively speaking) of people from this forum being elected president and refusing to distribute the emergency vaccine because it kills a tiny percentage of those vaccinated, and then everyone I know dies from a disease.

Huxley wrote:
Well, note the amount of qualifiers that one has to place on scenarios in order to create a scenario with only two solutions. I think that's important, personally, because it highlights how there is very rarely situations in which there are only two solutions.


I think though, that this is really about how we make decisions. And that's a neuro-psychological question that I can't answer. I do know though, that a bias can tip the whole process towards a certain (sometimes bad) conclusion. I think you have to firmly bind the problem--so that if we are deciding something we can sort of bounce off of "but this seems instinctively wrong" and then over to the other side to "but the consequences..." and achieve some sort of reflective equilibrium safely. If you don't have one of those boundaries what happens in the decision making process?
Jebediah
 
  1  
Reply Mon 21 Jun, 2010 02:51 pm
Also, one other thing about this debate. As an analogy, imagine you are cooking noodles. But the only thing you acknowledge is that it's "bad to not cook them long enough". I then argue that it's possible to overcook noodles. If you say that we don't need to think about overcooking noodles because we can follow the instructions on the box exactly, or have some deus ex machina device that tells exactly when they are cooked you are actually agreeing with me 100%. Because any such method or device would be utterly pointless if it was impossible to overcook noodles. I think that's very much comparable to the objections made to these hypotheticals...they acknowledge the value of consequentialism.

Huxley wrote:
I think that's a fine response to the question, myself. I don't hold, however, that the hypothetical is so cut-and-dry that the response is the only response. I came up with another solution to the problem -- disarm the button and try and stop the war through peace activism. I think activism works. It may fail, but it has a reasonable chance of success.


Certainly, it's very good that the soviet union and the USA never decided on "save humanity by wiping them out first".
Huxley
 
  1  
Reply Mon 21 Jun, 2010 03:32 pm
@Jebediah,
Ah. Well, I wouldn't want people to get in a moral rut such as that, I agree.

My model isn't as much between the intrinsic value of rules and the effects of actions, I think. Maybe this is the disconnect in communication.

I often view ethics as a set of teleological principals (virtues) with rules and an understanding of a general content which binds those rules together (the meaning behind the name of a virtue, where the rules are explicit examples of said virtue). I agree that moral extremism needs balancing in our reasoning, but I think this can be accomplished by adopting multiple virtues (As metaphor: think of a system of differential equations. Events outside the moral reasoner discontinuously vary the initial conditions. Like that, but presently with more uncertainty). Further, I acknowledge that ends are important, but I think that ends are motivators, not justifiers. Rather, the value of an action is found in the conjunction of the action itself and the motivation. It's a virtue-theory deontology (I state this mostly because I emphasize consistency in moral reasoning, adherence to your moral law out of a respect for it, and because one should acknowledge that others are moral reasoners as well). What's more, I'll acknowledge that it's not perfect. It's just a model.

So, I'm not striking out consequentialism, as if we should just ignore the possible effects of our actions. I do put less emphasis on it, however, than other parts of moral reasoning. I do so in particular because I don't think it answers many moral questions, but that's just on the basis of trying to utilize it in my own life. I know others that do, and seem to get along fine. I don't discount it entirely, I just don't use it as much.

Quote:
Also, one other thing about this debate. As an analogy, imagine you are cooking noodles. But the only thing you acknowledge is that it's "bad to not cook them long enough". I then argue that it's possible to overcook noodles. If you say that we don't need to think about overcooking noodles because we can follow the instructions on the box exactly, or have some deus ex machina device that tells exactly when they are cooked you are actually agreeing with me 100%. Because any such method or device would be utterly pointless if it was impossible to overcook noodles. I think that's very much comparable to the objections made to these hypotheticals...they acknowledge the value of consequentialism.


Hmm? I think that one goes both ways. Suppose, in cooking noodles, one states that the only thing that matters is that the noodles need to be good, in the end. But to do that, you need to understand what makes them good. In this scenario one would act such that they can periodically check the pasta to a standard of taste.

In the above you answer "Push the button". This is because of a rule you have formulated for moral reasoning: namely, to take into account number in choosing the moral action. As such, in consequentialism one acknowledges the value of deontology. (and I would say vice-versa, as you point out. I don't think I've advocated for one in exclusion to the other, though I certainly emphasize one more than the other in my moral reasoning -- as I would expect most people to do)
0 Replies
 
Night Ripper
 
  1  
Reply Mon 21 Jun, 2010 05:34 pm
@Jebediah,
Jebediah wrote:
How many is a "few people"? I think you'll find that the vast majority would be happy to be alive. A million years of evolution is on the side of my estimate...


Luckily, morality isn't popularity contest.
0 Replies
 
 

Related Topics

is there a fundamental value that we all share? - Discussion by existential potential
The ethics of killing the dead - Discussion by joefromchicago
Theoretical Question About Extra Terrestrials - Discussion by failures art
What is your fundamental moral compass? - Discussion by Robert Gentel
morals and ethics, how are they different? - Question by existential potential
The Trolley Problem - Discussion by joefromchicago
Keep a $900 Computer I Didn't Buy? - Question by NathanCooperJones
Killing through a dungeon - Question by satyesu
 
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 03/01/2024 at 11:45:23