Home · Maps · About

Home > SubChat
 

[ Read Responses | Post a New Response | Return to the Index ]
[ First in Thread | Next in Thread ]

 

view flat

Re: Editorial: ''De Blasio's Subway Follies''

Posted by Nilet on Wed Jul 26 09:26:10 2017, in response to Re: Editorial: ''De Blasio's Subway Follies'', posted by New Flyer #857 on Wed Jul 26 08:26:25 2017.

edf40wrjww2msgDetail:detailStr
They are divinely ordered.

That's just factually untrue.

You can't build a moral system on a falsehood.

Such is the only way in which they do not change with the times. Even the most essential rights you can think of and the most basic tenets of "basic morality" can be changed with enough revolution and popular opinion shifts.

On the contrary; the unchanging will of God changes with every revolution, every generation, even every fad, while the existence of people's fundamental desires tend to be fairly universal.

So morality to you is just a matter of what is found practical given the current climate? So that would be all changeable then, right?

Just the opposite. That no one wants to be shot is a constant; that a universal no-shooting rule is the best way to fulfill this universal desire is a constant; as such, "do not shoot" is a constant no matter the climate. Depending on what happens, violations of that rule may become commonplace but they will never become right.

There could conceivably arise a situation in which many/most people want to be shot. Would that change "morality?" And at what point would the change be made?

A situation in which everybody wants to be shot seems a bit absurd. If this absurd situation did arise, it would not change the fundamental moral calculations but it would change their outcome.

Which is to be expected. Autonomy is paramount; you have the right to decide what you want, and what you want is the basis for determining whether you gain or lose through an action. A normal person is harmed by being killed, but a terminally ill person may actually desire a quick and painless death in which case that person would gain by being killed. Morality evaluates gains and harms where a "gain" is when you get your way and a "harm" is when you don't. The underlying rules that evaluate gains and harms don't change, but if a person's desires change then what constitutes a "gain" or a "harm" for them changes too.

So if we posit an absurd scenario in which most people genuinely want to be shot, then in that scenario the "no shooting" rule won't apply— not because the rules of morality have changed but because the rules of morality are a function whose input has changed.

Now here morality for you is based strictly on outcome (ends). The goal is a "better group."

Not necessarily. That each of us wants what we want is sort of a given; humans have desires. That each of us takes action to pursue our desired ends is also sort of a given; humans have always pursued their goals and doubtless always will.

Morality is a tool for achieving our goals. More specifically, morality arises out of the recognition that (1) agreeing to a system of universal rules can facilitate your ability to achieve your own goals overall even when the system stands in the way of those goals by diminishing conflict with other people pursuing their goals, (2) if the system can be modified to increase the extent to which people achieve their goals in general, it will result in a statistical gain in the odds that you personally will achieve your own goals, and (3) the incentives to maintain the system provided by (1) and (2) apply to everyone, which means the system is sustainable; everyone has reason to agree to it and reason to maintain it.

The goal isn't a "better group." At its core, morality is selfish— which means that people can support it out of self-interest.

This again implies that the world is oscillating between absolute good and absolute evil. These terms demand clarification.

They're your terms. What do you mean by "absolute good" and "absolute evil?"

Well yeah, perceptions, but I mean. . .how about reality?

What you desire is subjective and can only be determined by you.

And who told them? I'm not saying people should not be able to stand up for themselves against the government, but this question needs to be asked if we are to reach any greater understanding of "basic morality."


I'm not sure what you mean by this. Your goals are yours; nobody but you can possibly decide what you want.

You know what makes you happy or miserable; you know what you enjoy or find unpleasant; you know what kinds of experiences and sensations you want and don't want, and you can choose your goals accordingly.

You can even choose goals that will make you miserable, because you desire misery, or feel you deserve misery, or are simply confused about what makes you happy. That's not within the domain of morality, however; morality asks what you want, but doesn't care why you want it. Self-inflicted misery is not a moral issue.

This is not mathematically sound. A system providing the greatest statistical gain still leaves plenty of room for significant amounts of people to be hurt overall by it.

Yes, but what are they going to do? The system which produces the least harm creates the smallest group of outcasts trying to tear it down. Most (if not all) of the people "hurt" by a perfect moral system would be hurt more by any other system, so if they're rational they wouldn't even try to tear it down.

Are you advocating utilitarianism?

Utilitarianism is a distant cousin which shares a few fundamental building blocks, but it's not the same thing.

I'd think of morality as more akin to traffic control at an extremely complicated intersection. Each car would rather go than yield and rather yield than crash. However, if each car tried to go, most would crash. The problem is entirely practical in nature— there exists a ruleset that, if followed, would produce zero crashes and an aggregate minimum of yielding. If the ruleset were proposed, each car would have reason to agree to it because each would say: "Statistically, this outcome is better for me than any other outcome would have been." If the ruleset were enacted, no car would have reason to break it because violations are incredibly likely to result in the violating car crashing. Yes, some cars will be forced to yield while another ruleset would have allowed them to go, but they'd still agree to the "perfect" ruleset because the one that lets them go is unattainable— only the perfect ruleset can legitimately achieve the universal assent needed to be enforceable.

Human interactions have far more complexity than any intersection, the range of our desires is much greater than a simple hierarchy of three possible actions, specific desires vary between individuals and over time, and morality must be vigilant of cheating, but it's still a fundamentally practical problem.

The first, yes. The second, not necessarily.

The system which produces the best aggregate outcome for its members is statistically better for each person to join. You may still lose, but no other system will give you better odds. And a system which is imperfect for your sake is unlikely to garner the universal support it needs to be meaningful.

Responses

Post a New Response

Your Handle:

Your Password:

E-Mail Address:

Subject:

Message:



Before posting.. think twice!


[ Return to the Message Index ]