Resolving Moral Conflicts

  • June 12, 2014 · 7:00 PM

One of the criticisms often raised about individual morality systems is the absence of a satisfactory mechanism for resolving conflicts which arise when two moral principles apply to a given situation but yield different “oughts”, “ought nots”, or simply actions. Dilemmas, such as those posed by the trolley dilemmas, illustrate the problem. Universalist systems, with unwavering commitments to principles, are especially plagued by this problem. Supporters of utilitarian systems are likely to call one’s attention to the “greater good” criteria as a general solution. Individual judgments are likely to vary because of variations in moral commitments to specific principles, both moral and non-moral, that people use to live their lives. Personal needs, desires, and beliefs (reality judgments) further complicate the judging process.

There is also the problem of different morality systems yielding different favoring reasons for what appears to be identical (relevantly similar) situations. Conflicts can arise because the conflicting principles apply to different aspects of the proposed action. Disagreements can also center on the differences in sources of normativity (reasons, norms) from which legitimacy is derived. Capacity to effect the action may affect the possibility of applying the principle, also. Then there is the prediction problem of the probability that the context might change and unforeseen/unintended consequences.

Discussion Focus

    a. Principles (oughts and ought nots) - Universalism and particularism
    b. Greater Good – Utilitarianism’s conflict resolver
    c. Internalization of moral principles – moral commitment
    d. Role of emotions – guilt, shame, et al, as motivator for moral behavior


Some Terms

   a. Antinomy - a contradiction between two beliefs or conclusions that are in themselves reasonable; a paradox.

   b. Universalism (generalism, objectivism) 1) Practice of identifying abstract concepts to judge value of an action -- a principle. 2) Abstraction - subsumption of particulars into a general principle. 3) Limits (such as applicable contexts) are provided as a priori contributory reasons stored as meta-data.

   c. Counter-example/counter-factual – description of a situation for which a principle is supposed to apply but fails to yield the expected judgment. Used widely by skeptics to call into question claims of universality.

   d. Eliminitivism – rejecting an entire principle/theory because of one successful counter-example/challenge.

   e. Particularism – principles serve as general guidelines, with final judgment made a posteriori based on the particulars of the context; triage

   f. Consequentialism - whether an act is morally right depends only on consequences (as opposed to the circumstances or the intrinsic nature of the act or anything that happens before the act). End justifies the means? Valuation of the consequences and distribution of that value is an important aspect of this system. Also, addressed under teleology. See < SEP/Consequentialism > for plethora of variations.

   g. Contractualism - morality consists in what would result if we were to make binding agreements from a point of view that respects our equal moral importance as rational autonomous agents. An act is wrong if its performance under the circumstances would be disallowed by any set of principles for the general regulation of behavior that no one could reasonably reject as a basis for informed, unforced, general agreement. (TM Scanlon, What We Owe Each Other, 1998).

   h. Utilitarianism - The main idea is that morality may be viewed as the set of decisions that would be made by an impartial benevolent observer - an observer who is aware of all the conflicting interests in a given situation, and of the consequences that different policies would have for those interests, and who is equally sympathetic towards all of the parties involved. The governing conception of utilitarianism is thus an imaginary construction (as is the governing conception of contractualism).
The moral point of view is a sort of God’s eye view, but independent of any belief in an actual God. It is the point of view that we would take if we could be fully aware of all the consequences of our actions, and could be equally sympathetic towards all those affected.

   i. Deontological ethics – duty-bound; - Focus is on features of the rules and actions rather than consequences. Conflicting duties face choices based on other considerations, such as loyalty to whom or what, and utilitarian greater good.

   j. Hedonism - pleasure is the only intrinsic good and that pain is the only intrinsic bad.

More Terms

   a. Culture critique – advertising tells us what we should be and what we are.

   b. Discourse ethics – objectivity/impartiality; Practical reasoning alone or in pairs/groups

   c. Divine command

   d. Egoism – self-interest; difficult to find a moral reason for doing without some other enabling reason; respect from others, self-esteem

   e. Naturalism – science only tells us what is, not what ought to be

   f. Perfectionism – equality issues and each person is valued equally; How to develop outlook? How to achieve balance?

   g. Pragmatism – imminent threat; tolerant and pluralistic

   h. Rationalism – Mostly an exercise in formal logic; and truth determination.

   i. Moral relativism – rejects moral rules and principles are absolutes and universals…all persons, places, times

   j. Subjectivism – emotivism; factual claim and attitude; fictions, conventional wisdom; folk psychology;

   k. Virtue ethics – focus on character of person…acts secondary to the way we do things;

   l. Kant’s Categorical Imperative – Constructing a principle with a neutral (objective) stance so as not to favor one’s self and ignoring consequences. See end of page.


Some Resolution Tools/Principles

   a. Greater good (utilitarian)

   b. Cost-benefit – benefits exceed costs – 1) personal 2) societal

   c. Survival of the species – 1)contemporary 2) unknown future challenge

   d. Instrumentalism – using another person to satisfy own needs/desires

   e. Regret and/or feeling of guilt/shame f. Golden rule

       1) Negative valence: Do not do unto others, that which you would not have others do unto you.

       2) Positive valence: Do unto others as you would have them do unto you. g. Parable of the Good Samaritan.

Moral Emotions - JP Tangney, et al, Moral Emotions and Moral Behavior, Annual Review of Psychology, 2007; See last two paragraphs, p. 4)

   a. Guilt(-proneness) – Generally is associated with a specific act, rather than one’s self, is public, involves respect of others, and is generally supportive of good moral behavior, reparative actions, and amends. Embarrassment sometimes has similar affective effect but not as strong.

   b. Shame(-proneness) – Represents a threat to ones’ self/identity, leads to defensive or hiding behavior, and is not useful in supporting good moral behavior. Often associated with an abusive history leading to maladjustment.

   c. Some positive valence emotional affects include pride, righteous anger/contempt/disgust, elevation, gratitude, and the emotional process—empathy.

Limits (of Ethics) – J Baggini & PS Fosi, The Ethics Toolkit, 2007

1. Akrasia – acting against one’s own interest
2. Amoralism – principles outside of morality, such as survival, happiness, et al.
3. Bad faith (deceiving others, misrepresenting ourselves) and self-deception
4. Casuistry (appealing to non-moral principals to justify doing what one wants to do, anyway) and rationalization
5. Falleness – our sinful nature
6. False Consciousness – applying value to something with out justification; Plato’s Cave, conventional wisdom, ideologies, 7. Free will and determinism
8. Moral luck – violating a principle and not getting caught
9. Nihilism – annihilation of existing order
10. Pluralism – conflicting principles that defy valuation
11. Power – moral principles are about exerting power
12. Radical particularity – invoking the Hitler story to win a point; sameness is baseless
13. The separtness of persons – a gain to one is not equivalent to a loss to another; marginal value is used to counter the forgoing claim.
14. Skepticism – moral beliefs have a purely subjective or internal bases, usually in feeling, and that no objective or external dimension can prescribe behavior
15. Standpoint – view from the power elite vs. that from the victim elite
16. Supererogation – praising one for exceptionally moral acts, but questioning the prudence of her excessive generosity; praising someone for a good act, but withholding judgment and not condemning someone who doesn’t perform such acts.
17. Tragedy – Scenarios (aka trolley dilemmas)

Kant’s Categorical Imperative

In his book/treatise, Groundwork of the Metaphysics of Morals, Kant provides a basis for morals:

"I ought never to act except in such a way that I can also will that my maxim should become a universal law."

Kant calls this principle the Categorical Imperative, for one must follow it in all circumstances (i.e., categorically); it is distinct from Hypothetical Imperatives that one needs to follow only if they further some end that one wants to achieve. To act according to the Categorical Imperative, one must formulate a maxim, which is a statement of one's intended action and the reason one would follow that action.

This of course raises the question of what actions are from duty and are therefore constitutive of a good will. Kant states that only actions that occur from the "representation of the law in itself" count. One must rationally determine [a priori] what the moral law is in a particular circumstance, and act just because of the moral law. To do so, one adheres to the following principle:

“It [The Categorical Imperative] is concerned, not with the matter of the action and its presumed results, but with its form and with the principle from which it follows; and what is essentially good in the action consists in the mental disposition, let the consequences be what they may. This imperative may be called the imperative of morality.”

-- Groundwork of the Metaphysics of Morals, Immanuel Kant, 1785.


* * * ADDENDUM ***

Components of Moral Action – CE Johnson,Meeting the Ethical Challenges of Leadership: Casting Light or Shadow, 2011; Chapter 7 – Ethical Decision Making and Behavior

There are a number of models of ethical decision making and action. For example, business ethics educators Charles Powers and David Vogel identify six factors or elements that underlie moral reasoning and behavior and that are particularly relevant in organizational settings. The first is

moral imagination, the recognition that even routine choices and relationships have an ethical dimension. The second is moral identification and ordering, which, as the name suggests, refers to the ability to identify important issues, determine priorities, and sort out competing values. The third factor is moral evaluation, or using analytical skills to evaluate options. The fourth element is tolerating moral disagreement and ambiguity, which arises when managers disagree about values and courses of action. The fifth is the ability to integrate managerial competence with moral competence. This integration involves anticipating possible ethical dilemmas, leading others in ethical decision making, and making sure any decision becomes part of an organization’s systems and procedures. The sixth and final element is a sense of moral obligation, which serves as a motivating force to engage in moral judgment and to implement decisions.

James Rest of the University of Minnesota developed what may be the most widely used model of moral behavior. Rest built his four-component model by working backward. He started with the end product—moral action—and then determined the steps that produce such behavior. He concluded that ethical action is the result of four psychological subprocesses: (1) moral sensitivity (recognition), (2) moral judgment, (3) moral focus (motivation), and (4) moral character.

Component 1: Moral Sensitivity (Recognition)

Moral sensitivity (recognizing the presence of an ethical issue) is the first step in ethical decision making because we can’t solve a moral problem unless we first know that one exists. A great many moral failures stem from ethical insensitivity. The safety committee at Ford Motor decided not to fix the defective gas tank on the Pinto automobile (see Chapter 2) because members saw no problem with saving money rather than human lives. Wal-Mart was slow to respond to concerns raised by employees, labor groups, environmentalists, and others about wage violations, sexual discrimination, poor environmental practices, and other issues.  Many students, focused on finishing their degrees, see no problem with cheating. (You can test your ethical sensitivity by completing the “Self-Assessment: Moral Sensitivity Scenarios.”).

According to Rest, problem recognition requires that we consider how our behavior affects others, identify possible courses of action, and determine the consequences of each potential strategy. Empathy and perspective skills are essential to this component of moral action. If we understand how others might feel or react, we are more sensitive to potential negative effects of our choices and can better predict the likely outcomes of each option.

A number of factors prevent us from recognizing ethical issues. We may not factor ethical considerations into our typical ways of thinking or mental models.  We may be reluctant to use moral terminology (values, justice, right, wrong) to describe our decisions because we want to avoid controversy or believe that keeping silent will make us appear strong and capable. We may even deceive ourselves into thinking that we are acting morally when we are clearly not, a process called  ethical fading.  The moral aspects of a decision fade into the background if we use euphemisms to disguise unethical behavior, numb our consciences through repeated misbehavior,

blame others, and claim that only we know the “truth.”

Fortunately, we can take steps to enhance our ethical sensitivity (and the sensitivity of our fellow leaders and followers) by doing the following:

  a.  Active listening and role playing

  b.  Imagining other perspectives

  c.  Stepping back from a situation to determine whether it has moral implications

  d.  Using moral terminology to discuss problems and issues

  e.  Avoiding euphemisms

  f.  Refusing to excuse misbehavior

  g.  Accepting personal responsibility

  h.  Practicing humility and openness to other points of view

In addition to these steps, we can also increase ethical sensitivity by making an issue more salient. The greater the moral intensity of an issue, the more likely it is that decision makers will take note of it and respond ethically.  We can build moral intensity by doing the following:

  a.  Illustrating that the situation can cause significant harm or benefit to many people (magnitude of consequences)

  b.  Establishing that there is social consensus or agreement that a behavior is moral or immoral (e.g., legal or illegal, approved or forbidden by a professional association)

  c.  Demonstrating probability of effect, that the act will happen and will cause harm or benefit

  d.  Showing that the consequences will happen soon (temporal immediacy)

  e.  Emphasizing social, psychological, physical, or psychological closeness (proximity) with those affected by our actions

  f.  Proving that one person or a group will greatly suffer due to a decision (concentration of effect)

Finally, paying attention to our emotions can be an important clue that we are faced with an ethical dilemma. Moral emotions are part of our makeup as humans.  These feelings are triggered even when we do not have personal stake in an event. For example, we may feel angry when reading about mistreatment of migrant workers or sympathy when we see a picture of a refugee living in a squalid camp. Moral emotions also encourage us to take action that benefits other people and society as a whole. We might write a letter protesting the poor working conditions of migrant laborers, for instance, or send money to a humanitarian organization working with displaced persons.  Anger, disgust, and contempt are other-condemning emotions. They are elicited by unfairness, betrayal, immorality, cruelty, poor performance, andstatus differences. Anger can motivate us to redress injustices like racism, oppression, and poverty. Disgust encourages us to set up rewards and punishments to deter inappropriate behaviors. Contempt generally causes us to step back from others. Shame, embarrassment, and guilt are self-conscious emotions that encourage us to obey the rules and uphold the social order. These feelings are triggered when we violate norms and social conventions, present the wrong image to others, and fail to live up to moral guidelines. Shame and embarrassment can keep us from engaging in further damaging behavior and may drive us to withdraw from social contact. Guilt motivates us to help others and to treat them well.  Sympathy and compassion are other-suffering emotions.  They are elicited when we perceive suffering or sorrow in our fellow human beings. Such feelings encourage us to comfort, help, and alleviate the pain of others.  Gratitude, awe, and elevation are other-praising (positive)  emotions that open us up to new opportunities and relationships. They are prompted when someone has done something on our behalf, when we run across moral beauty (acts of charity, loyalty, and self-sacrifice, for example), and when we read or hear about moral exemplars (see Chapter 3). Gratitude motivates us to repay others; awe and elevation encourage us to become better persons and to take steps to help others.  In sum, if we experience anger, disgust, guilt, sympathy, or other moral emotions, the chances are good that there is an ethical dimension to the situation that confronts us. We will need to look further to determine if this is indeed the case.


Join or login to comment.

11 went

People in this
Meetup are also in:

Sometimes the best Meetup Group is the one you start

Get started Learn more
Katie

Katie, started NYC ICO

Start your Meetup today

Act now and get 50% off.
Until February 1.

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy