Tag Archives: Ethics

A Place for Philosophy

In 2004, the city of Memphis celebrated the “50th Anniversary of Rock ‘n’ Roll,” attributing the roots of rock to Elvis Presley’s 1954 recording of That’s All Right. Some have rightly taken issue with this; black artists were playing what sounded like rock music for years before Elvis. Perhaps the origin of rock music is better attributed to Jackie Brenston’s Rocket 88, recorded in 1951. This event seems to be just one in music history where white artists capitalize on the talent of lesser-known black artists. Still today, some argue that white artists are more successful at pushing boundaries simply because they are white.

Just as the achievements of black artists sometimes go unacknowledged, philosophers are often not acknowledged for their past and ongoing contributions to furthering knowledge (scientific or otherwise). Moreover, scientists rely on philosophical thinking in their work but often don’t acknowledge it as such. In what follows, I hope to convince the reader that philosophical thinking has a central place in modern science and academia and that we ought not marginalize philosophy in favor of pure scientism.

Science as we know it today developed out of philosophy. Galileo and Isaac Newton were known as “natural philosophers.” It was not until the modern period that science began to separate from philosophy. Even after this separation, there was plenty of interaction between the two disciplines. The formalization of calculus was occasioned by a philosopher and many philosophers made contributions to modern mathematics.  However, science does not just owe a tip of the hat to philosophy for its origins; many modern scientific theories, such as string theory, rely on reasoning that goes beyond physically observable phenomena–what I think of as philosophical reasoning.

This past week, I attended Denver’s third Nerd Nite. Kayla Knopp, Rachel Miller, and Lane Nesbitt, clinical psychology students at the University of Denver, spoke about the way in which both biology and culture influence our understanding of sexual behavior. Although they didn’t state their conclusions explicitly (more on this below!), I take one of their main conclusions to be that (C) the set of all morally permissible sexual activities is larger than the set of all culturally accepted sexually activities. After what I think was an honest assessment of their talk, I took the argument for (C) to be the following:

P1. In nonhuman animals, there exists a set of sexual activities that our culture would find unacceptable if practiced by humans (e.g., one member of this set is deer threesomes!).
P2. We have evolved to enjoy (some) activities in this set.
P3. Since these activities are part of our natural evolutionary heritage, we ought to explore them (i.e., they are not immoral).
C. The set of all morally permissible sexually activities–i.e., the set of culturally acceptable activities plus the set of “natural-yet-culturally-unacceptable” activities that we ought to explore–is larger than the set of all culturally accepted sexually activities.

To be clear, I agree with (C); it is the argument for (C) that I take issue with.The problem with this argument–as many philosophy students may pick up on–is that P3 is fallacious: just because an act is natural doesn’t mean that that act is right, moral, or ethical. There are many examples that illustrate this fact. For one, we have evolved to eat meat—i.e., it is natural—but it’s not at all clear that eating meat is ethical; there are very strong arguments that support the claim that eating meat is unethical. To consider a more relevant example, sex between adult and juvenile animals occurs naturally among certain types of weasels and primates. But, among humans, such acts are considered immoral by most (and for good reason, since these relationships in humans often cause significant harm to at least one party, they are often non-consensual, etc.).

I notice this fallacy often in scientific discussion, and it is sometimes made by scientists. But philosophers have been warning us about making this mistake so much that they’ve given it a name: the appeal to nature fallacy. Generally speaking, my (admittedly anecdotal) experience suggests that philosophers are more careful with the form of their reasoning than scientists are. As further evidence for this claim, philosophers are notorious for writing their arguments and conclusions explicitly and very carefully; they sometimes even number their premises in the way that I did above. Philosophers follow this practice because they think it removes a lot of potential ambiguity and misinterpretation. Scientists follow this practice less often but I think that, in many instances, doing so would make their work more clear. After all, if explicitly outlining arguments was a standard scientific practice, then I wouldn’t have had to create my own (possibly misinterpreted!) outline of the sex-talk argument above. The point here is that, in so far as scientists use arguments in their work, they are doing philosophy in some sense, and that scientists may have something to learn from philosophers about how to present their arguments more effectively.

This week’s Radiolab provides another example of how philosophical thinking fails to get the credit that it deserves. It’s worth looking at the description of the episode:

The “mind” and “self” were formerly the domain of philosophers and priests. But in this hour of Radiolab, neurologists lead the charge on profound questions like “How does the brain make me?”

The connotation here (if I’m not being overly sensitive to the science-philosophy tension) seems to be that, for centuries, philosophers have spun their wheels with the question of personal identity, but that no serious work has been done until brain scientists caught on to the question. This kind of thinking is problematic for a number of reasons. For one, in an incredibly complicated area such as personal identity, just formulating the right set of questions is difficult. In many areas, philosophers deserve credit for formulating the right questions. Second, as philosopher Alva Noë has argued, neuroscience operates under the philosophical assumption that consciousness and the self are cognitive phenomena that are to be found by studying neural activity in the brain. This assumption is not empirical or “scientific”; it has roots in Cartesian philosophy (and I don’t think it is at all obviously true). Noë’s position, which builds upon work by Martin Heidegger and Maurice Merleau-Ponty, is that we are missing something of fundamental importance if we do not try to understand the self in terms of our embodied existence. Finally, the idea that discoveries in neuroscience (and many other disciplines) are reliable has been called into question by epidemiologist John P. A. Ioannidis. Ioannidis, in the celebrated paper, “Why Most Published Research Findings Are False,” argues that, in “hot” fields where there are small studies, small effect sizes, or great financial interest, many findings are likely to be false (arguably, neuroscience falls into each of these categories). While Ioannidis is not a philosopher, his arguments attack the (poor) statistical reasoning of some scientists. Since statistics is really a formalization of inductive reasoning, it can be understood as a sub-discipline of philosophy. Thus, I think Ioanidis’s criticism of scientific findings is largely a philosophical criticism.

There are many other examples of the importance of philosophy in scientific practice: disputes in the foundations of statistics, the interpretation of probability in modern physics, the definition of life in biology, and the understanding of measurements and mathematical models are all inherently philosophical. We ought to  acknowledge that philosophy place such an important role.





1 Comment

Filed under Uncategorized

An Open Letter to Dr. McCallin Re: Pay and Benefits Equality Act of 2014 (HB14-1154)

Dear Dr. McCallin,

I am a part-time instructor in the Colorado Community College System (CCCS) and a PhD student at a nearby university. I want to thank you for your thoughtfulness and dialogue with respect to Community College Pay and Benefits Equality Act of 2014 (HB14-1154). 

This is a tough issue, and as I’m sure you know, one that is receiving national attention. Colorado has the opportunity to lead the nation in providing better conditions for one of the most vital groups on campus–the group that has the most direct impact on serving students–educators. 

I’m sure you know the financial situation of the CCCS better than just about anyone. I trust that if it is possible to allocate future funds to pay educators a living wage, while not putting the CCCS in an absolutely dire position, then you will use those funds to move away from an unjust and unsustainable model that has many adjuncts in a dire position; that treats a large percentage of educators as disposable labor, and as a means to the end of financial (but not social) health. 

Fortunately, my income from the CCCS is supplemental. If I cannot secure one of the few (much desired) full time positions in the CCCS, I’m confident that I have the skills and motivation to be successful somewhere else. But that is a shame. I love teaching in the CCCS more than anything else that I do. Much of the private sector–the big banks and insurance companies in particular, with their unethical practices–don’t deserve the skills that I may be able to provide to them. The CCCS does. Those who serve our community as educators need some sort of reasonable financial incentive to stay. I promise you, it will pay dividends. 



Leave a comment

Filed under Uncategorized

The Paradox of Choice and the Death of God

Although some choice is undoubtedly better than none, more is not always better than less. –Barry Schwartz

God is dead…how shall we comfort ourselves, the murderers of all murderers?–Friedrich Nietzsche

My least favorite aisle in the grocery store is the cereal aisle. It’s not because I don’t love cereal (I do); it’s because there are too many choices. I know that I probably ought to eat one of the hundreds of low sugar, high fiber cereals–perhaps one of the many kinds of oatmeal. But the hundreds of different chocolate cereals are so damn good. At least for a time. Then I get tired of the sweet stuff and wish that I bought something else. The problem is that I take too much time deciding, and, in the end, I am usually disappointed with choose.

Psychologist Barry Schwartz describes some research that suggests that having too many choices–like I have in the cereal aisle–is actually bad for one’s well-being. His claim is that we feel less satisfied with the choices that we make because we are haunted by the idea that one of the many other options that we did not choose may have been better. Although we tend to think that the more choice we have the better, Schwartz is convincing in arguing that the abundance of choice, even in trivial domains like the grocery store, can make us less happy (he calls this the paradox of choice).

The problem seems to be much more pronounced when we think about the choices related to important existential questions in our lives. Choices about what school we ought to go to, what career path we ought to take, where we ought to live, whom we ought to date or marry, whether and when we ought to have children, etc., can be extremely difficult. To spend a lot of time thinking about these questions is to acknowledge that a lot is at stake; the idea that the chosen career path or mate was the wrong choice is enough to cause us serious psychological harm.


But were these existential choices ever easy? It seems as though people have long been faced with important choices about where to work or whom to marry. There are good reasons, however, to think that the nature of the choices that we face today is different because there really are so many more options available to us. This is not just an empirical claim about the number of potential careers or mates in the world (although it is probably true that there are more possible careers and mates); rather, it’s a claim about a worldview that has slowly pervaded our culture. This new worldview, the philosophical foundation of which arguably began with Nietzsche’s claim that “God is dead,” seems to be one in which old models for living are being erased or left behind. For example, getting married used to be a thing that you did as soon as you could; the same with having children. It didn’t even seem like a choice. Everyone just did it. Schwartz writes:

A range of life choices has been available to Americans for quite some time. But in the past, the “default” options were so powerful and dominant that few perceived themselves to be making choices. Whom we married was a matter of choice, but we knew that we would do it as soon as we could and have children, because that was something all people did. The anomalous few who departed from this pattern were seen as social renegades, subjects of gossip and speculation.

Today it is hardly the case that those who don’t have children are renegades. Many people make a deliberate choice to not have children. But the problem that comes along with shaking the old existential models–e.g., grow up, get a job, get married, have a child (or several)–is that there aren’t corresponding replacement models for existential decisions; instead, we are faced with a myriad of ways to live our lives, and, prima facie, many of these ways seem equally good (notice that is not to say that some aren’t bad…). If this analysis is accurate, then it seems as though we are faced with the same dilemma that I faced in the cereal aisle: knowing that there were other options available to us makes the option we chose less satisfying. Perhaps it would have been better to have children, or to be a teacher rather than researcher, one might reason. If we have destroyed the old existential models (perhaps for good reason), then how are we supposed to live? Or, equivalently, if God really is dead, then how shall we “murderers of all murderers,” comfort ourselves?

Leave a comment

Filed under Uncategorized

Rationing by Price

Last month, hepatitis researchers made an astounding breakthrough in the fight against hepatitis C.  A drug named sofosbuvir is claimed to cure hepatitis C  in over 90% of patients with very few side effects. Given that hepatitis affects millions of people in the US and globally–many times more people than are affected by AIDS–this is great news. However, sofosbuvir’s manufacturer, Gilead Sciences Inc.,  is expected to sell the drug for $1,000 per pill. Since the treatment is expected to take several weeks, experts estimate that a full treatment–that is, a cure of hepatitis C–will cost, on average, $84,000.

This hefty price should raise a number of questions about access to healthcare. For one, even acknowledging the fact that Gilead Sciences Inc. has invested large amounts of money into developing this drug, we should ask ourselves whether this is a fair price for a disease that causes suffering to hundreds of millions of people. I think that, independent of what ethical theory that you ascribe to (e.g., utilitarianism, deontology, virtue ethics), there are strong arguments to be made that this is not a fair price. But, here, I’d like to argue for another claim that is often overlooked by proponents of free market healthcare and those who criticize healthcare rationing: charging $84,000 to cure a disease is rationing healthcare (by price), and further, rationing healthcare by price is at least as unethical as rationing by age, pre-existing condition, or employment type.

The verb ration means to “allow each person to have a fixed amount of a particular commodity.” So, to ration healthcare means something like “to allow each person to have a fixed amount–and perhaps that fixed amount is none–of a particular medical treatment.” In recent debates about the Affordable Care Act (Obamacare), opponents of the act claimed that it will lead to rationing of healthcare, particularly to seniors. They argue that government bureaucrats will be the ones deciding whether the costs of a particular treatment (perhaps tens of thousands of dollars) will outweigh the benefits of the treatment (perhaps only months of additional low quality life). It may be the case that government healthcare will lead to forms of rationing; perhaps some government decision makers will decide that a $50,000 treatment for an expected six months of extra low quality life is not worth the cost, when that same cost can be spent in much more efficient ways (perhaps by extending a higher quality life by many, many years). But opponents of this kind of rationing are silent about the form of rationing that is standard practice. We already ration healthcare in the US. When a cure for a condition like Hepatitis C costs $84,000, we are rationing by price: if you do not have the money, your allotment of sofosbuvir is zero. When an insurance company decides that it won’t allow those with pre-existing conditions into the risk pool–a practice allowed before Obamacare–we are rationing by pre-existing condition (got cancer? Oh well!). The question we ought to ask is not, “Is  it permissible to ration healthcare?” since some form of rationing scarce healthcare resources will always take place; rather, we ought to ask, “What form of rationing is most permissible?” Those who think that rationing by market forces is most permissible seem to implicitly assume that those who don’t have the money for a treatment must have not worked hard enough to deserve it. Even a basic understanding of the kind of systematic inequality and lack of social mobility that exists in the US seems to suggest that this is seriously mistaken.

Proponents of rationing by price may rightly ask, “Does the existence of a cure imply that the cure ought to be available to everyone?” To this question, I ask another: if you have the ability to easily save someone’s life, say, by administering a medical treatment, but don’t, are you responsible to some degree for that loss of life?

Leave a comment

January 12, 2014 · 11:54 am

Reasons and Public Policy

In January 2013, members of New Trier Public Schools in Illinois hosted a forum to present information about guns and public safety. During the talk, the crowd became defensive about the message; finally, an army veteran begins to speak about our constitutional rights. He says:

I went to war for your ability to have the First Amendment, to say what you stood up there and said today, to write what you want to write in your newspaper, and have whatever opinion you want to have. You can practice whatever religious freedoms you want. I would like you to answer the question, since you just [mentioned] one of the rights that I went to war over to defend, that is inalienable, to every American citizen. If this discussion was going on, about your First Amendment rights, would you still have the same opinion that we don’t need that any more either.

Goodman, a speaker from the New Trier Public Schools reasoned with the veteran as follows: when the second amendment to the constitution was adopted in 1791, Congress and the people had particular reasons for supporting it; we ought to reevaluate whether these reasons are relevant today.

I have a feeling that, to many Americans–call them constitutionalists–the idea of reevaluating the reasons for the second amendment is foreign. The Constitution, the Bill of Rights, the Declaration of Independence and other government documents written at that time, so think constitutionalists, state the inalienable natural rights of American citizens. Such rights just are, and no reasoning ought to take them away. In fact, audience members during this debate between Goodman and the veteran can be heard saying things like “eternal truth…true for all time” in the background.

I think this kind of reasoning put forth by constitutionalists is highly problematic (and, to be sure, it’s not just limited to debates about guns). The Constitution, the Bill of Rights, the Declaration of Independence, etc., are documents written by people. The people who wrote them, as smart as they were, were not infallible, and did not have any special connection to God (if He exists…) or any special deep insight into the nature of the universe that other scholars lacked. As evidence for this claim, we can point to the fact that Thomas Jefferson, while writing that “all men are created equal,” owned slaves. While we might defend Jefferson as being blameless on the grounds that, as a product of his time and culture, he did not consider enslaved Africans to be “men”, this is no reason to believe that Jefferson is an infallible individual since we all (hopefully) believe that Africans are “men” and enslavement is wrong. Thus, if the individuals who wrote these documents are fallible, then we have no prima facie reason to take them as conveying the absolute truth. Instead, as Goodman points out, we should probably look at the reasons why they were written and consider whether those reasons are relevant today.

One might object at this point (as the veteran does during the debate) that the reasons for holding onto the second amendment have not changed. This claim cannot be true of all of the reasons. Here are some reasons that people in 1791 supported the second amendment:

  1. citizens ought to be able to form a militia
  2. citizens ought to be able to participate in law enforcement
  3. citizens ought to be able to suppress slave revolts
  4. citizens ought to be able to protect themselves from tyranny

In the very least, we no longer believe the third reason to be legitimate. Further, given the nature of current military weapons, the fourth reason seems silly: no weapon available to a civilian can offer protection against the kinds of weapons that the government possesses. Given that these two reasons are a bit outdated, we can conclude that the legitimacy of a reason for a law depends on the time and context in which the reason was given. So, it seems to follow that we ought to focus on our context, consider how we ought to interpret the second amendment, and think about what reasons we might have for passing certain laws that limit it.

The purpose of this post is not to defend or denounce second amendment rights. Instead, the purpose of this post is to shed light on how important reasons are. It’s not enough to rally around a veteran simply because he is a veteran or point to inalienable natural rights; veterans can be wrong and debates about whether such rights exist are highly philosophical and far from settled. Instead, we ought to decide what values are most important and what policies would best promote those values. This process involves assessing reasons.

Leave a comment

Filed under Uncategorized

The Environmental Impacts of Meat-Eating

Climate change—which is thought to be anthropogenic (human-caused) by the vast majority of climate scientists—is projected to have serious impacts on the environment that we depend on, and thus, on our health. For example, it is projected that climate change will increase the frequency of major storms like Hurricane Katrina and Super Storm Sandy; cause sea level rise, and thus serious suffering, in many areas including very rich (New York City) and very poor (Bangladesh) areas; cause water scarcity (this is particularly relevant to Colorado); and increase the prevalence of many diseases and disorders, including malaria, diarrhea, and asthma. An incredibly important question that we now face is, what, if anything, can we do to slow the effects of climate change?

On my most optimistic days, I tend to think that there are things that we can do to slow the effects of climate change (on pessimistic days, I tend to think we’re in serious trouble no matter what). I do my best to take up some of these actions: for example, I try to compost and recycle what I can; this semester, I didn’t buy a parking permit to coerce myself into biking to campus more often; for recent home renovations, I tried my best to buy environmentally friendly items like energy star appliances and recycled materials. While I think that these actions are beneficial if repeated on a large-scale, there’s one action that, if we’re serious about slowing the effects of climate change, we ought to do: reduce our consumption of, or cease all together, eating meat.

I know how bad that sounds. For most Americans, including me, eating meat is a deeply engrained piece of our culture. Even if one is convinced that meat-eating is, in most cases, wrong (and I’m thoroughly convinced that it is), ceasing something so central to our culture is difficult. But we ought not let what is difficult stand on the way of what is right. After all, no one would be convinced that, since ending institutional slavery in the US in the 19th century was difficult, we might as well not have bothered. Instead, we ought to try, as much as we can, to do what is right, independent of how hard it is. That being clear, now comes the difficult part: arguing convincingly that eating meat is wrong.

There are very convincing reasons to believe that eating meat is wrong, especially in the fashion that we produce it, because it causes extreme suffering. My goal is not to make this argument here; rather, I hope to argue that it is wrong because the negative environmental impacts of producing meat are great. Consider some facts about the way that we produce food:

(1) It is estimated that one pound of beef uses between 2500 and 5000 gallons of water; a pound of chicken requires 815 gallons of water. If you know anything about water issues, especially in the western US, you see how much of an issue this is. By contrast a pound of rice requires 400 gallons of water, a pound of potatoes requires 30 gallons and a pound of lettuce requires 15 gallons.

(2) It is estimated that one pound of beef requires almost a quarter of a gallon of oil. A full sized cow requires almost 300 gallons.

(3) The emissions produced by an 8 oz. steak are equivalent to the emissions produced by driving 14 miles.

(4) Since most of the cows that we eat are fed an unnatural diet of corn and grains (rather than grass), they tend to be very gassy (that’s right—they fart a lot). The methane released from cows is thought to be a significant contributor to climate change.

(5) The waste from concentrated animal feeding operations produces some nasty waste, which includes antibiotics, hormones, chemicals, and ammonia and heavy metals. This waste is known to pollute waterways and drinking water.

These facts lend a lot of support to the following claim: Meat production is detrimental to the environment, is a contributor to climate change, and thus, is detrimental to human health. Since it is plausible that we ought not support what is detrimental to human health, if follows that we ought not eat meat (or, at least we ought to greatly reduce our consumption).

What do you think of this argument? Are you convinced of the conclusion? If not, then it must be that, either some claims are false or the argument form is bad. Which is it? Feel free to share by emailing me at bzaharat@mines.edu.

1 Comment

Filed under Uncategorized

Without God

Without God, Are All Things Permissible?

“But what will become of man then…without God and immortal life? All things are lawful then, they can do what they like?” –Dmitri, The Brothers Karamazov

The question posed by Dostoevsky’s Dmitri is one that motivates many to believe that morality is inextricably linked to God. They believe that, if there are things called moral truths, then they must be given and enforced (either on earth or in the afterlife) by God. Just as civil laws are given by lawgivers (e.g., the state), moral laws are given by God. Further, if there is no God to hold us accountable for following these laws then we are free to harm others as we please. Since this conclusion is undesirable, so the story goes, in order for there to be a moral law, there must be a God.

It’s worth observing that there is a logical error in this argument: the fact that a claim is undesirable, does not imply that the claim is false. For example, it is undesirable that many suffer and die unnecessarily, but, it is still true. We will overlook this logical problem and consider the essence of the argument, that, without God, all things are permissible.

There are (at least) two plausible answers to the claim that, without God, all things are permissible. The first answer, which is similar to an argument given by Socrates (in Plato’s Euthyphro), attempts to show that, even if there is a God, moral laws do not depend on God, and thus, not all things are permissible. The argument goes as follows: if God believes that an entity or action X is good, it is either because (1) X possesses some properties—properties that are indepdendent of God—that make it good, or (2) X is good only in virtue of the fact that God believes that X is good. But, if (2) is true, then it seems as though moral laws are completely arbitrary. For, if X is good only because God likes it, it follows that, if God happened to like senseless torture, then senseless torture would be good. But, baring an extreme and unjust moral relativism—one that a benevolent God ought not allow— senseless torture is not good. So (1) must be true. But, if that is so, then we do not need God to study what is moral; rather, we just need to study the properties of X that make it good. So, Dmitri’s worries are in vain—without God, there is still the possibility of moral goodness.

The second answer, given by 20th century existentialist philosopher Jean Paul Sartre, accepts the notion that there is no God, and the notion that without God, there is no objective moral law or punishment. But, for Sartre, it does not follow that all things are permissible in the sense that Dmitri meant it! The absence of God comes with extreme responsibility. For Sartre, rather than conforming to the moral law, with every action, you are writing your own part of the moral law. So, if you decide to treat others with disrespect, cause suffering, kill, cheat, etc., you are expressing to others that it is permissible to do these things. Dmitri’s worries about morality are important, but not just because the absence of God implies the possibility of chaos and anarchy; instead, they are important because they summon one to think very seriously about who they want to be, and what kind of picture they want to paint of humanity.

So, what do you think? Is it possible to have morality without God? If so, is it because morality isn’t depdendent on God, as Socrates argues? Or, do we have a deep responsibility for writing the moral law through our actions, as Sartre argues?


Leave a comment

Filed under Uncategorized