Friday, March 30, 2012

Video gaming

Almost 60% of Americans (183 million) play video games; an incredible 97% of Americans aged 12 to 17 play video games. For some, video games become an addiction: 5 million Americans play 40 or more hours of video games per week.

Five popular myths about video games are:

1.    They’re only for kids. The most popular video game is solitaire. People of all ages play a wide variety of video games using everything from a smartphone to especially designed equipment.

2.    They’re only for boys. In the 12 to 17 age bracket, almost as many girls (94%) play as boys (99%); 40% of all gamers are female.

3.    They’re a boutique business. Not true – video games represent a $25 billion industry.

4.    They cause violent behavior in young people. The research is inconclusive.

5.    They stunt the social skills of young people. This depends on the type of game played, amount of time spent gaming, and other activities in which the person engages. (Robert A. Lehrman, “Five myths about video games,” Christian Science Monitor)

The average American gamer is about 37 years old and has played for 12 years. Video gaming is not an activity for kids or that people outgrow. Why?

Yale professor Paul Bloom, author of "How Pleasure Works," points out that Americans find many products of the imagination – games, movies, TV – more interesting than real life.

"Why would individuals ... watch the television show 'Friends,' "he quotes one psychologist as saying, "rather than spending time with actual friends?"

Among other things, Dr. Bloom says, the adventures of fictional characters are usually "much more interesting" than ours. (Robert A. Lehrman, “Video game nation: Why so many play,” Christian Science Monitor, March 18, 2012)

Video gaming engages the player, stimulates the imagination, requires decisions, sometimes involves human interaction, and perhaps hones motor skills. These are all good things.

There is even an Anglican video game, The Anglican Cathedral of Second Life. I’ve not played, but have read some positive reviews.

Simplistic condemnations of video gaming ignore the pervasive reality of video gaming and its multidimensional nature. As with most things, video gaming may be positive or negative in a person’s life. An addiction to video gaming, for example, that causes a person to lose her/his job, interferes with important personal relationships, and is otherwise destructive is obviously unhealthy. Similarly, some future research will probably show that for some gamers, extended (the research will need to quantify this) playing of excessively violent games (also needs specificity) causes, with a measurable frequency, in whole or part antisocial, criminal behaviors. I know from my work with moral injury and the morality of war, that warriors must learn to kill. Research has consistently shown that without highly focused training, only about 10% of soldiers in combat actually shoot at the enemy. The others do not fire their weapons or intentionally aim to avoid hitting the enemy. These statistics hold even for troops receiving enemy fire.

God intends life to be good and for humans to enjoy pleasurable activities. Video gaming can contribute to that pleasure while also developing life skills. Rather than maintaining its distance from video games, the Church could profitably engage with gaming and encouraging responsible gaming. When I picture Jesus living in the twenty-first century, I can easily see Jesus and his friends playing video games.

Wednesday, March 28, 2012

Trayvon Martin, Robert Bales, and human nature

Is human nature basically good or bad?

Answering that question requires at least three decisions. First, one must define the terms good and bad. . Both good and bad have had multiple definitions. Yet assessing human nature requires clarity of terms

Good, it seems to me, denotes that which gives or enriches life and bad denotes that which takes away or diminishes life. From a Christian perspective, those definitions cohere with the ideas that God is creator intends life for the living, ideas that resonate deeply throughout the Scriptures.

Second, one must define human nature. Is human nature strictly a result of one’s genes, i.e., one’s physiology? Probably not. Physicists pondering the nature of the cosmos increasingly identify emergent properties, a sum greater than the total of the parts. In other words, atomistic reductionism of the kind favored by many atheists (and others) fails to account for the complexity of existence (which is very different than arguing for God's existence).

By human nature, I connote a person’s basic orientation, i.e., does a human tend to give and to enrich life OR to take life and to diminish it? Observationally, I note that some humans tend to practice reciprocal altruism (loving their neighbors with the hope of being loved in return, i.e., choosing what they think are win-win behaviors) frequently. Other humans more frequently engage in a winner take all competition that may enrich their life but does so at the expense of others.

Third, which of those orientations is most prevalent, i.e., is human nature basically good or bad? On the one hand, I find assessments about human progress extremely difficult (cf. Ethical Musings: Is progress possible?). On other hand, this is not an easily avoided question. One’s perspective on human nature colors how you expect others to treat you, whether you think most people tell the truth, whether you presume most people try to do what they perceive is right, etc.

The historic Christian answer of original sin taints every human seems inadequate. Christian theologians and biblical scholars debate whether the idea of original sin accurately identifies a Scriptural theme. Furthermore, observation reveals considerable disparity in human behavior and orientation, disparities not aligned with Christian or even religious commitment, i.e., some bad people self-identify as Christian and some good people self-identify as non-religious.

The killing of Trayvon Martin highlights the danger of presuming that people are bad and out to take advantage of one. A neighborhood watch shot the young, unarmed man for being the wrong race and in the wrong place. The bad – the neighborhood watch unwilling to trust the local community to provide policing and expecting others to be bad – killed the good, Trayvon Martin.

The killing of Trayvon Martin reminds us that stereotyping people is always morally risky and often morally wrong. The best foundation for moral judgments (and such judgments are inescapable, Jesus’ purported warning against them notwithstanding) is to look at each person as an individual, assessing that person’s actions as good or bad, beginning the relationship with an expectation that the other is good. Hope for the best from another, even if prudent concurrently to prepare for the worst.

We expect military leaders to be persons of strong moral character. A U.S. Army staff sergeant faces 17 counts of murder for shooting innocent and unarmed Afghans in a recent rampage. Extenuating circumstances – the man’s fourth deployment in a decade to a war zone, financial pressures, frustrated career ambitions, and alcohol – do not alter the facts: a trained warrior wantonly killed innocent people. (This is not to presume the guilt of the accused, who under law, is presumed innocent until proven guilty. If the Army arrested the wrong soldier, another soldier is guilty; some American warrior killed those Afghans.)

If the accused is guilty, was he a bad person? Or, more worrisome, was he like you and me: good, but when under a certain level of stress and placed in certain circumstances, acted in an uncharacteristic manner? If the latter, under what level of stress and in what circumstances do you act in an uncharacteristic (i.e., bad) manner?

Monday, March 26, 2012

Class divisions

Libertarian pundit Charles Murray (famous – or infamous, depending upon one’s perspective – for his book, The Bell Curve) has recently published a new book, Coming Apart. In his newest book, Murray focuses on white America and argues that a new divide separates and increasingly isolates the upper and upper middle class from the working class.

Murray defines the upper classes as:

We have 20 percent in the upper-middle class as I've defined it, managerial jobs, professional jobs, college education. Within that group there is the very successful, the top 5 percent of the 20 percent alright, who run the country. Now some of them run the country in terms of their local city, they're influential wherever they live. Some of them just run the country -- period -- if you're talking about Washington, D.C., if you're talking about financial centers in New York, Hollywood, that kind of thing. They are different, they have become different over the last several decades in all sorts of ways. They have essentially a very distinctive culture. They get married a lot later than the rest of the country, they have somewhat different child-rearing practices.

The new upper class devotes incredible amounts of effort to raising their kids but that also includes incredible amounts of effort in getting their kids into the right preschool in some elite communities which I think is going a little bit too far. And they also have given rise to what are called "helicopter parents" because they hover. So there are lots of good things about the way the new upper class raises kids. Pregnant women, if you're a member of the new upper class, and you're a woman, and you have a whiff of pregnancy not a drop of alcohol, not any exposure to secondhand smoke, no drugs, and they take care of themselves magnificently while the child is in utero. That's good! The lengths to which they go is sometimes kind of extreme. I could form a mosaic of these distinctive cases and preferences but you know what? An awful lot of the people who watch the NewsHour know exactly what I'm talking about already.

The average American watches TV about 35 hours a week. Among the new upper class you have sort of two basic attitudes toward TV. One is you still have one, but you use it to watch the NewsHour and "Masterpiece Theatre" and maybe "Downton Abbey." The other says that we don't even have a TV anymore -- that kind of attitude. Well, do I think watching 35 hours of TV a week is a terrific thing to do? Not particularly. But do I think you're shutting yourself off from a lot of American culture if you are so completely isolated from what goes on, on popular TV? Yeah, you are! And if you don't see the movies that other people see, if you don't eat at the same kinds of restaurants, if you don't engage in the same kinds of interest and sports and the rest of it, none of these are terrible things, it's not good vs. bad. It is isolation however of the new upper-class from the mainstream of American culture.

He defines the working class as:

When I'm talking about the white working class, here's what I'm defining: high school degree, no more, and working in a blue-collar job or a low-skilled service job. When I'm talking about the white, upper-middle class, I'm talking about people who work in the professions or managerial jobs and have at least a college degree.

(To read more, cf. Paul Solman and Elizabeth Shell, “Charles Murray on Downton Abbey, Smoking During Pregnancy,” at Making Sense, March 21, 2012)

Where does Murray place you, in the upper classes or the working class? To find out, take his 25 question quiz. His quiz will also give you his appraisal of how well, if at all, you bridge the gap between the classes.

I’ve not read his book, Coming Apart. However, I do find his suggestion of a new class divide in the United States persuasive. I’m unwilling to impose blanket value judgments on the habits of each class. Obviously some habits – watching 35 hours of TV per week or obsessing about a child’s admission to the right pre-school, for example – are unhealthy. Similarly, some habits – getting together with friends or maintaining a reasonable weight – are healthy. But as my examples illustrate, each class has some healthy and unhealthy habits.

What does concern me, more than assessments of particular habits, is the growing polarization that I observe and experience in American society. A social fabric that lacks elasticity will tear sooner under the normal stresses and strains of change. When the pace of change accelerates (as has happened), potential tears become more precipitous with larger consequences.

One of the important functions of religion has been to create community, adding elasticity and strength to the social fabric. Murray’s work suggests, and other research confirms, that religion is increasingly a source of polarization rather than community.

Reversing that trend is easier said than accomplished. Not only have people seemed to become more intransigent but they also identify several issues as litmus tests of people with whom they can cooperate or exist in community (e.g., abortion, gun control, and gay marriage). The answer is not necessarily compromise on deeply held convictions but learning to respect and to celebrate diversity (cf. my previous post on civility, Further thoughts on civility).

Saturday, March 24, 2012

The U.S. social safety net

A couple of statistics illustrate why some of the poor (e.g., those active in the Occupy movement) and some of the middle-class (e.g., those active in the Tea Party movement) are angry about federal expenditures. In 2009, federal benefit programs gave the average American a whopping $6,583. Those benefits have increased by 69% since 2000. In some areas of the country, government benefits are 20% ($1 of every 5) of income. Almost half of all Americans lived in households that received government benefits in 2010.

Meanwhile, the least affluent households – the bottom fifth in terms of income – received only 36% of all government benefits in 2007, a sharp decline from the 54% they received in 1979. (Binyamin Appelbaum and Robert Gebeloff, “Even Critics of Safety Net Increasingly Depend on It,” New York Times, February 11, 2012 at

Some of the shifting pattern is understandable. Government benefits include those for veterans (more likely to go to the middle class) and all Social Security and Medicare payments (for which almost everyone is eligible). However, the safety net’s purpose in my estimation, and that of many citizens, is to care for the most vulnerable, the poorest, and not to replace reliance on individual initiative and achievement for the majority of people.

This chart shows where government goes and how that spending has increased over the last fifty years:

Several trends in that chart greatly concern me. First, spending on education is decreasing while spending on law enforcement (police, fire, and jails) continues to increase. Putting more people in prison is not the answer to the nation’s fiscal woes (cf. Ethical Musings: Musing about prison).
Second, defense spending has increased steadily and sharply since 2000. Yet the world is not safer because of the wars in Iraq or Afghanistan, the principle reason for the increase in defense spending (cf. Ethical Musings: Swords into Plows).
Finally, spending on entitlements has increased more rapidly, far outpacing all other expenditures. The problem of soaring Medicare expenditures is especially troubling. The answer is not to eliminate Medicare, which is probably the most cost-effective means of providing healthcare but to better control healthcare expenditures in total (cf. Ethical Musings: More thoughts on healthcare reform and Ethical Musings: Some events and thoughts worth pondering).
Surveys consistently report that a majority of Americans are willing to pay higher taxes but expect the government to practice better fiscal accountability. The huge deficits the nation faces bode ill for the future. This generation is borrowing from future generations to fund its lifestyle, similar to what happened in Greece, Italy, and elsewhere.
Exploiting future generations is simply not a Christian approach to obtaining the good things that life has to offer. Perhaps we need to have lower expectations. We certainly need a fairer tax system in which those with the largest income pay a larger proportion in taxes. Eliminating most (if not all) deductions and credits would level the playing field. Having everyone with income pay some percentage in taxes, small or great, would help everyone to feel invested in the nation and its economic system.
A strong social safety net expresses the nation’s concern for the most vulnerable and least fortunate. Few people can afford to pay for the cost of care when they or a loved one suffers from a catastrophic illness or health problem. Most workers can face a period of unemployment at some time in their working life. Ensuring a decent, minimal standard of living for the elderly is a mark of a civilized society. Unfortunately, the U.S. social safety net has lost sight of those objectives and now attempts to

Friday, March 23, 2012

Human cloning - part 3

The argument against cloning humans centers on four objections.

First, all people, including children, are of value in and of themselves. As God continues to participate in the creation of each new human, God creates each person as an individual, intrinsic good. In contradistinction, cloning reduces people from an end to a means. In other words, clones change humans from an intrinsic good into an instrumental good, a commodity.

On a micro sale, children become commodities when cloned to assuage guilt, to achieve immortality, or as a warehouse to provide spare parts that match the recipient’s blood and immune system to avoid rejection by the host. On a macro scale, people become commodities when cloned en masse for menial labor, warfighting, or any other purpose. On both a micro and macro scale, regarding people as commodities robs them of their God given dignity and worth.

Second, human community is biologically and theologically essential for shaping human identity and for respecting the dignity of all people. Simple biological and historical observation suggests that the nuclear family, regardless of the number of adult partners, has most frequently been understood as consisting of one man and one woman, with or without children. Theologically, the book of Genesis is more explicit, recording God’s concern that man should not live alone and thus explaining the creation of woman and the institution of marriage. Jesus apparently quoted those verses with approval emphasizing the permanent, monogamous nature of marriage.

Unfortunately, circumstances such as divorce or sexual orientation may make that ideal impossible to realize. Nevertheless, the nuclear family in all of its many permutations remains the basic building block for community. The extended family expands the nuclear family to include multiple generations and multiple branches of a family within the same generation. Community is created as nuclear and extended families coalesce into the larger whole of a clan, a tribe, or even a nation state. Thus, kinship has been the biological and historical basis for community.

From a Christian theological perspective, community is the body of Christ, the Church. In Christ, all, the living and dead, are connected. When asked about his family, Jesus replies that his mother, brothers, and sisters are his disciples. In our highly mobile and sadly divided society, Christian community has the potential to unite people in ways that both complements and transcends biological kinship.

On the one hand, cloning is inconsistent with biological kinship as the basis for human community. The clone, intended as a replica or extension of an individual, is not a complementary addition to a family or to a larger community. On the other hand, since cloning reduces the clone to the status of a commodity cloning is also inconsistent with the creation of Christian community.

Third, biological processes have their own integrity as integral elements of God's design for creation. The biological process of heterosexual reproduction ensures genetic diversity. As scientific research has repeatedly confirmed and as our legal codes enshrine, the biblical prohibition against incest protects Homo sapiens from the potentially devastatingly adverse consequences of inbreeding.

Human evolution depends not only upon cooperation, achieved through family and community, but also upon competition. Heterosexual reproduction utilizes competition both in the selection of a mate and in the selective joining of one particular sperm and one particular egg out of the many possibilities. God used competition within the evolutionary process to create humans and, perhaps, will use it to create whatever if any life form develops out of Homo sapiens in the future.

The book of Genesis tells the story of people who sought to become like gods by building a tower that reached to the heavens. To frustrate that intent God supposedly caused people to speak in a proliferation of languages, making them unable to understand one another and thus frustrating their desire to become like gods. Cloning seems another attempt to become like gods when we have not yet learned to live as moral human beings.

This moral failure is the fourth objection to cloning. Literally millions of children cry out for justice. They lack adequate medical care; they are unwanted, neglected, mistreated, hungry, and enslaved. Each of these children is a living argument against cloning. Jesus loved the little children and taught that the kingdom of God belongs to such as them. Until the children already in the world are truly loved, how can one realistically expect that clones will be better treated?

Scientific knowledge, per se, is neither good or evil, right or wrong. Nuclear technology, for example, has been used to heal and to threaten life. Learning how to clone is morally neutral. Employing that knowledge to benefit humanity through cloning animals is potentially beneficial though not without its hazards. But there is no valid reason to clone or to attempt to clone humans. Indeed, there are persuasive arguments against human cloning. As the noted ethicist Paul Ramsey wrote, “The good things that men do can be made complete only be the things they refuse to do.”

Wednesday, March 21, 2012

Human cloning - part 2

What about cloning humans? Surely, God is the author of life – but God often acts through human agency. People feed the hungry and heal the hurting, acting as God's hands, feet, and voice. Since the groundbreaking genetic research of Austrian monk Gregor Mendel in 1866, humans have assisted in the creation of plants and animals, shaping their genetic makeup through selective breeding and more recently through genetic modification. That work culminated in the birth of the first cloned animal, the sheep Dolly, four years ago. Why not clone humans?

Four major arguments in support of cloning humans have been advanced.

First, cloning offers people a form of immortality as they “live on” through a genetically identical clone. As already implied, this argument is specious. Clones will not, cannot, be imitations of the original. In no way could a clone ever embody the consciousness of its parent because it would never have the same experiences or make all of the same choices. Eternal life requires the persistence of personality, not simply a similar or even identical physical body. (Identical physical bodies are highly unlikely because of the effect of diet, climate and other factors on physiological development.)

Second, cloning offers childless couples another option for having a child. Current options include pregnancy through natural means, in vitro fertilization, various fertility drugs, adoption, etc. This argument has the greatest prima facie cogency of the four, though it is seriously flawed.

A child produced through cloning would not be the product of the couple’s love but a shadow of the partner or person who donated the DNA. The clone is more accurately described as a sibling of the DNA donor than as the donor’s child. Couples desperate to have a child, and those who intentionally or unintentionally pressure couples to have a child need to remember that having a child is not the summum bonum, the supreme good. Human relationships should be founded upon mutual love and commitment rather than a need to produce progeny.

Some cloning advocates suggest that the use of a fertility drug that results in multiple births when a fertilized egg splits several times before development begins is in fact a form of cloning. That analysis is incorrect. The significant difference between such multiple births and cloning is that the fertilized egg has received half of its DNA from the mother, half from the father. This situation is far more analogous to the birth, through entirely natural processes, of identical twins or triplets, than to intentional cloning.

Third, cloning advocates contend that cloning offers couples who have suffered the death of a child, or whose child has a terminal disease, the hope of being able to recreate their deceased or dying child. This concept is akin to suggesting to bereaved parents that they can “always have another,” words intended to comfort. In fact, bereaved parents rightly hear in that message loud, albeit unintended, notes of cruelty. A new son or daughter, cloned or otherwise, can in no way replace, physically or emotionally, one who has died.

Fourth, cloning offers the hope of becoming a form of eugenics as people seek to improve the human race by choosing which people to replicate based on intelligence, beauty, strength, etc. Narcissistic, self-cloning is simply a subset of cloning for eugenic reasons: the donor has decided that clones of him or her self will make the world a better place.

Not surprisingly, despots like Hitler have seen great potential in eugenics (remember the movie, The Boys from Brazil?). Conversely, novelists like Aldous Huxley in Brave New World and many ethicists are more fearful, recognizing that desirable traits may include minimum intelligence and a proclivity for tedious, highly repetitive work making clones like worker bees in a hive, existing only to support a ruling class.

Who decides what are desirable traits? Who decides the right percentage of the population that should have each trait? God, no respecter of persons, has created all people to be of equal value. Eugenics, however practiced, inherently presumes that some people are more valuable than others.

Although the cost and difficulty of cloning place any possibility of wide scale eugenics far in the future, twentieth century experiments with centrally planned economies as in the former Soviet Union conclusively demonstrate that humans lack both the wisdom and foresight to effectively manage any eugenics program. Sin, apparently endemic to the human condition, also dramatically limits our ability to enter optimistically into any eugenics program. Only the truly arrogant can believe that humans have the wisdom and the right to play God.

The third post in this three-part series presents the arguments against cloning humans.

Tuesday, March 20, 2012

Money and relationships

USAA Magazine (Spring 2012, 18-23) has some interesting statistics collected from several recent surveys of the U.S. population:

·         3 in 4 single Americans are turned off by excessive debt

·         Cost of a typical date: $69

·         91% of women say they would marry for love over money

·         On average, a U.S. wedding costs $26,500 and the honeymoon $4,466

·         There were 2.1 million marriages and 900,000 divorces in the U.S. in 2010; 1 in 10 marriages ends in divorce before the tenth anniversary

·         1 in 3 Americans say that the recession has stressed their marriage

·         22% of divorcees say money caused the split; 30% of people admit lying to their partner about money (I wonder how many don’t admit to it!)

·         Projected cost of raising a child born in 2010 to age 18 is $226,920

In the course of my active ministry, I conducted hundreds of couples counseling sessions and dozens of couples’ enrichment groups. Contrary to popular thinking, money is not the cause of most or even many relationship problems (obviously, having at least a minimally sufficient income helps).

The real cause of problems, suggested in the statistics above, is the inability of couples to communicate. I’ve known couples with six figure incomes who fought frequently and inconclusively over money; I’ve known couples with very love five figure incomes who found much happiness and love in their live together. If people can communicate and commit to the relationship, then, they find common values, common goals, and make a shared life not only work but also become the source of mutual flourishing.

Monday, March 19, 2012

Human cloning - part 1

Somatic cell nuclear transfer is the technical name for the process most frequently associated with the word cloning. In this process, a cell, usually an unfertilized egg, is readied in several ways, the most important of which involves removing the cell’s DNA. DNA – deoxyribonucleic acid – contains the genetic code, the instructions that guide the replication and differentiation of cells as they develop into an animal.

Cloning is initiated by inserting the DNA from the animal to be cloned into the prepared cell. This process frequently fails, generally requiring two hundred plus attempts before a prepared cell with its new DNA will multiply and differentiate, producing a clone. A significant number of the failures produce animals with abnormalities, perhaps acceptable with animals but a frightening prospect with humans. Although a human has yet to be cloned, the technology that now exists would seem to pose no barrier to human cloning. The Internet even has a one-page recipe for human cloning posted at

Science fiction depictions of cloning full-grown animals or humans as in the movie Multiplicity are just that – science fiction. Cloning by any method cannot even yield an egg much less a child that will grow into an exact replica of its DNA donor. Identical twins have identical DNA, are alike in many ways, yet are distinct and unique individuals. This is because humans, and the same is true of other animals, are determined by factors in addition to heredity. Environmental factors – diet, climate, illness, parenting – all influence the maturation process. Humans also have some measure of free will that also affects their development.

So even if a human were cloned, the resulting individual would not exactly duplicate the DNA donor. Presuming that DNA was available from Michealangelo, Lincoln, Hitler, Einstein, or Schweitzer there is no assurance that a clone would have the same abilities or values and every likelihood that the clone would be significantly different than the original.

The value of cloning animals derives from their contribution to human welfare, that is, although animals are God's creatures and therefore intrinsically valuable God also intended them as instrumental goods, i.e., a means to an end. We eat fish, meat, eggs, and other animal products; we wear skins and wool; animals in dozens of other ways enrich human life. Humans have animal pets; animals do not have human pets.

Genetic modification of animals can enhance their value, e.g., as cows are bred to produce more milk, chickens bred to lay eggs with less cholesterol, etc. Cloning animals has the potential to produce large numbers of animals that share the same beneficial traits.

The dangers of cloning animals are: first, as the genetic similarity of a breed increases the susceptibility to disease wiping out large numbers of that breed also increases; second, the consequences of humans and other life forms consuming genetically modified animal products is largely unknown.

The morality of cloning animals largely hinges on one’s view of the intrinsic value of animals and on the answer to a utilitarian question: which is greater, the benefit to human well being from cloning animals or the dangers to human well-being? Further research will probably provide an answer to that question.

The second part of this three part post explores arguments in favor of cloning humans; the third part delineates reasons against human cloning.

Saturday, March 17, 2012

Additional musings on death

Peter Goodwin, a family physician and leading advocate of assisted suicide in Oregon, recently committed suicide. Goodwin had been instrumental in Oregon’s fight to legalize assisted suicide. He had a terminal illness, a prognosis of a rapidly diminishing quality of life during his projected six remaining months, and chose to die with dignity. (Stephen Miller, “Right-to-Die Advocate Ends His Life,” Wall Street Journal, March 13, 2012,

What would you choose if you, God forbid, find yourself in a similar situation? Goodwin’s act was not rash or ill considered. No hope of an effective treatment existed; each day brought further deterioration to his condition and quality of life. Human actions prolong life in immoral ways, e.g., wrongly spending enormous sums to extend a comatose person’s “survival” for a day or a week.

Simplistically saying that only God determines when a person dies clearly does not mesh with contemporary reality. Similarly, humans can now create life, e.g., through in vitro fertilization.

For more thoughts on death, read Ethical Musings: Musings about death.

I support organ donation. I’ve prepared the necessary legal documents to authorize donation of my organs upon my death. The gift of organs can be a gift of life to another person. Goodwin’s choice of death and organ donation represents a net gain of life and healing.

However, I recently read a column by Dick Teresi, “What You Lose When You Sign That Donor Card” (Wall Street Journal, March 13, 2012) that raised some concerns. First, society would benefit from having a clear definition of death, a task easier said than done. Teresi’s column unhelpfully muddies the debate by vaguely referring to brain waves emanating from comatose individuals. He does not specify the type of brain waves; he ignores cases like that of Karen Quinlan, whom doctors declared brain dead in spite of some continuing brain activity, who was kept “alive” for years, and whose autopsy, when she was finally declared physically dead, showed significant, long-term brain deterioration.

Second, people should consult with knowledgeable medical experts when facing choices about prospects for reversing terminal diseases, selecting treatment protocols, opting to refuse treatment, exploring the advisability of assisted suicide, and deciding whether a person is brain dead. People should not depend upon religious leaders for answers to these questions. The ethical issue is living abundantly and promoting flourishing, a goal that is not always inimical with accepting death. Choosing to die with dignity may be the best way to end one’s life abundantly and with maximal flourishing.

Thursday, March 15, 2012

Importance of character

Noted scholar James Q. Wilson recently died. In remarking on his death, a number of commentators called attention to Wilson’s 1985 essay, published in Public Interest, “The rediscovery of character: private virtue and public policy” (available here).

Wilson analyzes achievement in the public schools, welfare and the dissolution of the family, and crime rates. He concludes that public policy alone cannot explain what he observes. Instead, the character of the citizens has changed.

What economics neglects is the important subjective consequence of acting in accord with a proper array of incentives: people come to feel pleasure in right action and guilt in wrong action. These feelings of pleasure and pain are not mere "tastes" that policy analysts should take as given; they are the central constraints on human avarice and sloth, the very core of a decent character. A course of action cannot be evaluated simply in terms of its cost-effectiveness, because the consequence of following a given course--if it is followed often enough and regularly enough--is to teach those who follow it what society thinks is right and wrong.

In other words, character is critical. Morality matters. That concept is not popular in an era of Freakonomics in which many people want to explain human behavior in terms of self-interest and gain. What Wilson’s essay repeatedly underscores is that people do not always act in their own self-interest. Wilson sadly suggests that Americans today are more self-centered than ever before.

From a Christian perspective, unbalanced self-centeredness is sin. Thus, I found Wilson’s essay especially timely during this season of the Church year in which Christianity encourages self-examination and adopting a spiritual discipline.

The traditional approach of adopting a single discipline for the entirety of Lend does not suit everyone every year. Nor is there any reason not to adopt an additional discipline partway through Lent or to begin a new one if you have, for any reason, put aside the one you first adopted.

So, here are some Lenten questions for assessing your own character/morals:

·         Toward what goal or aim is your life directed?

·         What right actions give you pleasure? What wrong actions leave you feeling guilty? How do these feelings shape your character in a positive way, making you more of the person God created you to be? How do these feelings and behaviors help you move toward your goal or aim?

·         Conversely, what right actions leave you feeling guilty? What wrong actions give you pleasure? How do these feelings negatively shape your character, making you less of the person God created you to be? How do these feelings and behaviors move you away from your life’s goal or aim?

·         Is there one habit – just one – that you can change from positive to negative that will transform (at least a little bit) into more of the person you want to become?

If so, cultivating that habit may be a very worthwhile Lenten spiritual discipline.

Saturday, March 10, 2012

Musings about death

To be born is to die. Or, in words from Ecclesiastes, There is a time to be born and a time to die. That truism may seem blindingly obvious. However, many people, as they approach death, seem to live in denial of death’s inevitability.

Over the last forty years, a change seems to have occurred in the medical profession. When I was in seminary, almost forty years ago, the conventional wisdom was that physicians generally insisted on employing every possible means to prolong life. Their attitude suggested that a patient’s death represented professional failure.

Yet a recent survey of medical doctors showed that 64% of physicians, compared to 20% of the general public, had advanced directives in case of medical disability or incapacity. Unlike TV programs in which CPR enables a person to return to a normal life about 70% of the time, in real life only 8% of people who receive CPR survive longer than a month; only 3% return to anything that resembles a normal life. Doctors, perhaps more than any other group, know the limits of modern medicine and recognize that treatments that may extend life briefly usually entail a greatly diminished quality of life. (Ken Murray, “Why Doctors Die Differently,” Wall Street Journal, February 25, 2012,

Abundant living sometimes necessitates choosing the best balance between quality and quantity of life. No one answer fits everybody; longer is not automatically better. This highly personal choice is a matter of prudential wisdom exercised in consultation with knowledgeable professionals (who provide the best possible summary of facts and probabilities), loved ones (whose love makes life valuable and suffering worth enduring), and spiritual guides (who affirm the worth and humanity of the dying, helping them find courage with which to face death).

Facing death squarely – not yielding an inch prematurely while not futilely tilting at windmills – helps people to die with dignity and love (hence the popularity of the hospice movement).

Facing death squarely can also save the healthcare system huge amounts of money. Approximately 25% of Medicare expenditures fund care during a person’s last year of life. The percentage of chronically persons treated by ten or more doctors increased from 30% to 36% from 2003 to 2007. In other words, in spite of doctors having changed their personal attitudes about the benefits of extreme measures to prolong their own lives, the healthcare system continues to treat chronically ill persons as if death was not inevitable and that prolonging life at any cost is worthwhile. (Cf. a Robert Wood Johnson Foundation study report at

In fact, the U.S. spends 50% more on healthcare than any other developed nation, with poorer results. (Ezekiel J. Emanuel, “Spending More Doesn’t Make Us Healthier,” New York Times, October 27, 2011, Acknowledging death’s inevitability is not defeatism but realism, not pessimism but essential for prudential management of scarce financial resources. Expensive medical procedures that do not significantly extend life with an acceptable quality of living (e.g., an extra week either in great pain or a drug induced stupor that is a side effect of pain control medication) simply do not make sense or cents.

Thinking about one’s death is a good annual Lenten discipline. Death can come to anyone, anytime. Being prepared for the unexpected helps one to value the present more and expresses love for the bereaved (this love may take the form of financial planning that includes insurance, end of life instructions, advanced healthcare directives, a will that identifies guardians for minor children, and perhaps even funeral or memorial service planning). Jesus’ exhortation to John to care for Jesus’ mother upon Jesus’ death represents a form of planning for one’s death (John 19:26-27).

Discussing the possibility of dying – when not a preoccupation – is not morbid but prudential. As with one’s individual thoughts, conversations about death can help people focus on the present, cherish times together (whether long or short), and discuss issues of mutual importance and concern that often are ignored in the press of the urgent.

Death is too important to ignore. Yet that is what happens all too often, at great cost to society, to one dying, and to those left behind.

Thursday, March 8, 2012

Food stamp statistics - part 2

More than one in seven Americans uses food stamps – unless one lives in Mississippi (and four other states and the District of Columbia), where more than one in five people rely on food stamps. (Phil Izzo, “More Than 1 in 7 Use Food Stamps in U.S.,” Wall Street Journal, March 2, 2012,

Those statistics are alarming. Are U.S. residents really so poor that they cannot afford to feed themselves without government assistance? Other reports that I read indicate widespread malnutrition, much of it the result of people making unwise choices about what they eat. Still other reports indicate that families do indeed struggle to feed themselves, pay the rent, pay for necessary healthcare, and pay other bills.

In my twenties, my wife and I lived for several years on an income that was less than half of the official poverty level. We survived without trauma, partially through good financial management and partially through luck. We had no phone, no TV, and no car. We used public transportation and bicycles. We cooked meals without using prepared foods. We ate relatively little meat and almost no sweets. We bought few clothes and made some of our own furniture. We had health insurance through the schools we attended. (We were both full time students but did not have subsidized housing or other benefits that might have pushed us above the poverty level.) We were lucky because we had no major health issues, were not victims of violent crime, etc. We knew that our poverty would be short-lived. Post-graduation, we expected our incomes to rise with employment. We did not apply for any form of government assistance because we felt an obligation to be self-sufficient as much as we could. In retrospect, the years were hard but formative and without regret. The availability of government assistance provided assurance that if all else failed, we would not starve.

I wonder to what extent people accept government assistance because they want a standard of living they cannot afford rather than accept the assistance only when unable to survive on their own. Much of what Americans take for granted – autos and TVs, for example – are really luxuries, not necessities.

Writing government policy, premised on one size fits all, is challenging (I’ve had to do it when in the Navy). Even so, I suspect that government assistance is perhaps too readily available, not encouraging (forcing?) sufficient self-reliance. When 1 in 7 qualifies for food stamps, something is clearly wrong.

However, that is not the whole story. When 1 in 7 people accept government assistance, then that also suggests self-reliance and hope are waning, warning signs that democracy is in trouble. Democracy does not require financial equality. However, democracy does require most people being able to participate in the political process as relative equals. That is no longer the case in the United States. I think this is part of the unfocused and sometimes misdirected anger of both the Tea Party and Occupy Wall Street movement.

Pervasive government assistance – whether food stamps, other anti-poverty programs, or middle-class tax credits – bear a striking resemblance to the subsidized grain programs that Rome’s pre-imperial elite used to pacify the city’s poor. That story had a bad ending: empire replaced democracy.

Is the United States moving in a similar direction?

Avoiding that fate requires reintegrating the wealthy and poor into the social fabric. For the poor, that means building self-reliance and hope for a better life in which present sacrifices are realistic steps to a better future. For the wealthy, that means more commitment to society, i.e., higher tax rates and more direct involvement (serving in the military rather than only as political leaders).

As a Christian, I am committed to political self-determination (i.e., democracy), self-reliance (this expresses human dignity), and healthy interdependence (this includes both a social safety net and recognizing that no person is an island). Unlike the partisan voices that characterize so much of contemporary political and religious discourse, Christianity insists on balancing all three of those values.

Tuesday, March 6, 2012

Jobs and defense spending

A defense industry trade association, the Aerospace Industries Association, has calculated that more than one million U.S. jobs might be lost if defense spending cuts reach the one trillion dollar level. The study, reported in the Wall Street Journal, does not indicate the time frame for the one trillion in cuts (Nathan Hodge, “Tank Plant Takes Cover amid Military Cuts,” March 2, 2012,

The article reflects the excessive emotion entangled in defense spending debates. First, the U.S. does not spend a trillion dollars per year on defense. Yet the article does not specify the period over which the cuts will occur.

Second, the one million jobs include not only an estimated 350,000 jobs directly affected but also cuts in suppliers, stores in which defense workers shop, etc. In fact, a cut in defense spending will cause a reallocation of employment away from defense related industries and into other industries because the money not spend on defense will be spent on a combination of other government programs, spent by taxpayers who retain earnings rather than giving them to the government, and make more funds available for other investments (to the extent that the government deficit finances its procurement).

The pain of economic adjustment is real. However, money spend on defense procurement is worthless unless the equipment sees actual combat usage or is necessary to deter a war. In other words, post-adjustment the nation gains by spending less on defense and more in other ways.

Third, the world has probably seen its last major tank battle. Buying equipment not required for actual defense needs in no way enhances national defense. Separating economic arguments from defense issues is imperative if the U.S. is to break free of the grip of the legislative-military-industrial complex that puts their interests ahead of actual national interest. Similarly, pushing other nations to buy high priced equipment they do not need (e.g., the Saudi jets that sit rusting on Saudi airfields unused) represents a bankrupt foreign policy designed to benefit narrow special interests at the cost of the public good.

Sunday, March 4, 2012

When we encourage Bible reading

The volume and variety of responses to my last Daily Episcopalian post, Encourage People to Read the Bible? Maybe not (also posted at Ethical Musings on Feb 19, 2012), suggest that I wrote about a vital and controversial issue. An essential follow on question is: How should Christians read the Bible? The answer to that deceptively simple question may help to identify differences between the norm and how Christians actually read, or recommend reading, the Bible.

For at least a century, The Episcopal Church (like most other Churches) has insisted that its seminarians learn the historical-critical method for reading and understanding the Bible. An implicit, if not explicit, premise of seminary biblical studies and other courses is that the historical-critical method is the preferred, if not the recommended or even the normative, approach to reading the Christian scriptures.

Yet, after graduating from seminary, many clergy default (revert?) to other ways of reading and interpreting scripture. Exegesis employing the historical-critical method is time-consuming hard work for which many parish clergy feel both under-prepared and unsure of its necessity or utility. Historical-critical exegesis can also challenge some long held and popularly cherished interpretations, e.g., the story of Jesus feeding the multitude reflects post-resurrection theology rather than factual history. Consequently, clergy tend to use scripture in daily morning and evening prayer (whether privately or as a public service), formation programs for children and youth, and adult studies in a manner that presumes that readers/hearers will understand the text’s meaning with little or no effort.

Presuming that casually reading (i.e., the devotional reading of texts not complemented by historical-critical study) scripture can be uplifting and formative but that preaching requires solid exegesis entails an oxymoronic dichotomy. On the one hand, scripture’s meaning is apparent and easily grasped when encountered in the context of a prayer office (apart from preaching). On the other hand, scripture’s meaning requires solid exegesis – even from a text that is part of the daily office lectionary – when expounded in preaching. A cynic might characterize this apparent inconsistency as clerical hypocrisy indicative of a lack of integrity or as clerical hubris indicative of believing laypeople lack the ability or faith commitment to master and use the historical-critical method.

My ruminations repeatedly prompted reflections on how other “people of the Book” (a Muslim phrase that includes Jews, Christians, and Muslims) read their scriptures. Unlike some people who attempt to straddle religious traditions, I’m very clear about my identity as a Christian. I’m a committed Christian, not a Jew or Muslim. On the other hand, unlike some Christians who think that we can learn nothing from other religions and non-Christians, I’ve often found that examining my beliefs and practices from multiple perspectives brings clarity and fresh insights.

Islam is riven by a sharp divide over how to read the Koran. Most Muslims today, as has been normative for centuries, read and interpret the Koran in the context of its history of interpretation. Various schools of jurisprudence (a term that reflects Islamic emphasis on the Koran, God's recitation to Mohammed, containing God's commands for people) provide the continuing conversations that help Muslims rightly understand what God's timeless words mean in the present.

In sharp contrast to that approach, Salafists believe that only the Koran and Hadith (the compilation of Mohammed’s words and actions not included in the Koran) are useful in understanding how people today should obediently submit to God. Salafist schools often teach only the Koran; well-meaning but ignorant instructors sometimes teach highly individualized interpretations as definitive. Unsurprisingly, these groups interpret Islam in ways that occasionally diverge radically from mainstream Islam.

For example, the Koran teaches that men and women should dress modestly. The Koran also instructs women to cover themselves with an outer garment when they leave their house. However, neither passage directs a woman to cover herself completely. Radical Islamists often require that women cover themselves completely based on Mohammed instructing his wives to hide behind a curtain. In keeping with longstanding Islamic tradition and jurisprudence, most Muslims believe that this latter guidance applied only to the Prophet’s wives, not to all women.

About 85% of Muslims are Sunnis, who have no authoritative clergy. Denying the value of centuries of Islamic juridical scholarship has multiplied individual interpretations and had the unanticipated result of producing extremist movements that include al Qaeda and the Taliban.

Turning back to Christianity, I find the analogues strikingly clear and horrifying. A few terrorist groups self-identify with Christianity, e.g., Operation Rescue, which targets abortion providers and bombs abortion clinics. These allegedly Christian groups, like their Muslim counterparts, justify their crimes with idiosyncratic readings of scripture. Mercifully, scripture study leads blessedly few Christians to become violent terrorists.

However, appallingly large numbers of self-identified Christians inflict terrible emotional and spiritual damage on others because they, like Muslim Salafists, reject their religion’s mainstream normative approach to reading and interpreting scripture in favor of individual interpretation guided by the Holy Spirit. These Christians include those who argue that women should be subordinate to men, all homosexual behaviors are sinful, effective child discipline requires generous and frequent doses of corporal punishment, and caring for the environment is unimportant.

No analogy is perfect. Christianity has had a dynamic, evolving approach to interpreting its scripture. Thankfully, the Church no longer regards allegory as a key interpretative principle. Yet from the second century forward, allegory figured prominently in reading and interpreting all of scripture. Similarly, after bruising controversies (e.g., with Galileo), the Church began to move away from a literal reading of the text toward a more complex reading informed by multiple disciplines (history, linguistics, psychology, science, philosophy, and so forth), tradition (i.e., a continuing conversation among God's people), and reason (to include experience).

I’m not arguing that scripture and its interpretation are properly the exclusive prerogative of the clergy. In any case, widespread literacy and access to the Bible and other materials prevent that from happening again. Nor do I want to adopt something akin to the Roman Catholic Church’s teaching magisterium.

I am arguing that Christians rightly use the historical-critical method to read and interpret scripture. Engaging in that endeavor requires effort and education; it also entails dialogue with the Christian community, directly (e.g., conversation) and indirectly (e.g., reading commentaries). I wonder what the Church might look like today if substantive biblical study that used the historical-critical method replaced the pabulum that widely passes for religious education. Every parish could, indeed should, regularly offer substantive, Bible study for all ages that teaches and uses the historical-critical method, empowering people to read and seek to understand scripture.

Judaism teaches that God gave the scriptures, particularly the Torah, to Israel. The scripture does not belong to an individual but to Jews collectively. Interpretation, therefore, belongs to the community rather than to individuals. Rabbis are not priests but Jews who have received an education in Torah, devoted themselves to the study of Torah, and to whom the Jewish community grants authority to teach because of that education and devotion. Judaism reads and interprets its scriptures through an ongoing dialogue between living rabbis conversing with scripture, dialogue with the rabbinical tradition of interpretation, and one another. This communal interpretive process explicitly recognizes that Jews today read the scriptures within a very different context than the one in which Israel received its scriptures from God.

Episcopalians, thanks be to God, are not Baptists or Pentecostals. Unlike many in both of those traditions, we believe in the importance of an educated clergy. We don’t ordain the uneducated, naively trusting God to guide them when they teach and preach. It’s time that we also believed in an educated laity. Only then will we honor both their calling as God's ministers and the Christian heritage of reading scripture informed by multiple disciplines, tradition, and reason.

Friday, March 2, 2012

Christian realism

New York Times’ columnist David Brooks and his family recently hosted a Chinese foreign exchange student. The student gave him a tie as a gift. To his surprise, the tie was not manufactured in China but one that Indiana Governor Mitch Daniels had taken to China as a gift on a recent trip to promote trade. Apparently, the tie had been sold and then re-gifted. (David Brooks and Gail Collins, “David’s Awesome Tie Story,” New York Times, February 15, 2012,

Perhaps pessimism about eventual Chinese dominance is less warranted than many pundits think. Globalization creates interesting, unpredictable linkages. China, for example, may have more to lose from waging a war to gain world dominance than U.S. advocates of increased defense spending believe.

Collins and Brooks used the tie story as a springboard for Brooks’ to complain about the lack of big solutions that President Obama has proposed. I wonder if Brooks’ is wrong about wanting a politician – of whatever political party – whose agenda is one of major solutions to major problems.

Big solutions will require Congressional action in a way that is very unlikely to occur, given the Senate's deep divisions and rules that allow a minority to obstruct passage of most legislation.

Small steps in the right direction(s) are doable, and much better than nothing. Perhaps Obama has learned a constructive lesson from his first three years - pushing for major change (e.g., healthcare) is too costly, too polarizing, and unlikely to happen again. Healthcare may be the signature issue of his presidency.

As Christians, God calls us to live in the world of the possible guided by the desirable, i.e., the ideal that God envisions for the world. Utopianism is unrealistic. Similarly, realism unguided by Christian ideals is no longer Christian. Looking for small steps of the possible, incremental progress toward a better world, is perhaps the essence of Christian realism.