Thursday, October 29, 2015

Tiny house trend

Image result for tiny houses free images

Tiny houses are houses that have between 100 and 400 square feet. Tiny homes are currently experiencing a surge of popularity in the US.

By way of contrast, from 1973 to 2013, the average size of a new home in the US increased by 1,000 square feet while the space per person doubled as the number of people per family shrunk. (Mark J. Perry, "Carpe Diem," American Enterprise Institute, February 26, 2014)

Tiny houses appeal to people for several reasons:
  1. Many tiny homes are mobile, allowing people to keep the same house if they relocate.
  2. Tiny houses have relatively tiny prices, generally under $100,000 and often less than $50,000. Growing numbers of people find 30-year mortgages burdensome, limiting their options fiscally and in other ways.
  3. Tiny houses represent a smaller environmental footprint and thus contribute less to climate change, global warming, pollution, and other ecological harms.
  4. Tiny houses force occupants to focus on developing a less material lifestyle, allowing more time for relationships, self, etc.

I like the tiny house trend. All four reasons commonly cited as explanations for why tiny houses appeal to people are ethically commendable. However, two other factors are also important.

First, the tiny house trend encourages people to consider how much space one really needs in a domicile. I have repeatedly observing people spending money on large houses. Had many of these people made alternative choices about the size of their dwelling, then they would have had the resources (money and time – big houses not only cost more to buy but also require time and money to maintain) to achieve some of their other goals in life. Perhaps I'm sensitive to this issue having spent most of my childhood in a large New England colonial that emotionally owner my father.

Two extremes can aid in bracketing the average amount of square footage per person that is ethically justifiable. On the one hand, the Soviets estimated that each person should have 100 square feet of living space. That feels too small for me, and probably does for most people. On the other hand, the National Association of Home Builders has calculated that the average US house grew from 1400 square feet in 1970 to 2700 square feet in 2009. The Census Bureau reported that the average US household had 2.58 persons in 2010. This means that, on average, a person in the US has approximately 1046 square feet of living space.

The tiny house articles/reports I have seen generally indicate one or two persons living in the tiny house. A family of three or four will undoubtedly find 100 square feet too small. Conversely, two persons living in 3500 plus square feet – as my partner and I did in Raleigh – is excessive and morally wrong.

Regrettably, people who want less house sometimes have few good options because US house builders traditionally prioritize size over quality. I bought the smallest house in Raleigh that offered the quality I sought. Thankfully, housing in Hawaii reflects a strong Asian influence. My Honolulu condo is mid-size for here, with less than 1200 square feet. I know we will have enough space; I wonder if we will have too much space. Extra space inevitably entails excess time, financial, and environmental costs.

Second, the tiny house movement has the potential to combat excessive economic inequality. Wealthy people by choosing a modest dwelling instead of a gargantuan mansion take steps to reduce the geographic and cultural barriers that create unethical class distinctions. Disavowing the pursuit of opulence in favor of a more modest lifestyle encourages self and others to develop an abundant life consonant with Ethical Musings' themes and values. Finally, the tiny house movement, if adopted by organizations and persons concerned about social justice, has the potential to reduce poverty, homelessness, and several other social evils.


The tiny house movement is no panacea nor do I advocate that everyone join it. However, the tiny house movement can be a catalyst for ethical reflection and making choices that lead to more truly abundant living.

Monday, October 26, 2015

Mercy trumps judgment

Over the last several weeks, a Roman Catholic synod of selected cardinals, bishops, and archbishops from around the world has met in Rome. The synod addressed several controversial topics related to Roman Catholic teaching about the family, e.g., same-sex marriage, the ban on divorced persons who have remarried without an annulment of their prior marriage receiving Holy Communion, and who has the authority to forgive a woman for the sin of having had an abortion.

Those topics reveal the real gulf that separates Roman Catholicism from progressive Christianity.

Strictly theological issues that people were once willing to fight to the death to defend (or oppose) are now sidelined, e.g., the correct formulation of the relationship between Jesus and God. Revealingly, the Episcopal Church and the Orthodox Church are moving toward agreement to adapt the version of the Nicene Creed that states the Holy Spirit proceeded from the Father and not the Father and Son. The conversation has attracted little notice and, apart from a handful of theological zealots on both sides, not generated much of a response beyond a yawn.

The sidelining of theological issues represents the inroads of scientific thinking and post-modernism. People are increasingly aware that theological statements are at best human efforts to encapsulate the ineffable and infinite in finite, human terms of reference.

Instead, the real issues that separate people today are ethical and ecclesiological. The latter, which include the ordination of women, clerical celibacy, and open communion, constitute barriers that limit participation, impose artificial limits, and exclude people. In a world increasingly committed to justice, all of these policies evoke strong opposition.

Ethical issues, among which are same-sex marriage, abortion, and the status of divorced persons, also represent sharp divides between contemporary followers of Jesus. One of the attendees, Cardinal Robert Sarah, who is from Guinea and leads the Vatican’s Congregation for Divine Worship, told the synod, “What Nazi-Fascism and Communism were in the 20th century, Western homosexual and abortion ideologies and Islamic fanaticism are today.” The Cardinal's comment that expresses an obviously heated rhetorical excess underscores the breadth of ethical divides.

Why is there this tremendous difference between disinterest and lack of passion about strictly theological issue and concurrent the interest and energy invested in ethical and ecclesiological issues?

I have two answers. First, in the twenty-first century, diminishing numbers of people accept revelation as a definitive, exclusive source of knowledge. Revelation in this context also connotes the Roman Catholic Church's teaching magisterium, which posits the Church's authority to speak definitively on behalf of God. Spiritual insights stand alongside knowledge and information gained from other perspectives, such as the scientific and historical. Unlike strictly theological issues, these complementary perspectives can have much to say about ethical and ecclesiological issues.

For example, a person's sexual orientation is not a matter of choice but birth. Defining the precise moment at which life begins is impossible; no evidence exists for ensoulment (God putting a soul into the new cell at a moment an egg is fertilized). Ergo, abortion is not and cannot be, by definition, murder. Women and men differ because of their sex but are in all other respects are the same. Arguing that an ordained male can represent Christ in a way impossible for a female to emulate is nonsensical, trivializing the priestly role as one defined by gender.

Second, institutions and humans too often prefer judgment to mercy. Pope Francis is calling the Roman Catholic Church to return to Jesus' thematic emphasis that mercy trumps judgment. Progressive Christians – at least theoretically – try to incorporate that theme in their ethics and ecclesiology. Twentieth century German-American theologian Paul Tillich argued that Christianity has a continuing need for reformation (his Protestant principle) precisely because of our human proclivity for judgment and our expanding knowledge base.


I'm not optimistic about the odds of the Roman Catholic Church undergoing significant internal reformation in my lifetime. I see their current struggles as a reminder that I too live in a glass house: instead of throwing stones, may God always help me to embody mercy rather than judgment.

Thursday, October 22, 2015

Upholding the law

While walking in Waikiki recently, I head a police officer say to someone with whom and he and another officer were having a conversation, "I choose not to enforce that law." Presumably, the officer found the law in question either morally objectionable or too difficult to enforce. The latter option seems unlikely. If the law were too difficult to enforce, I think that the officer would have phrased his comment differently, suggesting, for example, that the law is unenforceable.

The incident prompted some musings about a question I had never before considered: Should police officers choose which laws to enforce?

In fact, no jurisdiction enforces all of its laws. First, laws everywhere are too numerous for every police officer to know every detail of every law. Furthermore, court decisions can add specificity or alter the meaning of laws; legislative bodies also enact new laws. Thus, even if a police officer learned all of the details of every law in her/his jurisdiction, the officer would continually need to monitor court cases and legislation in order to have a current knowledge of the law. Time spent this way would be time away from actual policing. The law's complexity and changing content explain why most lawyers choose to specialize in a particular aspect of the law. A police officer can only enforce laws of which s/he is aware.

Second, violations are too numerous for the police with their current work force to enforce every law constantly. A police officer positioned at a traffic light might spend most of the day writing traffic tickets (running a red light, failure to wear a seatbelt, using a handheld mobile device while driving (illegal in Hawaii), speeding, jaywalking, littering, etc.). Assigning that type of duty to an officer precludes the officer dealing with more pressing issues (e.g., responding to an alleged assault or theft) and might give citizens the feeling of living in a police state. Few citizens would support hiring enough police officers to enforce all laws constantly, even if that goal were a possibility.

Third, each level of law enforcement consequently already exercises some discretion about which laws to enforce when and under what circumstances. Judges have perhaps the least discretion. Facing an overwhelming caseload, a judge may generally dismiss certain types of cases; judges frequently push attorneys to reach a plea agreement before appearing in court and submit the agreement to the judge for review and approval, perhaps only a cursory review. Prosecutors, with limited staffs and budgets, have great latitude in deciding what charges to file against each defendant. And police officers make daily choices about the best use of their time and efforts. Some crimes attract media attention or the public regards as so infamous that law enforcement has little room for discretion, e.g., child molestation and murder. For other crimes, public attitudes and media attention may actually inhibit aggressive enforcement, e.g., the use of cameras to catch drivers who run red lights. But much of the time, each echelon of a police department's hierarchy will exercise some discretion in deciding which laws to enforce.

In sum, police discretion is unavoidable and reminds us that we do not live in a police state. Thus, the real question is not whether the police should exercise discretion but how the police can best exercise that discretion ethically. Here are four guidelines.

First, police officers should prioritize public safety. Enforcing laws against murder, assault, kidnapping, and so forth are obvious examples. Less obvious examples in which the police must exercise some measure of discretion are, illustratively, traffic laws such as stopping at red lights and not texting while driving that keep people safe on the road. Still less obvious examples are laws that establish sanitary regulations, building codes, and other rules intended to ensure public health and safety.

Second, police officers should make protecting property less of a priority than protecting public safety. People are more important than property. Tangentially, a woman who had a valid license for carrying a concealed weapon recently drew her weapon and fired at the vehicle of shoplifters fleeing a Home Depot store. The woman's shots thankfully did not injure anyone, including innocent bystanders. The attempted use of deadly force to stop thieves who pose no physical threat to anyone is morally outrageous.

Third, police should assign laws intended to make life more enjoyable and violations that neither endanger public safety nor destroy property their lowest priority. For example, governments may have a valid moral interest in banning the homeless from sleeping overnight in a public park or prohibiting begging. Those activities, however, do not inherently endanger public safety or property and the police may appropriately assign enforcement a low priority.

Fourth, the police should exercise their discretion and refuse to enforce unconstitutional laws. Nineteenth and twentieth century Jim Crow laws that mandated treating people differently based upon race are historical examples of clearly unconstitutional laws. I suspect that the officer whose remark prompted this post had reference to laws limiting the number of consecutive hours that a person may spent on Waikiki's streets, parks, and beaches. The city intends those laws to keep homeless people from establishing temporary dwellings in the midst of the city's tourist and business districts. The laws have ignited much controversy because the city has not made a commensurate provision for assisting and housing its homeless population.

Reflecting about police discretion in deciding which laws to enforce has triggered three final thoughts.

First, communities and citizens can beneficially engage in public discourse with the police on these issues. Good policing is partially a function of police responsiveness to public concerns. Public discourse and openness about the use and shape of police discretion also exerts a healthy influence on the police to maintain honorable and morally defensible policies. Police officers who walk a beat can get to know the local community in urban areas in a way that police officers who cruise in patrol cars can never achieve.

Second, New York City at the end of the twentieth century aggressively enforced laws against relatively minor infractions (loitering, graffiti, public drunkenness, shoplifting, vandalism, etc.). The purpose of this campaign was to interdict offenders before they committed crimes that are more serious. Although crime rates in New York City declined, crime rates across almost all of the US declined during that same period. Researchers have not been able to determine whether New York City's much touted enforcement efforts were actually effective. Good research has the potential to improve policing by informing the public and law enforcement about which crimes lead to more serious or repeat offenses and where the police can best utilize their limited resources.


Finally, reducing crime is no panacea and sometimes is more costly than addressing the underlying problem(s). For example, many drug addicts commit minor crimes to obtain money with which to purchase drugs. Incarcerating addicts reduces crimes rates but at a significant cost to society. Mandating treatment and improving addiction treatment options is less costly than is warehousing addicts. In Hawaii, displacing homeless persons from one area of an island to another area on the same island achieves little. The community helping homeless people transition back into society produces a win for the homeless, the community, and business. Besides, helping people is the right thing to do!

Monday, October 19, 2015

Do you have a moral obligation to die?

Legislation, most recently in California, legalizing the assisted suicide of a terminally ill person has prompted some musings about whether a person ever has a moral obligation to die.

Most persons can envision a situation in which they believe that a person has a moral obligation to put self heroically in harm's way for the sake of others. We celebrate the passengers of United Flight 93 who, following the hijacking of their plane by the 9/11 terrorists, refused to allow the terrorists to use the plane as a weapon of mass destruction. The passengers who decided to attempt to regain control of Flight 93 almost certainly recognized that their effort might result in one or more of their deaths. Similarly, some persons – for example, those in the military and police – have sworn to their willingness to risk going into harm's way for the public good. This post does not address these issues.

The indigenous cultures of some Native American peoples, especially in the Arctic, anticipated that an elderly person would decide when s/he had become a burden to the community, that is, when the person no longer was a net contributor to communal well-being. Once s/he reached that conclusion, the culture expected the person to make their farewells and then one night to slip away, to die quietly in the dark and cold. This ethic seemed especially understandable for people whose life in the harsh Arctic allowed only a slim margin of safety and few extras.

Life in the twenty-first century is more complex. In retirement, our culture no longer expects that people will contribute to the communal good. Instead, we hope that retirees will have the opportunity to enjoy life more freely and fully than they could during their working years. Many retirees, I hasten to add, substantively and generously contribute to the common weal through gifts of time, talent, and treasure.

The fortunate few live long, prosper, do good things, enjoy their golden years, and then die peacefully in their sleep sometime after marking their centenary.

Unfortunately, increasing numbers of people – some of whom have lived long, prospered, done good things, and enjoyed extended golden years – die after a lengthy decline and protracted, costly medical treatment. Approximately 28% of Medicare, for example, is spent on patients during their last six months of life. Spending those funds on younger patients would unquestionably achieve, on average, greater results in terms of improved quality and length of life. In other words, many elderly persons, because of their medical condition, place disproportionately large demands on the community. Do these persons have a moral obligation to commit suicide?

My answer has four parts.

First, the idea that God determines when each person will die is incredible (i.e., unbelievable and untenable) in the twenty-first century. If a merciful God actually held each life in God's hands, then literally millions of people, young and old alike, would not have died from excruciatingly painful and often easily preventable causes. Furthermore, arguing that God determines when everyone dies implicitly denies that one human ever has any responsibility in the death of another human. Although the biology of life is very poorly understood, death results from natural and not supernatural causes.

Second, killing another human is morally wrong. Even morally justifiable killing (a police officer shooting a mass murderer as the only way to prevent more deaths, for example) is an evil, albeit morally justifiable. Under no imaginable circumstances in the twenty-first century is killing an elderly person because s/he represents too large a drain on community resources morally justifiable.

Third, death is the natural end of life. Cherishing life is understandable and right. But no life, regardless of medical or other scientific progress, will endure forever. Accepting the reality of death can bring a person to the final stage of growth. Living in denial of death prevents one from appreciating life's transitory nature and rightly valuing each moment.

Fourth, individuals have an inherent but not absolute right to decide when to die. No person is an island. Our relationships with family and friends are integral to our human identity. The young adult who does not have a terminal disease yet who commits suicide acts as if s/he has an absolute right to decide when to die, abrogating the rights of others to be in relationship with that person. Such suicides point to undiagnosed or untreated mental illness and broken relationships.

States are slowly recognizing this inherent but not absolute right to decide when to die by legalizing assisted suicide for the terminally ill. These laws both respect the sanctity of life (nobody has the right to kill another) and recognize that death is the natural end of life. These laws also value the quality of life as much as its duration. Only the individual can determine when the quality of his/her life has diminished to the point where that life is no longer worth sustaining.

Individuals already have the option, in most jurisdictions, of refusing medical treatment if it appears that the treatment is likely to result in a significantly diminished quality of life. Describing this refusal as a moral obligation to die may paint an extreme picture. However, the phrasing dramatically and insistently underscores that healthcare is not an unlimited social good. Resources expended on one person thus become unavailable to help others.

In other words, many of the healthcare choices commonly viewed either as exercising one's right to care or as governed by the principle that every effort should always be made to preserve life are actually utilitarian decisions. In these utilitarian decisions, an individual should weigh not only what is good for the person but also what is good for loved ones and the larger community.

Illustratively, presume that you are 95 and your doctor recommends you receive a heart transplant. Important factors in whether to accept or to reject that recommendation include not only the projected effect of the transplant on your quality and length of life but also how you having a transplant would alter the options available to other, younger and perhaps healthier, patients who need a heart transplant or other costly healthcare.

My choices may be mine to make but that does not mean those choices do not have any consequences for others. Living abundantly occurs only in community; living abundantly is possible only when I value the lives of others as highly as I value my own life.

Thursday, October 15, 2015

Columbus Day

My friend, Chuck Till, recently wrote on his blog that the US should revision Columbus Day as Indigenous Peoples' Day. He carefully caveated his remarks that he intended no insult to Italian-Americans (who provided the impetus for creating the holiday in 1934) or to Christopher Columbus who, whatever else he may have been, was assuredly an intrepid sailor. As a sailor, I echo Chuck's respect for Columbus. As a child, and now as an adult, I could never fathom the disdain that parts of my family had for one of my great aunts who had married an Italian-American. Diversity enriches rather than harms life.

First, I agree with Chuck. Replacing the Columbus Day holiday with Indigenous Peoples' Day is a good move. Each year fewer people and businesses seem to observe the holiday. The European exploration and conquest of the Americas had at least as many negatives as positives, e.g., the coerced relocation of African slaves and the subjugation/extermination of indigenous peoples through war, famine, and disease. An annual commemoration of Indigenous Peoples' Day is a step that the US can take to present a more comprehensive, balanced, and accurate interpretation of its history than the one that widely taught in the public schools, i.e., heroic explorers and highly moral, courageous settlers striving to create the Promised Land out of virgin wilderness.

Second, I want to reply to an Ethical Musings' reader query of some months ago whether the US could make amends to Native Americans. Nobody can undo the past. Nobody can make amends to injured people now dead. Nor does anyone have the wisdom to set right present wrongs caused by yesterday's actions. In other words, I very much doubt that we can make amends to Native Americans for wrongs done during the past four centuries.

However, the question of how to make amends – analogous to campaigns to win apologies to the present generation from governments and others for sins committed by ancestors – focuses attention in the wrong place.

To the extent that we celebrate the conquest of the Americas, regardless of what we say or imply about indigenous peoples, we perpetuate the lie that the invaders were superior.

To the extent that an Indigenous Peoples' Day contributes to ending that lie, the holiday will be constructive. To the extent that an Indigenous Peoples' Day increases awareness of the wrongs done, intentionally (e.g., enslavement) and unintentionally (e.g., spreading disease to people who lacked immunity in ways not then understood), the holiday will also be constructive. To the extent that an Indigenous Peoples' Day motivates persons, organizations, and governments to promote wholeheartedly and effectively equal respect for all, the holiday will be constructive.

Instead of apologies, let us stop perpetuating the evil. No way to amend the past exists. However, we can amend how we treat people who suffer in the present from a legacy of past mistreatment such that we alter the future for them and for us. Affirmative action is sometimes one positive means of accomplishing this goal. More fundamental is valuing others – their culture, their ethics, their humanity – as much as we value ourselves.

I don't pretend to have remedies to the problems that plague indigenous peoples. I do recognize that treating them as second-class (the reservation system) has produced a slew of evils (high rates of alcoholism, crime, and poverty to name just three) and no visible benefits. Native American reliance on revenues from licensed gambling operations has perpetuated rather than ended this second-class status, generating new revenues without significantly improving quality of life for most Native Americans.


If debating whether to establish an Indigenous Peoples' Day will focus national attention and concern on these continuing problems, then that debate will have achieved more than has the commemoration of Christopher Columbus for the past eighty years.

Monday, October 12, 2015

Seeking peace in Syria

Headlines have chronicled several recent developments in Syria:
  • US efforts to train fighters from rebel groups opposed to Syria's current government have failed;
  • Russia has employed aircraft and cruise missile against ISIS;
  • US munitions, intended for rebel groups, appear to have fallen into the wrong hands;
  • ISIS' grip on significant portions of Syria and Iraq remains strong in spite of an extensive bombing campaign by the US and its allies that in the last year exceeded the amount of ordnance dropped by air in either Iraq or Afghanistan during a five-year period.
In sum, efforts both to displace Assad have stalled and to eliminate ISIS as a major force in the Middle East have failed. Indeed, the US and its allies appear to be achieving results that are the opposite of their goals: instead of contributing to the establishment of peace, well-intentioned but misdirected efforts are exacerbating violence, instability, and harming thousands.

Concerned individuals and groups can contribute to building peace in Syria and the Middle East by advocating that governments adopt policies and programs designed to bring security and stability, diminish violence, and improve the quality of life for people who might otherwise join the flood tide of Middle Eastern refugees.

Among the positive actions that the US and its allies might take in the Middle East are:
(1)   Weighing non-combatant safety and security more heavily in decisions to authorize airstrikes (even persons opposed to any use of military force should be able to support this diminution of violence);
(2)   Creating a Syrian "no fly" zone to limit the ability of Assad's regime to harm or intimidate its citizens;
(3)   Debunking the myth that better training or arms will compensate for the widespread corruption and lack of commitment among Iraqi security forces;
(4)   Adequately funding and safeguarding refugee camps in and around Syria;
(5)   Supporting quality of life and self-determination efforts of people in the Middle East.


Advocating these or other moves need not presume either the expertise or prerogative to prescribe solutions. Instead, concerned individuals and groups can best function as catalysts who try to keep governments energetically focused on building secure, stable communities, diminishing violence, and improving people's quality of life.

Thursday, October 8, 2015

Anonymity and legacy

Perhaps it's being in transition. Perhaps it's moving some distance from family. Perhaps it's reading Victor Hugo's monumental Les Misérables. Perhaps something else has been the catalyst, but recently I've been musing some about legacies and the anonymity with which most people live and die.

Consider two individuals who were not anonymous and who were near contemporaries two millennia ago. First, Julius Caesar imposed himself on the shaky structures of the Roman republic, transforming it into an empire. He is survived by some of his writings, his image memorialized in sculpture and coins that still survive, and the month of July was named in his honor. The twenty-first century world is certainly different because of Caesar, but without knowing what this century would be like had Julius Caesar never lived it is difficult to specify the differences attributable to him.

Second, Jesus of Nazareth left no writings and no actual likeness of him survives, if one was even made. Yet the world is assuredly different because of Jesus. In some way, of which scholars debate virtually every detail, Jesus' relationship with his closest followers so moved them that after his death they formed what began as a new Jewish sect and quickly morphed into a new religion, Christianity. Claims that Jesus rose from the dead are the most facile explanation of what happened. Most non-Christians reject that claim. And among Christians diverse, contradictory explanations of Jesus' alleged resurrection have contended for adherents, persisting in spite of efforts by an orthodoxy established in the fourth century to suppress all competing views as heresy. Other explanations of Jesus' life-altering effect on his original followers usually emphasize his personal charisma.

The world is better and worse because of Jesus. By at least one historian's count, religion caused approximately ten percent of all wars. Presume Christianity caused a substantial portion of those wars. Christianity also has contributed to the evils of colonialism, racism, sexism, etc. Conversely, Christianity has inspired great altruism that has stopped wars, fed the hungry, cared for the sick, motivated educational and charitable organizations, inspired support for human rights, etc. Assessing the magnitude of the evil attributed to Christianity seems a simpler task than assessing the magnitude of the good attributable to Christianity. Much of the evil is both visible and specific: the number of people killed or injured, the amount of property damaged, etc. Of course, the injury to minds, with the follow-on second or third order effects, is impossible to quantify. Conversely, measuring the number of lives saved or bettered by a physician who cares for the sick because of Jesus represents a much more difficult calculation: the number of the physician's patients may be known, but the percent who would have died if not treated by that physician is indeterminable. Furthermore, the good done to minds, with follow-on second and third order effects, like the evil done to minds, is unquantifiable.

I expect to die in anonymity, even as I have happily chosen to live in anonymity. My writings, much as I might occasionally wish to the contrary, will soon pass into oblivion, even on the internet. The few extant likenesses of me (photographs, sketches, digital images, etc.) will soon disappear, lose any tag that identifies the likeness with me, or pass into the hands of people who never knew me and have no interest in preserving my memory. I will happily give my allotted 15 minutes of fame to any successful claimant.

Children are the most common way in which people hope to leave a mark upon the world. Jesus sired no known offspring. Julius Caesar's biological children all died at a relatively young age; they are no more than footnotes to his life. Christianity remembers Jesus' presents. Caesar's parents are forgotten. In both cases, neither man would have changed the world had it not been for his parents. In common with an increasing number of people in the developed world, I will leave no progeny.

Consequently, the relative handful of people I have known in my life (they total in the thousands, but on a globe populated by seven billion people, this is a relative handful) constitute the most probable way in which my living will have made a difference. Nobody has the wisdom and knowledge to identify, much less quantify, the good – and the inevitable even if unintentional evil – that I have done. Incidentally, some cultures have employed the idea of an all-knowing being who rewards the good and punishes the bad (.e.g., God, according to some Islamic, Christian, and other theological traditions; Santa Claus in folklore) to motivate good behavior and dissuade putative miscreants.

If biologists are correct and genes have an inherent drive to perpetuate themselves, then humans – an arguably unique species because of our limited autonomy and spirituality – have a similar, inherent drive to perpetuate ourselves through some form of legacy.

What is the legacy you wish to leave?

Do you wish the world to remember you as a statesperson, military leader, author, inventor, artist, or?

Do you wish the world to remember you personally or simply to leave the world a different, hopefully better, place because you lived?


Do you wish your legacy to be like that of Jesus, where the individual is forgotten (Christian theologians describe this as kenosis, self-emptying), and the lives of others changed for the good (the abundant life that so many of those who live in Jesus' name continue to experience)?

Monday, October 5, 2015

The power of illusion

In a couple of previous Ethical Musings' posts (Fear of failure and living abundantly and Further ruminations on the fear of failure), I reflected about some of my observations of military and government bureaucracy. In this post, I explore the effect that those security measures (armed patrols, sentry stations that permit public access, and different levels of force protection at different gates to a single military facility) have on outsiders.

For many casual observers, the measures improve the military installation's security. That is, the measures create the illusion of security. Dedicated, putative miscreants (imagine a hardcore terrorist, for example) can easily replicate my observations. Such individuals would presumably also observe the times and patterns (if any) of patrols, the type of weapons carried, potential fields of fire, whether any entering vehicles are ever searched, and so forth. At best, these security measures reduce the already very low probability of a non-dedicated putative miscreant harming someone (imagine an unhappy but mentally healthy teen). However, I know from multiple conversations over many years with many people that the visible security measures, no matter their potential effectiveness, give the majority of people an illusion of security.

Similarly, some knowledgeable counterterrorism officials believe that the Transportation Security Administration (TSA) is almost completely a waste of federal money. Air travel is safer because flight crews and passengers are committed to never again allowing a passenger plane to become a weapon of mass destruction. (To learn more, read my book, Just Counterterrorism.) Yet, the TSA gives many passengers the feeling that air travel is safe. We humans tend to prefer the illusion of security to the reality that life is inherently vulnerable and that all of us share a common fate; the only unknown is when we will die, not whether we will die.

Illusion – the belief that something is true – is powerful. Norman Vincent Peale (the power of positive thinking), Robert Schuller (the power of possibility thinking), and numerous others have capitalized on this power creating popular self-programs.

Conversely, an illusion may become self-limiting. A person begins to believe that he/she is incompetent at a particular task or at living in toto or a person hears from others and then in their own conscious mind that he/she is inferior, second-rate, or less of value than others.

Illusion detached from reality is indicative of mental illness (imagine the person who, believing he/she can fly, leaps off a tall building). Alternatively, the person so mired in an ugly reality that she/he has no vision of a better future is condemned to a miserable subsistence that can never become abundant living (imagine an incest survivor trapped in endlessly reliving memories of those horrific experiences).

Spiritual leadership consists in large measure of being a catalyst to help people imagine a better future for themselves, a future grounded in reality yet a future that pushes the individual to become more alive, move loving, and more fully realize his/her potential. A mentor, friend, parent, or stranger may provide this leadership through words, actions, or even a chance encounter. Similarly, what a person reads, hears, or sees may also be a source of spiritual leadership, transforming the person's life. Religious traditions value their scriptures because so many persons have found engaging those scripture to be a catalyst that opened a new perspective on life.


How does illusion function in your life? What are your illusions? Are your illusions grounded in reality? How do your illusions limit your growth or serve as a catalyst to help you become more fully human?

Thursday, October 1, 2015

Aloha to the Anglican Communion

The Hawaiian word aloha, since the nineteenth century, has come to have three meanings in English. Each meaning is applicable to the future of the Anglican Communion.

First, and most consistent with the word's Polynesian roots, aloha may mean love, peace, or compassion. Members of the Anglican Communion, all members of Christ's body, appropriately have feelings of love, peace, and compassion for one another. The conflicts of the last two decades within the Communion have tested, strained, and, sometimes, broken those bonds. However, genuine aloha should set the tone for relationships between the churches, leaders, and individual members of the Anglican Communion.

Second, aloha also means hello. The Archbishop of Canterbury, the Most Rev. Justin Welby, has convened a January 2016 summit of the primates of the Anglican Communion's constituent churches. He has also invited the head of the Anglican Church in North America (ACNA) to attend part of the meeting. Heretofore, the ACNA has been excluded from Anglican meetings. Geography, historical ties to the Church of England's missionary efforts, and ongoing communion with the see of Canterbury, not a group's use of the word Anglican or theological/liturgical claims of being Anglican, has defined who is and is not Anglican.

Times of changed. Canon Giles Fraser of St. Paul's Cathedral in London contends in a column in The Guardian that the internet and hypertext sealed the fate of a hierarchy being able to define a group's theological identity. In my experience, few people in the pews of US Episcopal congregations or those of the Church of England understand, much less care about, the Anglican Communion. Anglicanism has always been a muddled approach to Christianity, as Andrew Gerns at the Episcopal Café has editorialized. So, let's say hello to a new model of being Christian together, one that forsakes structural and doctrinal unity for promoting communication, broadening horizons, honoring differences, seeking commonalities, and together incarnating God's love as and when possible.

Third, aloha also means goodbye. It's time to farewell efforts to develop an Anglican covenant and perhaps to the Lambeth convocations of bishops. The former is, in nautical terminology, dead in the water. Archbishop Rowan Williams' commendable efforts to preserve the Anglican Communion through establishing minimal doctrinal and structural unity failed. The latter, the Archbishop of Canterbury convening a gathering of all Anglican bishops once every ten years, was during the nineteenth and twentieth centuries the best mechanism for preserving ties within the Anglican Communion. However, as much as individual bishops value their Lambeth experiences (and many do), new options now exist for creating ties between members of the Communion that would involve significantly more people at a much lower cost, e.g., multiple ways to establish relationships at all levels using the internet.


Instead of wasting time and energy bemoaning its demise, saying aloha to the old and the new represents a constructive step forward.