What You Should Think About Greed

December 1, 2007

“Greed is a bottomless pit which exhausts the person in an endless effort to satisfy the need without ever reaching satisfaction.”

–Erich Fromm

Roaming the blogosphere provides observers with no shortage of comedic sights. People just making their first efforts at articulating political thought may give off that adorable vibe one senses in little children trying to get their hands around an adult tool. Then again, if the message is sufficiently hateful, they may instead give off that disturbing vibe one senses in little children trying to get their hands around a loaded firearm.

A prime example of this phenomenon is the vast array of instances in which the Gordon Gecko “greed is good” speech is cited as a compelling defense of cutthroat capitalism. Anyone of sound mind who actually watched Oliver Stone’s Wall Street couldn’t come away seeing Gecko as a hero. No doubt he was a complex and charismatic character. Also, in the context of the dramatic moment, his speech was an appeal to idealize cutthroat capitalism. Yet in the broader context of the film, the character was clearly a villain willing to inflict enormous pain and suffering on others simply to add even more money to his own uselessly vast personal fortune.

As a character, Gordon Gecko was an exceptional man. For example, he referenced Sun Tzu in a business context without completely mangling the application of that ancient wisdom. Ordinarily, when an executive or manager invokes that sagacious general, it is just before pulling a monumentally stupid move. The Art of War has much to teach about abstract strategy. Yet it also has much to teach about overseeing armies of slave-conscripts in the midst of deadly struggle. Since that text became fashionable with the MBA crowd, it has done far more to promote the view that free workers engaged in constructive endeavors should be treated like slave-conscripts than it has done to refine the abstract strategic thinking of business leaders.

Likewise, if his words are to be believed, Gordon Gecko was right to call for sweeping reform of management at Teldar Paper. When a private enterprise is taking huge losses while continuing to pay out large salaries to an unproductive legion of executives, change is warranted. Yet his actions represent the exception that proves the rule — not the rule in action. In reality there are plenty of bloated Teldar-style corporate bureaucracies and very little pressure to link pay with performance. Insofar as reality’s Wall Street drives business trends, it promotes the search for cheap labor abroad rather than real restraint in the realm of executive compensation packages.

If popular beliefs about the benefits of cutthroat capitalism were remotely true, then bloated executive payrolls at unprofitable companies would be rare and short-lived. Instead of the tight correlation between productivity and executive compensation that capitalism is thought to promote, year after year in the U.S., executive pay raises consistently outpace overall economic growth, sometimes by a substantial multiple!

Architects of modern capitalism often spoke of “enlightened self-interest.” In a world where capital market activity involves long-term investors taking an active interest in the management practices of companies they partially own, there is at least the theoretical possibility that greed will tend to promote good business practices. We do not live in such a world.

Much of the activity in capital markets today is short term speculation. Investors seek to cash in on growing trends or public reaction to world events rather than build wealth by supporting wise executive stewardship. Even institutional investors, like mutual fund managers, are no longer likely to hold long term positions of any consequence. Though he is held in much esteem amongst financiers and tycoons, virtually no one today emulates Warren Buffet’s practice of actively monitoring the management practices of companies supported by his investments. He has become a living relic of a bygone era.

The end result is an economic dystopia alluded to in Oliver Stone’s film. Corporate raiders gain control of a business, loot it for saleable assets, trim workforces, further cut payrolls by replacing experience and skill with untrained novices, declare an impressive short term profit, then laugh all the way to the bank. It is a strong driving force in the “race to the bottom” — a steady decline in the quality of goods and services. While financial insiders get new piles of money to place alongside the piles of money they already possessed, thriving businesses are gutted and abandoned.

Depending upon economic elites to be consistently enlightened in their pursuit of self-interest makes no more sense than depending upon hereditary aristocrats to be consistently enlightened in the exercise of despotic power. The core idea behind the establishment of the United States was that exploitation of the many by the few could be disrupted through periodic redistribution of power according to the results of a nationwide public process. While this approach to political power is widely embraced, it is hard to imagine many Americans endorsing a similar approach in which wealth would be periodically redistributed.

In fairness, there are some good consequences to economic inequality. Then again, there is a vast range of possibilities between absolute economic equality and cutthroat capitalism. Right now, only the empty promises of trickle down economics answer any demand for social justice. Just imagine the outrage if a group of hereditary political aristocrats answered calls for democratization with flimflam about how their immense personal privilege will eventually spill over to empower ordinary citizens.

Wealth and power are not the same things, but no society that has accommodated extreme concentrations of wealth has found ways to prevent them from also serving as extreme concentrations of power. Some would defend a greed-based competitive paradigm with assertions about human nature. “Communism didn’t work because people are too greedy” goes the propaganda point woven deeply into the fabric of American culture.

If we actually governed ourselves with total deference to human nature, then we might also argue that medicine doesn’t work because people are too vulnerable to sickness or police do not work because people do not possess a natural tendency to comply with statutes. Government is not about shrugging and letting the worst of human nature determine the course of human events. Government is about taking collective action to restrain the worst of human nature and promote the advance of civilization. In short, it is a means to transcend “the law of the jungle.”

Because of the strong connection between wealth and power, a system that is dedicated to rewarding the effective actualization of personal greed is a system that empowers the selfish, the dishonest, the shortsighted, and even the larcenous among us. It should come as no surprise that the end result is a tendency for public officials to wallow in the spoils of legalized bribery while looting the public treasury to subsidize their coconspirators in the military-industrial complex.

I do not contend that an absolute equality of wealth is either attainable or desirable as a matter of public policy. Yet I do contend wholeheartedly that anarcho-capitalist ideology is an immensely harmful set of ideas rooted in falsehoods. These falsehoods are sustained by popular personalities who manage to combine shoddy analytical skills with an excellent ability to manipulate the emotions of their gullible admirers. Even as American civil rights, environmental conditions, economic vitality, public morale, and global prestige are being devoured by the politics of selfishness, millions of voting citizens will perpetuate their habit of endorsing a candidate based chiefly on false promises of fiscal restraint.

I have no trouble agreeing with the consensus of religious teachings that greed, far from being good, should be regarded as an anathema. Yet I also acknowledge that, in specific contexts, subject to reasonable restraints, greed can be useful. We do not need to take “self-interest” entirely out of the American way of life, but it is long past time to restore “enlightened” so as to strike a optimal balance. This balance permits collective action to solve real problems facing real people in our own times while also permitting individual action to produce individual rewards. Decades of policy from dozens of prosperous nations prove that this balance can be achieved. All we need in our own society is the will to make it happen.


What You Should Think About Existentialism

November 30, 2007

“In the struggle between yourself and the world, side with the world.”

–Franz Kafka

In some circles, existential philosophy has the reputation of an angry teenager. Yet that reputation is different in crucial ways from the “nobody understands me” cliché of adolescence. The trials and tribulations of a typical experience with puberty are made to seem more intense by a host of physical and chemical changes. Little by little, young people must cope with a range of adult issues for the very first time. Like teenage acne, teenage angst is unsightly yet also perfectly natural and well understood by adult outsiders, most having endured it themselves.

By contrast, many critiques of existentialism do not stem from any sort of genuine understanding. It is one thing to have passing encounters with notions like individualism and uncertainty. It is a much different thing to delve into the profundities of the human condition without any ideological safety blanket. Many are the clumsy critics, mauling great works of existentialist thought with interpretations bereft of nuance. Rather than embark on a lifelong journey of learning and personal growth, they wallpaper over great mysteries with conformity and faith.

Understanding existentialism begins by understanding the futility of asserting useful absolute knowledge. We can only be ourselves. Even much of what we know of ourselves comes through flawed perceptions and imperfect communications. All the knowledge we possess of entities beyond ourselves is also a product of those perceptions and communications. Then there is the ever-present prospect of faulty inference.

To uphold any teaching as beyond dispute is to assert inhuman perfection exists within human belief. Yet this process does not end where it begins. Accepting the general limits of human understanding is a major step toward transcending the limits of any specific tradition or doctrine. Insofar as existentialists have any particular aim, it is to liberate the human mind from the circumstance of life as a moral marionette. However uncomfortable a question with no answer may be, it has clear advantages over dedicated entanglement in the threads of popular false narratives.

When existentialist ideas were emerging in the 19th century, even ivory towers were populated predominantly by people convinced that questions of morality yielded to certain answers rooted in traditional beliefs. To people firmly anchored in a particular religious or cultural worldview, it is unpleasant to confront the suggestion that life is packed with unknowns and unknowables. From Apollo’s chariot to literal interpretations of Genesis, it seems human nature to favor even outright implausible narratives over comfortable coexistence with the unknown.

Much of existentialist thought is concerned with philosophical deconstruction. This is no haphazard obliteration of all that has come before. Martin Heidegger, among others, favored the term “abbau.” Perhaps the best metaphor for this process involves the architecture of a growing city. To deliberately level the entire place would be enormously harmful. Yet selective demolition of edifices that are not useful in the present is an essential activity that clears space for new projects that serve new needs.

Abbau offers us a minor paradox in that it is at once destructive and constructive. Just as decrepit brick buildings are best dismantled to make way for towers of glass and steel, invalid or obsolete ways of thinking are best abandoned so as to make way for more realistic and useful beliefs. It is a creative form of destruction, as the absence of dogma and falsehoods is itself a phenomenon worthy of creation. It also facilitates further constructiveness to the degree that accepting uncertainty establishes a foundation for later acceptance of novel information.

Existentialists are often accused of discarding all of tradition in order to embrace amoralism or nihilism. Yet this accusation can only be born from some simple-minded interpretation of philosophy. If anything, existentialists encourage the pursuit of knowledge about other moral and philosophical beliefs. After all, it is dogmatic thinking that causes that the vast majority of human thought to be discarded as heterodox. It becomes much less difficult to assimilate the vast diversity of worthwhile human wisdom after recognizing the profound limitations of all human wisdom, including those beliefs one holds most dear.

Centuries earlier, the dawn of astrophysics prompted ecclesiastical authorities to persecute, even kill, people guilty of no greater heresy than challenging official church doctrine on the nature of heavenly objects. Thus it should come as no surprise that existentialist writings condemning absolute faith in religious morality provoked, and in some circles continue to elicit, incendiary hostility from devout worshipers. The rise of secular governance, especially Western civilization’s embrace of free speech as a human right, protected men like Friedrich Nietzsche and Søren Kierkegaard from state-sanctioned reprisals for controversial publications.

Those two individuals have a peculiar part to play in the story of existentialism’s rise. Both struggled with inner demons even as they displayed outright genius in the analysis of human morality. If there is any real link between nihilistic brooding and existentialist philosophy, it is not in the actual message of existentialist philosophers but rather in the darkest moments of human drama endured by its pioneers.

Neither of them actually espoused nihilism. Kierkegaard was a devout Christian who observed (as is also apparent today) that a vast gulf divides the teachings of Jesus from the deeds of those most vocal about acting in his name. Yet Kierkegaard was also an existentialist. He granted that his faith was a personal choice rather than a logical conclusion, and he never lost touch with an inner struggle between faith and doubt.

By contrast, Nietzsche leveled many powerful broadsides at the core of religion. His command of religious history conspired with a rapier wit to make his works especially provocative. Even as he wrote about the folly of being certain in beliefs, his literary voice conveyed a merry prankster’s boldness. Traditional thinkers were insulted enough to see sacred teachings linked to the ancient myths from which they were so clearly derived. Adding ridicule to the mix helped to shake some readers out of mental malaise even as it afflicted some critics with obsessive hostility.

To some degree there is a link between Buddhism and existentialism. Some Buddhist teachings promote stark honesty regarding the human condition.  Others emphasize the importance of arriving at beliefs as a continuous process of searching for personal enlightenment transcendent of any established doctrine.

Yet existentialism is no religion. In fact, it actively discourages the kind of orthodoxy that comes with most organized religious activity. The central lesson existentialism teaches regarding religion is that whatever wisdom priests and scripture may contain should be given due consideration right alongside wisdom that contradicts the assertions of clerics and holy texts. The search for insight is also a search for the will to let go of the false security provided by attachments to tradition, faith, conformity, nationality, etc.

Existentialism does not offer a path to the easy satisfaction of transcending doubts. This is good, because that easy satisfaction is the progenitor of dangerous zeal. By acknowledging that the human condition simply does not permit an absolute escape from the unknown, existentialism offers a means to become comfortable with abundant mystery. It shines light on the illusory nature of the comforts of dogmatic belief. By acknowledging the real limits of human knowledge, the stage is set for a rebellion against tradition.

Through this process of rebellion, guided by awareness of human limitations, it becomes possible to constantly refine one’s own beliefs, moral and otherwise. Few people find it controversial to assert that lifelong learning is better than settling for an outlook firmly fixed long before life’s end. Yet few also understand just why and how an adaptive personal approach to morality has more to offer than an inflexible doctrinal approach.

Existentialist philosophy offers a long, and occasionally absurd, journey to the frontiers of human understanding. Still, it seems unsound to me to avoid this journey. Attributing infallibility to any particular tradition or teaching can only retard personal moral growth. If there actually was a creative thought process driving the birth of the universe or the development of its inhabitants, it seems clear that this process left human beings with the capacity to think for ourselves. With or without a God watching over us, it seems better to exercise that capacity for moral reasoning than to settle for uncritical adherence to beliefs promoted by others.


What You Should Think About Pacifism

November 29, 2007

“From pacifist to terrorist, each person condemns violence — and then adds one cherished case in which it may be justified.”

–Gloria Steinem

Even in more tranquil times, there is no shortage of commentary meant to remind non-violent citizens that legions of trained killers stand at the ready to provide security for the nation. No doubt much of human history reveals that force of arms provides a means to keep a hostile enemy out of a nation’s heartland. Yet more circumspect analysis also demonstrates that force of arms provides a means to produce hostile enemies. Could it be that there is more to achieving a security goal than having the most guns or the best fortress?

The bizarre state of the world in the aftermath of America’s “headless behemoth” foreign policy provides a new perspective on some old ideas. From the earliest clashes in military history, there have been questions about the justification for war. No one remotely acquainted with the realities of warfare could carry on without any doubts about the endeavor, even if military culture vigorously promotes thoughtlessness in this arena.

To be fair, soldiers in the thick of it are more effective if no weighty political cogitations distract from the urgent business at hand. Yet this same culture so useful in the field also has drawbacks. Once the fog of war has cleared and some opportunity for reflection presents itself, this mindset creates difficulty reconciling doubts raised by the experience of waging war with political justifications for the violence.

Since ancient times, it has been common for a head of state to have extensive personal experience with military service. Thus the entire history of governance is heavily influenced by, if not a “might makes right” attitude, at least a “having might is more important than being right” attitude. In Europe (sans Switzerland and a few other pockets of exceptional thoughtfulness,) from the Middle Ages to the middle of the 20th century, it was accepted that a genuinely defensive stance was inadequate. Responsible governance was presumed to include cultivating enough military might to fight alongside allies, lend credibility to aggressive posturing, and project force to distant lands.

Even today, blatantly stupid ideas like “war is good for the economy” or “war is essential to driving technological progress” are widely believed. Centuries upon centuries of social paradigms make it such that questioning or contradicting these unsound assumptions is regarded as a sign of weakness. It may be that the negative response is as much primal as it is cultural. Yet it surely is not intellectual.

There may be a subset of human beings who are best able to achieve their potential in some context provided by war. Yet to promote war as a means of promoting human achievement is downright senseless. Many of those who have achieved great things in a wartime context were just as capable of achieving great things in some peaceful pursuit. More to the point, surely that portion of humanity inclined to thrive in warfare is not a strong majority. Then, even if I were mistaken about that point, how much innocent blood may be spilled in the name of creating a militant environment for human achievement? Could the inspirations of war ever exceed the lost loves and labors of lives cut short by the consequences of combat?

War for war’s sake is only a good thing to the degree that someone has developed a profoundly misguided notion of “good.” Yet there remains the matter of defense. Wherever there is prosperity or power available for the taking, there is the risk that aggression will occur. George Orwell is known to have asserted, “we sleep safe in our beds because rough men stand ready in the night to do violence upon those who would do us harm.” To someone just beginning to attain the first glimmers of enlightenment, such a statement seems to suggest that peace and prosperity rest on an essential foundation created by awesome military forces ready to lay waste to prospective national enemies.

That assessment comes from an ignorance of the interconnectedness of all things. Did a sniper stuff the pillows on which this peaceful sleep occurs? Did a gunboat pilot assemble the frame of the bed? Was the mattress put together by an artillery crew? Is the heating and plumbing that makes our homes comfortable first invented by a team designing killing machines? Were our city streets planned and paved with the oversight of combat-hardened generals? To turn the simple-minded interpretation of Orwell on its head — dedicated warriors eventually find safe places to sleep away from the battlefield because most everyone else stands ready to perform constructive and creative activities on their behalf.

For too long, the darkness of tribalism and barbarism has lingered in our modern institutions. In the halls of power, even from the lips of those who avoided service themselves, characterizations of military forces as “the backbone of our society” are sincere. Yet they are also archaic and misguided. If we accept that military organizations are the essential core of strength our society possesses, then we define our greatness chiefly by our power to kill and destroy. I would think even an overwhelming majority of military personnel would hope for a more noble perspective from national leaders. Alas, this affliction remains severe in the United States, and it is hardly absent from other nations in the modern world.

Even amongst warriors, the trait of being peace-loving is correctly regarded as a virtue. Yet when it comes to absolute pacifism, hawks, chicken hawks, and plenty of doves all seem willing to agree that it is foolish. Personally I agree that there are plausible scenarios in which defense of others or defense of self justifies actions intended to neutralize a real and imminent threat. Yet no small part of the pacifists’ wisdom is understanding how incredibly rare these situations are if you do not make it your business to instigate or escalate hostilities.

An absolute pacifist runs the risk of doing wrong by failing to take the most effective course of action in protecting the innocent. Everyone else runs the risk of doing wrong by performing willfully destructive actions that do not serve any protective purpose. Which is the greater risk?

In the personal context, fluid situations and instantaneous needs can lead to situations where thoughtful reflection is not an option. Within limits both reasonable and practical, there should be some tolerance for honest mistakes. In an international context, however fluid the situation, opportunities for contemplation are usually abundant. To go to war when the underlying facts are not subject to thorough investigation or the stated cause(s) are unreasonable or the overall plan is unrealistic is to perpetrate the very worst sort of mistake. Only a team of lazy minds paired with dark hearts could let the desire to order an army to do violence take priority over the moral imperative to avoid unnecessary warfare.

Perhaps absolute pacifists are fools. Yet if we see clearly, then we see that life makes fools of us all. There is much more to be learned from the fool who thinks differently than from the fool who echoes our own thoughts. When we cut through useless divisiveness, we are left recognizing that abhorring violence is innately rational, perhaps even innately good. While we who are not absolute pacifists set about establishing the grounds on which we would support acts of violence, there is much benefit to be found in considering the very best arguments against those acts. If we cannot even face the questions of those who condemn all violence, how can we possibly believe our own justifications for it are legitimate?


What You Should Think About Skepticism

November 24, 2007

“Just think of the tragedy of teaching children not to doubt.”

–Clarence Darrow

By the time the ancient Greeks took to formalizing thoughts on belief, they also managed to formalize thinking on doubt. An influential thinker from Elis named Pyrrho managed to witness firsthand many conquests of Alexander the Great. Some might argue that this association caused later scholars to place undue emphasis on Pyrrho’s legacy. Yet it was at least worthy of some note.

The man wrote no great philosophical work, but as with Socrates his students would boast of their association and labor to recall Pyrrho in his own words. This leads to historical accounts that blend earnest recollection with distortions meant to serve the agenda of philosophers promoting their own ideas. Still, it seems clear that the heart of Pyrrho’s teachings was that uncertainty is sound and right in ways that certainty cannot be.

His aim has been characterized as “emotional tranquility,” and he advocated suspension of belief. To him an ideal state of mind, given the term ataraxia, involved having no beliefs. Of course there is an amusing contradiction here. How does one pursue this ideal of having no beliefs without holding the belief that it is an ideal state of mind?

One account mocks Pyrrho as requiring the constant attention of handlers to prevent him from walking off cliffs or stepping in front of horsecarts due to an inability place stock in his own perceptions. In reality the man was almost certainly more reasonable. If his teachings were not as severe as the most extreme account, then they too may be thought of as a reasonable response to the problems of argumentative acrimony and divisive conflict.

In a world where life could be taken at the whim of a leader, flexibility in belief offers some survival value. In a world where political disagreements may tear a society apart, flexibility in belief may insulate an individual from the anxiety and pain of being an active partisan. Yet there seems to have been more to Pyrrho’s teachings than this. He laid the foundation for recognizing just how loosely beliefs may be anchored in reality.

From the ancient past to the modern world, this complex relationship has been the subject of much discourse. Robert Nozick was never shy about considering “brain in a vat” scenarios. Since all we know of the universe comes to us through our perceptions, there is no way to establish with absolute metaphysical certainty that our experiences are not part of some simulation that eclipses an underlying reality in which the individual is actually a brain in some alien stimulation and life support system.

It is fair to argue that what we know, and even beliefs about what we are, follow from the imperfect results of perception and analysis. Our senses and our minds can “play tricks on us,” leading to beliefs that depart considerably from what is real. Yet this sort of thinking can be taken too far. I favor use of the phrase “skeptics’ infinite regress” to describe this phenomenon in which a retreat from the very prospect of useful knowledge is fueled by contemplation of possibilities that are supported chiefly by appeals to the limitations of evidence . . . as opposed to something like evidence itself.

Walking off cliffs or into traffic because we doubt the reality of those phenomena seems as sure to be foolish as the term “foolish” is sure to be meaningful. I believe the greatest extremes of skepticism can rightly be pigeonholed as philosophical novelties rather than essential insights. Yet the broader phenomenon is clearly not useless. Just as an inability to believe would be crippling, so too would be an inability to doubt.

Belief and doubt are both fundamental phenomena that shape the way thinking beings relate to the world around them. The ultimate value of this thinking is heavily influenced by the degree to which belief and doubt are used to seek truth (or the degree to which they are used to avoid it.) For example, someone with a deep emotional connection to a particular political perspective on scientific question may level doubt at even the most rational analysis while eagerly offering up belief whenever it provides an opportunity to confirm a predisposition.

To use the term “skeptic” while engaged in irrational defense of a long-held viewpoint is somewhat misleading. This behavior is less an exercise of true skepticism as it is an exhibition of passionate belief. Mainstream perspectives on the 9/11 attacks, global warming, evolutionary biology, etc. are met with a great deal of “skepticism” but very little of the judicious manifestations of doubt that comes with real skeptical thought. Zealous adherence to contradictory beliefs masquerades as much more reasonable than it actually is.

The best application of skepticism is not in challenging beliefs one opposes, but instead in challenging beliefs one holds dear. A modern day skeptic does not cower behind a wall of media carefully selected to reinforce a single ideological perspective. Rather, the exercise of skepticism in our times involves reaching out to a diverse assortment of sources in the careful search for the genuine insights people with different opinions might possess. Faith in falsehoods follows from fixation on the findings of one faction.

You do not need to feel every individual raindrop to know that a storm is wet. Yet you also should not believe it is raining just because someone pissing on your leg tells you that it is so. To my personal chagrin, contemporary philosophical literature seems to trend toward novelties like the “brain in a vat” scenario more than it delivers practical wisdom like the appropriate uses of, and limits on, skepticism as practiced by a functional human being.

It can be argued that disengaging from reality has its uses. While subjected to torture, clinging to fantasy may be useful as a survival mechanism. Embracing it as a form of entertainment may also be satisfying. In times of great emotional distress, it can be argued that belief that could not withstand skepticism is a more desirable alternative than accepting great loss or crumbling in the face of great peril. Yet for any situation where measured philosophical discourse is appropriate, it seems clearcut that nothing offers better outcomes than making the best possible effort to engage with reality.

The limits of perception and cogitation provide us all with good reason to question our own beliefs. That so many of our beliefs are filtered through many others’ perceptions and cogitations make this sort of questioning all the more worthwhile. It is the true skeptic who seeks out the best challenges while constantly learning from encounters with unexpected information. There are also terms for people who wallow in a single group’s orthodoxy and self-congratulations, but to me it seems misleading when they call themselves “skeptics.”


What You Should Think About Abundance

November 22, 2007

“Abundance never spreads; famine does.”

–Zulu proverb

To describe economics as “the study of scarcity” is reasonable enough, as far as gross oversimplifications go. On the other hand, to follow such an assertion with arguments about the world itself being nothing but a set of scarcities is just plain wrong. In many instances demand for a good or service does exceed supply. Yet there are also instances where it does not. As most of the nation overeats alongside family and friends on this Thanksgiving Day, it is hard to overlook one form of American abundance.

Agricultural outcomes in the United States have as much to do with market economics as a commercial airline flight’s safe landing is a function of the shifting winds. By that I mean disasters may occur naturally, but on balance it is a planned activity. Agricultural policy does not so much influence as directly shape shape the menus in our restaurants and the inventory on supermarket shelves.

Not only does this engineered abundance have trade advantages — it also addresses vital security and public health concerns. Despite all our science and technology, harvests can still be fickle. If an unexpected blight or a bad turn in the weather devastates output from a particular region or with a particular crop, others will have the strength to pick up the slack. However vulnerable depending on foreign oil may make us, it would be an even greater vulnerability to become dependent on foreign food.

This is not to say that agricultural imports do not have, legitimately, a vital place in modern agricultural policy. Imported foods contribute to dietary diversity, which tends to be a healthy phenomenon. Given our own surpluses, we retain the option to turn inward in time of emergency. Also, save for art and media, food commerce may be the most culturally influential form of trade. All in all, trading food with our neighbors in the world is good for us, good for them, and good for our relationships as well.

Still, because of the safety provided by ample production, it has long been American policy to support domestic food abundance. Everyone has the potential to benefit. In regions where delicacies are produced, less is consumed locally as staples can be shipped in from afar for less than the value of goods from a fresh harvest. This is a benefit for people rich enough to incorporate delicacies into their daily existence. For everyone else, it means that food prices tend to be low and agricultural price shocks caused by nature can be avoided through reasonable dietary change.

Today there is new thinking on American agricultural abundance. Serious policy analysts do not dignify anarcho-capitalist twaddle about going unplanned and exposing our national stomach to the full force of the elements. However, there is much talk of revising planning guidelines in order to address the obesity epidemic. Policies established in the first half of the 20th century are still shaping the food intake of Americans in the 21st century. In a complex “chicken and egg” relationship, growing consumer interest in a healthy diet is accompanied by growing expert interest in agricultural policy reforms.

These reforms would shift focus away from heavily processed items and promote health by making whole foods more available to consumers. If the alliance of officials and corporations pushing for reform has their way, the national strategy that coordinates planting, harvesting, livestock feed, livestock slaughter, etc. will maintain or even elevate the level of satisfaction provided by supermarkets and fast food chains while inverting a key relationship. Under standards established in the 1950s, raw materials suitable for heavily processed products tend to be highly abundant while raw materials suitable for service after minimal processing remain scarce.

This situation drives corporate activity such that fattening foods are where the easy money is. Turning that trend upside down rewards companies that were innovative in the pursuit of brand identities related to healthier eating while removing the incentive for entrenched entities to maintain a keen focus on stimulating demand for unhealthy food. Getting out of this rut is only possible because a long-standing policy of abundance places government in the proverbial driver’s seat. Just as American nutrition improved with the original wave of coherent national agriculture policy, pending reforms off up the prospect of a new wave of improvements to be followed by gains in childhood development and general national health.

In debates about health care policy, there tends to be an implicit assumption that the United States is incapable of doing what dozens of other civilized nations have already done — pursue systematic abundance to produce consistently superior outcomes than the nation endures at present. It is very much a failure to see the forest for the trees. It may not be possible to engineer a surplus of every possible health care good or service, just as it is not possible to maintain a nationwide surplus of every possible fruit and vegetable. Yet it is possible to undertake sensible coordination of activity and deliver a general abundance to the benefit of all.

Agricultural policy first worked its magic in the United States through an emphasis on cereals and dairy products. Once upon a time, a good bowl of Wheaties swimming in fresh milk was the epitome of health food. By the same token, in a nation where millions have no involvement with physicians and millions more only turn to modern medicine after experiencing a health crisis, just delivering universal preventative care would be an enormous step forward. No doubt there is more to public health than getting virtually everybody to participate in routine checkups, but that measure alone would enable tremendous gains. It would actually conserve medical resources by decreasing the extent of time potentially crippling problems go untreated. If labor productivity gets a boost as well, that is hardly cause for complaint.

Yet too much of today’s thinking is afflicted with misconceptions. It is a misconception to think that sound planning could not generate a useful degree of medical abundance. It is also a misconception to think that providing ever-greater advantages to America’s top income quintile will somehow cause their abundance to become some sort of universal phenomenon. The past three decades of American economic history constitute one monolithic denial of trickle-down theories. On the other hand, it does seem to be the case that hardships are not so clearly self-contained amongst America’s poor.

Infectious microbes are no less at home in the bodies of the rich than they are in any other human beings. Contagion has no respect for net worth. Also, since most of America’s economic dynasties rest on corporate ownership or other forms of working investment, the rich suffer from degraded returns on those investments even as the poor suffer more directly whenever preventable illness leads to lost productivity.

Then there is the question of atmosphere. Wealthy Americans do not take an oath to avoid ever having any empathy with working class citizens. Defense of the “right” of the rich to not support universal health care is also an attack on the “right” of those same people to be better protected from disease, enjoy superior returns on domestic investments, tap into a more fit labor pool when launching new ventures, and live in a generally happier society. There is no choice that is not a choice.

As we settle down to dinner this day, we do well to honor the tradition of expressing thanks for our blessings. Though not perfect, the existing collaboration of farmers, corporations, and bureaucrats has accomplished much in the feeding of our nation. Even with all its flaws, a “do more with less” approach to working the land seems to have succeeded in principle. Given the resources and financial inputs devoured by our existing health care institutions, it would seem American endeavors in that area are presently guided by a “do less with more” paradigm.

In truth, that reality is an unpleasant side effect of being misguided by a popular paradigm that forbids even discussing the pursuit of abundance in our capacity for healing. Yet I believe it is a pursuit well worthy of the effort, including the effort to dispel false narratives about unresolvable scarcity. Should we, as many other nations already have, manage to achieve useful abundance in medicine, then our nation will enjoy a new form of strength. Universal health care would be a particular relief to the poor, but its indirect benefits would substantially improve circumstances for the rich too. Were it part of the American way of life, then on future days of Thanksgiving we might all know one more blessing to be counted . . . or at least one more useful abundance to take for granted.


What You Should Think About Freewill

November 13, 2007

“Life is like a game of cards. The hand that is dealt you is determinism; the way you play it is free will.”

–Jawaharla Nehru

Many people turn to philosophy in a search for purpose. “Why are we here?” has driven much thinking throughout, and no doubt before, history. Yet there are even more fundamental questions to ask. Understanding humanity’s purpose, or lack thereof, must follow from some understanding of what it means to be a human being. “What are we?” is a question with answers much less self-evident than superficial analysis would suggest.

There can be no doubt that making decisions, and taking action based on the results of decisions, is part of the human experience. Then again, had apples the capacity to appreciate their existence, growing ripe and falling from trees would be part of the quintessential apple experience. As scientific inquiry reveals more and more about the physical workings of the brain, mystery yields to empiricism. Can it be that we are nothing more than chemistry and biology at work?

Truth be told, even in times of abundant mystery there was no evidence substantiating the existence of supernatural components to human existence. The cosmos as it is at present results from the cosmos as it was before. The future will follow from the cosmos as it is now. Understanding all of the processes as they unfold would require an unfathomably complex information system. For that matter, computing all the electrical and chemical activity in a human brain while also modeling the environment from which all stimulus emerges is an impractically complex challenge.

This brings us to the heart of a concept like freewill. It may be reasonable to believe in physical predestination — that the properties we are born with and the experiences we have in life are the only forces that shape human behavior. However, given that those causes defy comprehensive comprehension, an analysis of our decision-making processes takes on a form necessarily different from an analysis of the growth and descent of a falling apple.

Too often, predestination is used as a shield against the notion of personal responsibility and other ethical considerations. Someone who is quick to anger may walk away from a violent outburst thinking, “well, I have a short fuse, of course that was going to happen.” Even worse, forethought along similar lines may motivate people to act more boldly on malicious or selfish tendencies. Yet without the vast inaccessible collection of knowledge required to be certain in calculating an individual’s destiny, we may instead look to one of the most useful forms of uncertainty.

Even if one grants the notion that freewill is an illusion, it is an illusion that adds to the meaningfulness of being human. Uncertainty is resolved as potential gives way to action or inaction. Self-awareness creates conditions in which reflecting on the outcomes of behavior can change the behavior itself. A strictly natural view of the universe holds that all such reflection is inevitable and predictable, at least in theory if not in practice. Being aware of ourselves yet not possessing perfect self-knowledge or comparable knowledge of our environment, we perceive contemplation and introspection to be spontaneous.

With the natural uncertainties of unresolved human decisions, the value of activities like reflection, meditation, and analysis become real. As alternatives to acting on impulse, they do not free us from metaphysical predestination, but they can liberate us from practical folly. Recognizing the usefulness of freewill as a concept provides a foundation for recognizing the usefulness of much of philosophy and psychology. Ethics becomes especially crucial once one accepts that carefully weighing a decision to act, while itself also a decision, offers up the potential to better control the outcomes of our actions.

Even so, recognizing the human responsibility to restrain the worst of our impulses and pursue the best of our aspirations can be taken too far. Neurological damage, chemical addiction, or even a genetic predisposition to mental illness can create conditions that inhibit sound decision-making processes. In some instances the mysteries shrouding human motives are less significant than identifiable externalities that also have predictive value in assessing human behavior.

This insight goes beyond justification for providing treatment to people with psychological problems. Action on a societal level can result in significant change. Strong anti-poverty programs do not insure that any particular individual will never commit a theft. Yet property crimes and crimes of violence both tend to decline as poverty is alleviated (and rise in response to a surge of poverty.) It may be unfair to characterize advocates of cutthroat economics as culpable in the same way that actual thieves and murderers are. Yet, as far as useful illusions go, social responsibility seems to deserve a place alongside personal responsibility.

Perhaps the most important consideration in assessing predestination and freewill is the role of uncertainty. Rare is the tyrant lacking a conviction born out of certainty that his personal destiny involves taking bold action as a national leader. Even worse than shrugging at the impulse to do wrong is surrendering to a sense of fate driving some grand plan forward. If indeed a particular grand plan is worthy of pursuit, then it will stand up under the rigors of methodical analysis. In fact, thoughtful reflection on any worthwhile agenda should only serve to refine understanding of related goals and methods.

Predestination is a useful concept when it comes to the philosophy of the natural universe. It also has some application when it comes to the formulation of social and economic policies. However, it has very limited usefulness when applied to personal perspectives on individual behavior. Limitations in that understanding remind us that freewill is crucial to understanding our own behavior. It is through processes that create the perception of freewill that we are able to escape our most destructive tendencies. Be they delusions of grandeur or compulsions to do harm, the perspective freewill offers is a means to transcend it all.

Yet is it really just a perspective . . . just a perception? If one grants that view, so much else must be written off as illusion. Emotions and reasoning may be the way we experience neurochemical processes, but those experiences have a reality of their own. Living within our mortal limitations, it is this reality that defines the human condition. In one sense we are a small part of a complex chemical reaction that has been ongoing for billions of years. Yet in another very real sense, we are the sum of the choices we make.


What You Should Think About Military Service

November 11, 2007

“I am a soldier. I fight where I am told, and I win where I fight.”

–General George S. Patton

Though I would not compare it with something as tragic rising body counts, another unpleasant side effect of watching our nation wage a pointless war is the rise of contempt for personnel serving in its armed forces. Many is the time I’ve read a compelling blog entry or heard a persuasive public speech only to be see its appeal marred by hostile language directed at a broad category of Americans, most of whom are decent human beings willing to make a particularly difficult sort of personal sacrifice on behalf of the entire nation.

Of course there are some soldiers driven to war by hatred of others, and that is not laudable behavior. Yet most uniformed military personnel, even those serving on the “wrong” side of historic conflicts, believe that they are doing work that is vital to protect their homeland from some sort of real threat. Generally speaking, the British did not open fire on Continentals because they hated American upstarts. Confederates had much more pressing reasons to shoot at Yankees than fears that slave labor might end. Even Wehrmacht forces defending the beaches of Normandy believed their actions were necessary for the protection of Germany.

Generally war does not break out unless warmongers are involved. The modus operandi of a successful warmonger is to portray a potential enemy the source of an urgent threat to a people’s way of life, rally a nation behind an agenda of aggression, then denounce dissenting voices as traitors to that nation. The entire process is a political exercise that combines the highest levels of effectiveness with the lowest levels of ethical conduct.

Misunderstandings, even skirmishes, can occur naturally as a result of complex international relationships. Full out warfare only occurs because leader(s) make it their business to advance an agenda of belligerence. Only after such historically malicious behavior has taken place do conditions exist where dutiful military personnel are called upon to kill people and break things. Much of this destructive activity is itself unethical, but then again it is also unethical to knowingly and willingly take an oath only to break it when faced with a moral quandary.

Perhaps more to the point, from a soldier’s perspective, ordinary wartime activities are not unethical. A policy of military aggression is normally presented as something urgently needed to insure a society’s survival. Fending off foreign invaders may actually be a matter of survival. Either way, out on the battlefield a “kill or be killed” mentality may be rooted in actual circumstances. Even when it is unethical to go to war, once deployed the ethics of following the chain of command, including orders to fight, become crystal clear.

Any nation with substantial resources invites disaster by refusing to maintain some sort of military capability. Perhaps there will come a time when this is no longer true, but on Earth in the early 21st century it clearly is true. Of course, there are many ways to go about achieving this goal. I’ve always admired the Swiss approach — neutrality in foreign matters, but if anyone tries to take their land and their homes by force, a nation full of trained and equipped militia will fight relentlessly to make the invaders’ incursion as brief and painful as possible. Insofar as “peace through strength” makes any sense at all, I believe it is in military doctrines like those that shape Swiss policy.

Unfortunately, rank and file soldiers have little say in the policies of a democracy, and even less if they should happen to inhabit an undemocratic society. Where a person is born does not change the merits of acting in defense of family and friends. As such, the political and military institutions of most societies expose soldiers to the possibility that they may be called to serve without an actual grave threat to home and homeland. When moral imperatives come into conflict like this, how should one respond?

For an overwhelming majority of military personnel, there is nothing to do but follow orders. This goes beyond aversion to potentially severe punishments brought on by willful dereliction of duty. Below the highest echelons of command, the ability to carry out orders as given is required to insure combat effectiveness. A nation might as well have no defenses as to let them to be contingent on whether or not front line troops are personally inclined fight on a particular day.

It may be wise to question policy, and surely it is wise to temper a combatant’s fervor with enough reflection that every act of violence is also an exercise in reluctance. Yet both the duty of a citizen to protest bad policy and the duty of a human being to avoid victimizing others must take a back seat to the duty of uniformed military personnel to cause death and destruction on command. Exceptions to this come only in the form of truly extreme abuses, like an order to decimate harmless civilians or an order to torture a defenseless captive. If it is plausible that a properly issued command serves a legitimate military purpose and there is no opportunity for discussion, then it is right to take action.

Yet there is another realm of exceptions beyond orders calling for the deliberate abuse of non-combatants. At the highest levels of military command, it is inevitable that national policy will become part of the discussion. Even if President Bush never actually listens to contrary opinions from the top brass, his public insistence that he defers to the judgement of commanders in the field invites efforts to establish healthy channels of feedback percolating up through the chain of command. Heck, even in a dictatorship, senior military personnel should be able to speak candidly with the potentate. Anything less creates a disconnect from reality that exposes combatants in the field to greater danger while diminishing the prospects for success of the overall military mission.

When I encounter schadenfreude regarding the hardships faced by today’s active duty military personnel, I see that as an ugly sentiment. No doubt there is an element in the services that did volunteer just so that they could “go kick some raghead ass.” Yet I am confident an overwhelming majority of those unfortunate Americans find themselves risking real danger for reasons that are more wholesome than sinister. On this Veteran’s Day I want to extend my thanks for their contribution to the strength of the United States of America.

I would rather my nation be strong than be weak. Yet I would also rather my nation would be right than be wrong. The only component of military condemnation I can support is criticism of senior military commanders associated with existing Iraq policy. Except for Gen. Shinseki, I am not aware of any top tier officers willing to end their careers in order to speak truth to power regarding the planning and conduct of that disastrous misadventure.

Going to war without question because your commanding officers do not want to hear your questions, or even because you’ve been led to believe the war effort really is sound defense policy, is merely the fulfillment of duty for almost all positions in the armed forces. On the other hand, supporting a war effort when your position carries with it the duty to raise concerns, being educated and informed enough to know better than the civilian leadership in this matter, is disgraceful conduct.

I can appreciate that speaking out means being denied access to a defense industry gravy train that could insure a general’s grandchildren’s grandchildren are born wealthy. That form of corruption comes with a powerful lure. Yet officers atop the military hierarchy should have greater moral fiber than to think so selfishly. Brave men and women serving so far from home will pay for that enrichment with spilled blood, lost limbs, addled minds, or even the ultimate price.

To those vaunted few with the position to speak truth to power regarding the folly of ongoing Iraq policy, I urge you to take this time to reflect on your duty to front line combatants as well as your duty to the nation as a whole. Think about the virtues attributed to you at ceremony after ceremony throughout your career. Ask yourself when is the last time you showed real courage and made real sacrifices for the good of the country. It is not too late for you to get back on the right path.

As for everyone else out there who does serve in some branch of the U.S. armed forces yet doesn’t have the ear of POTUS or VPOTUS or SecDef, I thank you for your service. Your work is among the most difficult work imaginable. You honor this nation by doing your military duty, often under downright traumatic circumstances. I only wish more of us could manage to honor you by doing our civic duty to support better political leadership.