Our ‘Horrible’ Economy

Everyone says that our economy is “horrible.” Gas for our cars is headed toward $5 a gallon. The stock market has turned seriously downward. Unemployment is edging upward (although the unemployment rate is still holding about even). Yet two basic points make it difficult for this amateur to see just why our current economy is so horrible. For me, the most important issue in any economy of any democratic republic is job growth. In 1976, I wrote speeches for Senator Henry M. “Scoop” Jackson (D., Wash.), in which the three main priorities of his campaign were identified, in this order, as “Jobs, jobs, jobs!”

The second-most-important thing for a democratic society, to my mind, is that there be consistent economic growth. The reason is that the most destructive of all human social passions is envy. If there is no growth, the only way people have to vaunt themselves is to tear others down. (In nearly all static societies, envy sparks immense social frictions.) In a growing economy, by contrast, people have a chance to stop comparing themselves with their neighbors (and tearing them down). Instead, they can work as hard as they can to meet their own goals. If their own position tomorrow will be significantly closer to their heart’s desire than it is today, then they don’t need to care how their neighbor is doing.

A republic like the United States simply must defeat envy, and focus on a better future for each family. The only way that can be accomplished is by reasonably consistent and gently upward economic growth. Growth is the necessary condition for the pursuit by each of his own happiness. A happy society is a more generous and loving society.

In this light, the U.S. — with a growing GDP from the year 2000 until today, along with a steady growth in the number (and percentage) of people employed — is better off than almost all nations in Europe. Whatever else is happening in the U.S. economy today, these are very good indicators.

U.S. output today is just about 40 percent higher than it was when President Clinton left office. The nominal GDP has grown from $10 trillion at the end of 2001 to $14 trillion at the end of January 2008. In other words, the U.S. has added to its national wealth an equivalent to the whole nominal GDP of China (in 2007, $3.25 trillion). The U.S. today is as big as it was in 2001, plus the whole GDP of China.

At the end of 2001, when President Bush’s economic policies were just beginning to take hold, total civilian employment was just over 136 million. At the end of January 2008, it was 146 million. Some ten million new jobs! Not the greatest, but not bad; certainly not “horrible.” During the last six months, the number of the employed is down 100,000 — not as huge a decrease as everyone has been imagining. The total employment in June was 145.9 million.

I freely admit that the economy under Ronald Reagan and Bill Clinton did even better. Under Reagan (1981-89) and Clinton (1993-2001), the U.S. added more than 35 million new jobs. This was a stupendous accomplishment. Yet under President George W. Bush, the total number of employed civilians has risen much less dramatically, by ten million new jobs.

How did Reagan and Clinton do so much better? Reagan offered drastic changes in the tax rates paid by inventive, productive, and entrepreneurial citizens. (The tax revenues then received by the U. S. Treasury soared to all-time highs, most of these paid by the top half of income earners.) There followed the largest growth in new small enterprises in American history, and with them the largest single jump in the number of employed citizens. (Most new jobs are created by small businesses, not large, and by new industries — e.g., computers, cell phones, and fiber optics — not mature ones.)

When Bill Clinton was elected in 1992, he inherited a tremendous “peace dividend” from the enormous effects of the great events of 1989 and 1991, the collapse of the Berlin Wall and the change of governments in Central Europe and Russia. By contrast, when President Bush was elected in 2000, he was less than a year away from a devastating military blow against the American homeland — which knocked several American industries (transport, tourism, restaurants and hotels, financial markets, etc.) to the floor.

There is much more to be said about the current economy, and how unprecedented it is in its level of GDP and employment. But there is much bad news as well. The refusal to drill for new oil in the United States, and to build new refineries and new nuclear reactors, has left America colossally dependent of Middle Eastern powers (and also Venezuela), whose wealth supports international terrorism against the free world. We are both depriving ourselves of independence, and subsidizing with enormous sums those who would like to destroy us.

This stupid policy has also led to Americans’ paying $4.50 at the pump per gallon of gas, with prices in the near future heading toward $5 or more. Yet bad news sometimes has a few good consequences, too — not enough to take the pain away, but enough to ignite hope.

For example, the high price of oil is causing some Americans in the global economy to compare anew the cost of transport against other costs. Some are reconsidering whether it might soon be cheaper to produce closer to home. The high price of a barrel of oil is also making new forms of exploration for oil and production cost-effective.

Again, $4.50-per-gallon gasoline is forcing Americans to change their driving habits, not to mention their car-buying habits. The driving patterns of teenagers are also likely to be cut back. And $4.50 gas has changed the political balance in the country. An ever-larger majority of voters is chanting: Drill now, drill here.

Look ahead. People buying futures contracts based on the price of oil as they expect it to be down the line, on condition that no new supplies come on line, are soon going to have to be careful lest new drilling and new refineries move from drawing boards to reality. Buyers of futures cannot afford to be wrong. They will quickly have to alter their projections about future supplies and future costs. Thus, the price of future barrels of crude can fall quickly, long before new gas starts flowing from new wells. Futures markets live and die by future expectations. The future price of gasoline can drop more rapidly than pipelines and refineries can be built.

As for buyers in a bear market, such markets have long been a textbook ideal. When prices are low is a great time to start buying. When the market starts upward again, returns will come in quite handsomely. The inner feeling of going upward is wholly different from the inner feeling of going downward. In my many years, I have known up and I have known down. Up is better.

Published in National Review Online July 17, 2008

A Secretary, a Speaker, and a Priest

Since the arrival of my first DVD in the mail, I have been a convert, one might say, to Netflix. Opting to test out the availability of films through the service, my first request was for The Ninth Day, a German film set in Dachau and Luxembourg. Not only did was the request fulfilled, but the film must be described as one which takes one’s breath away, and keeps one deathly silent. I had read the short autobiography upon which the movie is based, Priestblock 25487: A Memoir of Dachau, and was put in some awe by it. What an amazing equanimity of spirit on the part of the author, Father Jean Bernard! What ruthless honesty about his own weaknesses, and quick vignettes about all the horrors. The autobiography offers a rare glimpse of the horrors faced by the religious at Dachau (virtually everybody in his barrack (and some others) was a Catholic priest, though some were clergy of other religions, especially Protestant.) Anecdotes related the abominable actions of the German guards, formerly religious men, as they raged with insults and violence. One night, the movie relates, Father Bernard (Henri Kremer, in the movie) is pulled from his building. One predicts that he will be hung on a cross by a rope strung through the bonds tying his hands behind him, left suspended in agony above the snow during the icy night, as happened to others. Instead, he is told he has been released. The Gestapo in Luxembourg wants to use him to persuade his “recalcitrant” bishop to go public with a letter distancing the Catholics of Luxembourg from the Pope by making the Church the official church of the Nazi Party — and to begin by showing that the Church does best “cooperating” with the Party. Father Bernard is given nine days to complete this mission, and if he fails, he must go back to the horrors of Dachau. Further, if he flees to Switzerland or elsewhere, all the priest-prisoners from Luxembourg will be put to death, perhaps on crosses.

His Gestapo handler is a young man who had come within two days of being ordained as a priest before concluding that he could affect history more and make the world better by giving himself to Nazism rather than to the Church. The worldly secretary to the bishop is a willing ally of the Gestapo. The Catholic bishop has refused to have anything to do with the Nazi occupiers, and has never once even gone outside his residence to chance being forced to meet with them. Meanwhile, he keeps the cathedral bells tolling loudly for several minutes every day, and the people take comfort from the strength of this resistance.

You really will want to see this movie for its own sake, as well as for the light it sheds on the real human interactions between church and state under totalitarianism. One will see what faith calls for in these extremities.

On a lighter note, I picked up two summertime novels that I have found both gripping and full of information I am glad to acquire. The first is the Newt Gingrich novel (with William R. Forstchen), Pearl Harbor: A Novel of December 8. The novel is particularly good at inserting us into the Japanese side of the war, by devices that take us back into personalities, families, schools, and habits of thinking that help us to sense what was in the minds of the Japanese. It is a novel full of the historical details that historians love.

The second, Dragon Fire, is a book I have been waiting to run into for several years now. During the Clinton administration, my wife and I were invited to fly back to Washington with Defense Secretary William S. Cohen and his astute, gracious, and beautiful wife, Janet. The secretary told us how worried he was by the new biological and chemical weapons becoming available on the world market, and how easily they could fall into the hands of tiny rogue cells of operatives, and cause unbelievable destruction in American cities. At one point, as the plane flew above the clouds, he held up a pitcher and said, “Imagine this filled with anthrax. Let loose in a subway or anywhere like that, it could put hundreds into agonizing death throes.” He said he had decided to put vivid examples of what he had learned about international terrorism into a novel — it was the only way to get people to face the reality.

Sure enough, not many pages into his novel, a high official in the Defense Department is put into an agonizing death in the kitchen of his own home by the most ingenious of methods. He alone had refused to be inoculated by the vaccine required to resist anthrax poisoning. Someone must have known that, and used that knowledge. But who?

If you like to learn things even in times of relaxation, in fast-paced and vivid books, try these.

Published in National Review Online June 27, 2008

No One Sees God: An Interview

What is the point of your book? My experience has shown me that self-knowledge has a huge impact on what one thinks about God. If God is within you, you can gradually become aware of that and reflect on its implications. If you are certain that God means nothing in your life, then you interpret your life quite differently. Steven Pinker, for instance, says he is a materialist, and in his world God has no importance. One would expect that each of us must answer the question “Who am I, under these stars, with the wind upon my face?” Whether you can find God within you depends a lot on how you answer that question.

In these matters, no one has knockdown proof. We make the most reasonable judgment we can, but practically everyone can see how easy it would be to come to the opposite conclusion. In actual life, many believers become atheists, and many atheists become believers. Each does this on the basis of evidence that makes a new and powerful impression on her.

No one catches direct sight of God. Our knowledge about him comes from weighing our own experience of life—including our experience of the natural world, the experiences of conscience, such experiences as the inner drive within us to ask questions (even from the time we were children), the pleasure of acts of insight, and reflection upon what we really do when we make judgments that something is true or good or beautiful. What suppositions are we making about existence, even in the simple act of judging that something is true or good (or better) or real—not an illusion or a fantasy.

This book begins self-discovery but leads well beyond a narrow, constructed sense of self.

There are many books about belief and atheism flooding the marketplace today. Where does your book fit?

It will probably struggle to find the special audience it needs: committed to reason and liberty, and by the accident of certain human experiences able to sympathize both with those who know God and those who find nothing there. And to see the benefits of reasoned conversation between such seemingly opposite tendencies.

At times in my life I have been driven toward atheism, wanted to become an atheist. Was left in the dark about God, felt nothing, nada. But none of the various sorts of atheism I encountered (and these were many) seemed intellectually satisfying. All felt—to me, at least—like dodges. Any line of questioning that brought pressure on atheism was simply defined out of existence or at least treated as irrelevant. For example, the question “Why is there something, not nothing?” was ruled out as a question that cannot be answered by science, therefore meaningless. That is much too easy. And so with other questions.

Many of the books responding to the new atheists emerge from evangelical or other traditions that root their belief in feelings, sentiments, or experiences of conversion. I have never found this approach helpful in my own case. I want to go as far as reason will take me. This is the principal difference between my book and others. I seek a reasoned path, a way rooted in reason—a path through the very structure and constitution and methods of human understanding.

To my mind, our understanding of God emerges from our questions about our own understanding.

It certainly seems like our conscience comes from a light over which we are not master, a light greater than ourselves, which often faults our own behavior down to its roots far below the surface of our rationalizations. It certainly seems as if the questioning of our own long-held assumptions, and the relentless probing of our comfortable beliefs about ourselves, comes from somewhere within ourselves—but greater than ourselves and not subject to our own self-deceptions. Thinkers since Plato have discerned this, quite rightly—you can test it in your own experience.

So mine is a book about reason’s path to God. Whether at this task reason succeeds—or fails.

The thing that makes me most curious: Why do you find atheism unsatisfying? Take the typical atheism of a university professor or of the literary world. Why doesn’t it grab you?

To me it seems a contradiction to insist that all things flow from blind chance and then to go on calling oneself a rationalist. Irrationalist on the big questions, rationalist on the things amenable to science, and something like “emotivist” on matters of practical choice and ethics. In the perennial inquiries of the human race, this mix doesn’t add up.

I can understand why atheists invent a heroic image for themselves—Bertrand Russell’s Prometheus, or Dylan Thomas’ raging against the night, or Sisyphus, or even Milton’s Lucifer refusing to “serve.” But all this seems to be striking a literary prose to cover up the emptiness of meaning in human life.

Out of a kind of commonsensical rebellion against doubletalk, which confuses the sensible fellow, the rough-hewn Lincoln sees something more down to earth and matter of fact about nodding toward “the better angels of our nature.” In what some see as mindless bloodletting, he sees in the dead at Gettysburg a noble meaning, in keeping with the history and destiny of humankind. Sensing a touch of the divine in oneself is, in this way, and for most people down through history, the default position of the human race. For most folks, things seem to add up better that way. But it remains possible to think most people wrong.

Do you think atheism (secularism?) is on the upswing? I was surprised by the title of your last chapter: “The End of the Secularist Age.”

The idea was suggested to me by two writers, on opposite sides of most issues, who both have a knack for reading the times: Irving Kristol in America and Jürgen Habermas in Germany. Kristol observes that while secularism keeps marching through the institutions of daily life, the core of its living beliefs is spent, dead, unfruitful. Any movement that deprives most human beings of any meaning in their lives is eventually self-doomed.

Professor Habermas writes that the events of September 11, 2001, shocked him into recognizing that secularism represents a small island in the midst of a turbulent sea of religion all around the world. Even in the developed world, as in the United States, religion thrives. Certain sectors of European society seem to be an exception. And how long can they hold out?

However this may be, others have noted that secular couples almost everywhere tend to have few children (sometimes none), and thus bring a demographic crisis upon themselves. Further, secular societies seem to enervate the inner self-confidence of whole cultures and make them think that they are unworthy of survival in the face of dynamic, rapidly growing, even violent rivals. In a different vein, secular societies show a pronounced tendency toward moral relativism and have no common means of discriminating moral decadence from “liberation” or distinguishing moral progress from decline. If there is no God, there still remain rational standards. But the question “Why be rational?” gets harder and harder to explain to the wayward.

Judaism and Christianity down through millennia have proved to be adept at generating Great Awakenings in entire cultures. It is not clear that any secular society can do so.

An admirable secular humanism still thrives among us—but it does seem limited only to smallish enclaves. It is difficult to foresee it capturing multitudes. Besides, the examples of those atheist societies that have tried to fashion ceremonies, liturgies, and vast demonstrations (to make atheism discernible to the imagination and sensibility of peoples) are not encouraging. Secular humanism seems better suited to a few strong individuals and to fairly rarefied groups among the elite than to a culture as a whole. It founders on its own perception of the meaninglessness of human life. It offers only the meaning that individuals can put into it—and as easily pull out.

No One Sees God is available in bookstores August 5, 2008.

Published in First Things Online June 24, 2008

New Atheists, Old Realities

As far as I can see, the New Atheists have been slowly executing a strategic retreat. Many seem to admit that there is not now, and can never be, a knock-down proof for atheism. Many seem also to be admitting that, no matter what their skeptical friends write, belief in God is not only here to stay, but also seems to be rooted in human nature itself. It may even provide an evolutionary advantage. Thus, the line of defense they have more and more frequently retreated seems modest and open-minded. As their reply to the question, “Is there a God?” their new answer is perfect for a bumper sticker: “I don’t know, and you don’t know, either.”

This is a mistake. The New Agnostic holds that the burden of proof is not on him; the burden is on others to “prove” to him that there is an object “out there.”

But the evidence about God is not to be sought “out there.” It does not reside among other classifiable, sensory objects in this universe. The question about God is essentially a question about one’s own personal identity. Do you yourself, Mr. Agnostic, find evidence within your own inner life (in a way that can be replicated by others) that your identity is not fully known until you admit that you participate in a life much larger than your own, drawing you toward becoming more fully developed and greater than you are? In a Light more powerful than the light of your own conscience? The question is about you.

Those who discover such evidence can claim to know that God exists within them, not simply to believe it. They hold that to find this evidence is the norm, not the exception; it is the default position of human beings. That is why the emergence of the religious impulse is to be expected in every generation. That is why a personal tie with God keeps being rediscovered in every era in human history, in virtually every culture.

There are two chief inner experiences that lead humans to the knowledge that in order to understand their own human nature adequately, they must come however slowly to recognize that they already participate in a divine nature, whose demands upon them as they currently find themselves are quite severe.

Consider first the “prison literature” of the twentieth century. In the prisons of officially atheist regimes, Fascist and Communist, there were many who were thrown into their cells at a time when they thought themselves to be atheists. Only slowly did some discover that there was an inner demand in them, a demand that they not become complicit in the lies of the regime; they must not sign their names to the lies put in front of them. On this imperative to stay honest, even at the cost of great pain, rested their entire integrity. If they had compromised that, they would have become part of the universal depravity insisted upon by the regime: “There is no truth but the truth of the Party.” They would have become like their jailers.

But why did they come to hold that this inner drive for absolute honesty was essential to their own human identity? Their senses of touch, hearing, seeing, smelling, and tasting may have ached with pain and violation. They may have been without any feeling of assistance from anybody, human or divine. Even their ability to give reasons for what they were doing might have collapsed, because the pain was so great and the terror of death so acute. The arguments of their torturers may have come to seem evident to them – and yet some deeper inner light drove them to refuse to lie.

What is the source of that light within them, which refused to let them surrender, even when their bodies could bear no more? They experienced that source as something greater than any part of their own body or mind. Yet that light seemed integral to their own self-identity.

This is the evidence that led Sharansky, Valladares, Mihailov, and an unknown number of others to perceive that they in fact lived in a spiritual community larger than their own ego, a community with all other humans struggling to preserve their integrity under threat of pain, and more than that. They also experienced by a kind of connaturality a mysterious Other (incorruptible and insistent) within them, more important than their own bodies and their own temporal life.

Such persons felt inwardly that, if they were not faithful, their moral failure would matter to that Other, in a wholly different way than it would matter to their jailers. Their moral surrender would be interpreted by their jailers as yet more evidence that everybody, just like themselves, had a price at which they would surrender. In such a surrender, their own integrity would die, and so would the real presence of God.

A second bit of evidence within myself (evidence that I participate in a wholly other, inconceivable Source of light) is my own insatiable drive to ask questions. Nothing finite satisfies me. There are always more questions to be asked. No existing concept seems final. In fact, this unrelenting drive lies at the basis of the scientific impulse. But it arises also in our intellectual lives outside of the habit of science. It arises within the habit of being faithful to reason, even in areas where science itself cannot go.

Ought I to marry this particular person? Ought I to take this job, make this work the center of my life’s pursuits? Is this the right institutional home for me, the community best designed to keep me asking questions and growing morally stronger?

One can make such choices intelligently, with good reasons. On the other hand, one may fail to anticipate realistically later twists of fortune. Later, one can blame oneself for having been more blind than one ought to have been. One can deeply regret past choices. In brief, science itself is not the only use for reason; in practical life, reason is also extremely important.

Here, some philosophers observe that people deploying practical reason live as if in the presence of an objective Observer. This Observer cannot be deceived by a person’s own self-deceptions. This Observer keeps pushing one to become more honest with oneself. And this Observer is not “out there,” but within. This Observer is sentinel not only over our scientific reasoning, but also our practical reasoning.

This, too, is evidence that we live in God, and He in us, at the very center of our identity. Within us is the Light, Judge, Merciful One, Brother, Inspirer, Prodder, Driver at the heart of our existence. Without becoming aware of this dimension of our own honesty and unlimited drive to understand, we cannot properly understand ourselves. We think ourselves smaller than we are.

“I searched for Thee everywhere, my God,” wrote St. Augustine in his mature, pagan, often profligate years. “When I found Thee, Thou wert within.” And later: “Thou wert closer to me than I to myself.”

The New Agnostic may not know, not yet, but a great, great number of us do know – yes, know – that the best drives within us do not come from our finite, sensory selves. We participate in them as an inner light all unbidden. Sometimes even as a torment. These inner drives are much greater than ourselves. They teach us that we are open to the Infinite.

Published in The Washington Post/Newsweek June 20, 2008

Two Public Policy Proposals: Catholic Social Thought in Practice

Here are two practical concepts for improving the welfare of all the citizens of a nation, especially the poor and the ill and the disabled. At least twenty nations have already adopted variations of these proposals, and they seem to be benefiting enormously thereby. They have not yet been adopted in the United States, except in small pockets, even though some of the key ideas were very well put forth by Steve Forbes in his campaigns for the presidency in 1996 and 2000. The Democrats in Congress blocked them when they were put forth by President Bush in 2005. These proposals are a threat to those with vested interests in government-run social democratic programs, because they tend to achieve many more goods, with greater efficiency, at less cost, and as a far greater impulse toward personal autonomy. Personal Medical Funds.

Do not build a government health service at huge public expense. Instead, mandate that income-recipients set aside some low percentage of their earnings. These savings would be deposited into a health fund owned by themselves, and this fund would be transportable to any future job they may move to. In case of their own premature death, this fund would be inheritable by any persons they have previously designated as their heirs.

These personal funds could be spent for ordinary health expenses at the owner’s discretion. However, any money he does not spend will be kept in his account as long as he lives, invested safely for modest growth – and given to his heirs if not spent. This policy makes every earner the owner of a capital fund of his own.

Mandate next that a small percentage of that personal fund, set by market rates, go for the purchase of (so-called) “catastrophic” coverage, in case of cancer or other serious illness, a personal accident, freak injuries, or the like. In the case of those with no income and no spouse – a minority of citizens – the government would pay an annual subsidy to provide similar coverage.

Ownership by individual families or individuals is important, for such ownership supplies an incentive to the one spending the money that he take responsibility for his own choices. Medical decisions ought not to be taken by the Government, nor even by health professionals alone, but by the individuals involved.

A public policy along these lines would greatly reduce the size of any government health bureaucracy. It would discipline costs, and it would raise the quality of service by encouraging competition among providers. President George W. Bush took a first step in this direction by his 2003 bill to cover the pharmaceutical costs of the elderly. The government funds this program, many providers compete to provide the services, and the elderly themselves choose the provider best suited to their needs. This vigorous competition has brought costs down far more than first predicted, and quality has been enhanced.

The reasons for personal medical accounts are three: first, to give individuals ownership of their own capital fund for health; second, to give them access to many more choices; and third, to invest individuals with responsibility for how they use their medical capital.

Personally Owned Old-Age Accounts.

In a similar way, nations widely scattered around the world have mandated that income earners set aside a certain percentage of their income to pay into a pension account for old age, of which they (not government) are the owners. This program has three significant good effects. First, the government saves enormous funds by closing down most of its own Old-Age bureaucracy. Second, every citizen becomes owner of a capital fund, whose remainder after his death goes tax-free to the heirs he designates in advance. In this way, any accumulated capital fund remains in the family. Third, this new investment flow immensely strengthens the national private economy, in which most of the family funds must be invested. In poorer countries, this domestic capital investment is especially important.

The policy of personal Old-Age Accounts has by now been tried with great success by such nations as Chile, New Zealand, Lithuania, Slovakia, and more than twenty others (half the population of Latin America now has access to personal accounts).

Let me close this point with one clarifying and affecting example. In the United States today, a disproportionate number of black men die before, or not long after, they reach the age of sixty-five. But that is the age at which they become eligible for a monthly stipend (“social security”) from the government. As matters now stand, the State owns the pension plan, so when these black men die, the government keeps everything remaining in their accounts. In other words, these men lose everything that they paid into the social security fund during their entire lives. What a waste for them and their families!

Neither of these two new policy ideas promises paradise on earth. But they do seem designed to strengthen both the common good and the sense of responsibility (and well-being) of the individual person. The person and the common good are the two main normative inspirations of Catholic Social Teaching.

Published in The Catholic Thing June 17, 2008

The Adventures of Catholic Social Doctrine

When I was liberal and young, the popes of the preceding eighty years – from Leo XIII (pope 1878-1903), to Pius XII (1939-1958) – were regarded as “liberal popes,” and they were often quoted by liberal Catholics against their local bishops. One professor I know used to read a quotation from one or another of these popes, then ask the class who was the author. Most guessed Marx, Engels, or some socialist. Generally speaking, these popes were defending an alternative to socialism and Marxism. They defended private property rights against the socialists, but also against a too-narrow libertarianism. Since a regime of private property is justified by its service to the common good (in Locke and Mill, for example), sometimes the common good imposes moral burdens on those who own property, to protect the weak.

These popes also defended the natural inequality of temperaments, skills, habits, and orders of preferences among humans – both against the socialists, who preached utopian equality, and against the partisans of “laissez-faire,” who thought the law of competition fair, without noting that there are many too weak to compete on even terms (See especially Leo XIII).

I still remember the glow that blushed over the whole Christian left (and the secular left, too) upon hearing the words of the new pope, John XXIII (1958-1963).

The popes from 1930 on were crucial to the discrediting of many socialist ideas – in both forms, National Socialism and International Socialism.

They were particularly important in encouraging the build-up of Christian Democratic parties to block the rise of Socialist and Communist Parties in most of the nations of Western Europe. (It was their unexpected success that so infuriated the East Germans, who launched a propaganda assault against the then-much-loved Pius XII.) These parties held the line until the surrender of Communism in 1989 and the spontaneous tearing down of the Berlin wall. (What rapturous days those were!)

The great difficulty in the inner development of Catholic Social Doctrine, from one historical phase into another, lies not primarily in the field of theological or moral principles, which are relevant to all ages and places. These are principles for all seasons: “The wise steward brings out from his treasure both old things and new, as suits the season.”

The wise steward must adjust when mild centuries give way to centuries of icy storms, and when new institutions sweep over all before them. One such turn occurred, for example, when Europe ceased to be primarily agricultural and became chiefly urban. At that point, Leo XIII was obliged to write of New Things, Rerum Novarum.

Catholic Social Doctrine depends crucially on the adjustment of “middle axioms,” which gear unchanging first principles to changing times. It also depends crucially on an instinct for the particular and the concrete, the accurate formulation of the new facts in play, and the shifting self-interests of all the players.

On sin and self-interest, in my experience, the great Protestant theologian of my youth, Reinhold Niebuhr, had more to say, and said it better, than any Catholic theologian I have read. Except for St. Augustine, of whom Niebuhr was a close reader.

For such reasons, Catholic Social Doctrine experiences “development” in two especially acute areas. Middle axioms need to be constantly reformulated to mesh with new sorts of institutions and regimes. Further, the shifting grounds of historical change need to be noted sharply, so as to reflect reality as it is, not as it was, nor as utopians wish it were.

Catholics know in their bones that history is strewn with ironies and tragedies, strange twists, monstrous actions by deranged individuals, the lassitude of the good, the collapse of the center, the rapidly spreading infection of destructive ideas. Even saintly leaders acting with good intentions have sometimes brought about ugly consequences they did not intend.

In other words, Catholic Social Doctrine is anything but cut and dried. It is a great field for young talent, full of energy and originality. It is also a hugely demanding discipline, because any practitioner (either on the theoretical or on the practical side) must learn an immense amount in the very short period of a human life.

Down in the public arena, no one has the luxury of hindsight. Those who fear making mistakes thereby disqualify themselves from taking action.

During my lifetime, Catholic Social Doctrine has been far too much distorted by being formulated through the lens of European experience, especially feudal, class-bound experience on the one hand, and social democratic experience on the other. That lens is a bit more pink than natural color. We in America are indebted to Europe; but we also have the experience of a New World. It is our task to contribute new things to the universal patrimony of the Catholic people.

Published in The Catholic Thing June 5, 2008

American Energy and Invention

Candidate Obama, like so many lefties, seems to believe anything bad about the United States, without even submitting it to critical thinking. He said on May 19, 2008, for example, that 3% of the world’s population (i.e., in his calculation, the United States) accounts for 25% of the greenhouse gases put into the atmosphere. In the 1970s, the lefties used to talk about 6% of the world’s population using 25% of the world’s energy. Even before Obama, they were blaming America first. The left’s figures depend on what is meant by “energy.” Before the founding and development of the United States, “energy” meant the human back, beasts of burden, windmills, waterwheels, burning wood, coke, and coal, and the like. The United States is certainly not using 25% of the energy generated by those means today. I don’t think so, although it might be. The darn country is just so efficient.

But if we mean by “energy” only the modern sources of energy – electricity, the Franklin stove, the steam engine, the piston engine propelled by gasoline (and now by electric and/or hydrogen batteries), the processing of crude oil into gasoline, nuclear energy, the jet engine, the development of ethanol and other fuels derived from plants, and other devices – all of these except one were invented by the people of the United States, as their gift to the world. (The exception was the steam engine, invented by our cousins in Britain, and further developed here as well as there.)

In other words, the United States has invented nearly 100% of what the modern world means by “energy.” And it has helped the rest of the world to use 75%.

Why can’t the other peoples of the world learn how to discover, invent, and develop new kinds of energy? Why must the whole burden be placed upon the people of the United States?

And, by the way, it is not so hard to foresee the day when new kinds of batteries, also invented in the United States, will nearly entirely replace gasoline as the propellant of our cars. Think what will happen to the Middle East when oil becomes too expensive, too dirty, and just plain obsolete.

The smart-aleck remark of the Saudi official who recently told the Americans traveling with President Bush: “If you want more oil, buy it!” will one day be quoted quite quaintly alongside Marie Antoinette’s “If they don’t have bread, let them eat cake!”

Posted on "The Corner" at National Review Online, May 23, 2008

Barack Obama’s America

One of the wisest American former officials I know asked me a few nights ago: “Michael, put on your thinking cap, and tell me where the United States will be four years from now, if Barack Obama is president.” I had been trying to avoid that question in my own mind. I have tried to tell myself the old proverb (told me by my father), “God takes care of children, drunks, and the United States of America.” I have tried to imagine that Obama will not be president.

But I should try to do the responsible thing: follow the trail from Obama’s announced principles and policies to their probable effects, based on how we have learned that the world actually works.

The number one issue, orders of magnitude greater than others, is what will happen in Iran, Saudi Arabia, and other sources of worldwide terrorism — suicide bombers, haters of Israel, and would-be destroyers of the United States and her allies. What will happen in Iraq? What will happen in Iran? What will happen in Pakistan?

Our Democratic party ever since George McGovern’s candidacy in 1972 has wished and wished, like an undisciplined child, for a benevolent world of peace, in which we could “talk to” and “reason with” those leaders whom earlier administrations had learned they could neither trust nor deal with as rational, benevolent partners. Earlier administrations had also hoped that other leaders of nations respected us, and meant us well. Events like the bombing of the World Trade Center, the attack on the USS Cole, and September 11, 2001, plus the subsequent fury and irrational cruelty of jihadists around the world, disillusioned them. But not, apparently, Obama; nor many members of the left-wing generation he represents.

The partisans of the welfare state demand peace, in order to pay for its insatiable need to keep handing out more and more benefits. That is why left-wing statists take peace as their natural inheritance. They cannot go on without it. They do not intend to pay any price for it; there are no funds left for that.

Given the historical record of the last 200 years (and more), what can we expect from this nursery-room fantasy? An untypical, even unprecedented era of peace? Or, on the contrary, the salivating determination of enemies to celebrate our visible moral weakness, and to slay their hated enemy while we bow our heads, standing there as weak and frightened supplicants? When a head is lowered from weakness, they strike it off.

In my experience, unwillingness to fight earns one contempt, further furies of terror, and truly bitter war. But perhaps other observers trust human nature more than I.

If the United States shows signs of weakness, surrender, and a one-sided departure from Iraq, the rejoicing of those who predicted that they would in the end defeat us will profoundly strengthen their resolve for the next battle. Further, without an offensive thrust in Iraq, any military forts or airfields of ours would be sheltered in a defensive enclave — announcing to those who hate us that they should keep killing two or more Americans every day, drip, drip, drip, until the American people cannot stand it any more. Weakness once shown invites fiercer aggression.

Iran will thus have its nuclear weapon by 2012, secure in the knowledge that Americans have no heart to do battle to prevent it.

In Pakistan, forces of economic and political development will know that they can no longer count on the Americans as a last resort. They would soon — to save their families — begin to yield more and more space to jihadists, terrorists, and promoters of sharia law. Free nations by 2016 will be far weaker than now, with far less space in which to alter the direction of terrorism.

Domestic Policy Meanwhile, if Obama keeps his pledge to raise taxes on the top 10 percent of income earners (or even on the top 2 percent), he will give them enormous incentives to alter their behavior, so as to show lower income. Since the top 1 percent of earners pay over 35 percent of all income taxes paid by all Americans, any decline in their income means a steep decline in tax revenues. Obama seems to have no comprehension that raising tax rates at the top dramatically lowers revenue coming in. He will learn the hard way.

His policies on quasi-universal health care will change all the incentives in our current health system — and for the worse. Studies show that a high proportion of demands for health care are the result of personal behaviors — eating or drinking too much, not exercising enough, leading a dissipated life, not taking advantage of preventive care, spending health dollars heedlessly (because they are paid by the State, not the responsible individual).

Many older doctors will leave medical practice rather than become employees of the State, constantly regulated, badgered, and demeaned. The idea of medicine as a proud, independent, inventive profession will be profoundly wounded. In hospitals, paying benefits for patients (even if they practice irresponsible behaviors) will demand ever more dollars, which must necessarily be pulled out of research and invention. Long bureaucratic lists of those needing particular operations will force even the neediest patients to wait long months before they can get care.

Neither Obama nor his party seems to understand how incentives motivate human behavior — not force, not coercion, not mockery, not nursery-school regulation, but real possibilities of good fruits up ahead for free and responsible actions. They do not understand the wellsprings of a virtuous, free, and prosperous society. They are still entangled in the fantasies of the European Left of 150 years ago.

Thus, Obama is now the creature and the prisoner of the American far Left, which has learned nothing from the failures of socialist and statist and anti-capitalist ideas during the past hundred years. Many leftists learn nothing, know nothing, and propel themselves not with practical wisdom, but with outrage and contempt and a desire to punish those who do not agree with them.

My friend himself thought, he finally revealed, that the West has come to an epochal axial point in history. From now on, economic and political progress would grow far less quickly than ever before, and a long-lasting, precipitous decline is about to begin. Overseas, and also at home.

Morally, too, virtue and character and responsibility for oneself would be mocked and discouraged. The State would take over more and more of life. Although licentiousness would be glorified on big screen and small screen (the Democrats favor the Hollywood view of the world, and vice versa), neither self-directed liberty nor self-mastery nor responsibility for the consequences of one’s own behavior would be encouraged. These would be treated as retrograde ideas. All virtue would be attributed to the motherly caring State — and to its political managers. Woe to the “right-wing” dissenters!

Well, maybe I am wrong. But that is how I see things, admittedly through a cloudy glass.

My only two suppositions are (1) that Obama will do exactly what he now says he will do; and (2) that we may dimly discern the consequences likely to flow from his words and actions, based upon what we have seen happen in other decades and other generations.

My most hopeful moments derive from imagining that Obama, as president, will be dissuaded from acting as he now says that he will. In that way, God will once again take care of those who are drunk on statist illusions, and He will once again take care of the United States, despite itself. It is when I take Obama at his word that pessimism floods over my heart. Published in National Review Online May 15, 2008

Remembering 1968

After 40 years, the year 1968 still brings to mind elite students, major universities, rebellion, the delegitimation of traditional authorities, cultural upheaval, the lifestyle revolution, marches for civil rights, anti-war protests, the drug culture, flower children, feminism, the search for personal meaning, and a last youthful gasp of the Marxist imagination. The year 1968 served a very spicy moral goulash--but where did all these seemingly disparate ingredients come from? And why did they emerge with such force at that particular time? From my own experience of the 1960s, I would like to describe some of the arguments and insights that led a portion of the students I taught at Stanford to cast their lots with the "youth movement." They did not seem especially predisposed to do so; many of these soon-to-be radicals were unusually industrious in fulfilling their classroom duties, good-hearted, well brought up, and eager to participate in community service. During the presidential election of 1964, 51 percent of Stanford undergraduates described themselves as conservatives, and the Stanford Daily endorsed Barry Goldwater.

So, from the first, their choices were freighted with ambivalence. A few took actions that destroyed their future prospects, even their lives; yet for others the Sixties were a liberating and humanizing experience that set them on paths they still find fulfilling today. To clarify the tangle of motives that led so many to join the youth movement, I want to propose a distinction between the movement as a whole (which, given enough time, went badly) and the widely varying personal journeys of some of its participants.

In the oft-derided cant phrases of the day, many in the 1960s sought to "seek their own identity" and "find themselves." Had these vacuous slogans been replaced with the Socratic dictum "the unexamined life is not worth living," this quest for identity might have received, not instant mockery, but some degree of respect. The quest was a starting point for millions of American youths, and understanding why the question "Who am I?" became especially insistent during the years 1964-68 is essential to understanding that turbulent era. Nowadays as then, universities are doing everything they can to block that question from being raised, and to maintain a sort of institutional repression.

Early Stirrings

In August of 1965, my wife and I, expecting our first child, drove out to Palo Alto, Calif., where my first professional position was waiting. I would be an assistant professor in a new religion department, the first Catholic in that field in Stanford's distinguished history. During the three years we were there, approximately 300 students per quarter attended my classes (usually in one large lecture class and one small specialized class)--that is, almost a thousand students per year, a good proportion of the undergraduate student body.

The youth movement reached Stanford more slowly, and at first more humanistically and democratically, than it did at many other campuses, notably nearby Berkeley. It first made its presence known in the spring of 1966, when students elected the most radical student-body president in the school's history--David Harris, later the husband of the folk singer Joan Baez. Harris had campaigned as a "humanistic" radical, meeting with students in every campus dormitory and many fraternity and sorority houses for long, relaxed conversations and easygoing late-night arguments about the near future of the university and the nation. For some students, this process provided a superb self-education. For others, as happened often in those years, early probings drifted more and more into angry activism.

"Rap sessions" like these were a constant feature of the era, and as I wrote at the time, the student conversations of the 1960s were characterized by three basic weaknesses: Too little consciousness of the irony, tragedy, and irrepressible sinfulness of human affairs; an unrealistic appraisal of the proper role of power and self-interest in all human institutions; and a strong tendency toward rosy, gauzy hope in "progress," including utopian fantasies about "new" types of human societies. These three weaknesses combined to guarantee, sooner or later, tragic outcomes.

Still, a constituent element of irony and tragedy is self-consciousness about good intentions, along with a false sense of personal innocence. These flaws are typical of the young, who so easily dwell unforgivingly on the faults, failings, and all-too-visible sins of their parents, their elders, and all the institutions that they have grown up in.

Most of my students at Stanford during 1965-68 had been born just after World War II, into the greatest period of prosperity the world had ever experienced. Their parents had grown up during the Depression, and their grandparents had memories of the First World War and the hard, rugged life of a predominantly rural America. The families of these students did everything possible to make the life of the generation of the 1960s more comfortable than their own had been. No previous generation in history had been as cosseted as the one that was coming of age in 1968.

No wonder my students felt so potent, so favored by the whole universe, and so singularly virtuous. Among other things, they were receiving the education that so many of their parents and grandparents had gone without. In the United States, a new university was founded, on average, every two weeks from 1945 to 1972. Never had so many twentysomethings been enrolled in college anywhere on earth. Never had there been so many young professors, well-trained but quite unseasoned.

The older generation had learned from the dictators of the two World Wars the dangers of authoritarianism, and they often warned their children against passive, uncritical obedience. In addition, for fear of appearing to be "dictators" in their own homes, many parents were deliberately lax in applying the sort of discipline under which, as youths, they had chafed.

Developments in the nation and the world reinforced an emerging sense of worldwide freedom and possibility. New hopes and horizons had dramatically emerged from the election of the young John F. Kennedy as president in 1960. Between 1962 and 1965, the most unchanging institution of the West, the Catholic Church, had launched at the Second Vatican Council a more open and searching "aggiornamento" than any other institution on earth. The civil-rights movement, after first drawing national attention in the mid-1950s, had continued to gain strength, and by 1964 had wrested a path-breaking Civil Rights Act from the government. In effect, the United States had convicted itself of hypocrisy and set out to rectify it, first legally and then in the practices of everyday life. One older European journalist told me at the time that the American reforms in civil rights during the 1960s were the deepest and most moving social transformation she had ever witnessed.

From these events, dominant mantras emerged. The first, with so many areas of life undergoing revolution, was "The times they are a-changin'." Everything seemed new and fresh; everything seemed promising; the force of change seemed unstoppable. Second: "Don't trust anyone over 30." The leading figures of their time were confessing the nation's long history of hypocrisy with respect to blacks. What other hypocrisies remained to be exposed? Third: "If you're not part of the solution, you're part of the problem." Students who had spent their summers registering voters and organizing protests in Mississippi and Alabama came back to their campuses much moved by the poverty, violence, and sheer meanness that they had encountered. They were surprised that many of their own professors--and university administrators--were not as alarmed as they were; their elders were preoccupied with expanding their universities to accommodate far higher numbers of students than ever before, and to elbow their way into greater national prestige.

Roots of Rebellion

The academic profession reinforced this appearance of disengagement with its supine response to the misnamed Free Speech Movement, which began across the bay from Palo Alto at the University of California, Berkeley, in 1964. Many kinds of political activity, such as students promoting non-university events or organizing on behalf of campaigns, were prohibited on campus, but much of the controversy and student anger involved a nearby piece of land, at the intersection of Bancroft Way and Telegraph Avenue, that was owned by the university. When police attempted to enforce the ban there in October 1964, students responded with sit-ins and other protests that eventually disrupted the university in December.

The very idea of a university rests on the principle that professors have superior wisdom to convey to their far less knowledgeable students, yet the Berkeley faculty, intimidated by menacing students, caved in to the demonstrators' extortion. Any moral or intellectual standing that professors might have had in the eyes of students was dashed to the ground. Similar craven surrenders occurred at some 300 other campuses over the next few years.

Then came the shock of Vietnam, which thrust a bitter choice upon privileged young men. University students were entitled to defer military service while pursuing their education, but for juniors and seniors, graduation was scarily imminent. In 1968 some common occupational exemptions and graduate-school deferments were taken away, making more young men subject to the draft. Resistance to the war grew in bitterness, even though the active resisters still made up only a minority of university students.

The college generation of 1968 had been taught by their parents that they were special and, as they grew older, would excel in national leadership. They were often described as the best-educated young people ever. Many young men imagined themselves perishing in the jungles of distant Vietnam before the dreams of their parents had been fulfilled--an indescribable waste. A small but increasing number began to burn their draft cards in defiance, declaring themselves conscientious objectors (even though many could not claim that they belonged to any "peace church").

Another factor was the event said to have ignited "the Sixties," the Kennedy assassination, which was shocking at the time but whose aftermath may have had even greater effects. Most of my students at Stanford had been 15 or 16 years old on November 22, 1963, when the young and vigorous John F. Kennedy was shot and killed as he rode in an open convertible in Dallas, Texas. The assassin turned out to be a Communist who had lived for a time in Moscow, was married to a Russian woman, and just before November 22 had made contact with the Cuban embassy during a trip to Mexico City.

These revelations boggled the left-wing mind; it could not tolerate the idea. In response, liberals went searching for "conspiracies"--right-wing conspiracies. They ended up implicating the CIA, the FBI, President Lyndon Johnson, and even the Warren Commission in both the killing itself and the imagined cover-up. For a political movement whose aim was an ever larger and more comprehensive welfare state, it was radically incoherent to teach people to distrust the government. In this sense, the American Left set the stage for the return of the party of limited government to political power.

But the roots of 1968 stretch back even farther. I had seen signs back in 1960, when I began graduate studies in the philosophy of religion at Harvard. At that time, most universities hewed closely to an ideology that was publicly described as liberal, pragmatic, realistic, and deliberately concentrated on the "descriptive," rather than the normative. Propositions deemed to be "normative" were assigned to a realm beyond the bounds of science, pragmatism, and practicality.

Students and faculty began to rebel against the kind of "political science" that depended almost solely on statistical surveys and mathematical equations--a discipline whose guiding principle seemed to be, "Unless you can put a number on it, it doesn't exist." Some professors rejected this thin gruel and led a recovery of the classical tradition of "political philosophy." The main carriers of this corrective initiative were the students of Leo Strauss ("the Straussians"--Allan Bloom, Walter Berns, Harvey Mansfield, Harry Jaffa, and many others) at the University of Chicago. They tried to think philosophically and critically about fundamental political principles, shining the light of centuries of Western wisdom on the subject. An associated movement was the renaissance of contemporary Thomistic studies, as represented by the very good quarterly at Notre Dame, The Review of Politics.

Yet the spirit of "just the facts" remained ascendant, and many students in the 1960s felt themselves choking in a miasma of materialism, logic, and technological regimentation. At the same time as the human element dwindled in their curricula, ever-increasing enrollment made their universities even more impersonal. Face-to-face contact with administrators was minimal. The most visible administrative communication arrived on computer cards marked: "Do not bend, fold, mutilate, or spindle." No one seemed interested in who the students were, or what their fears were. Almost no one spoke to their consciences. They were, they felt, not being awakened to larger questions, but processed; treated not as persons, but as things.

Thus, the campus turmoil of the 1960s was not a reaction against professors and administrators who refused to change; it was a reaction against the dispiriting changes they had made. What looked like a student rebellion against tradition, history, and the humanistic wisdom of the past was actually an instinctual revolt against an arid professoriate given to impersonal, "objective," "descriptive" discourse of the sort found in directions for assembling office equipment. Students were often confused about where to find true wisdom and how to distinguish it from the passing academic fashions just beginning to appear, such as Paul de Man's "deconstructionism" and the nebulous concept of "postmodernism." When their classwork only deepened the confusion, they began to look elsewhere for answers.

The Movement Hits the Streets

The San Francisco Bay area was an early hot spot of the youth revolution; many of the ugliest trends of the 1960s began there. The first thing that struck me when I arrived from the East Coast in 1965 was that "the movement" seemed divided into two wings, which I in amusement thought of as "contemplative orders" and "activist orders." In the streets one could already see lots of long beards on men wearing loose-fitting clothes and sandals, many of them with a kind of dreamy, meditative look on their otherwise hard faces. Many of the women had long, stringy hair and plain black or dark-colored dresses cinctured at their waists. The scent of marijuana was in the air, and in cafés, people whiled away the long afternoons sipping sangria.

On one level everything seemed safe enough, but not far below the surface was a mood of anger and menace. These folks were not just dissenting and dropping out; they were learning to hate the bourgeoisie that had bred them and the Establishment that looked down upon them.

As the anti-war movement grew, "flower children" began handing out daisies, irises, and other cheap cuttings to passers-by. Then came the peace medallions on black cords around the neck, tie-dyed T-shirts, arm tattoos, and peace posters in the windows. (My friends still tease me about the photograph on the back of my book A Theology for Radical Politics, which shows me in Boston in 1967 with long sideburns, a turtleneck, and beads--perhaps the first such costume seen or to appear in the Hub of the Universe. Those who saw it thought, not admiringly, that I had "gone California.") San Francisco, like other cities, saw burnings of draft cards in front of cheering, menacing crowds of young people. Becoming more common in 1968 and especially 1969 were large "peace marches," with plenty of noise on some occasions and a double-meaning silence on others--half flower-childish gesture, half "revolution" and "power to the people." On occasion these degenerated into clashes with police in riot gear, who sometimes fought back with tear gas.

Three important dynamos of energy developed out of this subculture. First were the concerted efforts at learning how to "organize," "mobilize," and stage "happenings" and "street theater" in order to "radicalize" new recruits. These efforts became steadily more thuggish and violent over the years, as organizers learned how intimidating the mere threat of violence could be, especially on college campuses led by "reasonable," peace-preferring, conflict-avoiding deans and presidents, mostly liberal Protestants. Confronted with loud expressions of anger and the possibility of mayhem, such rational liberals in positions of authority preferred negotiation, which inevitably meant surrender. This habit won them, not respect, but heightened contempt. "Up against the wall, m-----f----r!" became an acting-out of male bravado and aggression toward authorities who had previously made young men afraid.

Second was the rebirth of feminism. The first women to join the revolution were expected to let themselves be bedded with little ceremony, to act as secretaries at meetings where men presided, to cook, and otherwise to be subservient. At first, some women took well to the open eroticism, free love, and sexual exploration to which they were introduced. But soon anger set in, and many radical women began to lash out at "male chauvinist pigs," "troglodytes," and "oppressors." Like college presidents confronted by protesters, delicate left-wing males soon learned to give in to whatever outlandish demands were made upon them by feminists. In some cases, feminist "consciousness raising" led to experiments in lesbianism and the outright rejection of men.

Third was a flirtation with what we would now call terrorism, along with talk of open warfare against the "pigs"--local police and even (as at Kent State) the National Guard. Bombings of electrical towers, ROTC buildings, scientific laboratories, and other vulnerable targets began to occur--most shockingly, at the Capitol Building, the Pentagon, and a major New York City police station. For a while the radicals got away with their experiments in violence, and because of this, it took the young an unusually long time to understand that the City of Man, as St. Augustine pointed out long ago, is built on the power to put down insurrection and impose the peace. Such honest thinking about the civilizing uses of force had always been difficult for a liberal-Protestant nation to absorb, and the new radicals had to learn the hard way about their weakness in the face of determined state power. More than one writer observed at the time that radicals almost never tried violence against conservative authority figures, who were unashamed of the legitimate use of force. They saved their wrath for liberals--and sometimes their own parents (who had been so generous to them).

Second Thoughts and Aftereffects

The "morning after" of the 1960s began while night was still falling, with educations prematurely ruptured to the detriment of later careers, ugly breaks from parents and families, precipitately dissolved relationships with close friends, lovers, or spouses, imprisonments that broke spirits, and simply the "lostness" of having wandered away from one's old life without moral gyroscopes and traditional networks of meaning. This last was indeed an ironic end to the "search for meaning" that lay at the heart of so many rationalizations for joining the "revolution."

But there was also another rationalization, at least among the males. Since we might soon be dying in the jungles of Vietnam, they thought, why not take wild risks at home? Young men experimented with dangerous drugs, took reckless motorcycle drives, drank prodigiously, taunted officers of the law, made occasional forays into criminality (in protest against the evils of the regime, of course), and thus jeopardized any chance they may have had of a life regulated by self-government. A great many perished along the way, and even those who survived inflicted much heartbreak on those closest to them. "Existential decisions" was a term one heard often in those days. Young men had a sense of holding their life in their hands and casting a fateful die. The iconic movie of the time was Bonnie and Clyde.

On questions like these, the yawning gap in the education offered by the universities of that time amounted to an administrative and professorial betrayal. The excessive esteem that academic authorities lavished on "science" and "pragmatism" and "technocracy," combined with their forgetfulness of the deeper wisdom of the ancients, reflected post-war hubris. Their training and experience had left them utterly incompetent in making convincing moral arguments for the way they lived, conducted their affairs, and shaped the culture of the university. They simply were not accustomed to thinking in such terms. Against accusations from discontented students, they could not defend themselves even to their own satisfaction. That is why they gave in so easily at moments of confrontation.

It cannot be denied that some of the poisonous ideas that today dominate the Left have their origins in the 1960s. Moreover, the lockstep in which many on the left prefer to march, isolated from other points of view, has made them particularly vulnerable to unworkable ideas, and positively allergic to those discomfiting facts that would show that their fundamental narratives--especially those about economics, politics, and culture--are ill-matched to human nature and history.

In a delicious irony, this lack of realism in the cultural revolution of the 1960s led, by reaction, to the birth of what is now called in politics and economics "the center Right." It also led to a deepened appreciation for the wisdom of our ancestors, and for modes of philosophical and moral thinking that complement what we know from science. Science makes a marvelous servant of humanity, but a very poor master.

In short, the inadequacies of the energetic and once-promising 1960s have triggered a renewed pursuit of that well-ordered human City our forebears named Sophia, "wisdom." It is a City well aware of human sin and weakness, of tragedy and irony, and of the perennial case, Humility v. Hubris.

Published in National Review Online May 5, 2008

Now Indiana and North Carolina

As the endless Democratic campaign is sailing swiftly toward crucial contests in Indiana and North Carolina on Tuesday, May 6, the seas are high and the horizon keeps bobbing up and down. It is hard for anyone to see where we are. Ten days ago those doing the cold analysis of the numbers were writing that, mathematically, Senator Obama cannot lose. But then Obama’s clever, self-promoting pastor from Chicago, the Reverend Jeremiah Wright, took to national television for three successive appearances, and almost derailed the Senator’s campaign.

Cunningly and mischievously, Reverend Wright went on and on about his peculiar racist theories. He opined that whites and blacks by nature think with different parts of their brain, and learn in different ways, so that whites tend to excel in book knowledge, logic, and measured, reasoned music, whereas blacks tend toward subjective interactions, more passionate musical beats, unique rhythmic clapping, and on and on. The Reverend seemed to insist that black brains are constructed differently – but this is the kind of crackpot thinking that once led to separate, segregated schools, as well as to deep feelings of racial inferiority. His rant was horrible stuff.

Worse, Reverend Wright accused the American government of deliberately inventing AIDS to subdue the black population, of encouraging drug traffic to destroy inner-city blacks, and of practicing worldwide terrorism equivalent to that of Al Qaeda. He blamed America for September 11, 2001. His words have outraged a large proportion of Americans, and forced many on the left to disassociate themselves from his absurd claims.

Yet worse again, this intensely self-satisfied Reverend announced in front of cameras that Obama was now acting only from political motives in separating himself from his Pastor, leader, and friend of these past twenty years. Pastors tell the truth, politicians must dissemble.

This charge cut Obama to the quick, for it destroyed in minutes thecampaign image that Obama had been cultivating all through these long, hard months: that the young Senator is a man “above politics,” who “tells the truth” and transcends “the old, outmoded methods of politics.” Indeed, these claims are the Obama campaign. His campaign has been (and is) all about him and his superior being, which by the very accidents of his life transcends the “old divisions” between black and white, Republicans and Democrats, government and citizens.

There are virtually no differences of policy between Obama and Senator Clinton. Both are to the left of their party, he only a bit more so (on the Iraq war). There are no known examples of Senator Obama playing a significant bipartisan role in the Senate, “reaching across the aisle” as Washingtonians call it. Obama has been running a campaign based on his personal identity.

So now Senator Barack Obama, son of a Kenyan father who not long after fled from the family, so that Barack (soon called Barry) had to be brought up by his white mother and her family, has for twenty years been nourished, brought to Christ , taught, counseled, and treated as a beloved son by the same Reverend Wright. Suddenly he must reach an anguishing decision. One can easily imagine the intense ripping apart of his soul -- in public, unavoidably so, and everyone looking on.

There was nothing for Barack to do except break from Reverend Wright and his principles openly, on national television, with as much ashen-mouthed passion as he could bring himself to show.

Now, here reporters differ. Some wrote that Obama’s face as he delivered his remarks paled from the loss of blood in his cheeks; and wore the look of a man heart-sick, defeated, deflated. He had known, of course, that this moment was coming; he and Reverend Wright had spoken of it months ago. Now at last Obama did what he had to do – he denounced Reverend Wright and his awful principles, lest he have to give up any hope of winning the presidency. He did not quite succeed in looking manly, for he was too inwardly wounded.

Other reporters – most of those in the national press, it seems – took this ritual of denunciation on its face. They wrote that Obama looked sincere and brave, and that he had done what he had to do, so now the campaign could move on. His wife said she was proud of his courage.

However, none of Obama’s famous friends stepped forward during those five dark days to stand with Obama – not Ted Kennedy, not Oprah the television talk-show hostess, not the Obama supporters among mayors of several large cities. For the last two days Obama stood all alone. Then, after his severing of relations with Pastor Wright, his supporters started coming back. Most of the press cheered for him. But the polls started trailing downwards, some of them fairly sharply so. Hillary moved ahead of Barack by five points in Indiana, and was cutting into his lead in North Carolina.

How could the polls not move? Obama’s sudden public wrestling against Reverend Wright, the great father figure described in his autobiography revealed Obama’s unresolved questions of maturation. It is not believable that a man like Reverend Wright could for twenty long years hide his core principles from his “son.” Something here is not truthful. And that suspicion is draining Obama’s campaign of the meaning he had poured into it.

Obama stands revealed as no white knight, no imperturbable and masterly hero. He is only a man. Besides, from now until the November election, on any given day, Reverend Wright may attack Obama’s credibility again, in response to Obama’s separation from him, and Obama’s denunciation of his core principles. That, too, would be human. Nonetheless, the numbers look as though Obama cannot be stopped.

The Democrats have a habit of falling in love with relatively unknown presidential candidates, whom they briefly idealize. McGovern, Carter, Dukakis, even the obscure Governor of Arkansas, Bill Clinton.

Beyond that, most of the Americans who make their living chattering in public – professors of social science and the humanities, journalists, television and Hollywood stars, artists, street activists – favor the Democratic Party, or even movements yet more to the left. Such persons tend to have an idealized, noble picture of their own professions – and themselves. They love leaders who appear to be ideal, heroic, moral figures.

Of the seven national television networks – ABC, CBS, NBC, PBS, CNN, MSNBC, and FOX – six are governed by the dreams and perceptions of the left. Virtually all the major big-city newspapers, at least on the East and the West Coasts, set the national standards for what should be honored and what ignored. Virtually all of them are in the Obama camp.

The great Protestant theologian Reinhold Niebuhr used to refer to Protestant liberals and secular saints as “the children of light,” who imagine themselves to be locked in a struggle against “the children of darkness.” (That is, Republicans, big corporations, and business elites). Niebuhr warned that this abiding tendency to sentimentalize reality, without irony and without tragedy, is for its own part bound to end in irony and tragedy. A kind of remorseless Greek Fate unrolls, irresistibly, before our very eyes, generation after generation.

Whatever the electoral arithmetic and the metaphysical undertow may be, the only real hope of Senator Hillary Clinton is that Senator Obama should stumble so badly that Democratic Party leaders will decide that his cause in November is hopeless. Since neither candidate can win enough votes at this point to qualify as the winner, these “super-delegates” will cast the deciding votes, after all the scheduled elections are completed, early in June. They dread having to deny Obama the election after his strong performance (until now), lest they cause black voters, one of the largest blocs in the Party, to abandon Senator Clinton in anger.

Moreover, one of the most surprising facets of this year’s primaries is how sharply many in the press and the Democratic party have turned against Hillary. Although some even try to pretend that she no longer exists, she keeps fighting and fighting.

Should Senator Clinton should happen to win a decisive victory in Indiana (a state that borders on Senator Obama’s Illinois), and reduce Obama’s large early margin in North Carolina (whose huge black population tilts the Democratic Party in his direction), she will have one more strong argument in favor of her candidacy. And Barack’s campaign will emerge even more battered.

But the opposite may also happen. If Barack defeats Hillary decisively in North Carolina, and also ekes out a win in Indiana, Hillary’s campaign will be severely undercut.

But who knows what ironic turns this race will take on May 6? Both candidates even now look bruised, battered -- and not a little puzzled about what on earth has hit them.

Published in Liberal May 5, 2008