Our Better Angels: Reagan was a re-Founding Father

Editor's Note: Below is the text of Michael Novak's speech at the celebration of President Reagan's 88th birthday, delivered on February 5, 1999, before the Ronald Reagan Presidential Foundation, in Simi Valley, Calif.

***

This has been the best birthday celebration I have ever attended. The stories, the memories, and the visions for the future remaining to be built have been more than touching — deeply stirring.

Aristotle said that what binds a political community together is a form of love — amicitia, friendship — a friendship among citizens. What I have observed this week among Ronald Reagan's closest associates is something more than admiration for this man, or loyalty — it is a kind of love. The way we all tell stories of him shows that. We love the guy.

One of my favorites involves that awful day when he was almost taken from us. It was the night of the NCAA basketball final game, to be played in Philadelphia. While we were horror-struck, waiting minute by minute for news of his condition, television said there was discussion about whether the championship would be canceled; the president was undergoing an operation to try to save his life. Early in the evening a report came over the television from the hospital. Asked how he was feeling, the president — so very near to death — flashed that mischievous look in his eyes and said: "On the whole, I'd rather be in Philadelphia."

The game went on that night in Philadelphia. The name of that city means "love of brothers." It is the most distinctive name of America — our whole country should have been called "Philadelphia"!

AN 18TH-CENTURY THINKER For sure, these last three days prove again, we here have been — to cite Shakespeare — "a band of brothers. We few. We happy few." And hundreds of millions with us.

One other story that I like was told to Karen and me by Clare Boothe Luce at dinner in our home. "One thing no one has noticed," Mrs. Luce said, "is where the president gets his equanimity in face of criticism, especially from the media. That's an occupational advantage he gained from Hollywood. Early an actor learns the difference between the box office and the critics. If you have box office, it's astonishing how kind you can be to critics."

Larry Arnns yesterday asked us to think of President Reagan in the context of Washington, Jefferson, and Lincoln. The president himself once said (I paraphrase): "I've been accused by my critics of having 19th-century ideas. They're wrong. I have 18th-century ideas. I learned them from our founders. I believe in them."

The best way to understand Ronald Reagan, I believe, is to steep yourself in America's heritage. Again and again and again, he sounds like America speaking — like John Adams, or James Madison.

Let me test you on this. Do you remember the president saying these words, and on what occasion?

Entertaining a due sense of our equal right to the use of our own faculties, to the acquisitions of our own industry, to honor and confidence from our fellow citizens, resulting not from birth, but from our actions and their sense of them; enlightened by a benign religion, professed, indeed, and practiced in various forms, yet all of them inculcating honesty, truth, temperance, gratitude, and the love of man; acknowledging and adoring an overruling Providence ... With all these blessings, what more is necessary to make us a happy and a prosperous people? Still one more thing, fellow citizens — wise and frugal Government, which shall restrain men from injuring one another, shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned. This is the sum of good government, and this is necessary to close the circle of our felicities. These are the words of Thomas Jefferson, First Inaugural. But don't they sound like Ronald Reagan?

THE REAGAN RE-VOLVERE Compare Jefferson's words to Ronald Reagan's own First Inaugural: ...We are a nation that has a government — not the other way around. And this makes us special among the nations of the earth. Our government has no power except that granted it by the people. It is time to check and reverse the growth of government, which shows signs of having grown beyond the consent of the governed... ...[I]t's not my intention to do away with government. It is rather to make it work — work with us, not over us; to stand by our side, not ride on our back....

If we look to the answer as to why for so many years we achieved so much, prospered as no other people on earth, it was because here in this land we unleashed the energy and individual genius of man to a greater extent than has ever been done before.... [W]ith all the creative energy at our command, let us begin an era of national renewal.

Back in 1988, I had the occasion to call these two passages to President Reagan's attention. I said the following words: Mr. President, taking Thomas Jefferson's words as your own, you made "a new beginning," and not only for the United States. Many nations are now imitating your policies. As the main source of hope for the world's poor, they too are turning from government activists to economic activists, that is, to all the people.

Historians tell us that what our framers meant by "revolution" was a turning back to founding principles — in Latin, a re-volvere — a going back to true beginnings.

Was there a Reagan Revolution? Mr. President, it was not exactly a "Reagan" revolution. It was "the American Revolution," now well into its third century, reestablished by you upon our founding principles. As the founders humbly dared to hope, Mr. President, this American Revolution heralded "a new order" of basic rights for all humanity and for all the ages. This novus ordo seclorum was conceived in liberty and dedicated to the proposition that every man and every woman everywhere is created equal. All around the world today — even in Mr. Gorbachev's USSR, if glacially — whole peoples are turning toward these shining principles.

May this revolution last forever, Mr. President, and may your name be linked with its renewal, at this time, in this age, for as many generations yet to come as God sees fit to bless America. For beginning anew the American Revolution, Mr. President, the revolution of natural liberty, the revolution that belongs to all humanity, we thank you.

It gave me great pleasure on that occasion to see that there were tears in his eyes.

THE PILLARS OF HIS LEGACY That is the first pillar of Ronald Reagan's legacy for the future: A love for the American creed, a reverence for her heritage, an awe before the work that Providence has wrought on these shores, among this people. It does not belong to us, this work, it is God's work, this nation's history.

John Adams, our second president, noted two further foundational pillars to the American adventure: hence, to Ronald Reagan's great adventure. He called one of these "the spirit of liberty"; the other, "the architecture of government."

No need to tell this audience: Ronald Reagan loved liberty. He believed, with Thomas Jefferson, that "the God who made us made us free at the same time." Our Creator made every woman and every man not only free, but responsible. And each of us is unique, different from every other. The name of each of us is written in the mind of God, the Good Book tells us — God knows us one by one, calls us by name. Individual liberty is, then, the second pillar of the Reagan legacy. This is not merely given to us, however. Each of us must awaken to it, live up to it, not allow our inner liberty to slumber. (People can be free in principle, but asleep to it.) Isn't it true Ronald Reagan called each of us in this room — all Americans — to be better, freer, than we might have been? Those of us who remember the 1970s know that it was so.

The third pillar — or at least its roundabout name — is "the architecture of liberty." Better: opportunity. John Adams said that republics require a higher level of virtue than monarchies. But republics lack an aristocracy to remind them what ambition is and what nobility is like. So they must make up for this by inciting individuals everywhere with opportunity to better their condition, to discipline themselves, to learn sound habits.

Worldwide, opportunity today means a new approach through universal capital ownership. During the 20th century, nations pursued a receding mirage of income maintenance. In the 21st century, we should switch the emphasis to capital accumulation for all families. For example, if social security is privately owned, it will become a capital fund inheritable by one's descendants — instead of dissolving with death as Social Security does today. In the same way, individual medical accounts can become inheritable assets, so that unused portions stay in the family. The opportunity to own capital should be made universal. Estate taxes should be abolished. Worldwide, universal capital ownership is the meaning of opportunity today.

Opportunity is the greatest teacher of virtue nature affords — when rules are just, rewards are fair, and government does not distort by favoritism the open field of opportunity.

Opportunity to discover and to use our God-given potential is the third pillar.

The fourth pillar is the most challenging of all.

The fourth pillar is to spread democracy around the world. That is a far more difficult and complicated job then it may seem.

Now it so happens that on or about January 21, 1981, the newly Honorable Jeane Kirkpatrick approached me and said that President Reagan needed a new Ambassador in Geneva for the Human Rights Commission. I said, "When?" She said, "January 31." "Can I have until tomorrow to decide?" "No, but go ahead and take it."

Thus it happened that I found myself in Geneva ten days later, thoroughly unprepared in U.N. jargon or institutional tradition; I was not even a lawyer. I was the first living Reaganaut to arrive in Europe. My fellow ambassadors regarded me with intense curiosity — looked to see if I was wearing sidearms and cowboy boots.

On the airplane over I read my briefing books — huge, prepared by the Carter-administration team. But I was able to predict to my fellow ambassadors what Ronald Reagan expected of me and, in general, what my future voting instructions would be.

I knew Jeane Kirkpatrick, and I believed I knew what Ronald Reagan expected of me. Here is what he said in his inaugural address. He quoted from Dr. Joseph Warren of Massachusetts, who fought at Lexington and died on Bunker Hill in 1775: Our country is in danger, but not to be despaired of... On you depend the fortunes of America. You are to decide the important questions upon which rest the happiness and the liberty of millions yet unborn. Act worthy of yourselves.

Isn't that 18th-century? Isn't that Ronald Reagan? That is why he belongs with Washington and Jefferson and Lincoln — on Mt. Rushmore. He renewed the American Revolution.

GREAT MEN'S MONUMENTS I knew the United States under Ronald Reagan clings to three principles: We stand for the human rights of all (that is why we came to America). We know that rights do not exist as words on paper ("parchment barriers"), but in the habits and institutions of living peoples. Therefore, we concentrate on institutions. Third, the key institution is democracy — not just majority rule, but separated powers, checks and balances, protections of minorities, the rule of law, free speech and free association — a whole complicated set of ideas. Including such difficult ideas as "coalitions" and "compromise." (In many languages there is not even a word for "compromise" in the good sense.)

Democracy is a very long school. Looking into the future, we need a much more detailed understanding of the complex set of habits and institutions needed to make democracy more than a word. In Russia, today. In China. In the Islamic world.

One last point. Ask yourself, why did our founders say that "the price of liberty is everlasting vigilance"? Because liberty is the most precarious of all regimes. People can freely give it away, allow it to be corrupted, squander it. That is why it can only be saved by re-volvere — turning back to first principles. Our nation needs constant rebirths of freedom — constant Reagan Revolutions.

Clare Boothe Luce said that every great man gets a single line on his monument — "Father of his country", "Freed the slaves." I don't know about that for Ronald Reagan. But in the memorial I will build in my own mind, the words will always be:

"Ronald Reagan — He defeated Communism — He led a new American Revolution — He awoke the better angels of our nature." And if I had to reduce it to one single line, I would write it in marble in my mind:

He awoke the better angels of our nature.

Published by National Review Online, June 7, 2004

An Authentic Modernity

The Ethics of Authenticity. By Charles Taylor. Harvard University Press. 142 pp. $17.95.

***

To grow up in Canada is to inherit a privileged position for understanding modernity-sufficently distant from that hurtling spaceship of "the republic to our south," while retaining (perhaps from connections to nature, to the history of France, and to Catholicism) a sharp, intuitive sense of what it once was like to be "premodern." A Canadian can more easily remain detached from capitalism, the spirit of commerce, and the fury of markets, sheltered as he somewhat is by the residual corporatism of medieval Europe and modern socialism. Thus a Canadian tends to associate the negative aspects of modernity with capitalism, its more positive sides to some inarticulate communitarian sense that is not capitalist.

The Canadian thinker Charles Taylor, in any case, is gaining status as the world's premier philosopher of modernity, the most judicious, the one who makes the most apt and discerning distinctions, the one who best sees both modernity's grandeur and its misery. I do not know whether he is what is called a "practicing" Catholic, or even what his spiritual disposition is towards Catholicism. He seems to know in his bones what it is like to have been a premodern Catholic, in love with the ancient philosophy of the Greeks, the Romans, schooled in the history of medieval philosophy, and well-informed about the twentieth-century French Catholic ressourcement. He retains a viewpoint larger than modernity (and thus able to judge it) while at the same time wholly committed to modernity (as one whose vocation it is to recognize in it a gift-and a challenge-from God). Analogously, the stance toward democracy in America assumed by Alexis de Tocqueville was not that of an English Protestant of bourgeois upbringing, but that of a man shaped by the history of Catholic and aristocratic France. To be in but not of the cultural world of modernity is to have a comparative advantage.

The original Canadian edition of this book, published during Taylor's sixtieth year, was entitled The Malaise of Modernity. Perhaps because the connotations of the word "malaise" are different in French and in English, perhaps because the word is virtually unusable in this context in the United States so soon after Jimmy Carter, and surely because Taylor expressly frames his book as a continuation of the inquiry nobly undertaken by Lionel Trilling in his Norton Lectures at Harvard under the title Sincerity and Authenticity, the American edition has been entitled The Ethics of Authenticity. But "ethics" is not quite the right word; in a way, "ethos" is closer, although not quite satisfactory either. What is really at stake is a fair judgment on modernity, an assessment, a fine discrimination of both its nobility and ethical allure, on the one hand, and its self- destructiveness, and self-flattening and demeaning tendencies, on the other.

The book consists of ten chapters of about twelve pages each, and although its argument is at times subtle, allusive, and demanding of full and total concentration, it also marches briskly along. The author inserts frequent guideposts to where he has been and where he is going.

One of Taylor's contributions is to distinguish clearly among three quite different strands of experience-individualism, instrumental reason, and subjectivism-intending to show how each of these contains both destructive and creative possibilities. Actually he has time in so brief a space to analyze only one of these strands in some detail, and asks us to use this analysis as a model for completing the other two ourselves. He points out that each of these three strands has some aspects that attract us and others that repel us. What I am suggesting is a position distinct from both boosters and knockers of contemporary culture. Unlike the boosters, I do not believe that everything is as it should be in this culture. Here I tend to agree with the knockers. But unlike them, I think that authenticity should be taken seriously as a moral ideal. I differ also from the various middle positions, which hold that there are some good things in this culture (like greater freedom for the individual), but that these come at the expense of certain dangers (like a weakening of the sense of citizenship), so that one's best policy is to find the ideal point of trade- off between advantages and costs.

The picture I am offering is rather that of an ideal that has degraded but that is very worthwhile in itself, and indeed, I would like to say, unrepudiable by moderns. So what we need is neither root-and-branch condemnation nor uncritical praise; and not a carefully balanced trade-off. What we need is a work of retrieval, through which this ideal can help us restore our practice.

To go along with this, you have to believe three things, all controversial: (1) that authenticity is a valid idea; (2) that you can argue in reason about ideals and about the conformity of practices to these ideals; and (3) that these arguments can make a difference. How to fashion such a position is, of course, a work of reason. Here is Taylor's enduring contribution. His appeal to reasoned judgment is traditional enough, but to make this appeal against the howling winds of relativism so characteristic of modernity is to do the traditional thing in a new and "authentic" way. As Trilling has pointed out, the ancients knew very well the value of sincerity (sine + cere = "true marble without addition of wax," i.e., honest work). In their way, the ancients recognized the difference between learning ethical rules by rote and truly appropriating them-making them one's own-in the concrete struggles of the agora and the battlefield. But in making things "their own" they did not have to do so in loneliness, on their own; the rules were publicly agreed upon. The full force of what we today mean by "authenticity" was, therefore, unknown before the modern period. But recognizing many hints, portents, anticipations, and incompletely self- conscious foreshadowings of our predicament even among the ancients, one hates to be apodictic about this.

To repeat, Taylor by his own lights needs to show that one can offer telling reasons for one's discriminations, and he does so with a brilliant maneuver. One thing modernity certainly requires, he points out, is an awareness of personal identity different from that of all others, an acute self-consciousness. Very well, then, how can one answer the question, "Who am I?" without searching through the stream of memory in order to select out those items that one deems "significant" to one's identity? But how can one do this without appealing to various reasons for declaring one thing "significant," and another not? In this way, the question "Who am I?" is a question of, by, and for reason. You can go ahead and just be, if you want to, but the moment you rise to the fully human activity of "living an examined life," you must invoke the rules and standards of reasoned judgment.

Taylor next shows that this effort at self-examination is always, and must be, dialogic: we learn how to understand ourselves through conversation-through being variously perceived, understood, and judged by others, and in turn learning how to perceive, understand, and judge in our own right.

A book of 120 pages can hardly do justice to the complexity of the matters being discussed, and Taylor could scarcely have succeeded, if he had not in Sources of the Self (1989) earned his way through scores of other useful distinctions. Unlike Bernard Lonergan or Alasdair MacIntyre, it would be misleading to describe Taylor as an Aristotelian or a Thomist; but it would also be quite wrong to overlook the ways in which he puts Aristotelian and Thomistic distinctions to work for him. Like Lonergan and MacIntyre, he understands the importance of keeping modern consciousness open to critical reason, to the eros of the pursuit of an accurate and true grasp of reality, to public claims of beauty and justice, and even to God.

Taylor draws easily and confidently on literature and art to display the contours and hidden turns of modern consciousness-oddly enough, rather like Richard Rorty. But he has a deeper and richer philosophical mind than Rorty's, not so "merely" modern, and not so limited in its appreciation of what came before the modern.

In a certain sense, moreover, Taylor is working in disguise. The followers of the philosopher Leo Strauss are fond of saying that John Locke was not as religious as he seemed; rather, he employed religious language in order to persuade religious hearers. It seems to me that Taylor is often doing the reverse. He writes as if he were less religious, and less traditional, than he is. While convincing us that he is authentically modern, and on the whole happy about that (although rightly worried), he never quite gives his whole heart, mind, and soul to modernity. That is the way it must be with ethics, even regarding authenticity. Let me put this another way. Taylor is actually trying to reach, as best he can, the truth about modernity, and to do so in a wholly modern way. He is subverting modernity from within. He sees both its dangers and its true possibilities. He recovers it for reason. His is, then, as promised, a work of retrieval.

Published in First Things May 26, 2004

Copyright (c) 1993 First Things 33 (May 1993): 40-42.

Economics as Humanism

For more than a century now economics has been advanced and practiced as a science, on the model of physics and mathematics. It was not always so. From Adam Smith’s Inquiry into the Nature and the Causes of the Wealth of Nations in 1776 until well after the publication of John Stuart Mill’s Principles of Political Economy in 1848, economics was viewed as a branch of moral philosophy astonishingly underdeveloped by earlier philosophers. It seems hardly possible, yet it is true, that before the time of Adam Smith no classic author—not Aristotle, not Aquinas, not Bacon nor Descartes—had asked about the cause of the wealth of nations in any sustained and fruitful way. Such an inquiry may well have been of great social utility, had it been successfully pursued in earlier, poorer centuries. But the problems of political order and the rule of law were of such importance—neither person nor property being safe from marauders, brigands, and feuding princes, whether in Europe or in other places on the planet—that the development of economics required the prior development of politics and law. During our own century, a school of economics much disdained by the leaders and the general run of professionals in the field (who were more and more attracted to the scientific model, and particularly to the strengths and beauties of mathematics) has restored economics as a field worthy of investigation by moral philosophy. The school is known as the Austrian School, the school of "classical liberals" or, in F. A. Hayek’s preferred description, "Whigs." Let me state the accomplishment of these Whigs starkly: As a result of the inquiries of the Austrian School, it has become clear that economics is at least as much a branch of moral philosophy and the liberal arts as it is a science.

This result was the fruit of three investigative strategies favored by the Austrian School. The first strategy was to attend to the subject in economic activities as well as the "objective" factors of production. "Why did economists fail to recognize that incentives remain relevant in all choice settings?" asked Nobel Prize-winner James M. Buchanan in 1991. "Why did so many economists overlook the psychology of value, which locates evaluation in persons, not in goods? . . . Why was there a near total failure to incorporate the creative potential of human choice in models of human interaction?" Taking advantage of cross-cultural studies of the work ethic, social trust, individual initiative, willingness to risk, patterns of cooperation, and other moral habits—together with studies in decision and game theory on the dilemmas that acting subjects typically face—the Whig economists have been able to focus attention on incentives, values, information, and choice, both private and public, including activities of deliberation, reflection, and selection.

The second of the Austrian strategies was to inquire into the concept of human action. The idea was to deepen both our understanding of economic action and its relationships with the other sorts of human actions. Actions begin in choice, and thus Ludwig von Mises opens his classic work Human Action: A Treatise on Economics: "Choosing determines all human decisions. In making his choice man chooses not only between various material things and services. All human values are offered for option." But humans not only act, they tend to act in patterns—in economic actions as well as political, religious, and cultural—and the Whig inquiry involved at least rudimentary inquiries not only into atomic human action considered in isolation, but also into characteristic actions or habits or virtues, and thus ultimately into a theory of human character.

The third Austrian strategy was to isolate and highlight the efficient cause of economic activity, the dynamic factor in economics—the habit of enterprise. The source of creativity, invention, even revolution in the way economic activities are carried out, this habit is the engine of change in economic development. Consider the recent experience of Central Europe: Some countries tried to move from socialism to capitalism by abolishing price controls, some began to respect and protect rights to private property, and some even began to permit the private pursuit and accumulation of profit. But even all these together no more constitute an active, capitalist economy than dry wood and air constitute a fire. Socialism inculcates in its people a debilitating passivity, and a formerly socialist people might well have waited for the state to do something else, without doing anything themselves. Capitalism did not properly begin until acting subjects looked around, noticed what could be done, and seized the initiative. In Poland, for example, half a million new small businesses were begun in the first six months after the Revolution of 1989. That is what made the transformation real.

These three Austrian strategies—attending to the human subject, investigating the sources of human actions, and emphasizing the habit of enterprise—have led in the last thirty years to a new focus on "human capital." The term "human capital" calls attention to acts of insight such as the entrepreneur noticing significant points that others fail to see: it thus stresses intellectual skills. But while many people have bright ideas, only some of them have the other qualities necessary for entrepreneurship—the moral qualities, such as boldness, leadership, know-how, tolerance for risk, sound practical judgment, executive skills, the ability to inspire trust in others, and realism. Human capital, even taking into account only matters of economic significance, is a concept of broad moral range. In recent years, in fact, the most interesting developments in the field of economics have come with the new attention paid to moral factors in economic progress. For some generations, so long as traditional Jewish and Christian moral values held sway in the West, such moral factors could operate as silent partners in economic analysis, being everywhere taken for granted. Their current absence has brought to consciousness their earlier unappreciated presence, as economists have rediscovered with a vengeance the moral dimensions of human capital in both cultural and personal contexts.

Two or three decades ago, it was frequently remarked that the systems described as "capitalist" and the systems described as "socialist" were asymmetrical, for socialism named a unitary system in which one set of leaders made all the key economic, political, and moral decisions, while capitalism was the name of an economic system only, capable of being combined with any number of political and moral systems. A man might be willing to die a romantic death defending democracy, but no one is willing to die for an economic system. That would be a confusion of means and ends—and, anyway, there isn’t much romantic about capitalism. So it was said.

The truth in the aphorism is that the weakness of socialism lay in its dangerous concentration of power—opening up enormous possibilities for the abuse of power to which many socialist governments succumbed, certainly, but also stripping human capital from private citizens. Pope John Paul II has written that the fatal flaw in socialist anthropology was its atheism, but he had in mind a particular kind of atheism: the atheism that sees man as a flat creature of matter and the will-to-power only, without spirit or soul, and ultimately unfree at his core. Even without theism, many Western classical liberals had an image of human beings as free and self-determining, with all individuals living out a story of weighty moral significance not only for their personal destiny but for the culture as a whole. In short, the ultimate drama in economics is acted out in the arena of human capital.

This humanistic turn in economics, first made by the great Austrian economists of our century, seems to have gone largely unobserved outside of the field of economics, even by humanists. But if economics is not only a science, if it is also a way of looking at reality and a way of thinking (a fact suggested by recent economists’ success at borrowing insights and methods from philosophy, law, anthropology, psychology, religion, and even art), then modern economics offers enormous resources for future generations of thinkers—and the possibilities for a new synthesis are immense.

In this light, there seems to be emerging in economics something like a universal science, a science of humans qua humans, in all our variety but also in certain invariant relations to human experience. Every human being on earth is an acting subject, capable of reflection and choice, a spirited animal capable of activities and a range of consciousness no other animal matches, aware of both universal community and unique personal meaning, faced with scarcity and sensing the impulse to inquire, create, trade and barter, and better our condition.

In the twenty-first century, economics has a great deal to teach us, and much of it complementary to the wisdom we have learned down through history. It is the vocation of economics to help us to be better women and men; to make better choices; to see more clearly what our alternatives are, and their comparative costs and advantages; to invest shrewdly in our fellows and in ourselves; and to use our freedom more advantageously and wisely. Economics is a noble vocation. It is also, I am arguing, a humanistic vocation.

Published in First Things May 26, 2004

Books in Review: 'The Wealth and Poverty of Nations'

The Wealth and Poverty of Nations: Why Some Are So Rich and Some So Poor. By David S. Landes. Norton. 650 pp. $30.

***

The irony of this major study of economic development is that its author writes as a complacent secularist and yet his fundamental thesis is theological. One can see this by comparing it to a rival study, in some ways its superior in clarity and theoretical cogency, How the West Grew Rich: The Economic Transformation of the Industrial World (1986). In that book, Nathan Rosenberg and L. E. Birdzell, Jr. stress institutional relationships, such as those between political and economic structures, and between science and markets. But David Landes, professor emeritus of history and economics at Harvard, unabashedly stresses culture, especially religion, and in particular, the Judaism that lies behind Christianity. In his final summary pages we read, for example:

"If we learn anything from the history of economic development, it is that culture makes all the difference. (Max Weber was right on.) Witness the enterprise of expatriate minorities—the Chinese in East and Southeast Asia, Indians in East Africa, Lebanese in West Africa, Jews and Calvinists throughout much of Europe, and on and on. Yet culture, in the sense of the inner values and attitudes that guide a population, frightens scholars."

And what precisely do the "inner values and attitudes" shaped by religion do? Landes quotes approvingly the view of the nineteenth-century Argentine champion of freedom, Alberdi, who called the religion of the English, the Germans, the Swiss, and the North Americans "the agent that makes them what they are." The center of culture, so to speak, is cult; men aspire to what they worship.

Europe’s buoyant political and economic dynamism required three specific theological breakthroughs, Landes thinks: the Judeo-Christian respect for manual labor, the Judeo-Christian subordination of nature to man, and the Judeo-Christian sense of linear time. He emphasizes elsewhere how the authority of God, conscience, and church limits the authority of secular state claims and thus creates space for liberty.

Landes believes that European success required more than Judeo-Christian theology, though: "In the last analysis, however, I would stress the market. Enterprise was free in Europe. Innovation worked and paid, and rulers and vested interests were limited in their ability to prevent or discourage innovation." Culture is crucial, but alone it does not suffice.

Landes organizes his text into twenty-nine chapters, covering the whole of human history and every part of the world. These chapters are crowded with vignettes, anecdotes, delicious quotes from ancient chronicles and journals, shrewd distinctions, and reasons for and against various arguments put forward by economic historians of past and present. His style is unusually lively for an academic, and its tone is rather more like that of journalism of a high order than of systematic scholarship. His method is eclectic and commonsensical rather than heavily theoretical or ideological.

It is not as though Landes does not have an ideology of his own; about that, more later. But no other historian recounts quite so large a store of anecdotes, concrete empirical materials, and fragments of theory—just enough theory to be the skeleton for his vast range of concrete stories. From Landes, one gains a new sense of how steel was first made; the technological grandeur and the limitations of the great Chinese empires of hundreds of years ago; the tragedy of the intellectual and practical decline of Islam; the pride and self-enclosed world of Spain at the height of its power; and countless fascinating stories of craftsmen, organizers of great enterprises, and trades and barters of all types. It is as if all of economic history had come to a great world’s fair, and one could visit it, pavilion after pavilion, being inspired with wonder and a sense of intellectual delight. And the theme of all the exhibits is unified: What has worked, and why; and what has not.

Landes provides a great deal of evidence that Jewish-Christian Europe cultivated discovery and novelty as did no other culture. He judges that the inventiveness of the years 1000-1500 in Europe were the most dynamic of the preceding 4000 years. He lists some of the crucial inventions of that era: water mills and other mechanical devices; the mechanical clock—the first digital, not analog, device; the eyeglass, greatly heightening the sense of precision and the possibilities of miniaturization; printing; gunpowder—used, not as the Chinese used it, for incendiary display but for the projection of force.

This is a book to be taken seriously, especially by non-economists. Anyone who wants to get a down-to-earth economic education enriched with endless examples will find in this text an unparalleled opportunity. Nonetheless, I see four flaws in it.

First, there is the problem of the author’s ideology, lightly disguised in his last paragraph: "We must cultivate a skeptical faith, avoid dogma, listen and watch well, try to clarify and define ends, the better to choose means." It would be naive to believe from this that Landes is without an ideology of his own. Even he admits to holding strong opinions. He is, for example, disturbingly anti-Catholic. He also takes pokes as often as he can at "classical economists," free-marketeers, and partisans of laissez-faire. His arguments reveal the principles of the sort of mixed-regime liberalism one has come to expect from Cambridge, Massachusetts, closer in his case perhaps to Paul Samuelson than to John Kenneth Galbraith.

Second, these principles prevent him from paying attention to a rival vision of economic history: that of the "classical liberals" or "Austrian school." Among the writers on economic history whom he entirely ignores are Schumpeter, Mises, Hayek, Viner, Rothbard, and Kirz ner. Further, on the medieval period to which Landes pays such surprising and welcome attention, and precisely on Max Weber, Landes neglects to treat Randall Collins’ key volume, Weberian Sociological Theory (1986). By not paying attention to the work of various students and critics of Weber, Landes falls into the trap of never being very clear about what Weber actually said, and which precise parts of his hypothesis have been overturned or revised in the nearly one hundred years since he wrote.

Weber, for instance, is not very good in helping to explain differences in the performance of Protestant groups; why so few Calvinists have been found who actually taught what Weber imputes to them; why the economic performance of some Calvinist groups was long retarded; and, as Randall Collins points out, why Weber did not see that all the conditions for the rise of the spirit of capitalism that Weber attributes to the eighteenth century had already been met in the twelfth, as the huge tide of economic innovations and great social transformations at that time suggests.

Collins stresses the following points of Weberian theory: the specialized economic pursuits of various orders of monks; the availability of appeal to international authority in Rome regarding disputes among competing jurisdictions (emperors, kings, barons, prelates, abbots, orders, merchants, guilds, confraternities, etc.); the total economic dedication of tens of thousands of skilled celibate laborers; systems of accounts; sustained investment and continuous accumulation of capital; emphasis upon innovation and efficiency; the clearing of land and the use of mechanical power; the growth of new industries, from mining and metallurgy to machine-made textiles; and the building of libraries and the cultivation of research both practical and theoretical. All this long precedes the birth of Calvinism.

The third flaw is Landes’ "cultivated skeptical faith," which seems to be insufficiently skeptical about secularism, for while he invokes religion, he never takes it seriously enough to study it, even a little. For Landes "the invention of invention" is the crucial economic dynamic. He senses that its origin is religious, and he himself notices (without adjusting his theoretical framework) that its power is manifest well before the Reformation. But he never allows his skepticism to question secularism. Had he done so, he might have discovered, as Daniel Boorstin did in The Creators (1993), that the Jewish/Christian conviction that all people are made in the image of their Creator, and called to be like Him, has been a creative force unique in the entire modern world.

Landes might then have seen that Max Weber’s preoccupation with "hard work," a "secular asceticism," and the "logic" of bureaucratic development missed the bullseye. The truly dynamic factor in economics is creativity, serendipity, innovation, and the act of enterprise. When Weber wrote, the number of democratic republics in the world could be numbered on one hand, and so he also failed to see how such republics (i.e., under limited government, as well as government by the people) interact with and alter the economic order. In general, Landes is disappointingly weak on many theoretical links that one would like to understand as clearly as possible: between markets and republics; faith and inquiry; hope and enterprise; habits and practices; institutions and laws.

Finally, in a footnote in his final chapter, Landes suggests that people turn to religious faith out of weakness, seeking comfort, whereas "science and reason are tough companions." He is quite wrong about religious faith; it is often a far tougher companion than science and reason. These, by contrast, often seem to offer a sure circle of comfort.

One wishes that in his wide reading Landes had come to detect that stubborn faith in a divine and purposeful universe whose springs run somewhat deeper than science and reason alone, and prevent them from floating on air. Such a faith, rather than the comfort-seeking religious faith he thinks he finds, would seem to be consistent with the thesis Landes already holds: that Jews and Christians have felt a deep and persistent obligation to be faithful to reason and science, come what may—and thus the obligation to build the civilization whose centuries of nooks and crannies Landes has observed teeming with joyful vitality. Landes should be more surprised by this joy. It is an important datum.

Published in First Things May 26, 2004

Controversial Engagements

On March 19, 1998, the young social historian Eugene McCarraher delivered a portion of his doctoral thesis as a lecture at the Cushwa Center of the University of Notre Dame. His subject was Michael Novak, "The Technopolitan Catholic." Though the lecture was highly critical of Novak’s work, especially after his "rightward turn," it also stressed the continuities in his thought. The Cushwa Center invited Novak to respond the following autumn, and on October 6, 1998, he delivered an account of his career from which the following article was adapted.

— The Editors (of First Things)

***

I was born in Johnstown, Pennsylvania, in 1933, and lived briefly in two other cities in western Pennsylvania towns, (Jimmy Stewart’s) Indiana and (Andy Warhol’s) McKeesport. At fourteen, I entered the Little Seminary on the campus of Notre Dame University for my high school years, graduating in 1951. From there I went to the novitiate of the Fathers of the Holy Cross in North Dartmouth, Massachusetts, took simple vows, completed my undergraduate degree in philosophy and English literature, and was sent to Rome for theological studies. After two happy years I nonetheless began to believe that my vocation was as a layman. My superiors advised me not to make so weighty a decision on foreign soil, and brought me back to Washington to complete my theological studies at Catholic University. After eighteen months of great darkness but also inner peace, I became certain that I should not be a priest.

Thus, in early January 1960, after twelve years in religious life, having had a profound experience of religious and intellectual community, I found myself in a garret apartment in New York City working on the manuscript of a novel. I had one hundred dollars that my father had given me, plus a determination not to go to work at any job except writing. I was budgeted at $35 a week (rent took $10), and so I had three weeks to find the next check. Luckily, an assignment for a book review or an article kept arriving each month. The manuscript I was working on was not my first novel, but in June of 1960 Doubleday accepted this one for publication. The advance seemed to me a fortune. I believe it was $600, with a matching check when I would hand in the completed manuscript.

Meanwhile, I had also applied to graduate schools for further study in philosophy; naively, I sent applications only to Yale and Harvard. Yale offered me tuition but Harvard added a supplement for living expenses. I preferred Harvard for other reasons, and was there for election night of 1960 and inauguration day of 1961 when in the Law School dining room, surprised by tears gathering in my eyes, I watched on television as John F. Kennedy was sworn in as the first Catholic President. I had sent in drafts of speeches to his campaign, copies of speeches I had prepared for a young lawyer running for Congress in northern New Jersey, including one on "The New Frontier." The Democratic pols in Newark had mocked my speech when my candidate started giving it, but once JFK used the theme in his acceptance speech at the convention they said it was brilliant.

In the spring of 1961, Robert Silvers of Harper’s asked me to write the article on religion for an issue on universities to appear that fall. That article, "God in the Colleges," though it caused a lot of discussion, left my professors at Harvard not at all pleased. It was reprinted several times in various New Left publications a few years later, since in some ways it anticipated the Port Huron Statement (the founding document of the New Left) in 1964. What I wrote in that article—about the death of humanism under the onslaught of the Enlightenment—has been a permanent theme of my work, and I have reprinted that essay in later books in 1964 and 1994. These are its closing lines:

"God is dead. . . . What are these churches if they are not the tombs and sepulchers of God?" Nietzsche asked. But much of Western humanism is dead too. Men do not wander under the silent stars, listen to the wind, learn to know themselves, question, "Where am I going? Why am I here?" They leave aside the mysteries of contingency and transitoriness for the certainties of research, production, consumption. So that it is nearly possible to say: "Man is dead. . . . What are these buildings, these tunnels, these roads, if they are not the tombs and sepulchers of man?"

The greatest event of my Harvard years was meeting my wife, a painter who was then teaching at Carleton College in Minnesota, an Iowan herself, who had studied with Kokoshka in Austria and Lasansky at Iowa City. She came to Boston to paint for a year during her sabbatical, and despite serious competition from two lawyers I prevailed upon her to marry me. We took part of our honeymoon as a working autumn in Rome for the Second Vatican Council, beginning from late August 1963 until mid–January of 1964.

In late November I unexpectedly received a contract for The Open Church—an existing contract that the Time correspondent could not fulfill. Taking Lord Acton’s report on the First Vatican Council in 1870 as my model, I wrote with great intensity for the seven weeks that the contract allowed me after the Council closed in early December. It was a bitterly cold winter, and the marble rooms in our pensione had no central heating. Wearing gloves to grip my pen (for revisions) and my dictating machine (for reporting speeches at the Council), I kept as many as three secretaries busy—one transcribing dictation, one deciphering my handwriting for typing, and the third typing the revised versions of the work of the preceding day. Although I was supposed to be studying for my comprehensive exams at Harvard, I nonetheless completed the manuscript by the January 18 deadline.

My second nonfiction book, which was also published in 1964, was a collection of essays called A New Generation: American and Catholic. In it I laid out my dissatisfactions with the various philosophies of the Enlightenment that had been my diet at Harvard, and announced my intention to pick up and develop the ancient and more capacious pragmatism and empiricism of Aristotle and Aquinas. I declared that in solving the crucial problems of Americans and Catholics in America, one needs "a consistent point of view, [one that is] empirical, pragmatic, realistic, and Christian." To this day, I think I have been faithful to that vision.

Although I have published versions of this story before, certain persistent misreadings of my intellectual biography make it seem judicious to give a short account of a half–dozen of the main continuities, the undergirding, of my intellectual life these past fifty years, before turning to those areas where my thoughts have changed significantly.

Most Americans seem to believe that every single human life has value and worth. But, I wondered from the beginning, is that really true? Is that how hurricanes, cancer, Hitler, and communism have treated human beings? During my lifetime, nearly one hundred million persons have died by violence, making life seem cheap. When I was eleven years old I saw the first movie reels from Auschwitz. What if this planet is as empty of meaning as it sometimes seems? I was struck very deeply by this at least apparent meaninglessness, and I took very seriously the challenge of Albert Camus that any philosophy of the future, any ethic, must originate within it, or risk not being credible. To build a new civilization on the ashes of Auschwitz would take much hard thought.

Most of my colleagues and friends didn’t share my problem. It is not that I didn’t believe. My faith never flagged. It was only that I felt nothing, I was empty, and I could not see how to answer the problems put by Auschwitz—and by explicit nihilists, including defenders of Hitler and Stalin, not to mention by nice atheists like some of my professors at Harvard.

Some of my friends could say with Pascal, "The heart has its reasons which the reason knows not of." Others could say: "Faith is a leap; you just have to let go and leap." In my case, I have known my own heart to have many bad reasons, and to be a great deceiver. So I didn’t like these two existentialist dodges. Regarding the first, I admire but I don’t quite trust "reasons of the heart." Regarding the second, I appreciate why others are content with a "leap" of faith, but it has always seemed to me that creatures blessed with the sort of minds God gave us should be able to give a better account of the why of our faith than that.

In other words, even accepting the Christian account on its own terms, we should from within it be able to find the way out of meaninglessness. We should be able to take in the worst this century has had to offer, and show why it is reasonable to find God also in that. I put the emphasis on reasonable. Many people I have talked to over the years think that that is asking too much. But if God really does want the worship of free men standing erect (and He does), then that much we have to achieve. Fides quaeret intellectum.

The second continuity in my work has been an experience of the dark region wherein God dwells. God is best understood to be caritas, a dark and terrible form of realism best symbolized by the Cross on which He willed his Son to die. God is best understood to be Love, but not in the types of love associated with the English word "love." Latin has at least five words for love—amor, affectus, dilectio, amicitia, caritas—and then the Greek agape adds a nuance, as do certain Hebrew words such as the one we usually render "compassion" but which more literally means "moved to the very bowels." Even when we cannot see God, we can turn our wills and intellects toward Him, aim them like arrows bound to fall short, and in effect say Fiat. The fundamental prayer to God is only one word, in the teeth of any storm: yes. Ivan Karamazov swore he could never say that. Not in a world in which so many children go to sleep in tears and alone.

The greatest continuity in my work is this affirmation that the basic energy, power, and force in creation is caritas. In this otherwise vast and possibly empty series of silent galaxies, the Creator made humans in order to have at least one creature able freely to respond to Him—either with love or not. Caritas is the one energy that matters. In it, we are first related, before we are solitary. We first receive, before we act on our own. We are first empowered, before we take responsibility for our own acts. We are first endowed, before we have rights. In all these things, all humans are linked together. Creatures depend. That is the great "intuition of being" that Jacques Maritain talked about.

In 1979 I gave a lecture at Notre Dame I have never forgotten—not, at least, the reaction of the audience. It was my first public defense of capitalism. In that context, I began with the presence of caritas within all of us. "Through the work of our minds and hands," I said,

the life of the triune God expresses its own love and truth and healing power, not all at once, imperfectly and in the darkness, but yet effectively. We build up the social institutions by which human history is slowly, very slowly, transformed into God’s own image. As our God is triune—a communal God—so is our vocation communal.

I gave the lecture at a conference occasioned by a Declaration by Chicago Laypersons fourteen years after Vatican II on the continuing neglect of the laity in the Church. After setting forth my theological vision of the action of the Trinity in this world, and on the need to reconstruct the social order, I spoke of the need to transform our approach by grasping capitalism’s religious possibilities. The capitalist system, after all, was the system in which most Notre Dame graduates would work. There could not be a realistic theology of the laity, or theology of work, without a theology of capitalism. When the lecture period adjourned for dinner, no one would speak to me. I had violated an important Catholic taboo. Those last few moments of that lecture—the capitalism part—admittedly marked a great discontinuity in my work. And that meant, of course, that I had excommunicated myself from the Catholic left.

The third most important continuity in my work is the theme that supplies the philosophical root that unites the first two themes: our unlimited, unquenchable drive to ask questions, the eros of inquiry. This is the organ of our appetite for transcendence, the point in us where our union with the communion of persons of the Trinity is joined, like two fires becoming one. We do not see God, but we thirst for Him. We seek Him. Our relentless drive to inquire is present in every act of our awareness. Thus, in every act, the transcendence of God is present to us (by reverse image, as it were). Furthermore, to pursue this unlimited drive within us is the best way to discover the multiple aspects of our duty on earth to build up the Kingdom of truth, liberty, justice, and love. The light that emanates from this drive to understand suffuses all we do. Issuing in caritas, it is the dominant dynamic of civilization.

A fourth crucial continuity is my emphasis on the incarnational dimension in theology. Some Catholics commit their lives to an eschatological witness, some to an incarnational witness. The former (Thomas Merton, Dorothy Day) believe that the world is sinful, broken, even adversarial, and they choose to light within it the fire of the love of God, while having as little to do with the things of this world as they can. Those who choose the incarnational witness try to see in every moment of history, in every culture, and in every place and time the workings of divine grace, often in ways that are hidden like the workings of yeast buried in dough. And they lend their energies to altering that world in its basic institutions, even if ever so slightly, in the direction of caritas. Both traditions are legitimate.

Early in my life, as I will recount later, I was sorely tempted by the witness of Dorothy Day and by Baroness de Hueck with her Friendship Houses, and even by the Benedictines. I was also drawn toward becoming a missionary. Yet I gradually realized that my own vocation lay in working in the world, in intellectual life, preferably in environments in which Catholics were few. Early in this pursuit I inclined toward a vocation in political action. By 1968, teaching in Cuernavaca, Mexico, that summer with Peter Berger, I came to see that economics was an even more neglected field in Catholic thought. By about 1976, I at last recognized that a capitalist system was not in fact what I had been taught it was; that no system is, in practice, more likely to raise the poor out of poverty than capitalism; and that capitalism is a necessary (but not sufficient) condition for democracy. I began to see that grace works also in economics.

Thus, it slowly dawned on me that, just as Jacques Maritain had recognized in American political institutions the yeast of the Gospels working in history, so also Max Weber had dimly seen that the original impulses of capitalism spring from Christianity, too. These impulses had been systematically neglected by economists, who had abandoned religious and even philosophical considerations in order to model their discipline on the physical sciences. In this way, economists had lost sight of the spirit of capitalism, and neglected the human habits on which its survival depends. Simultaneously, nearly all theologians had become as adversarial toward capitalism and business as Europe’s aristocrats were; they looked down upon economic activities as vulgar and crass, if not evil. In other words, I came to see the need for a reconstruction of the world’s understanding of capitalism and, beyond that, a reconstruction of capitalism’s realities.

We need to think of capitalism in a larger and deeper way than the economists and business schools typically think of it. We need to think of it in a Catholic way. This is what Pope John Paul II does in Centesimus Annus. He describes the business corporation as a community that is a model for truths Christianity has always attempted to teach about the human person and community.

Note, however, that the underlying continuity I am stressing here is theological. I am stressing the incarnational emphasis in general, not my particular judgment about capitalism. Whatever the present model of political economy, it will not measure up to the height and depth of the Kingdom of God. It will always be inadequate. The city of man will never be the city of God.

Just the same, it is important that there be Christians who go out into this city, whatever its stage of moral and religious development, and try to incarnate the Gospels in it as Jesus incarnated God in history. No doubt, this will often enough be by the way of the Cross and rejection, as it was for Jesus. But it is only thus that great Christian civilizations have been reared in the past. In any case, the "liberal popes" from whom we learned so much—for that is how scholars in my youth described the social teaching of Leo XIII, Pius XI, and Pius XII—called millions of us precisely to this task of "reconstructing the social order."

In my earlier years, I thought the best model for this reconstruction lay in a blend of democracy with some form of socialism. Later, I came to believe that socialism in any of its forms would be futile and destructive. I saw greater hope in a more realistic effortto reform and reconstruct society through the unique combination of capitalism and democracy that we have been lucky enough to inherit in America. But my point, to repeat, is that my own strategic vision, which is incarnational rather than eschatological, has been constant throughout my life.

The fifth continuity—related to the incarnational theme just mentioned—is a sense of the importance to Christian thought of the body, the flesh, the senses. No other religion promises the resurrection of the body. No other is so lavish in its evocations of the senses, as a holiday Mass in St. Peter’s Basilica dramatizes. Catholicism, G. K. Chesterton once wrote, is a thick steak, a glass of stout, and a good cigar. He wrote this not because he was a materialist, or blind to wit and spirit, but because turning away from the body weakens our grasp of the Incarnation. That is why an important theme of the Catholic Renaissance during the past hundred years has been a recovery of the theology of the body. In the Catholic America of my youth, this recovery was badly needed. One saw it in renewed emphasis on poetry, fiction, sound, smell, and texture. One saw it especially in the early liturgical renewal. I felt it keenly in my own struggles to learn the craft of fiction.

The sixth fundamental theme in my work has been "intelligent subjectivity." By this concept, beginning with Belief and Unbelief, I have always protected the role of the tacit, the inarticulable, the well–ordered senses and passions and emotions and heart at the very center of our acts of insight and judgment. But I have also tried to show how these are, properly, acts of reason.

Some critics mistakenly read my use of "intelligent" and "rational" as if my intellectual roots were utilitarian, post–Cartesian, and merely concerned with skill or techne. In fact, my primary concern has been that "knowledge by connaturality," that "wisdom," which my parents exemplified for me, and which Jacques Maritain and Michael Polanyi first taught me how to express. To make myself understood at Harvard on these matters, then a bastion of Quinean logic where Maritain and Polanyi were rejected (or simply disregarded), I needed to spell out the working of a form of "subjectivity" that is intelligent, reasonable, and empirically present in every act of reasoned judgment. That was the main effort of Belief and Unbelief, The Experience of Nothingness, and Ascent of the Mountain, Flight of the Dove.

These six continuities—and there are others—are plainly interrelated. Every one of my books had a place in the journey whose route I announced in A New Generation in 1964, and I have never deviated from it. I do admit plenty of errors, oversights, hasty judgments, and wrong turns. According to Winston Churchill, consistency is like a helmsman in a small boat amid thirty–foot waves; the only way to keep going is to lean hard first to one side, then to the other. That is not inconsistency. That is prudent sailing. Whatever the storms of my time, from Auschwitz to Vietnam, from the fall of the Berlin Wall to welfare reform, this particular Michael has always hoped, in the end, to row his boat ashore.

The great discontinuity in my life occurred when I decided, from much evidence, that the economic and social thought of the left which I had long supported—working for John F. Kennedy, Eugene McCarthy, Robert Kennedy, George McGovern, and Henry "Scoop" Jackson—was turning out, despite our good intentions, to be injurious to the poor of the world, including the poor among whom I had grown up in Johnstown during the 1930s. It was my rule—inculcated in me by my father—never to forget where I had come from and who my family was. But I owe a lot, as well, to my "second family," the Congregation of the Holy Cross.

I can hardly give enough credit to the Holy Cross seminaries for what they taught me between 1947 and 1959 about caritas, the drive to understand, and an incarnational humanism. There my soul became in a sense a child of France. I learned to love the Jacques Maritain of Integral Humanism, François Mauriac, and Albert Camus. From the French I learned the desire to write both philosophy and fiction. I also began an intense study of the life and work of St. Thérèse of Lisieux.

St. Thérèse (1873–1897) is the teacher of the Church about the everyday exercise of caritas, in ways so humble that they mostly cannot be seen, even though their effects may be subjected to the tests of the gospel. She taught me the importance of thinking small and honoring the humble things that I at first tended to despise. For the theology of the laity and the theology of work and the theology of daily institutional life, her work has been described—by no less an authority than Hans Urs von Balthasar—as revolutionary.

The influence of Thérèse is most often visible in my work when I refer to the transformation that St. Thomas Aquinas wrought in Aristotle’s philosophy of human action. Aristotle organized his thought around the conception of phronesis or practical wisdom; Aquinas saw the potential in this concept to support a new mode of caritas. This transformation shapes the horizon within which I placed the work of Reinhold Niebuhr, on whom I had intended to do my doctoral thesis at Harvard. (I wrote and published Belief and Unbelief instead, to clear away a conceptual obstacle to understanding the theology of the person and community, by way of "intelligent subjectivity.")

By 1965, I had accepted an assistant professorship at Stanford—the first Catholic ever to be hired in the religion department—and my duties and purposes there took my writing in more practical directions. I attempted to read every word that Maritain and Niebuhr had written, including as much of their occasional journalism as I could lay my hands on. How they applied their vision to the practicalities of lay life in the world interested me greatly, but so did their theoretical framework.

As mentioned above, early on in my life, mainly through the subscription to the Catholic Worker that my father brought into our home, my heroes were Dorothy Day, Baroness de Hueck, and (later) Michael Harrington. While I was at Catholic University (1958–59), the radical writings of the sociologist Paul Hanly Furfey were added to these influences. Nonetheless, I hesitated about declaring myself a democratic socialist or social democrat, because I was unclear about the implications of that allegiance. I resolved to study economics more carefully, and to clarify in my mind questions about poverty and wealth, economic development and religion. I devoured Weber, Tawney, Fanfani and others, as well as magazines such as Dissent. I thought it morally correct and religiously satisfying to be something of a socialist and a tart critic of capitalism. I tried hard.

It was during my years at Harvard, 1960–1965, that I first heard the term "WASP," and had to ask its meaning. At the Divinity School, I learned how loftily "mainline" Christians looked down on the vast majority of evangelicals in America. When Billy Graham came to lecture, most went to jeer (at Harvard this means making dry jokes), although some said later he did much better than expected.

One exception to the general climate was James Luther Adams, a great defender of what others at Harvard referred to as "the garbage bin of the Reformation," the Anabaptists and all the free churches that sprang from them, who eventually became the largest number of Protestants in America. Adams held that the free churches, more than any other social movement, taught America the practice of association that Tocqueville later described as "the first law of democracy." Leo XIII had won the sobriquet "the pope of associations," and this link to an entire world of Protestantism, a world I had never before clearly distinguished in my own mind from the Philadelphia–New England "mainline," gave me a new conceptual tool. From then on, the principle of association became a golden thread in my analysis of society, democracy, capitalism, welfare policy, and civil society. It is at the heart of my conception of the open society and the open church.

The theology of caritas, the Catholic tradition of personalism and community, and now the principle of association—all these helped me to break the horizon of left–wing socialism in which I had been formed. "A Catholic boy like you has to be a Democrat," our high school advisor, Father Peverada, had told me in the fall of 1948, when he caught me making counterarguments in favor of Thomas E. Dewey. (I have always been, in discussions, a counterarguer. My friends purposefully used to advance arguments they had first learned from me, in order to trap me into arguing against my earlier self—and I always fell for their deception.) Similarly, I remember Maritain writing about wanting always to be "a man of the left." (Though in France, "right," and therefore "left," mean something very different from anything in America.) Paul Tillich added that "Any serious Christian must be a socialist." Practically everyone concurred. It took me many years to begin questioning this magnetic pull, and to figure out whence its power came.

When I found myself questioning my own left–wing commitments, which had been somewhat influential in the "radicalization" of other liberals, I was frightened. I thought something must be wrong with me. As Kathie McHale Mulherin noted in Commonweal, my turn to the radical left in politics was a matter of intellectual conviction, against my own conservative temperament; she had been my research assistant and watched it happen. Her comment surprised me when it appeared, but I came to admit that she was right. The radicals, temperamentally, were not my sort of people. When they said "power to the people," the last thing they meant was the workers of Joliet and Johnstown, the white ethnics of Mayor Daley’s Chicago, or the Moral Majority. They did not have the good of my Uncle Emil in mind.

I need to say a word about Uncle Emil. He was my father’s oldest half–brother, by an earlier marriage of my grandfather. He was a big, rough, hearty man who was missing one whole finger and part of another from accidents in the steel mill, and his language was goodhearted, loud, punctuated by laughter, and not at all suited for Sunday School. He made his own wine. His Slovak was as rough as his English. Around his humble frame house, which even then seemed like an antique in a mining town, he grew hollyhocks and a grape arbor, of exactly the sort I was later to see in Slovak mountain villages in the Tatra mountains whence our family came to America about 1885. He was a boisterous supporter of FDR, the Democrats, and the unions. In all my memories, Emil seems to be in a sleeveless T–shirt, although I must have seen him squeezed uncomfortably into a suit, shirt, and tie at one or another funeral or wedding.

Somewhere along the line at Harvard, I got the idea of submitting every generalization I heard about "the Americans" to a test: Did that sentence fit my Uncle Emil? For instance, 1968 newpaper reports that "Catholic ethnics support Wallace." I could agree that Emil might have admired the guts of George Wallace in his presidential run in 1968, when Wallace took on the "pointy–headed liberals." But I am certain that the Wallace crack about "running over protestors" would have disgusted Emil; and compared to Wallace, Hubert Humphrey was the proven union man. Humphrey was Emil’s kind of Democrat. Many political writers and sociologists, in those days, seemed not to know of Uncle Emil. They confused him with the migrants from the South who migrated north into the mills, and did support Wallace. Ethnicity confused them.

In the autumn of 1970, I took a leave of absence from Stanford to accept an invitation from Sargent Shriver during his campaign to elect Democrats to Congress around the country. We visited some thirty–nine states, and spent nearly every day from August through November on the road. Years later I wrote that by election day,

I had a far better grasp of the diverse neighborhoods of America than I had ever had before. I had seen at first hand the true significance of ethnicity and localism in American life. . . . Words that I had written about the American majority—complacently drinking beer in front of television—in Toward a Theology of Radical Politics now made shame color my cheeks. I met the American people in the flesh; my literary imagination had been calumnious. But this had not been my vision only. In rejecting it, I was rejecting the leftist vision of America (or Amerika), the anti–Americanism so common among my intellectual colleagues.

This thought weighed on me as time passed. I saw many analyses of American politics and social needs go wildly wrong about the actual texture of American social reality, and decided I must write The Rise of the Unmeltable Ethnics (1972, 1996). Later, I followed up with an account of a crucial but almost totally neglected union struggle among Slavic miners in eastern Pennsylvania, The Guns of Lattimer (1978, 1996). It remains my ambition to tell the story of my grandfather’s immigration and the great Johnstown flood of 1889, which took more lives than the battle of Gettysburg.

There is one minor continuity in my work that I should mention here. In every period of my life, I have come back to the "gap" in our society between the intellectuals and the people. I make a sharp distinction between two halves of the American elite: those who choose to support the growth of a larger, supposedly more compassionate state, and those who choose, because it is actually better for the poor, to support the growth of the private sector. I define "elite," roughly, as the top 20 percent of the population as sorted out by three measures: income, years of education, and professional status. Since World War II, about half of this elite has found a new route to power, wealth, prestige, and influence through the promotion of a larger, more activist state. The others see a better route for themselves and the country through the promotion of a limited state, and a larger, more compassionate civil society. I call the first of these "the new class" and the second "the old elite." It is good for America to have a divided elite, of roughly equal size, so that the two elites check and balance each other. In terms of political economy, the new class tends to favor the political solution, while the old elite tends to favor the free economy, a reformed welfare state (toward which we have begun to move), and a civil society with the state "off its back."

Once I started criticizing the errors of the left, I was en route to becoming a "neoconservative," a term that I at first hated. The term was invented as a sign of excommunication by the Catholic socialist (and my good friend) Michael Harrington. Harrington, who began with the Catholic Worker, committed himself to the left; he became a socialist in the way that some people become Catholic. Socialism became his religion, not only his politics. He meant the term to signify a way of life, a horizon, a way of seeing things, an ethos, an ethic, a dream, an ideal goal. He called those of us who were beginning to question the premises of his faith "neo," to suggest "pseudo" or "imitation," and he called us "conservatives," knowing full well the judgment of Louis Hartz and Lionel Trilling that "there is no conservative intellectual tradition in America." For a socialist to call someone a conservative is the meanest name he can think of. It means outside the moral pale—greedy, money–grubbing, narrow–minded, bigoted, troglodytic—you get the picture.

Breaking ranks with the left is a phenomenon that deserves its own study. I lost the company of formerly good friends, received some letters from friends who severed all contact with me or pleaded with me not to continue in my horrible mistake. I had articles returned from magazines that had once begged me to write for them, and watched hostile and ad hominem reviews replace the glowing notices of just a short time earlier. But this has been a common experience among ex–leftists; some had it far worse. I took comfort from noting that Paul Johnson had just made the same break with the socialist left in England, and Irving Kristol and Norman Podhoretz had done so in New York. There seemed to be a growing band of us: "We few, we happy few!" (The literature on, by, and about neoconservatives is by now quite extensive.)

What we had in common was a past on the far left—not necessarily at the Communist extreme, but well to the radical side of Arthur Schlesinger’s "vital center"—and a powerful intellectual conviction that the left was wrong about virtually every big issue of our time: the Soviet Union, the North Vietnamese regime, economics, welfare, race, and moral questions such as abortion, amnesty, acid, and the sexual revolution.

In my experience, people join the left out of idealism. Once they see through the deceptions of the left, and break with its powerful set of internal controls, including censorship, they come to hate it. One must fight this hatred in oneself, and try hard to remember how one fell for the left because of one’s own uncritical ideals. What defectors come to hate in the left is its pervasive lack of honesty—the constant use of euphemism and linguistic deception (in public, socialists call themselves liberals and liberals call themselves moderates), its black–and–white vision of the world, its intolerance of any questions about its own principles.

It has been said before that a neoconservative is a radical who has begun to understand economics. In studying economics, one begins to grasp why socialism cannot possibly work in practice, and why it is especially damaging to the poor. In the days of my left–wing idealism, I thought the left would help the poor. I favored the War on Poverty. But then I watched what actually happened: A 600 percent increase in births out of wedlock (especially among the poor), a 600 percent increase in violent crime (especially among the poor). Once you begin to judge the fruits of programs inspired by a socialist analysis of social reality, not as dreams but as realities, disillusion begins. I watched the most liberal city in America, New York, slide into social and financial bankruptcy, becoming less civilized and more dangerous with every year that passed. I watched North Vietnam after the war was over, and noted what happened in the prison camps and reeducation centers, and wondered where now were my friends from the antiwar days who once said they cared so much about the Vietnamese people.

Nonetheless, the great intellectual utility of being a leftist, a utility I at first missed, is that you have a clear compass for interpreting every day’s events: If x undermines business, corporations, and capitalism, x is good. If x strengthens the central state, x is good. Using this template, you can detect instantly what the proper "progressive" line is. For cultural critics and journalists, as well as activists, having such a template is a great advantage. In cases of serious doubt, you have only to wait for your favorite left–wing journal to put out the correct line.

Well, you can see why my first articles reflecting a fundamental change of mind on political and economic matters seriously disturbed my former friends. Some were accustomed to looking to me to see where the progressive line lay; in the past, I had sometimes seen it before others. They were used to regarding me as one of their minor leaders. Then, in about 1976, I published two quite tentative articles, but only after I had estimated the probable incoming fire. One was called "A Closet Capitalist Confesses," in which I expressed shame that I could no longer, try as I might, desire to be a socialist. I couldn’t find, anywhere in the world, one single example of socialism that worked in practice, in a form that I could admire. Even when I pulled photos of Sweden out of my desk drawer, I wrote, Sweden no longer held any attraction for me. Neither did Cuba. Nor any other of the romantic options held out by the left. I didn’t even admire the British health service.

My wife didn’t want me to announce this disgrace in public, but I had to be honest: Once I thought about it, it was clear to me that capitalism had been better for my Uncle Emil and other poor folks than what had befallen those in our family who were still in the Slovak Socialist Paradise or had migrated anywhere else on earth.

The other article, at greater length, was not yet ready to become positive about capitalism, but its title announced its thesis well enough: "Capitalism, An Underpraised and Undervalued System." I did not renounce my former criticisms of many aspects of capitalist reality. Admittedly, it is a bad system, except, as Churchill noted about democracy, that all other known systems are worse.

When an intelligent person loses his or her confidence in socialism, what he or she most misses is the North Star it placed in the sky, its guidance system. Once you stop believing, you feel the ideological vacuum keenly, and recognize its moral hazard. Never again do you want anything else so totalistic to take its place. Irving Kristol in his book On the Democratic Idea in America argued that the American experiment, rooted in Aristotelian prudence and a sense of human fallibility and evil, is a bracing corrective to the ideological and utopian thinking of Europe. Hannah Arendt and others have pointed out that the American experiment is the most intellectually neglected social reality. Nowadays, almost no one grasps its originality or can articulate its specific moral vision, its table of virtues. The American tradition teaches a modest and humble way of thinking, close to earth, anti–utopian. It is also internally conflicted. Its tendencies toward liberty are at war with its tendencies toward a broad egalitarianism. To begin explaining this novus ordo seclorum, my first neoconservative book properly so called was The American Vision (1978).

By the late 1970s, I realized that it is worse for the poor of America (and around the world) to rely upon the state and to pin its main hopes on political measures. And it is better for the poor of America and the world to support a limited state, in which the conditions are favorable to the growth of business (especially small business), and to prefer opportunity rather than handouts from the state. Characteristically, neoconservatives favor the welfare state for those really unable to care for themselves, but try to break the corrupting bond between the central state and the welfare function. There are better ways to provide welfare than what we have constructed since 1965. About the specifics of such matters, persons of good will can argue long into the night. Politics is about argument.

In brief, my conversion from a mildly socialist way of viewing reality to a more distinctively American (that is, enterprising) way did not occur all at once. I can’t quite determine whether foreign or domestic experiences generated my first doubts about the leftism I effortlessly acquired with an excellent education. After all, for at least a century the humanities have been anticapitalist for traditionalist reasons, and the social sciences have been anticapitalist for socialist reasons. Against the pressures of my education, one experience after another during the 1970s made me a Reagan Democrat even before Reagan became President. In fact, a set of themes I articulated in an article in 1978 about why I still remained a Democrat, even while moving away from statism in my principles, was field–tested by Richard Wirthlin and adopted by Ronald Reagan as his election slogan: work, family, neighborhood, peace, strength.

The collapse of communism between 1989 and 1991 persuaded even many of its former adherents of the failure of socialism. As the people of the world gradually learned the condition of the mass of people under the Soviet Union, they learned what Gorbachev had already admitted: Under a first–world military establishment, the USSR hid a third–world economy. Socialism was a fraud. This did not surprise those of us who had learned from Hayek and von Mises why socialist economics is irrational and unworkable, and from Leo XIII why a socialist anthropology is both evil and futile.

Let me return, in closing, to the theological development, the "open church," that corresponds most closely to my developing appreciation of America.

The "open church," like Karl Popper’s "open society," is utterly different from the idealized "secular city" (which even Harvey Cox has now disowned). Its dynamism springs from a fidelity to the drive to understand, in all its workings throughout human life. The gospel is best preached when it uses this dynamism, since here the divine purpose in creation and in history works its way out.

My book The Open Church was a report, both journalistic and theological, on the second and most crucial session of the Second Vatican Council, where the nature of the Church was the focus. Unlike the secular city, this open church has a vicar of Christ as its visible head, the Pope, and a highly visible body of bishops around the world in communion with the Pope. The open church, therefore, does not lack a center and a hierarchy and a visible symbol of worldwide communion. Its reason for being is to form a community around the Eucharist and the Word of God. Against the widespread urges to tamper with doctrine in the years since the Council, the criterion of the open church is analogous to that of the open society, as described by Karl Popper: the falsification principle. New proposals must be submitted to rigorous tests.

The Second Vatican Council took care to preserve an important check–and–balance. The bishops in collegiality, including the Bishop of Rome, are the authenticating body of last resort—yet with an important twist. Even all the bishops of the world together, if without the Pope, do not suffice for authentication. The Pope must concur. On other occasions, he is bound to teach and confirm the brethren, even when alone. All are bound by the Word of God as held by the whole Church at other times and places. No mere majority vote suffices.

In writing The Open Church, I did not foresee such an exemplary Pope as John Paul II. I did not foresee most of the things that would happen between 1968 and 1999. But I did suggest that they would be ironic, and that at the conclusion of the Council the jesters of the Roman fountains would be laughing at yet another generation passing through that eternal city, with high and unrealistic hopes. "All things human," was the inscription on the fly leaf of that book, "given enough time, go badly." The progressives who have dominated the American Catholic Church since that time have not yet drawn up a realistic accounting of their own failures.

A future generation may find it hard to believe that so many theologians of his time failed to see the greatness of Pope John Paul II, and the precious gifts he gave the open church. During the darkest years he helped throw the Polish church open to all comers—believers and unbelievers—for intensely vital civil discussions. He encouraged associations of all sorts to press forward with their work. The Polish church met in factories, in homes, in the private quarters of professors. There were underground newspapers, theaters, printing presses, universities, catechetical centers, liturgies. There was little that I imagined in The Open Church that Archbishop Wojtyla did not try.

It gives me no small pleasure when I look back at my life of battles and controversies to find that, at least intellectually, I have ended up trying to further the great work of Karol Wojtyla.

Published in First Things May 26, 2004

Human Dignity, Human Rights

Fifty years ago, a tangle of intellectual and diplomatic puzzles blocked the world from agreeing on a universal code of human rights. In the years 1945-1948 the world was emerging only slowly from the devastation of the war that had burned through Asia and Europe. The largest nation of all, China, was in the midst of a bitter civil war, and Communists both there and in the Soviet Union harbored worldwide ambitions. Although consciences on all sides had been shocked by the bloodshed, the newly discovered death camps, and the tens of millions of displaced persons and refugees, no one way of thinking about moral issues commanded consensus. People seemed more divided about right and wrong after the war than they had seemed before it. How, then, could they come to agreement on a short list of the rights of all men? That was the first puzzle. It might be called the conundrum of pluralism. The second conundrum had a different origin. One side in the great war claimed to be defending the individual, whereas two of the other great protagonists—the defeated National Socialists and the triumphant International Socialists (or Communists)—marched under the banner of community, the collective, and especially the state. These two visions of the future were antagonistic—or so it seemed. How could the opposing sides possibly agree about common principles, when the principles dearest to each were to the other anathema? Still, some thought, there must be some way to protect the rights of individual persons while recognizing at the same time the many communities in which all persons are concretely embodied. This might be called the conundrum of the individual or the state.

The drafters of the Universal Declaration of Human Rights (1948) did manage to cut both these Gordian knots. While protecting the ability of diverse consciences to disagree radically about the premises and principles of ethical theory, they found a way to emphasize a number of basic findings of practical reason, to which a sufficient majority of peoples around the world had been driven, whether by the terrors of the first half of the twentieth century or the wisdom of the previous three millennia. Many had been led, under the pressure of extreme suffering, down a sort of via negativa. Having seen the awful consequences of a world without universal standards, they were now ready to agree about certain practices that must not be done, not ever again. Where agreement about the whys and wherefores was still not possible, agreement about a few practical "don’ts" was almost universally sustained. Some of these standards might be stated as goals to be striven for in societies not yet fully developed, and thus be worded in positive terms. But virtually every one of the thirty principles of the final draft rested on bitter memories of recent abuses.

Similarly, East and West were not at all in agreement about the individual and the state. Nonetheless, there are some down-to-earth social institutions, such as the family, in which all humans function. Could not a way be found to bridge the gap between the Anglo-American zeal for the term individual and the Soviet insistence on the state? And between the Western predilection for liberty and the Asian (especially Chinese) emphasis on duty?

Once it was decided that the Universal Declaration could not possibly formulate a common theory about human nature and destiny, nor a common creed, but only declare, instead, a limited but quite clear practical code, the two recently fashioned keys suggested above began to turn in the locks.

One key was to notice the sharp difference between the universality reached through the workings of practical reason and the universality possible through the work of the theoretical reason. The second key was to notice that the term person has connotations lacking in the term individual, and that such terms as social and community attach to many other referents besides state.

The next task was to find a "symphonic theme" that would make each measure of the Declaration more powerful and more meaningful by belonging to a whole, of which each was a partial but essential element. The actual drafters of the Declaration preferred to speak of finding an "architecture," a metaphor easier to visualize: its preamble as portico, its four "pillars" or axial principles, its four "rooms." But I believe the metaphor taken from music better expresses the way in which each principle of the code is intended to express harmonies, echoes, and motifs amplified by later principles. The force of the whole adds considerable meaning to each part.

The final task was to formulate each proposition of this code in a form most likely both to attract universal consent and to hold up under the pressure of events. Its drafters did not want the Universal Declaration to be discredited by subsequent events; on the contrary, they wanted its value to be enhanced by at least a modicum of prescience. Practicality in both senses was the key.

Even though the Declaration might come to be phrased positively, in terms of "rights" rather than in terms of "don’ts," one could think of these rights as reverse descriptions of practical actions that ought not to be taken. Concrete prescriptions have a more practical ring than theoretical affirmations. Besides, they are far more readily agreed to (and more difficult to oppose). This focus on practicality commended itself to diplomats, political leaders, and the public itself, and it was greatly strengthened by recent horrific memories and still-abiding fears.

Let us look more closely at the solutions to these philosophical logjams at the United Nations. In those years, there were not quite sixty countries in the United Nations, compared to today’s 185. They still had radically opposed moral visions, though, and these quickly became apparent to the committee in charge of drafting the charter of UNESCO. On this commission the philosopher Jacques Maritain did especially useful work on the particular problem of pluralism. That work was in turn appropriated by those working on the Universal Declaration.

Like others, Maritain recognized quickly that agreement on common principles—a common philosophy of human nature and destiny—was out of the question. Among representatives of different faiths, worldviews, philosophies, and ideologies, there was no one theory that all of them shared. On the other hand, Maritain knew from his long studies of St. Thomas Aquinas, who had written at a time when various European cultures were meeting more mature and vital Muslim and Jewish cultures, that practice and theory are activities of two different habits of mind, and operate under different laws and constraints. This point requires a brief excursus.

Sometimes, people do quite well in practice what they cannot explain in theory, even as people who are excellent in theory often fail in practice. Theory is curiously impersonal; anyone, from any point of view, should be able to examine a theory, and to falsify or verify it in an objective way. But practice is incurably personal; the batting grip that works for one baseball player does not work for another. In practice, coaches, not theoreticians, are the most help. (Theory may be useful to coaches, but it is not enough.) Moreover, three or four persons can engage in the same practice although each has a different reason for doing so, and a different theory underlying his practice. This observation provided the clue Maritain was looking for: How is an agreement conceivable among men assembled for the purpose of jointly accomplishing a task dealing with the future of the mind, who come from the four corners of the earth and who belong not only to different cultures and civilizations, but to different spiritual families and antagonistic schools of thought? Since the aim of UNESCO is a practical aim, agreement among its members can be spontaneously achieved, not on common speculative notions, but on common practical notions, not on the affirmation of the same conception of the world, man, and knowledge, but on the affirmation of the same set of convictions concerning action. This is doubtless very little, it is the last refuge of intellectual agreement among men. It is, however, enough to undertake a great work; and it would mean a great deal to become aware of this body of common practical convictions.

Maritain restated the question before the committee. Instead of, How can such disparate intellectual positions be reconciled?, he asked, How much agreement can we reach regarding practices even while remaining incurably divided regarding the underlying theory for such practices? Maritain then raised two further questions that suggested an answer: Are there not some things so terrible in practice that no one will publicly approve of them? Are there not some things so good in practice that no one will want to seem opposed to them? Since the answer to both questions likely was "yes," the relatively simple yet overlooked distinction between agreement in theory and agreement in practice broke the logjam.

This distinction—taken over from the UNESCO experience by those working on the Universal Declaration—allows people to stand firm on all points of principle, avoiding the trap of moral indifferentism or relativism. A Muslim need not surrender one iota of Muslim faith, or a Christian of Christian faith. Nor need a Communist abandon Communist theory. Maritain’s approach was to ask one question only: Do you agree that the support of this practice and the prohibition of that other practice is a worthy criterion for the world community? Do you agree to declare that your nation will live under this code of practices? In the event, the Soviet Union did not sign the Declaration, but neither did it veto the action of putting it forward.

There were obvious weaknesses in the Declaration. Like the U.S. Constitution, the UN Declaration cannot of itself prevent behavior radically at odds with its principles. On the other hand, after 1975, with the publication of an extension of the Declaration in the Helsinki Accords, the Universal Declaration proved of inestimable importance to human rights activists behind the Iron Curtain. Indeed, these "mere words" were credited with being one of the most useful of all tools in the final discrediting and dismantling of the Soviet Union. The failure of the USSR to live up to this simple and elementary code of practice played a decisive role in delegitimating the regime, even in the eyes of serious Communists, during the turbulent period 1989-91. It worked much as Maritain had hoped it would.

For Maritain, whether it would work was in a sense a testable hypothesis. Even people who deny the existence of the natural law cannot help exemplifying it, he knew, because they are human beings who use practical reason every time they act. Maritain had learned from Aquinas that "natural law" is only the name for the actual working principles of practical reason. (Natural law presents no concrete ordinances, but does make principled demands regarding methods of practical inquiry.)

Charles Malik, head of the Commission writing the Declaration of Human Rights, played a role similar to Maritain’s on pluralism with regard to the concepts of person and society. The Soviet delegation was allergic to the Western use of the term individual, and the U.S. representative, Eleanor Roosevelt, was firm in insisting that the individual is prior to the state. Other delegations had difficulties with both those terms. It seemed for a time that the impasse between rival political philosophies could not be broken. However, from various quarters Malik had become familiar with interesting possibilities in the word person, employed as a substitute for individual. The Anglo-Americans could live with the substitution, and Malik was careful to point out to others all the social reverberations of the term. He made person a far more attractive term to those who feared the radical separatism and potential lawlessness suggested by individual.

A cat or a dog, even a tree, can be an individual, but only a human being (or God and the angels) can be a person. Person is far more specific to the human race; it is a far more humanistic term. What makes a person a person, rather more than merely an individual, is a spiritual capacity: the capacity to reflect and choose, to be imaginative and creative, to be an originating source of action. We have two cats at home, and they merely behave. They can do no other than follow their own instincts. Our children, by contrast, do not merely follow the law of their own natures; they improvise, they think of new and crazy things, they create new personae for themselves.

Moreover, persons are reared over long years in families, and it is in families that their identities, habits, and character are established. Families further participate in whole networks of kin, neighborhood, religious tradition, and other intermediate associations, natural and civil, and in and through those relations live out a thick social identity. In this sense, societies take shape long before states do. Persons are social beings before they are aware of having their own distinctive personalities. Persons come to fulfillment only in community, and communities have as their end and purpose the raising of persons worthy of their inherent dignity. Dignity inheres in them because they are destined to be free to reflect and to choose, and thus to be provident over the course of their own lives, responsible for their own actions. A person is capable of insight, love, and long-term commitment. Such creatures are deserving of respect from other rational creatures. Their inherent nature makes civilization possible, since civilization is constituted by conversation, the art of persuasion through reason, mutual respect. Civilized persons argue respectfully. Barbarians use clubs.

These characteristics of person give rise, in turn, to the four main principles whose force is felt in every one of the thirty principles of the Declaration: Every human being without exception is worthy of dignity, liberty, equality, and brotherhood. In other words, the very term person implies a vision of a universal society, which for reasons of practicality and local autonomy, and by the natural workings of culture and history, is organized through countless local associations of varying sizes and horizons. Among these latter, of course, are states—but not the most important and not the primary social forms. Even the Soviet Union found it difficult to object to this language entirely, although its representatives did in fact object. For the USSR presented itself to the world through four different social forms: It was (it said) a union, of republics, formed by soviets (local communities), in a socialist project—the USSR. Whatever its declared differences, even in the Soviet Union smaller layers of societies, in theory at least, formed the Communist state.

From the very first words of the Preamble, then, the Universal Declaration avoids the term individual and takes care to surround the term person with references to the expanding circles of communities and associations in which, in the real world, actual persons become aware of their own capacities, responsibilities, rights, and obligations; and in which they find information about their human possibilities and their rights, in addition to moral and institutional support in vindicating them. In the Declaration are articles about the person and courts, the person and the family, the person and intermediate institutions, the person and religious or cultural traditions, the person and ethnic groups, the person and the state, the person and the international order.

Very neatly, in other words, the Universal Declaration represented concerns expressed by Latin American delegates, Asians, Africans, Middle Easterners, and others, and avoided boxing itself into the rather more partisan language of individual and state that figure so largely in both Anglo-American and Marxist discourse.

Moreover, the Universal Declaration deftly avoided another East-West pitfall by the way in which it treated the economic and social "rights" that the Soviets were most eager to get into the document, to offset the individual rights that they saw as false consciousness. The drafting committee took care to include the economic and social rights (articles 23-27). But they also took care to point out in a kind of preamble to them (article 22), and in a highly discernible change of tone and of sense, that the term "rights" was now being used in the document in a different sense. In the earlier set of rights, the state does not really have to do anything, except follow the law and otherwise stay out of the way, avoiding abuses.

In the case of the social and economic rights, by contrast, the role of the state is vastly expanded, some might say to an almost infinite degree. These rights are not spoken of as immunities from oppression by the state but as entitlements to goods and benefits from the state. Further, these goods and benefits are stated in the vague language of moving targets, as in the locution (article 22) "in accordance with the organization and resources of each state." The assumption seems to be that these are not, precisely, rights (in the way the American Bill of Rights speaks of rights) but, rather, goods—or even goals to be striven toward.

It is only a fairly rich and developed state that can provide its people the high standard of living, securities, and benefits held out as goods ("rights" in this new sense) in articles 23-26. Most nations in history have failed that test.

A small band of intrepid thinkers and doers succeeded, fifty years ago, in doing what many thought impossible, formulating a document to which all nations on earth might repair, and a document that was as responsible as anything for the "Helsinki process," which many credit with undermining the legitimacy of Communist governments in Eastern Europe. For here were official documents, signed by their own government officials and proudly published (at first) in official Communist newspapers, that informed dissidents of rights they didn’t know they had. Waving these newspaper clippings, they embarrassed their leaders and put them on the defensive.

The small band led by Charles Malik succeeded only through stubborn persistence and an infinite capacity for explanation and patient instruction, as well as brilliant tactical diplomacy in the arts of running meetings, conducting public arguments, and corralling crucial votes. The sheer human patience and generosity of spirit required to conduct this wearisome political work is awe-inspiring. (Having served as U.S. Ambassador to the Human Rights Commission twice, and once to the Bern round of the Helsinki Commission for a period of about eight weeks each, I can testify how supremely demanding such work can be.) Yet at the distance of fifty years, what is most impressive is the penetrating intelligence and rich practical wisdom of the architects of the Universal Declaration. Not least impressive was their discovery of the two hidden keys—the key to the problem of irreducible pluralism, and the key to the conundrum of the individual and the state—that unlocked age-old chains.

Copyright (c) 1999 First Things 97 (November 1999): 39-42.

Published in First Things Online May 26, 2004

The Godlessness That Failed

The collapse of communism in 1989 was one of the greatest events of human history—one of the most sudden, unexpected, dramatic, and utterly transformative. We are too close to it to be certain how to read it. Yet one characteristic of communism proved to be decisive—its particular form of atheism, and the effect of this atheism upon the morale of the people and upon their economic performance. For seventy–two years, communism in Russia waged a silent war against the human soul. Sometimes screams were heard from torture chambers deep in prisons and in detention centers, but mostly the war was fought with ideas and incessant public propaganda. Below the surface, it eroded foundations. Out of sight, it taught people to have a low opinion of themselves, as if they were incapable of nobility of soul. It ridiculed the soul’s capacity for discernment and for truth. Year after year, the silent artillery of communism leveled the inner landscapes of the soul.

A more secular way to speak of these things is to say that communism set out to destroy human capital. It set out, for instance, to eradicate centuries of learning, habits, cultures—to erase "bourgeois culture," to salt it and plow it under with lies, demonstrations, propaganda. In doing so it destroyed enterprise, investment, innovation, even the ability to distinguish between profit and loss. It wounded the habits of honesty and trust, self–reliance and fidelity to one’s word. More deeply still, it dulled the most distinctive human mark: the soul’s primordial endowment of creativity, its sense of personal responsibility, its knowledge of itself as a subject.

The denial of the dignity of the individual, the reduction of the human being to merely material elements, erases our awareness of ourselves as persons who reflect and who choose, who launch new and creative actions into history, and who accept responsibility for our actions. Unlike a horse or a cow, a human being is an acting person, an active agent—inquiring and understanding, deliberating, judging, deciding. In precisely these ways, a human is made in the image of God.

Communism aimed to objectify everything and everybody. Its fundamental premise was materialism. Human beings are—meat. Animated for a time, perhaps, but essentially no more than a sachetto of chemicals. Instruments. Means. The "dialectical" part of "dialectical materialism" belonged to a dynamic class position of "the proletariat." The "materialist" part belonged to the people. The individual should expect to be expended, sacrificed, used up, like a thing.

Sacrificed—in this last respect, communism traded on the symbolism of Judaism and Christianity: the expectation of a New Jerusalem and the sacrifice of self for others. Communism’s materialistic theory, in and of itself, had no such resonance. Mere things do not make sacrifices for noble purposes or consider sacrifice a noble act. Thus, communism’s deepest sentiments were borrowed.

It is well known that belief in God can lead to torture, as in the awful scrutinies of heresy tribunals. It is less well known that atheism of a particular kind also leads to torture. The two routes to torture are quite different. The temptation of believers comes from moral arrogance or its mirror image, as in the case of Dostoevsky’s Grand Inquisitor, who was moved to torture by "pity," a foolish belief that most people are not as wise as he, so that it was his "duty" to keep them from liberty. This route begins in moral debility. With atheists of the Communist kind it is quite different. Here torture flows from its fundamental premises about the human being. No human has any worth apart from contributing to the Cause—to the Dialectic, to the triumph of the Party (the Vanguard of History, the Custodian of human fate). If a man will not contribute willingly to History or (it comes to the same thing) the Collective Will of the Party, he is without value and may be disposed of—indeed, is a threat to the Party, and should be disposed of.

Communist atheism denies any transcendent dimension to being, any call to which humans must freely respond, any standard of truth, evidence, moral integrity, and goodness by which humans are every moment being judged. For the Communist, all is nothingness except the Dialectic of History, before which and in whose name he prostrates himself. The Communist borrows from Christianity and Judaism a comfort, viz., that his prostration places him on the side of justice and compassion. Yet his comfort is unwarranted because it rests on ideas in which his premises forbid him to believe. For the Communist has only one moral principle: the Collective Will of the Party. All else can be done in that name: murder, torture, imprison, exterminate, assassinate. No other moral question can be scientifically raised. There is in man no internal source of dignity. Personal liberty and personal responsibility cannot be honored in theory, although of course they continued to live on among individuals. In theory, these realities are dismissed as bourgeois affectations. The Communist’s moral comforts are stolen from elsewhere.

Paradoxically, however, the Communist system of imprisonment, torture, and public confession constituted, despite itself, a via negativa that led a great many of its victims to God, and to a fresh sense of being an individual who possesses dignity. For under torture they discovered evidence for the presence of God at the core of their own being. The prison literature of our time is full of such instances.

The typical pattern, if I am not mistaken, went something like this. The KGB handbooks listed more than twenty different degrees of torture, more or less scientifically studied and refined. At some point in the proceedings, the torturer would tell his victim that there is no point in resisting, so why put everybody through the pain? "No one will ever know what happens here. It has no significance. Neither resistance nor confession, really, will affect the outcome of History. Just be pragmatic. Tell me what I wish, write what I request, and do it sooner rather than later. Why not? Bourgeois prejudices? You are too intelligent for that. No one is ever going to know what you or I do here. It will be locked up in files with millions of other files, and a thousand years from now when Socialism is truly consolidated, people will never even notice. Consider yourself a forgotten man. Be practical. There is no such thing as truth. It is only a matter of making a decision. It is a matter of will. Write down what you know is fact. I will even help you. The sooner I can go home the better for me—and for you. It is a matter of will. Be practical."

And then the light would go on in the victim’s head: my torturer is telling me that he has all the power. But he is actually confessing something else. There is something he wants from me that he does not have. So he does not have all the power. What he needs is this: that I should conform. He needs my will. He needs my denial that there is any such thing as truth. Only then will his philosophy be confirmed.

As long as I remain faithful to my own intellect and will, as long as I refuse to be complicit in his lie, then my existence unsettles him. I will not tell a lie. As long as I can hold out for that principle, then my existence shows him that his philosophy is false.

Of course, he will overpower me. He can break me with pain. He can take away my mind and my liberty with drugs. But the real power in this relationship is mine. He cannot get what he wants unless I freely give it to him. It is not enough for him to force me, to destroy me—that would be only an instant’s work. I am totally in his power—except for the sanctuary of my consciousness, my fidelity to the light. He will strip me of everything but honesty and naked will. These I cannot give him. He will have to destroy me, and then he cannot have them. Death is now my friend. I will be no use to him—or his precious Party—dead.

Along this way, very much like the way that St. John of the Cross marked out in The Dark Night of the Soul, thousands of victims came to know themselves at a depth they had never experienced before. They began to distinguish among the movements of their own souls—memory, imagination, desire, dread, understanding, will.

Moreover, when their bodies ached with pain from beatings, and from the application of electrical current, and from being contorted and held for hours in positions of excruciating pain, they learned something else. They learned that the light inside themselves, to which they were trying to be faithful, the light of truth (or at least, the will not to be complicit in a lie), cannot properly be said to be part of themselves. Their initial sense, of course, was that they were being faithful to themselves, clinging to their own minds and wills. When the pain becomes intense enough, however, one sees that one is not really suffering this for oneself. If that were so, why would one not just surrender and make the pain go away? Why wouldn’t one be pragmatic?

Rather, it seemed as though, in being faithful to the truth, and in calling up his stubborn courage of will, a man was answering to something that did not belong to himself, something that called (although it had no voice) from outside his own mind and will, something at any rate not reducible to his own mind. His own mind and will were focused in a direction running contrary to everything good for his body and his comfort and his peace. But why? Why was he running from his own self–interest, narrowly considered?

On the matter of self–interest, his torturer was certainly correct. In fact, the torturer’s insistence on self–interest suggested the one line of thought that explained why the torturer was wrong.

The light in my mind (before which I am trying to be honest) is, as it were, something I participate in, and it is not reducible to me. This light approves of my liberty and grows brighter with my own acts of responsibility to it. This light seems very like what people mean—the people an atheist couldn’t earlier understand—when they speak of God. And yet (as St. John of the Cross insists) in the place where we would like God to be, "no one appears." Only silence. Emptiness. Nothingness. Yet from emptiness strength emanates, and from it one feels constantly stronger. And more comforted, despite the wracking pain and weariness and tedium, than by anything one has ever before experienced. And one feels true.

In the via negativa, the voyager sees nothing, hears no divine voices, feels no mystic "presence." As it were, he has before him no more "evidence" concerning God than he did when he called himself an atheist. But he can no longer call himself that. He has come to know that he is no longer accurately described as an atheist. He has been led to the threshold where God dwells, by a dark and obscure knowledge that carries with it a warrant unmistakable to those who have participated in it. He may or may not be ready to say that he believes in "God," but now he has had the experiences that allow him to know what others have been talking about. Not that these are "experiences" that can be isolated, or that they are a kind of "special knowledge" given to some but not to others. They are, rather, something simpler.

In the act of fidelity to the light—the resolve not willingly to be complicit in a lie—a man has become aware of a dimension of his being he had never glimpsed before in such stark clarity. In this awareness, he is aware of a powerful personal dignity. What impresses him is its inalienability. Unless he is simply destroyed, it cannot be taken away from him without his consent. It is true that later he may weaken and give in. But he does not have to fight later, only now. He needs only to concentrate during this staccato second, one second at a time, on the dark light within.

The fall of communism forces us to confront one of the deepest lessons to be gleaned from a seventy–year plague upon the human race. Even in the emptiness, the sheer willingness not to turn away from the light, not to be complicit in a lie, leads to an experience of the emptiness in which God darkly dwells. Receptivity is all. It is as though our inquiring hearts are already God–shaped, formed in His image, so that when we try to be honest and brave, try to be true to ourselves, that effort is already a form of participation.

Before communism collapsed in 1989, it had also targeted its silent artillery on the human capital of its people, especially the human capital that suited them for personal economic initiative. In defining the nature of capitalism, however, Karl Marx made an egregious mistake. He thought that capitalism is constituted by three institutional arrangements: 1) private property; 2) a market system of exchange; and 3) the private accumulation of profit. These three institutions, however, are all pre–capitalist. They are found in biblical times (and even earlier), whereas scholars hold that "capitalism" is something very new, modern in fact, and quite different from the traditional system based on private property, markets, and profit. Max Weber dated the birth of capitalism after the Protestant Reformation (also a mistake, but indicative of the timing). During the eighteenth century, Adam Smith, David Hume, and others in Scotland and England were arguing for a new system, the defining dynamic of which was to be invention and enterprise. Capitalism applied imagination and practical intelligence to creating new goods and services not provided by earlier systems, agrarian, feudal, and mercantile.

Capitalism is most of all a set of human habits—virtues, in the old–fashioned sense, natural and learn ed dispositions. The virtue of enterprise consists in both an intellectual habit and a moral habit. The intellectual habit is to notice, often before others do, new creative economic opportunities, new goods to create or new ways to create them; to innovate; to invent. The moral habit is to have the realism, the practicality, the know–how, and the stubborn obstinacy to turn ideas into realities; that is, to make ideas work. Not everybody who has one of these two habits has the other. Enterprise requires both. Enterprise is not unlike the creative habit of the artist, who also makes to be what never was. Business leaders are not infrequently as vainglorious about their creations as any prima donna.

As David Landes of the Massachusetts Institute of Technology makes clear in his 1998 study of economic history, The Wealth and Poverty of Nations, one main cause of the economic leadership of the West lies in the "joy of discovery" taught to Jews and Christians through the teaching that each woman and each man is made in the image of God, the Creator, and is called to be a creator, too. It goes without saying that communism tried to eradicate this image in the human soul, and to strip away from society every social support that over the ages had been brought to its flourishing. Economic initiative was forbidden. The good Socialist was expected to be receptive to the Collective Will and to submerge individual creativity within it. Private property was abolished. (As late as 1986, along the banks of the dark river in the center of Moscow, huge red letters blazed at night: THE ESSENCE OF SOCIALISM IS THE ABOLITION OF PRIVATE PROPERTY.) The system of market exchange was replaced with a system of national planning, in which each month bureaucrats set the prices for more than twenty million different items, with no reference to the costs, desires, or efforts expended by individual buyers or sellers. (The epistemic problems were insoluble, as Ludwig von Mises had predicted in the 1920s.) As if that weren’t bad enough, communism cut the tie between economic effort and reward. It forbade private accumulation, and settled instead for rewarding its faithful with political favors (including living quarters, dachas, automobiles, and "official" stores).

Communism furthermore set out to abolish the ancient traditions, customs, and habits of law and morality. It wanted to dirty, distort, and bury the past so that it would be irrecoverable. It tried desperately to replace "bourgeois morality" (in reality, the morality of Judaism and Christianity, more dear to the poor perhaps than to the affluent) with "Socialist morality," in which the human person is never an end but always a means. It taught disregard for critical thinking, personal judgment, and a love for truth in order to make room for Party ideology and propaganda.

As a final affront, it withheld even the simplest goods—toilet paper, meat, oranges—so that humble citizens would have to spend hours in line every week just in order to live. In this way, they might come to feel grateful for the smallest of triumphs. They would also learn to hold themselves in contempt, as unworthy of anything at all except what was allotted to them. Shortages demean people, and communism used them as a means of social control. "Being" is not reducible to "having"; but a human being has a right to personal property, in order to be free to act. The "abolition of private property" was an abolition both of freedom and of dignity.

Thus it is that today the legal and moral traditions of Russia are a shambles. The human capital built up over centuries of religious and humanistic striving was bleached out of each successive generation—one, two, three, four generations in all—and nothing was put in its place but cynicism. Means and ends. Instrumentalism.

Few commentators have noticed this aspect of Communist destructiveness. In destroying the heritage of religion and law, and in destroying the very idea of evidence–based truth, communism destroyed the social capital on which all human progress in liberty depends. Even in a society where liberty is vital and strong, it takes a degree of heroism to act virtuously when others are not doing so. When the whole society frustrates your actions at every turn, it seems futile to act virtuously, and one must struggle daily against the temptation to despair.

Western economists themselves often take the moral and cultural sphere too much for granted. Jennifer Roback of George Mason University describes an American couple who adopted a young boy of three or so from Romania, one of those orphans brought up mass–production style, never held in human arms, fed by a bottle put in place by a mechanical apparatus. Isolated from human closeness with adults until he left the orphanage, the child is grown to young manhood now, handsome, smart, charming—but absolutely incapable of forming a human relationship, capable only of seeking his own will and his own pleasure. He fears close contact with people, only pretending to affection so far as is necessary. Cleverly narcissistic, he lies, steals, cheats—whatever he needs to do to obtain whatever he desires. And all the while, smiling, he charms people by his seemingly open manner. He has already been arrested once for shoplifting, and his teachers at school, for a time in love with him, have reluctantly had to report the times he has stolen things from his classmates.

Professor Roback suggests that the totally self–centered impulse that moves this child, the total preoccupation with his own physical self–interest at the expense of all other more noble interests, sounds remarkably like what the economists conventionally discuss as "economic self–interest." She has challenged other economists to tell her in what respect the conduct of this warped and totally narcissistic young man differs from the behavior of their theoretical "economic man." This challenge infuriates the economists, she has found, but they only sputter and do not answer it.

When they have had time to reflect upon it, however, they may realize that the homo economicus of their theories is really a moral person with highly developed humanistic virtues taken from Judaism, Christianity, or some correlative tradition. For when economists write "rational," they also mean "law–abiding" and at least minimally "honest," "trustworthy," and "morally reliable." They emphatically do not mean a crook, cheat, liar, manipulator, or narcissist with whom it is impossible to have a trusting relationship. A deal has to be a deal. A partner one cannot trust brings a high cost in efficiency, and a high probability of eventual disaster. The true anthropology of capitalism, the only premise on which it can work, encodes a far richer morality than is exemplified by that unfortunate orphan.

Analogously, an unfortunate orphan brought up until the age of three without human contact, warmth, or emotional involvement is not a fair metaphor for the ordinary people who endured the imposition of amoral communism upon them for decades. But it is a fair metaphor for the aims and practices of communism. Where there ought to be a "self" in that young man, there is a cipher. This child learned to determine his direction by negotiating his way around any resistance he meets to getting what he wants, like a robot bumping and bouncing away, incapable of internal self–government. Within his own personal history, there is a dialectic of resistant objects that have marked out the paths forced upon him: a kind of miniature Dialectical Materialism, blind and irresistible.

Also inhuman.

Long before World War II ended, a group of economists and philosophers in Germany began thinking about the novus ordo that would have to replace Nazism once Hitler came to the end of his line. They recognized that if they were to build a humane society they would have to reconstruct a new political order, a new economic order, and a new moral/cultural order. To construct any one of these orders is a herculean job, but to be obliged to construct all three—and to be obliged to do so almost simultaneously—is virtually superhuman.

The hope of these "Ordo economists," as they called themselves, was that Germany had not suffered total cultural and moral damage under Nazism, given that the regime had lasted only twelve years. They further hoped that there were strong remnants of the humanistic past that could again be drawn upon, but in a more careful way. Their philosophy, and their vision of a "social market economy," became the practical guide to the "miraculous" success of postwar Germany. Their great success shows that it is not impossible to construct the three interdependent social systems—political, economic, and moral/cultural—that constitute the free society, in which free persons and free communities can flourish, and even to do so within a relatively short time. This success gives heart to all who must achieve something similar, even if yet more difficult.

Such a task, moreover, is never fully done once and for all, but must often be recapitulated. Each generation needs to rediscover why the free society is constructed as it is, and why it demands so many sacrifices and so much unrelenting effort. The free society is moral, or not at all. That is why it is so precarious. Any one generation, deciding that it is not worth the cost, can throw it over.

But the moral situation of the formerly Communist countries is far more desperate than the situation of Germany in 1945. For the moral destruction that communism wrought in Russia during seventy–two years had to be far more destructive of traditional institutions, practices, and associations. The damage to human capital was incalculable.

This much we know. Even under the best of conditions, it is extremely difficult to construct a free society that works, that endures, that is self–correcting. The silent artillery that communism leveled at the human spirit and at every internal nerve of human capital for more than seventy years had its effect. The transition from communism to a free society is consequently a severely demanding moral task. It is a transition to a society free from torture, assassination, extortion, and tyranny in its political system; nourishing orderly and creative enterprise and liberating the poor from poverty in its economic system; and through its cultural system rewarding the habits that make a free economy and a free polity both possible and worthwhile.

How that transition goes is perhaps the greatest issue of our time. Everything depends upon the use that humans make of the liberty with which each is endowed, while there is still time to affect the outcome. We are the subjects of this drama, not the objects.

Thomas Jefferson, no orthodox believer, put it this way: "The God who gave us life gave us liberty at the same time." It is no hindrance to our purposes to understand that liberty is the Creator’s jewel, favored by Providence. Theism is no hindrance to personal dignity. On the contrary, it is its source.

Published in First Things Online May 26, 2004, first published by First Things, June/July 2000

Pius XII as Scapegoat

From the beginning of his papacy in 1939 until well after his death in 1958, Pope Pius XII was honored with unfeigned warmth by Jewish leaders around the world. Golda Meir was uncommonly effusive in her praise of him. Trees were planted in Israel in his honor. In 1955, the Israeli Philharmonic Orchestra flew to the Vatican to give a special concert to show the nation’s gratitude. In 1940, Albert Einstein wrote a tribute in Time. At his death, tributes were universal and eloquent, especially by those Jewish groups closest to his efforts. A later generation, in contrast, has been exceedingly harsh. Why this stunning reversal? Whose interests are served? As it turns out, the spectrum of those who benefit by denigrating Pius XII is very broad.

The reversal might be said to have begun in April 1945. The instant Hitler fell, the propaganda machine of Stalinist communism turned full-bore on Pius XII, then on the Catholic bishops and priests of Poland, Hungary, Czechoslovakia, France, and Italy. The strategic aim was to prepare the way for Communist governments in the Slavic and Latin countries of Catholic Europe. More than he had feared Hitler—and with good reason, as events after 1989 demonstrated—Stalin feared the moral power of the Pope.

The attack on Pius XII took on major proportions, however, only in 1963, with Rolf Hochhuth’s surprisingly successful play, The Deputy. Even though it was roundly denounced by historians, the play drew moral attention away from Hitler and moral pressure away from Germany, especially Protestant and pagan Germany, and shifted the spotlight of moral condemnation in the direction of the Pope and the Catholic Church.

Today, a considerable number of “progressive” Catholics, not least among them former priests and seminarians, choose to beat up on Pius XII as a way of diminishing the papacy in general, and thus also the present Pope with whom they especially disagree. This is the express intention of John Cornwell, author of Hitler’s Pope: to discredit John Paul II and his ilk, that is, popes speaking as solitary moral voices (which, though he does not seem to notice it, more or less undercuts Cornwell’s case against the solitary moral voice of Pius XII). One can only wish progressive Catholics better luck next pope, even if they have trouble making up their minds about what they want in papal outspokenness.

A few Jewish spokespersons today, both in America and elsewhere, have also turned on Pius XII. For the first fifteen years or so after World War II, the effort to comprehend the sheer barbarity, madness, and evil of Hitler and his entire machinery of death ended in frustration. There followed many recriminations among Jewish groups themselves, as chronicled in Walter Laqueur’s book The Terrible Secret. The terrible secret is how long it took the public to recognize that after January 1942 the Nazis were serious about exterminating Jews. Many Jews fought and died in furious resistance, in vain. But most could not believe what was happening to them until far too late.

Why, it was now urgently insisted, didn’t someone warn them? Why didn’t someone sound the alarm? Why didn’t at least one world leader raise a voice in moral condemnation and say, “This must stop!”

Here, too, refocusing the question on Pius XII brought moral relief. Journalists and commentators of many different backgrounds (including Catholics), who had never before thought that popes counted for much, now imagined that one word from the Pope, one dramatic statement, might have had the necessary miraculous effects.

In fact, what Pius XII did say and do—especially through Vatican Radio, jammed as it was in Germany—was almost daily amplified by the BBC and other Allied radio broadcasters. Reports of atrocities were easily dismissed as war propaganda, to which the public had become inured during World War I. Appeals to Hitler would have been even more futile: the Fuehrer knew he was violating Christian moral principles; in his eyes Christianity was a religion for weaklings, and he had contempt for it. Besides, when Pius XII had pleaded with every ounce of public strength for one last peace conference in the summer of 1939, before an irreversible descent into the cauldron of war, no one took him seriously. No one even answered his summons, neither the Axis powers nor the Allies. That was the last time he had freedom of access to worldwide media.

Once the war began, Mussolini shut the Pope up in the Vatican, and every means of communication he had was censored—his mail, Vatican Radio, L’Osservatore Romano. Four different Nazi intelligence organizations, along with the more pervasive Italian ones, penetrated the Vatican. (It was easy to threaten the families of Vatican employees, virtually all of whom commuted into the Vatican gates every working day.) The Holy See, moreover, was totally dependent on the Italian government for essential services: water, sewage, electricity, telephone, telegraph, and food. Even with all this, Hitler’s unhappiness with Pius XII was such that he twice gave orders that contingency plans for occupying the Vatican be put into operation. Paratroopers were to attack suddenly and haul the Pope off to Germany. Twice his orders were frustrated by local commanders, who delayed until he was distracted elsewhere. (One told him he was assembling experts in Latin and Greek who could decide which of the archives to haul off, and this would take six weeks.)

When the Pope had full voice nobody listened. Are we to believe that when he could not be heard unless his keepers let his words go forth, then the world would have listened? For most Catholics, such reasoning is hard to understand. During the last fifteen years, for instance, there has been no lack of dramatic statements by Pope John Paul II (and Mother Teresa)—sometimes in the very face of world leaders, and before vast audiences on international television—against the systemic use of abortion and euthanasia and “the culture of death” they represent. Few listen. Why would they have listened in 1942 or 1943?

The Allies were interested in the Pope when he made propaganda for their side. They didn’t want him criticizing Communist atrocities, since Stalin was an ally; they didn’t want him condemning Allied carpet-bombing of German and Italian cities. But they did want him to condemn the German carpet-bombings of London and Coventry. They were furious when he was silent.

As prisoner in the Vatican, Pius XII was silent about many things, and on principle, not out of fear. Archbishop Sapieha of Krakow upbraided him publicly for not speaking up in late 1939 and 1940 as the intellectual leaders of the Polish Church, lay and clerical, were persecuted by the thousands, beaten, killed, thrown into concentration camps. Sapieha later recognized that moving into open rhetorical warfare would have been useless—and worse, positively inflammatory. He later grasped the method in the Pope’s coolness and followed suit in his own style of leadership as the years went by. He was young Karol Wojtyla’s protector and teacher.

The vulnerability and weakness of the papacy was nothing new. Among recent namesakes of Pius XII, two (Pius VI and Pius VII) had been jostled in rude carts to Paris for delicious humiliation by Napoleon; the chancellor of Pius IX had been assassinated on the marble stairs of his office building, while the Pope had to flee Rome for his life; Leo XIII was also driven into temporary exile in the late nineteenth century. Pius XII knew this history and knew precisely the Vatican’s vulnerabilities, but he told a boastful and threatening Goebbels to his face that he personally feared nothing, and would never leave Rome. Often described as distant and analytical, Pius also had cold steel in his spine.

A skilled reader of men, Pius had carefully diagnosed both Hitler (given to flying into destructive rages) and Mussolini (more reasonable and, even better, Italian). The Pope knew that on at least a few big things he could eventually persuade Mussolini—keeping Rome a free city, for instance—but he also knew he had been thrown into a game of wits with Hitler, in which an iron determination not to be baited out of formal neutrality might prevail against all odds. No matter how bleak everything appeared from 1939 to 1943, Pius XII judged that coolness under fire would allow him to shepherd such strengths as could be deployed to alleviate suffering.

Many people around the Pope begged him to speak out more dramatically—the ambassadors of Britain, Brazil, and France, for instance, confined by necessities of war in cramped rooms inside the Vatican walls. The Pope pointed out that he was speaking out, very strongly, in clear and unmistakable principles. More than once, he drew the portrait of the brutal jackboot of racism, unjustified violence, and the gross slaughter of human beings. He did not, of course, point out which powers the portrait described. To figure that out did not take rocket science; the propagandists at BBC knew instantly how to put those papal condemnations at Hitler’s feet, and did so within hours. Hitler’s infuriated analysts saw just as quickly how the Pope intended his words to be used, but (cleverly) without formally violating neutrality. Worse, if the Nazis attacked what the Pope said, they confirmed the accuracy of the BBC’s sharp thrusts.

Among world leaders, none was more at the mercy of surrounding Axis powers for the entire period of the war than Pius XII. But none spoke as openly as he did or fed the world press with as much vital information about what was happening. He also saved a large number of Jewish lives through opening convents, monasteries, and religious houses to clandestine sanctuary, and brought face-to-face, hand-to-hand relief to millions who suffered as refugees. Pope Pius XII worked out his strategy early, learned to adjust his tactics, and never heard a convincing reason—though he heard many reasons—to do differently. He was as steady and courageous as he was cool and analytical.

A less disciplined course, Pius XII knew, might have led him to become more confrontational. If that course resulted in an even more open and draconian warfare on the Church, the very best, ablest, and bravest would have been killed or imprisoned earliest. All would have been reduced to silent servility. The Pope probably would have been spared, mocked for his helplessness, and reduced to the isolated, almost demented state in which Napoleon left Pius VII. But scores of thousands of others would die, with no tangible gain. Hitler was yearning for the confrontation. Wiser heads trapped under Hitler’s power, with a view to the future, were not.

For some critics, all this is too subtle. They demand in retrospect an open, no-holds-barred papal condemnation of unprecedented evil. They offer nothing but speculation about what would have followed from such a statement. Indeed, for the rigor of their logical position, they must concede to the papacy far greater rhetorical power than modern theories of the advanced secularization of Europe permit. Do such historians pledge that they, for instance, would heed the solemn words of a pope today, even when those words go against their own beliefs and interests? And if they wouldn’t today, why would others then?

The fury of recent attacks on Pius XII, in contrast with the almost universal esteem he enjoyed from the beginning of the war until his death, is fed by different passions than those of sixty years ago. Among those secular Jews whose chief organizing principle is the Holocaust, one hears the simultaneous assertion that all theological notions are abstruse and fanciful, and yet that a theological condemnation of the Holocaust by Pius XII would have made a difference. Others today who are bitterly opposed to the Church’s perennial position against the moral approval of homosexual acts, or against abortion or euthanasia, also seem to delight in weakening the moral authority of the papacy. At the commanding heights of culture, as the Marxists used to say, this new establishment resents the imputation that what it blesses as moral is contrary to the law of God and hence immoral. The critics of Pius XII are deflecting attention from themselves; for this new establishment, it is convenient to discredit the messenger.

The more antithetical the times to Catholic substance, the higher the prestige of the papacy seems to climb. It appears to be an office most easily injured by universal obeisance, and bravest and most useful when it runs against the grain. Two thousand years on the same spot, above the tomb of Peter, it has seen many powerful establishments rise and fall. Today’s charges against Pius XII cannot stand scrutiny.

Published in First Things Online May 26, 2004, first published by First Things, August/September 2000

Copyright (c) 2000 First Things 105 (August/September 2000): 20-22.

Defining Social Justice

Last year marked the one hundredth anniversary of the birth of Friedrich Hayek, among whose many contributions to the twentieth century was a sustained and animated put–down of most of the usages of the term “social justice.” I have never encountered a writer, religious or philosophical, who directly answers Hayek’s criticisms. In trying to understand social justice in our own time, there is no better place to start than with the man who, in his own intellectual life, exemplified the virtue whose common misuse he so deplored. The trouble with “social justice” begins with the very meaning of the term. Hayek points out that whole books and treatises have been written about social justice without ever offering a definition of it. It is allowed to float in the air as if everyone will recognize an instance of it when it appears. This vagueness seems indispensable. The minute one begins to define social justice, one runs into embarrassing intellectual difficulties. It becomes, most often, a term of art whose operational meaning is, “We need a law against that.” In other words, it becomes an instrument of ideological intimidation, for the purpose of gaining the power of legal coercion.

Hayek points out another defect of twentieth–century theories of social justice. Most authors assert that they use it to designate a virtue (a moral virtue, by their account). But most of the descriptions they attach to it appertain to impersonal states of affairs—“high unemployment” or “inequality of incomes” or “lack of a living wage” are cited as instances of “social injustice.” Hayek goes to the heart of the matter: social justice is either a virtue or it is not. If it is, it can properly be ascribed only to the reflective and deliberate acts of individual persons. Most who use the term, however, ascribe it not to individuals but to social systems. They use “social justice” to denote a regulative principle of order; again, their focus is not virtue but power.

The term “social justice” was first used in 1840 by a Sicilian priest, Luigi Taparelli d’Azeglio, and given prominence by Antonio Rosmini–Serbati in La Costitutione Civile Secondo la Giustizia Sociale in 1848. John Stuart Mill gave this anthropomorphic approach to social questions almost canonical status for modern thinkers thirteen years later in Utilitarianism:

"Society should treat all equally well who have deserved equally well of it, that is, who have deserved equally well absolutely. This is the highest abstract standard of social and distributive justice; towards which all institutions, and the efforts of all virtuous citizens, should be made in the utmost degree to converge."

Mill imagines that societies can be virtuous in the same way that individuals can be. Perhaps in highly personalized societies of the ancient type, such a usage might make sense—under kings, tyrants, or tribal chiefs, for example, where one person made all the crucial social decisions. Curiously, however, the demand for the term “social justice” did not arise until modern times, in which more complex societies operate by impersonal rules applied with equal force to all under “the rule of law.”

The birth of the concept of social justice coincided with two other shifts in human consciousness: the “death of God” and the rise of the ideal of the command economy. When God “died,” people began to trust a conceit of reason and its inflated ambition to do what even God had not deigned to do: construct a just social order. The divinization of reason found its extension in the command economy; reason (that is, science) would command and humankind would collectively follow. The death of God, the rise of science, and the command economy yielded “scientific socialism.” Where reason would rule, the intellectuals would rule. (Or so some thought. Actually, the lovers of power would rule.)

From this line of reasoning it follows that “social justice” would have its natural end in a command economy in which individuals are told what to do, so that it would always be possible to identify those in charge and to hold them responsible. This notion presupposes that people are guided by specific external directions rather than internalized, personal rules of just conduct. It further implies that no individual should be held responsible for his relative position. To assert that he is responsible would be “blaming the victim.” It is the function of “social justice” to blame somebody else, to blame the system, to blame those who (mythically) “control” it. As Leszek Kolakowski wrote in his magisterial history of communism, the fundamental paradigm of Communist ideology is guaranteed to have wide appeal: you suffer; your suffering is caused by powerful others; these oppressors must be destroyed. We need to hold someone accountable, Hayek notes, even when we recognize that such a protest is absurd.

We are not wrong, Hayek concedes, in perceiving that the effects of the individual choices and open processes of a free society are not distributed according to a recognizable principle of justice. The meritorious are sometimes tragically unlucky; the evil prosper; good ideas don’t pan out, and sometimes those who backed them, however noble their vision, lose their shirts. But a system that values both trial–and–error and free choice is in no position to guarantee outcomes in advance. Furthermore, no one individual (and certainly no politburo or congressional committee or political party) can design rules that would treat each person according to his merit or even his need. No one has sufficient knowledge of all relevant personal details, and as Kant writes, no general rule has a grip fine enough to grasp them.

Hayek made a sharp distinction, however, between those failures of justice that involve breaking agreed–upon rules of fairness and those that consist in results that no one designed, foresaw, or commanded. The first sort of failure earned his severe moral condemnation. No one should break the rules; freedom imposes high moral responsibilities. The second, insofar as it springs from no willful or deliberate act, seemed to him not a moral matter but an inescapable feature of all societies and of nature itself. When labeling unfortunate results as “social injustices” leads to an attack upon the free society, with the aim of moving it toward a command society, Hayek strenuously opposes the term. The historical records of the command economies of Nazism and communism justify his revulsion at that way of thinking.

Hayek recognized that at the end of the nineteenth century, when the term “social justice” came to prominence, it was first used as an appeal to the ruling classes to attend to the needs of the new masses of uprooted peasants who had become urban workers. To this he had no objection. What he did object to was careless thinking. Careless thinkers forget that justice is by definition social. Such carelessness becomes positively destructive when the term “social” no longer describes the product of the virtuous actions of many individuals, but rather the utopian goal toward which all institutions and all individuals are “made in the utmost degree to converge” by coercion. In that case, the “social” in “social justice” refers to something that emerges not organically and spontaneously from the rule–abiding behavior of free individuals, but rather from an abstract ideal imposed from above.

Given the strength of Hayek’s argument against the term, it may seem odd to assert that he himself was a practitioner of social justice—even if one adds, as one must, “social justice rightly understood.” Still, Hayek plainly saw in his vocation as a thinker a life of service to his fellow men. Helping others to understand the intellectual keys to a free and creative society is to render them a great benefit. Hayek’s intellectual work was not merely a matter of his own self–interest, narrowly understood, but was aimed at the good of the human city as a whole. It was a work of justice in a social dimension—in other words, a work of virtue. To explain what Hayek did, then, we need a conception of social justice that Hayek never considered.

Social justice rightly understood is a specific habit of justice that is “social” in two senses. First, the skills it requires are those of inspiring, working with, and organizing others to accomplish together a work of justice. These are the elementary skills of civil society, through which free citizens exercise self–government by doing for themselves (that is, without turning to government) what needs to be done. Citizens who take part commonly explain their efforts as attempts to “give back” for all that they have received from the free society, or to meet the obligations of free citizens to think and act for themselves. The fact that this activity is carried out with others is one reason for designating it as a specific type of justice; it requires a broader range of social skills than do acts of individual justice.

The second characteristic of “social justice rightly understood” is that it aims at the good of the city, not at the good of one agent only. Citizens may band together, as in pioneer days, to put up a school or build a bridge. They may get together in the modern city to hold a bake sale for some charitable cause, to repair a playground, to clean up the environment, or for a million other purposes that their social imaginations might lead them to. Hence the second sense in which this habit of justice is “social”: its object, as well as its form, primarily involves the good of others.

One happy characteristic of this definition of the virtue of social justice is that it is ideologically neutral. It is as open to people on the left as on the right or in the center. Its field of activity may be literary, scientific, religious, political, economic, cultural, athletic, and so on, across the whole spectrum of human social activities. The virtue of social justice allows for people of good will to reach different—even opposing—practical judgments about the material content of the common good (ends) and how to get there (means). Such differences are the stuff of politics.

We must rule out any use of “social justice” that does not attach to the habits (that is, virtues) of individuals. Social justice is a virtue, an attribute of individuals, or it is a fraud. And if Tocqueville is right that “the principle of association is the first law of democracy,” then social justice is the first virtue of democracy, for it is the habit of putting the principle of association into daily practice. Neglect of it, Hayek wrote, has moral consequences:

"It is one of the greatest weaknesses of our time that we lack the patience and faith to build up voluntary organizations for purposes which we value highly, and immediately ask the government to bring about by coercion (or with means raised by coercion) anything that appears as desirable to large numbers. Yet nothing can have a more deadening effect on real participation by the citizens than if government, instead of merely providing the essential framework of spontaneous growth, becomes monolithic and takes charge of the provision for all needs, which can be provided for only by the common effort of many."

Published in First Things Online May 26, 2004, First published by First Things, December 2000

Copyright (c) 2000 First Things 108 (December 2000): 11-13

Another Islam

Beginning in the thirteenth century, the three monotheistic religions parted ways, with the Jewish and Christian world going in one direction and the Islamic world going in another. We are still coming to terms with that split. But the three faiths still hold more in common than we typically recognize today. For example, Islam (like Judaism and Christianity) has a powerful sense of the transcendence of God—His majesty, His greatness, His incomparability with anything else. You see this in the Muslim’s abject bow at prayer, head lowered to the ground.

There is a sentence in the Psalms that says that the whole world, the whole vastness of the stars and everything else, is to God but a grain of sand. The whole world is insignificant. That’s a way of saying how great God is. The purest single note in Islam is this greatness of God, and the appropriate human response to it is “yes.” “Islam” means submission: “Yes.”

T. S. Eliot says that the most beautiful single line in all of human poetry is in Dante: E‘n la sua volontade è nostra pace. “In His will, our peace.” That’s a Christian and a Jewish expression of the relationship of humans to God. It sounds like submission, doesn’t it? “In His will, our peace.” Sometimes we are told that Islam means “peace.” It does mean peace, if you submit to His will. And it sounds as though that’s very similar to the Jewish and Christian “In His will, our peace.” Mutual understanding can begin from there.

Difficulties in understanding arose in part because Muslim scholars interpreted Plato and Aristotle in such a way that they became convinced that God’s greatness and transcendence is so superior to us that God couldn’t be bothered with this grain of sand, with this changeable world in which there are seasons, upheavals and erosions, historic transformations and individual contingencies. God is concerned with necessary things, the things that are eternal, the things that stay the same. His will is not affected by all the things that happen on our level of existence.

Focusing on God’s transcendence is Islam’s great strength. Its weakness is that it can say little about human liberty, and about how human choice affects the will of God. How can God allow for human freedom? How can God permit human choice? It’s as though medieval Muslims imagined liberty to be a zero-sum game. If humans have it, God doesn’t. If God has it, humans don’t. It’s a philosophical problem they couldn’t resolve.

Jewish, Christian, and Muslim writings thus divided, first of all, over the role of liberty in the relations between God and man. So great is God that in the Islamic view He overpowers human liberty. This suggests a kind of determinism. What God knows and does is eternal and necessary and can’t be changed, and no individual will, no knowledge of singulars or contingency, is possible to God. He doesn’t concern himself with things like us, and you can’t talk about human beings as images of God.

“Man and woman He created them,” that much is clear in Genesis. “In the image of God He created them.” For Jews and Christians, human beings are made in the image of God. For Islam, to conceive of an image of God is to fall very short of, even to falsify, His greatness. To speak of images of God is blasphemy. It marks one as an infidel—one who has not seen the point, and is in denial about the inconceivable greatness of God.

The second cause of separation flowed from the first: the tendency toward a doctrine of double truth. God is light, unchanging, eternal. Muslim scholars couldn’t discern a way, philosophically, to deal with contingency and changeable things, such as human beings changing their minds and following their own vocations. And so they had one set of truths to which philosophy led them, and another for talking about reward and punishment as the Koran does. The Koran seems to talk about the ethical life, and it allows for a certain degree of human liberty. But Muslims described that truth as “allegory”—that is, something other than philosophical truth. Their way of solving the problem was thus not to solve it, but rather to say it’s insoluble. Christians and Jews, by contrast, adopted a different solution. If “x” happens, then God eternally willed it. God knows necessary things necessarily and contingent things contingently.

The third difference between Judaism and Christianity on the one hand and Islam on the other had to do with what theologians call the unicity of intellect. Islamic thinkers thought they were following Aristotle, but they were not. With Aristotle, they had come to believe that each of us has two kinds of intellect. One is the “potential” intellect, by which we are open to understand all things. We receive impressions of the world, we take things in. But then there is an active, questioning, almost aggressive intellect, which goes out raising questions, deploying logic, calculation, and abstraction. This drive was then called the “active intellect.”

Some Islamic scholars took the view that while we each have a potential intellect—we are all receivers—there is really only one potential intellect in the world: the Divine intellect. They reasoned that when, as a fruit of inquiry or investigation, we have an insight, we come to share in an understanding that others have shared before. One difficulty with this interpretation is purely epistemological. If there is only one potential intellect for all, then it alone possesses all knowledge, and individuals would not have to discover knowledge for themselves. This is completely contrary to experience. Another difficulty is that this way of analyzing the act of understanding diminishes human liberty. It deprives human beings of their own personal acts of understanding. We are no longer creatures capable of individual insight and choice, the kind of creatures that the stories of Judaism and Christianity require.

Every story in the Bible is a story of how human beings use their will. Sometimes they say “yes” to God, sometimes “no.” King David in one chapter is faithful to his Lord, and in the next he is not. The suspense is always, “What will he do next?” And so the axis of every story in the Bible is the arena of human will. It is the most important theater of action in the world. In this arena, God offers human beings friendship: Will they accept it or not? That’s the drama of history. That drama hinges on human liberty, our capacity to say “yes” or “no.” The Jewish and Christian story is that God created the whole cosmos so that somewhere in it would be a creature with whom He could share His love or His friendship. And to human beings He offered His friendship, as to no other creature. That’s why human beings have a dignity beyond any other creature. That’s why the death of a cockroach or a fly presents no moral crisis—no wrong against the natural order, in which all things come to be and then perish. Yet the untoward death of humans is somehow a violation of the order of things.

God wants the friendship of free people, not slaves; we must be free to say “yes.” “The light shines in the darkness, and the darkness has not overcome it,” begins the Gospel of John. So freedom is at the heart of the Jewish and Christian story in a way that it is not at the heart of the Islamic story.

The Jewish and Christian story also unleashes human dynamism. In the eleventh century, the shoulder harness for horses and oxen was invented in the West. There was also the invention of the rear rudder for steering ships and the invention of mariners’ tools for plotting one’s position on the earth, which enabled men to go out on the ocean. There was the invention of eyeglasses and of magnifying glasses, as well as cogs and wheels that make watches and clocks possible. There was a sudden explosion of innovations—what’s called the first Industrial Revolution of the eleventh century—that starts making Western civilization the equal of Islam.

The fact that the people of the West believed that all humans are made in the image of God meant that they understood themselves to be called to create, to invent, to discover, to figure out how all things work. And the result was the great thrust of modern science, modern technology, the invention of a new form of political science, the process of “modernization,” and the invention of economics. Jews and Christians took joy in discovery. For them, work was a vocation; it was to be, in some sense, God-like.

In light of this history, how should Christians of good will respond to Islam today? The United States is now home for a great many Muslims. They are our fellow citizens. All of us have been thrown into a worldwide struggle for our own survival against terrorists. That’s a fresh reason—but not the only reason—why it’s our task to see if there are resources shared by Islam, Judaism, and Christianity that can help us to revisit those three ancient problems—the transcendence of God, human liberty, and truth.

Above all, we must speak for the rights of women in Islam and the poor within Islamic countries. We must help create the conditions for economic prosperity, democracy, and human rights in the Islamic world. When we speak of human rights, we cannot mean only American rights. We mean the rights of all humans, including the rights of Muslims.

We need to give voice to those rights. We would be unfaithful to ourselves if we did not. We should listen for echoes of this voice in the Islamic world. We saw the joy in Afghanistan when people were liberated from the Taliban. We see in Iran today young people taking to the streets in the name of freedom.

There is a widespread desire everywhere to have human rights declared, protected, and advanced. The world we should work for in the decades just ahead is a world in which Muslims, like every other people on this planet, are free to worship as conscience directs them; a world in which Muslims, like every other people, are free to inquire and study and write and speak; a world in which Muslims, like every other people, escape from poverty by the millions, and find abundant opportunity to employ their immense wealth of God-given talents to make a better Earth; a world in which Muslims, along with all other peoples, are free to practice the arts of democracy, civility, and all the fundamental human rights that are endowed in every man and every woman on this earth by our Creator, Who is One, and Who is Great. May His name be praised, in, by, and through liberty for all.

Published in First Things Online May 25, 2004, First published by First Things, November 2002

Copyright (c) 2002 First Things 127 (November 2002): 17-18.