Photographs courtesy of Bill McDowell

All thinking about the good society, what is to be wished for in the way of life in community, necessarily depends on assumptions about human nature. All sorts of things have been assumed about human nature, and have been found persuasive or at least have been accepted as true over the course of history. We have had a long conversation in this country about class, race, ethnicity, and gender, how the moral, intellectual, and emotional qualities attributed to those in favored or disfavored categories create the circumstances of their lives, and, as they do so, reinforce an acceptance of the belief that these qualities are real, these characterizations are true. When there were no women in medical school or law school, or in higher education, it was easy to believe that they would not be able to endure their rigors. We in this country are fortunate to have a moderately constant loyalty to the idea of equality that has moved us to test the limits imposed by these cultural patterns, some of them very ancient, some of them once virtually universal and now still deeply entrenched in many parts of the world.

Of course we have not realized anything approaching this ideal. The meaning of it is much disputed—does it mean equality of opportunity or equality of outcome? Frankly, if we were to achieve either we might find that it resembled the other nearly enough to make the question moot. In any case, our failures, real and perceived, sometimes manifest as an anger with the project itself, and this distracts attention from the fact that we have made a very interesting experiment, full of implication, in putting aside traditional definitions and expectations and finding that when they are not supported culturally, which is to say artificially, they tend to fade away. We can learn from our own history that the nature of our species, and our nature as individuals, is an open question.

I do not draw any conclusions from the fact of our apparent malleability. Certainly it cannot imply perfectibility. Since we don’t know what we are, and since we have a painful and ongoing history of undervaluing ourselves and exploiting one another, we are hardly in a position to attempt our own optimization. Still, how can we find our way toward a fuller knowledge of ourselves? I have a favorite scientific fact that I always share with my students: The human brain is the most complex object known to exist in the universe. By my lights, this makes the human mind and the human person the most interesting entity known to exist in the universe. I say this to my students because I feel their most common problem is also their deepest problem—a tendency to undervalue their own gifts and to find too little value in the human beings their fiction seeks to create and the reality it seeks to represent. By means direct and indirect this problem has been educated into them.

I have a habit of browsing relatively respectable journalism to get a sense of the climate of opinion on this great subject, human nature. On CNN.com I came across an article which affirmed that liberals and atheists have higher IQs than conservatives and the religious. It explained the difference in terms of the tendency of intelligent people to act in ways that are not conventional, in the article a near synonym for “natural.” Liberal is defined for these purposes “in terms of concern for nonrelated people and support for private resources that help those people.” According to the evolutionary psychologist Satoshi Kanazawa, author of the study, which was done at the London School of Economics but with American data, “It’s unnatural for humans to be concerned about total strangers.”

These very confident statements about human nature always seem odd under scrutiny. If the study should not be taken to imply that conservatives are less religious than liberals, then it must be taken to imply that religious people are less inclined than others to feel concern for people to whom they are not related. I’m sure all of us can think of a thousand examples that argue against this conclusion. But then the article seems to have a special definition of religion. I quote: “Religion, the current theory goes, did not help people survive or reproduce necessarily, but goes along the lines of helping people to be paranoid, Kanazawa said. Assuming that, for example, a noise in the distance is a signal of a threat helped early humans prepare in case of danger. ‘It helps life to be paranoid, and because humans are paranoid, they become more religious, and they see the hands of God everywhere,’ Kanazawa said.”

Without assigning any truth value to any of this, for example, his comparison of these groups on the basis of intelligence, I found myself pondering the assumptions embedded in it. Since it is intelligence that distinguishes our species and inventiveness that has determined our history, by what standard should an unconventional act or attitude be called unnatural? How can human nature be held to another standard of naturalness than its own? Perhaps with our intelligence comes the capacity to know about and empathize with the problems of strangers, and this makes it natural for us to do so. On grounds of their intellectual and practical limitations, bears in Canada must be forgiven their apparent indifference to the fate of bears in China. Under other conditions—bigger brains, opposable thumbs, bipedalism—for all we know they might be model activists.

The article suggests that people of high intelligence actually intend to impress others through unnatural behavior, here philanthropy. This sweeps concern for strangers back into the great category of self-interested behavior, which would seem to make it natural after all, granting the assumptions of those who find self-interest at the origin of all behavior. It would be a simple fix to broaden the definition of “natural” to accommodate observed behavior, seeing that the means of doing so are so ready to hand. But supposedly this concern for strangers is liberal, and Kanazawa identifies as a “strong libertarian,” so perhaps a little tincture of self-interest has colored his conclusions.

It is characteristic of these queries into human nature that everything exceptional about us and about the situation in the world we have created for ourselves is excluded from consideration. It is as if a realistic view of the hummingbird required the exclusion of small size and rapid metabolism, or as if bees could only be understood minus their hives and their interest in pollen. There is actually some reason to worry about this kind of throwaway scientism, however transparently flawed, because versions of it are everywhere and because, whatever else it is, it is almost always presented as learned hypothesis if not outright “information” about our kind, assumptions about human nature presented as if they were objective truth and a reasonable and necessary basis for understanding reality.

Another article, this time from The New York Times Magazine, describes current thinking on the nature and function of the human brain as follows:

“[E]volutionary psychology tries to explain the features of the human mind in terms of natural selection. The starting premise of the field is that the brain has a vast evolutionary history, and that this history shapes human nature. We are not a blank slate but a byproduct of imperfect adaptations stuck with a mind that was designed to meet the needs of Pleistocene hunter-gatherers on the African savanna. While the specifics of evolutionary psychology remain controversial—it’s never easy proving theories about the distant past—its underlying assumptions are largely accepted by mainstream scientists. There is no longer much debate over whether evolution sculptured the fleshly machine inside our head. Instead, researchers have moved on to new questions like when and how this sculpturing happened and which of our mental traits are adaptations and which are accidents.”

This line of reasoning clearly assumes much, and implies much more, when it sorts human mental life into only two categories—adaptations that suit us to life on the primordial savanna and “accidents.” All sorts of creatures are suited to surviving in their environments—this should be obvious on its face. The world would be a very empty place if it were not in fact axiomatic. Our humanity consists in the fact that we do more than survive, that a great part of what we do confers no survival benefit in terms presumably salient from the Pleistocene point of view. This kind of thinking places everything remarkable about us in the category “accidental,” at least until some primitive utility can be imagined for it. If we were to step back and look at ourselves without preconception, if we were to say, for example, that we are what we do, then the fact of our biological kinship with the other creatures, which so far as I know has never been disputed, would not overshadow the indisputable fact that we are radically unique.

Every great question is very old. In the sixteenth century good John Calvin rejected “the frigid dogma of Aristotle,” using the human faculties that exist in excess of or apart from physical need to argue for the existence and immortality of the soul. He said,

“[T]he powers of the soul are far from being limited to functions subservient to the body. For what concern has the body in measuring the heavens, counting the number of the stars, computing their several magnitudes, and acquiring a knowledge of their respective distances, of the celerity or tardiness of their courses, and of the degrees of their various declinations? The manifold agility of the soul, which enables it to take a survey of heaven and earth; to join past and present; to retain the memory of things heard long ago; to conceive of whatever it chooses by the help of the imagination; its ingenuity also in the invention of such admirable arts, are certain proofs of the divinity in man.”

To say that these capacities in us are “imperfect adaptations,” accidental to our nature rather than essential to it, is to exclude them from the degree of reality enjoyed by adaptations appropriate to what is imagined as life in the Pleistocene, the period of the last great ice age. Of course it would be easy to make the case that our imperfect adaptations create the environment in which we as a species have lived for a very long time, and which have changed the terms of existence for all of life. Notably, they have given us the means to survive lesser ice ages and inclement weather generally. To put the matter another way, to begin with the assertion that we are primates after all, and on that basis to discount the vast differences between us and other primates, and to conclude on that basis that we are, when all is said and done, simply primates with a great many epiphenomenal qualities is circular reasoning to say the least. And it is always worth wondering what we really know about our cousins, the apes. Whatever else may be said about them, they also have traits that brought them through the Pleistocene. As does every creature that has been among us for ten or twenty thousand years—traits that differ from species to species as traits tend to do. And Homo sapiens sapiens is a species unto itself, which might be expected to discourage facile generalizations, even if we had secure accounts of the adaptations that brought us and our kindred through the Ice Age.

Child labor on one continent produced a plethora of cheap and disposable gadgets for another continent, which fouled and wearied the sea in their transit.

Other disciplines have adopted versions of humankind stripped to what is proposed as its essence, that is, minus its most distinctive characteristic, its complexity. In a recent column in the “New York Times,” David Brooks, with whom I almost never agree, traced the recent economic collapse to the fact that elaborate theoretical models were based on a stick-figure anthropology, the idea of “the perfectly rational, utility-maximizing autonomous individual.” This may seem incredible, looking back, but in fact I have had conversations with people who were entirely persuaded of the rightness of this model, and who could not be bothered with a glance at the historical record or at current affairs, or with a moment of introspection. We were awash in wealth, or something that looked like wealth, and the secret of unleashing yet more of it was adherence to the notion that markets were in some sense free, and should be even freer so that this perfect rationality and maximization of utility could have its full, beneficent effect. Simply stand out of the way, and the best of all possible worlds will emerge on its own, more or less inevitably.

This best of all worlds might not have been to one’s taste, since it seemed to move toward its fulfillment with a vigorous disregard for the fragility of the planet and the finitude of its resources, and since it was driven by a calculus of self-interest that was materialist in the strictest sense of the word. Child labor on one continent produced a plethora of cheap and disposable gadgets for another continent, which fouled and wearied the sea in their transit. No matter. There was a rationality in it all that made doubts about the value of it, objections to the destructiveness of it, sentimental and retrograde. And unenlightened. That invisible hand was shaping—who knows what, really. I can’t regret the fact that we will likely never know. It seemed to have the Midas touch, the ability to monetize virtually anything. Unlike that legendary king, who learned to lament his gift, it could also discount the worth of whatever resisted definition in its frankly mercenary terms.

Not so long ago there was a theory abroad that college professors should be paid per head of student consumer they attracted to their classes. The curriculum was to have been designed on the same basis, adjusting pedagogical supply to undergraduate demand. No more small classes in specialist fields, unless some corporation saw fit to underwrite them. No more retaining the capacity to teach in areas that might not be popular but might nonetheless have importance that in any present moment was unforeseeable. Whenever I hear monotheism or religious difference singled out as the great cause of conflict among peoples, I wish some part of the population at some time in their lives had been required to read Herodotus and Thucydides. The Commentaries on the Gallic War, that old staple of high-school Latin, could shed a little light on this very contemporary canard, a supposed insight that burst on us suddenly not because we had reached a pinnacle of enlightenment that allowed its truth to be realized at last but because whole literatures of relevant context had been, for all purposes, forgotten. How peaceful was the polytheistic world, in fact? Why did the nations so furiously rage together? Well, since we thought we knew all we needed to know about human nature, there seemed no longer to be any point in consulting human history.

Simple faiths tend to be driven to distraction by anomalies, and to bring an especially acerbic moralism to bear on whatever their belief systems cannot account for.

One might have thought that this proposed streamlining of the institution toward economic efficiency would at least have been self-consistent. This seems to be one thing economy would require. All consequences should be harmonious if not mutually reinforcing, presumably. But we all know what sort of thing will fill a college lecture hall. Charismatic professors and unconventional topics are an important element in the university experience, to be sure. Less-demanding classes sometimes make it possible for students to take on other classes whose material is especially daunting or whose demands are especially great. And lectures that everyone talks about are a powerful leaven in a community of learning. But none of these things yield the other presumed desideratum of the University of the Invisible Hand, that is, an efficient, economically competitive workforce. So if the university were rationalized to produce the best teacher-student ratio, putting aside other considerations, it would lose the ability to produce highly disciplined graduates with the knowledge bases that would make them effective participants in an evolving world economy—and the emphasis here should certainly fall on “world.” But to point this out was only to reveal a dismal failure of comprehension. Market forces would take care of the details, even the largest ones, and reconcile them all the more elegantly if only they were left alone to do their work.

It was a simple faith.

And like other simple faiths, it seemed for some reason to predispose its believers to indignation. It produced an oddly collectivist mentality, since whatever inhibited the working out of its great single law could be thought of as reducing the prosperity and economic freedom of every one—the emphasis here on “one,” since it imagined an oddly atomized collectivity. Why should taxpayers have to support someone who teaches classical Greek to thirty students a year? Why pay an enormous tuition so that someone can teach modern Chinese to forty students a year? These inefficiencies are in effect a tax on individual wealth that could otherwise go into the great stream of utility-maximizing autonomous individual self-interest.

And what about social arrangements that might reward uneconomic choices, that would tend to shield the hapless and the feckless from the consequences of their own errors and deficiencies? By virtue of the totalism of this model of reality, they are everybody’s business, everybody’s problem. The fact that some upstanding producer/consumer somewhere is permitting her own power of rational choice to be diminished because she acts on a sentimental loyalty to her ne’er-do-well cousin or her beleaguered fellow citizens becomes, in these terms, not only foolish but actually wrong. Simple faiths tend to be driven to distraction by anomalies, and to bring an especially acerbic moralism to bear on whatever their belief systems cannot account for. If Homo sapiens sapiens is also Homo economicus, why all these deviations from the norm? If self-interest disciplines choice, why is society at every scale shot through with arrangements that seem to inhibit or defeat self-interest? One possible explanation might be that these arrangements actually describe human nature, mingled thing that it is. For this reason they are surely more to be credited as information on the subject than is any abstract theory. But no. There is instead the urge, driven by righteousness and indignation, to conform reality to theory.

This tendency has become generalized beyond the self-declared objectivity of economics. Cultural patterns replicate by analogy much more readily than they extend themselves by logic. We live in a time in which certain rather startling words have crept into American political discourse—Fascist, Stalinist, Maoist. If they have any legitimate use in this context, it is perhaps to draw attention to the recurrence of this impulse to conform reality to theory, as these ideologies all did in their time. In each of these cases there were infuriating anomalies—called cancers, parasites, bacilli—and otherwise known as elitists, dissenters, subversives, foreigners, persons perceived as foreigners or as defectives or deviants or more generally as threats to or burdens on the body politic. History being the greatest ironist, those who use the words “cancer” and “elitist” with reference to their fellow citizens now also use the terms “Fascist” and “Maoist,” having added them to the lexicon of disparagement the Fascists and Maoists put to such effective use in dividing and devastating their own societies.

The economic theory described by Brooks is often called “capitalism,” and to point to its moral and aesthetic shortcomings is therefore viewed in some quarters as unpatriotic, though, as Brooks points out, this late permutation of American economic theory has done the country catastrophic harm. Capitalism is presented as quintessentially American, though this form of it is deeply, and for some intolerably, at odds with many of our institutions, for example our venerable postal system. Noah Webster’s 1840 edition of An American Dictionary of the English Language does not include the word capitalism. It does define “capitalist” as follows: “A man who has a capital or stock in trade, usually denoting a man of large property, which is or may be employed in business.” This definition implies nothing like an economic system, let alone an ideology. I realize that my drawing attention to this fact in certain quarters might set off a hectic search for Webster’s birth certificate.

My point is that our civilization has recently chosen to identify itself with a wildly oversimple model of human nature and behavior and then is stymied or infuriated by evidence that the models don’t fit.

Nevertheless, it seems important to note, before we ransack 400 years of cultural development in the name of making the country more purely itself, that neither the word nor the concept is discoverable among our founding documents. It is everywhere, under the name “political economy,” in Britain at that time, as it had been for generations. But Britain was the great power from which we were attempting to differentiate ourselves. Webster’s American Dictionary offers a very general definition: “Political economy comprehends all the measures by which the property and labor of citizens are directed in the best manner to the success of individual industry and enterprise, and to the public prosperity. Political economy is now considered a science.” Again, there is no suggestion of system or ideology, and nothing that particularly associates this “science” with America. The 1840s Webster’s American Dictionary does include a definition of “socialism”: “A social state in which there is a community of property among all the citizens; a new term for Agrarianism.” And it defines “communism” as a “community of property among all the citizens of a state: a state of things in which there are no individual or separate rights in property; a new French word, nearly synonymous with agrarianism, socialism and radicalism.” So some version of the modern vocabulary was current in 1840, the absence of “capitalism” being more interesting for this fact.

I suspect we have never come up with a term to distinguish our economy from others, though it is unique in important ways and has served us remarkably well. In fact it was not a system but a patchwork of experimentation well into the twentieth century. There is a fine, old, quintessentially American word, “pragmatism,” that should serve well enough to describe the non-ideological way we went about our national life in the days of our expanding prosperity. It did not have its full modern sense in 1840—Webster defines “pragmatically” first as “In a meddling manner, impertinently,” but also as “in a manner that displays the connections and causes of occurrences.” This is the essence of it for our purposes, engagement with reality as we encounter it in the world of experience. A practical response to occurrences, mysterious as they are, demanding as they are of vigilant observation and whatever can be mustered in the way of objectivity. Once we were innovators. Once we were credited with ingenuity.

My point is that our civilization has recently chosen to identify itself with a wildly oversimple model of human nature and behavior and then is stymied or infuriated by evidence that the models don’t fit. And the true believers in these models seem often to be hardened in their belief by this evidence, perhaps in part because of the powerfully annealing effects of rage and indignation. Sophisticated as we sometimes claim to be, we have by no means evolved beyond this tendency, are deeply mired in it at this very moment, and seem at a loss to think our way out of it.

Yet there are other ideas floating around in the general culture, or fragments of information that could be the basis for other kinds of thinking, if we gave them any part of the credence we extend so willingly to the most brutally reductionist of these theories and their ilk. An article appeared recently in the Science section of The New York Times that described the discovery of stone tools on the island of Crete. According to the article, the tools are “at least 130,000 years old, which is considered strong evidence for the earliest known seafaring in the Mediterranean and cause for rethinking the maritime capabilities of prehuman cultures.” If the writer is making a precise use of the term, I would consider this discovery cause for rethinking the definition of the word “prehuman,” and therefore the word “human,” taking behavior rather than anatomy as the set of traits by which humanity should be distinguished. The article goes on to say that “the style of the hand axes suggested that they could be up to 700,000 years old,” since they “resemble artifacts from the stone technology which originated with prehuman populations in Africa.” Again, I find it a little startling to find the words “culture” and “technology” associated with creatures excluded from the category “human.” Excluded on the basis of their physical configuration and cranial capacities, of course, but apparently capable of acting effectively on complex intentions, and of sustaining within their populations a body of skills worthy of the name “technology.” This should suggest that the ability to teach and learn and to sustain skills and knowledge over generations not only preceded but formed modern man. According to the Times article, this discovery on Crete appears to be evidence that these prehumans “had craft sturdier and more reliable than rafts” and “must have had the cognitive ability to conceive and carry out repeated water crossings over great distances in order to establish sustainable populations producing an abundance of stone artifacts.” How many generations of refinement and transmission of skills would be required to produce these crafts, good enough and numerous enough over time to make this migration and colonization possible? Presumably life spans were short, and this would have accelerated the process of teaching and learning, since mastery would have to have survived despite the early deaths of most ancient artisans and sailors. In other words, it seems reasonable to assume that life among these hominids must have been quite intensely cultural and collaborative.

I am still using a journalistic source, so perhaps I am too easily impressed by talk of stone artifacts and the geological strata that yield them. All the same, comparison with the other articles I have looked at does draw attention to the fact that they proceed entirely—I think it is fair to use the word “entirely”—by inference. The ancient hominid who weathered the last ice age figures decisively in scientific understanding of the human brain—which is really to say of human nature. Yet the hominid itself is essentially hypothetical, the creature of theory. In him or her we can recognize Brooks’s utility-maximizing autonomous individual, the very creature, in the guise of the modern producer/consumer, that has haunted our economics departments for these last decades. The hominid could probably even be called perfectly rational, as animals generally are when they negotiate the conditions of their survival.

I wish to suggest that there is more than coincidence at work here. Modern theories of human nature, which are essentially Darwinist and neo-Darwinist, pare us down to our instincts for asserting relative advantage in order to survive and propagate. This dictum hangs on our essential primitivity as they understand it—assuming that our remote ancestors would have been describable in these terms, and that we, therefore, are described in them also. But it seems worthwhile to remember that this is a modern theory projected onto the deep past. Then the past, seen through the lens of this theory, becomes the basis for interpreting the present. And the observed persistence of these archaic traits in modern humanity affirms the correctness of this characterization of our remote ancestors, which goes to prove that these archaic traits do in fact persist in us. The endless mutual reinforcement distracts attention from the fact that it is all hypothetical. We know precious little about those dwellers on the savannas of the Pleistocene, and, as Brooks points out, we clearly know precious little about ourselves.

By some standards, 170,000 years ago is the blink of an eye. But it does take us back to the Pleistocene. Those prehuman colonizers of Crete seem to have left tangible information about themselves, if the report is to be believed, and this sets them apart from hypothetical prehumans. They could navigate at great distances over open water. Perhaps they had considered the heavens and had found a practical use for a knowledge of the stars. We have half-smothered out the stars with our cocoon of artificial light, but the ancients seem to have watched them endlessly. To consider means, etymologically, to take account of the stars, for the purpose of making a decision. Etymologically, a disaster is a bad star. These words are from Latin, which came late into the world, but which expresses a prescientific confidence in the inter-involvement of the cosmos and humankind. This sort of thing is reckoned primitive, so why should it not be among our primal traits? Perhaps it is excluded because it looks too much like metaphysics.

I am extrapolating, too. I can’t help but wonder what history lay behind all this prehuman skill and purpose. So far as there is a visible trajectory from primitivity to present time, it suggests that creative intelligence appeared early. The neo-Darwinists would say this is all just an effect of genes seeking to propagate themselves. Of course the same might be said of every feature of all gene-bearing life, while the great interest of life lies in the fantastic differences among its forms. This is an instance in which a theory that explains everything really does explain nothing. It is rather like saying that life is an expression of the tendency of complex molecules to form in the bellies of stars. However true this may be, there is clearly a great deal more to the story. After all, human intelligence is not just a compliment we pay ourselves. It is a phenomenon of great interest in its own right. If it is rooted as deeply in our origins as artifacts suggest, in effect preexisting us by many thousands of years, and if its artifacts suggest teaching and learning, culture and cooperation, then surely we should be less invested in the low estimate of our ancestors, therefore ourselves, on which modern anthropologies depend. We should drop the pretense that we know what we don’t know, about our origins and about our present state. Specifically, we should cease and desist from reductionist, in effect invidious, characterizations of humankind.

I would like to propose a solution of sorts, ancient and authoritative but for all that very sporadically attended to. What if we were to say that human beings are created in the image of God? It will certainly be objected that we have no secure definitions of major terms. How much do we know about God, after all? How are we to understand this word “created”? In what sense can we be said to share or participate in the divine image, since the Abrahamic traditions are generally of one mind in forbidding the thought that the being of God is resolvable to an image of any kind?

The subject was of interest to me in the first place because I have felt for a long time that our idea of what a human being is has grown oppressively small and dull.

But it is on just these grounds that this conception would rescue us from the problems that come with our tendency to create definitions of human nature that are small and closed. It would allow us to acknowledge the fact, manifest in culture and history, that we are both terrible and very wonderful. Since the movement of human history has been toward a knowledge and competence that our ancestors could not have imagined, an open definition like this one would protect us from the error of assuming that we know our limits, for good or for harm. Calvin understood our status as images of God to have reference to our brilliance. He said, truly and as one who must have known from his own experience, that we are brilliant even in our dreams. There is much that is miraculous in a human being, whether that word “miraculous” is used strictly or loosely. And to acknowledge this fact would enhance the joy of individual experience and enhance as well the respect with which we regard other people, those statistically almost-impossible fellow travelers on our profoundly unlikely planet. There is no strictly secular language that can translate religious awe, and the usual response to this fact among those who reject religion is that awe is misdirected, an effect of ignorance or superstition or the power of suggestion and association. Still, to say that the universe is extremely large, and that the forces that eventuate in star clusters and galaxies are very formidable indeed, seems deficient—qualitatively and aesthetically inadequate to its subject.

I have made a long and indirect approach to my subject—the human spirit and the good society. The subject was of interest to me in the first place because I have felt for a long time that our idea of what a human being is has grown oppressively small and dull. I am persuaded as well that we educate ourselves and one another to think in terms that are demeaning to us all. I mean “educate” in the widest sense. The culture is saturated with information about the expectations we have of ourselves and one another. What we are taught in classrooms is a very minor part of what we learn from ambient experience, which will teach us that learning itself is suspect, not an attempt at some meaningful vocabulary of reflection but instead an affectation, a kind of idleness or triviality that deserves the name “elitism,” and that has no purpose except to assert a claim to superiority.

Here the word “Maoist,” used so loosely these days, really does come to mind. I had a Chinese student once who wrote movingly about a colony of exiles to the frontier of Mongolia who were treated as enemies of the people because they were mathematicians, or because they played the cello. This was done in the name of democracy. I hardly need to mention to this audience that if such standards had been applied at the time of the American Revolution, our democracy would have deprived itself of that whole remarkable circle we call the Founding Fathers, and your own Mr. Jefferson would have been the first to suffer denunciation. The Constitution, to which appeal is made so often these days, could never have been written. We are profoundly indebted to the learnedness, in fact the intellectualism, of the Founders, and if we encouraged a real and rigorous intellectualism we might leave later generations more deeply indebted still. But the current of opinion is flowing in the opposite direction. We are in the process of disabling our most distinctive achievement—our educational system—in the name of making the country more like itself. Odd as the notion might sound, it is well within the range of possibility. To cite only one example, I have seen trinkets made from fragments of Ming vases that were systematically smashed by Mao’s Red Guard. If we let our universities die back to corporate laboratories and trade schools, we’ll have done something quieter and vastly more destructive.

The lowering of ourselves in our own estimation has been simultaneous with the rise of an egoism based on the assumption that it is only natural to be self-serving, and these two together have had a destructive effect on public life. To cite only one example: historically the United States has educated far more people far more broadly and at far greater length than any other civilization in history, and yet the notion is pervasive and influential that we as Americans are hostile to learning. Our colleges and universities—the greatest in the world by any reckoning—have come to be seen as anomalies because the love of learning that built them by the thousands is no longer considered a national trait, indeed, is considered a thing alien to us, despite such formidable evidence to the contrary. I know from visiting all sorts of institutions everywhere in the country that even the smallest college is a virtual Chautauqua of conversation and performance that binds it, together with its community, into national culture and world culture. I know from teaching and traveling elsewhere in the world that the role of higher education in this country is very exceptional.

Yet all this is unacknowledged as we sink deeper and deeper into the habit of mutual condescension, tending always toward mutual impoverishment, insofar as we can still consider ideas and information an essential form of national wealth. Journalism is an especially important instance of this phenomenon. The churches are, in too many cases, another. Over and above specific instances, and behind them, is a drift toward cynicism and away from mutual respect and from willingness to take responsibility for our life as a community and a culture. I know it is impossible to say this without seeming to idealize a past that was dreadful in many respects. But the difference between the evils of the past and the ameliorizations of the present is the measure of the willingness of earlier generations to acknowledge and act on needed change. These reforms were made in the name of justice. Justice is reckoned on the basis of our obligations to one another. These obligations are different, lower or higher, depending on the worth we are willing to grant one another.

As I said earlier, in the great matter of human nature, we seem to be able to be persuaded of anything, about ourselves and about others, as groups and as individuals. Granting these astonishing brains we carry around with us, granting the miraculous intricacies of the nervous systems of everyone we pass on the street, we seem to find nothing that will securely anchor ourselves or our species in our estimation. The very idea of human exceptionalism is held up to scorn, as if our doings on this planet were not wildly exceptional, whatever else may be said about them.

Thomas Jefferson wrote, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” This is the kind of thinking I would like to recommend. We don’t know the nature of Jefferson’s religious beliefs, or doubts, or disbeliefs. He seems to have been as original in this respect as in many others. But we do know he had recourse to the language and assumptions of Judeo-Christianity to articulate a vision of human nature. Each person is divinely created and given rights as a gift from God. And since these rights are given to him by God, he can never be deprived of them without defying divine intent. Jefferson has used Scripture to assert a particular form of human exceptionalism, one that anchors our nature, that is to say our dignity, in a reality outside the world of circumstance. It is no doubt true that he was using language that would have been familiar and authoritative in that time and place. And maybe political calculation led him to an assertion that was greater and richer than he could have made in the absence of calculation. But it seems fair to assume that if he could have articulated the idea as or more effectively in other terms, he would have done it.

What would a secular paraphrase of this sentence look like? In what nonreligious terms is human equality self-evident? As animals, some of us are smarter or stronger than others, as Jefferson was certainly in a position to know. What would be the non-religious equivalent for the assertion that individual rights are sacrosanct in every case? Every civilization, including this one, has always been able to reason its way to ignoring or denying the most minimal claims to justice in any form that deserves the name. The temptation is always present and powerful because the rationalizations are always ready to hand. One group is congenitally inferior, another is alien or shiftless, or they are enemies of the people or of the state. Yet others are carriers of intellectual or spiritual contagion. Jefferson makes the human person sacred, once by creation and again by endowment, and thereby sets individual rights outside the reach of rationalization.

My point is that lacking the terms of religion, essential things cannot be said. Jefferson’s words acknowledge an essential mystery in human nature and circumstance. He does this by evoking the old faith that God knows us in ways we cannot know ourselves, and that he values us in ways we cannot value ourselves or one another because our intuition of the sacred is so radically limited. It is not surprising that the leader of a revolution taking place on the edge of a little-known continent, a man clearly intent on helping to create a new order of things, would attempt an anthropology that could not preclude any good course history might take. Jefferson says that we are endowed with “certain” rights, and that life, liberty, and the pursuit of happiness are “among these.” He does not claim to offer an exhaustive list. Indeed he draws attention to the possibility that other “unalienable” rights might be added to it. And he gives us that potent phrase “the pursuit of happiness.” We are to seek our well-being as we define our well-being and determine for ourselves the means by which it might be achieved.

This epochal sentence is a profound acknowledgment of the fact that we don’t know what we are. If Jefferson could see our world, he would surely feel confirmed in the intuition that led him to couch his anthropology in such open language. Granting the evils of our time, we must also grant the evils of his and the cultural constraints that so notoriously limited his vision. Yet, brilliantly, he factors this sense of historical and human limitation into a compressed, essential statement of human circumstance, making a strength and a principle of liberation of his and our radically imperfect understanding.

 

Excerpted from “When I was a Child I Read Books” by Marilynne Robinson, to be published by Farrar, Straus, & Giroux, March 2012. © 2012 Marilynne Robinson

Marilynne Robinson

Marilynne Robinson is the author of novels Housekeeping (FSG, 1981); Gilead (FSG, 2004), winner of the Pulitzer Prize; and Home (FSG, 2008), and of three books of nonfiction Mother Country (FSG, 1989), The Death of Adam (1998), and Absence of Mind (2010). She teaches at the University of Iowa Writers' Workshop.

At Guernica, we’ve spent the last 15 years producing uncompromising journalism.

More than 80% of our finances come from readers like you. And we’re constantly working to produce a magazine that deserves you—a magazine that is a platform for ideas fostering justice, equality, and civic action.

If you value Guernica’s role in this era of obfuscation, please donate.

Help us stay in the fight by giving here.