The Wheatley Institution

The Alphabet of Man

Daniel N. Robinson
September 17, 2009

Related Videos

John D. Caputo

Post Modern, Post Secular, Post Religious
John D. Caputo

October 19, 2016


James H. Charlesworth

The Dignity of Human: Imago Dei
James H. Charlesworth

December 2, 2008


Darin Davis

Faith, Learning & Education for Wisdom
Darin Davis

September 30, 2015


Click here to view the transcript
Click here to hide the transcript

It is always a pleasure to return to BYU. Tonight, however, a certain melancholy is added to this pleasure, for I have the honor to contribute to a remembrance of Truman Madsen. I have a vivid recollection of the first time I met Truman and Mrs. Madsen. In just a few minutes, I recognized Truman as a man of great depth and great goodness. Those of you who knew him well must miss him mightily.

Eternal Man sits comfortably within my taxonomy of books. Many books are thick, but somehow thin. A few books are thin, but somehow thick. Eternal Man is just this sort of book, the kind that has you pausing frequently to reflect on a seemingly innocent sentence that then turns out to be charged with meaning. The final chapter of the book begins with the passage from Sir Thomas Browne's Religio Medici. It reads:     

There is surely a piece of the Divinity in us; something that was before the elements, and owes no homage to the Sun. Nature tells me that I am the image of God, as well as Scripture. He that understands not this much hath not his introduction or first lesson, and has yet to begin the alphabet of man.

Sir Thomas Browne was 18 when he entered Pembroke College, Oxford. Completing his undergraduate course of studies in 1626, he undertook his medical education on the Continent, at Padua and Leiden. Religio Medici displays the author's extraordinary literary style, and also his frequent departures from orthodoxy. The work was placed on the proscribed list by the Pope, thus ensuring an even wider readership. He was very much a patron of "the new learning," so recently and influentially advocated by Sir Francis Bacon. Was he, then, ahead of his time, a forward-looking and sophisticated skeptic?

Alas, the record here is mixed. Indeed, the record of Sir Thomas Browne's entire age is mixed, and in ways that might well inform our own time. In 1660, when Sir Thomas was already an established and celebrated figure, the Royal Society was founded. For a number of years it was gestating in the form of what Robert Boyle called "the Invisible College." Then, with the restoration of the Stuart monarchy, the Royal Society received a royal charter and in little time could claim the most accomplished membership of any learned body in human history—Robert Boyle, Christopher Wren, Robert Hooke, Isaac Newton, John Locke. It was this society that played so central a part in the cultivation of what we now take to be the modern scientific world view. Although never elected a fellow of the Royal Society, Sir Thomas was a fellow of the Royal Society of Medicine, a comparably forward-looking and scientifically oriented body.

So much for 1660. What about 1662? (I'm not going to give you every year up to 2009.) This, too, is an important year in the calendar of human judgment. It is the year in which Rose Cullender and Amy Deny were found guilty on thirteen counts of what the law referred to as "malevolent witchcraft." Their case was heard before one of the great judges in British history, Sir Matthew Hale. The women were executed, and the widely advertised trial proceedings were taken as a model closely followed across the ocean in a place called Salem. Thus, as the Royal Society worked diligently to unearth the material facts of earthly life and the laws of their relationship, the celebrated British legal system attempted to refine the procedures for identifying witches and imposing just penalties on them. Not only was Sir Thomas Browne present at that trial, but he actually composed a notable essay acknowledging the fact of witchcraft and dismissing skeptics as closet atheists.

My point, of course, is that we surely are a less than consistent species. One further bit of evidence on this point is again supplied by Sir Thomas, this one rather amusing. In Religio Medici he offered something of a lamentation on the means by which our Creator saw fit to have us procreate. He regarded the required conduct as revolting and ridiculous, wishing instead that we might have increased our number as he says, "like trees without conjunction." Nonetheless, he and his wife produced no less than a dozen children in eighteen years of marriage, so in this case reality gaining the upper hand in its relentless encounters with theory and speculation. What, then, of "the alphabet of man"?

For this, let me turn from Sir Thomas Browne and his seventeenth century to Wolfgang Köhler in the twentieth century. Psychologists and psychology students here will recognize Köhler as one of the founders of Gestalt psychology. He had a profound effect on both physiological and cognitive psychology, and he certainly anticipated much of what has come to be called "cognitive neuroscience." All that, however, is not my reason for turning to Köhler. Instead, it is the book he published in 1938 titled The Place of Value in the World of Facts. 

By this time, Köhler was on the faculty at Swarthmore College, having emigrated from Germany—and away from the rising power of the Nazis. In the first chapter of this work, he reflects on what Max Weber had referred to as "die Krise der Wissenshaft." Literally, "the Crisis of Science," but the phrase really is better understood as the crisis within the learned professions at large, chiefly as represented within the major colleges and universities, institutes and foundations. The crisis had risen to visible proportions in the First World War when the once politically neutral and officially "objective" undertakings of science were now redirected to serve national and narrow political interests. Weber's was one of the more audible voices proclaiming the objectivity of science, what he referred to as the vocation of science. For all its failings as a serviceable guide to life, science was nonetheless the rational model of human understanding in its fact-gathering mission; at least as Weber saw it. For Weber, what the First World War had made clear is the ease, the speed with which such a limited and dispassionate mission is transformed into a blunt instrument with which to condition and control the gullible.

Weber was wrong on so many important matters that I am reluctant to bring him into tonight's discussion. I doubt many still take seriously his casual linkage between the Protestant ethic and capitalism. It seems to me that, as of now, the only unapologetic capitalists found in significant numbers are in China! I also doubt that the aspiration to develop a value-free social science is either realistic or properly informed. Value, as Weber would have it understood, must be at the very foundation of free inquiry itself, not to mention the essential ingredient in what we take to be the integrity of science. However, what is of continuing importance in Weber's reflection is just this recognition of the lack of objectivity; this recognition of the tendency of the social sciences to serve masters rather than to serve truth. What is also of value, but surely unintended by Weber himself, is the clarity and the cogency of his attempts to comprehend human nature with sweeping theoretical inventions and cherry-picked data, all focused by an ideological lens designed to conceal a veritable world of troubling exceptions. Thus, I leave Weber now, and with warranted haste.

It is at just this point that Köhler should occupy our attention. He recognizes that the crisis, if anything, is now greater in 1935 than it was fifteen years earlier when Weber was writing. He speaks of a conversation—really a debate—he has had with an editor friend, a man of letters. Köhler takes the part of the man of science, insisting that the scientific worldview is what is needed to overcome this crisis. The editor will have none of this, insisting that what science produces is a large number of half-truths and what the editor calls "false facts." Köhler finds such a claim to be oxymoronic, for how can a fact be false? The editor then makes his case without much difficulty, and I shall follow his lead.

Suppose we are confronted by that proverbial visitor from Mars, coming to Earth in order to determine the sorts of creatures who live here, thereupon returning to Mars to submit a full report. Let us say the visitor locates a book titled The Physiology and Chemistry of Human Life. He corroborates the contents through long interviews with leading scientists. Returning the Mars, he offers this summary: "A human being is a body that is 50-75 percent water. The percentage of water depends on the total amount of fat. On average, each human being is comprised of enough sodium chloride to fill three salt shakers. In the infant stage, the average amount of potassium is between seven and eight grams."

Now, as you know, I could continue with this, listing all of the salts, the bone mass, muscle mass, as well as an inventory of reflexes and characteristic movements. There would be a limitless number of facts to report, each of them counted with great accuracy, and based on systematic and scientific study. But, of course, the question that arises is whether the Martian community, now in possession of all of these facts, has even the foggiest notion of just what a human being is! Offered as an answer to the question, "What is a human being?" this body of facts constitutes a deception—a falsehood. In the editor's terms, it's a body of false facts.

What concerned Max Weber nearly a century ago, and what troubled Wolfgang Köhler's friend in 1930s, is now so thoroughly ingrained in our culture as to go unnoticed. It is more or less taken for granted, by persons facing the moral and social dimensions of life in the modern world, that the surest guide to the right decisions and the right attitudes will somehow be supplied by science. The behavioral problems presented by their children will be dealt with either by way of pharmacology or by way of psychology. Problems arising within a marriage will be submitted for treatment, for some kind of cure by trained counselors. And morality itself, for which there are specific brain mechanisms, of course, is a cultural phenomenon, the details of which fall within the province of anthropologists and social science. To locate the correct position to take on matters of political and social consequence, one need consult only the most recent polls, generally broken down by gender, socioeconomic status, level of education, and regionally. The question is simply one of finding the "norm" that one matches up with and then, by a kind of retrofit, adjusting one's perspective accordingly.

Before considering whether or not science is the right sort of guide in all such areas--indeed, in any such area--it is important to recognize that the guidance it has already provided has a surprisingly vacillating character. I want to choose a timely but of course controversial example here to keep you alert. And so I shall choose the scientific understanding of homosexuality. I say "scientific," but only in so far as the fields of psychiatry and psychology are included within the larger scientific framework.

We might begin with where matters stood, or came close to standing, about thirty years ago. Representative of the movement of thought within psychology and psychiatry at the time is 1976 essay by Gerald Davison titled, "Homosexuality: The Ethical Challenge." Recall that this appeared two years after homosexuality had been removed from the Diagnostic and Statistics Manual (DSM) of the American Psychiatric Association. Davison notes the success therapists have had in treating those homosexuals desirous of transforming their sexuality and becoming "normal." There is no hint in his article of homosexuality being in any sense "immutable," genetic, or controlled by brain mechanisms. Rather, the author appeals to fellow clinicians to weigh the possibility that what is being treated is not a disease in the first place, and that the goal of therapy in such cases may have more to do with considerations of social acceptability than considerations of mental health.

This was taken at the time as a measure of forward thinking, a liberation from psychiatry's labored "medical model" which regarded any departure from conventional attitudes and behavior as a sign of possible pathology. Recall that at this same time there were initiatives within psychiatry to employ direct brain stimulation to assist the "inveterate homosexual" to overcome his disability. I could give you a side lecture on approaches at that time, which would involve implanting an electrode and simulating centers known to excite sexual arousal while the patient viewed pornographic heterosexual films to the point where, after a course of therapy, the inveterate homosexual actually asked for some real life opportunity to show that his attitudes had changed. This all went on at Tulane Medical Center, by the way, where the ever-resourceful staff and faculty, perhaps scouring the streets of New Orleans to find a woman eager to contribute to science, of course for a consideration, made herself available. As one of the leaders reported when I hosted a lecture by him, the patient "performed [and the adverb here is everything] admirably." So this was going on even as Davison's essay was making its way. The essay is thus usefully contrasted with that "medical model" that had held sway for so long. It surely contrasted sharply with Franz Kallman's 1952 "classic" study of the genetic foundation of homosexuality. Kallman not only reported a high concordance of homosexuality in identical-twin pairs—100 percent; this is an all-time record (it never works like that)—but related this to the then firmly held clinical judgment that which such sexual departures from the norm were part of a larger psychodynamic malfunction; part of what Kallman himself referred to, in quintessentially scientific terms, as "an originally disarranged sex constitution."

Well, where do matters stand in 2009? Today's litigation surrounding the issue of same-sex marriage routinely features data from published scientific articles establishing that one's sexual identity or sexual impulses or sexual conduct is—or is not—inborn, resistant to change, expressive of the function of specific neural pathways, etcetera. The influential journal SCIENCE offers an illustrative piece of research in a 1991 Simon LeVay article titled, "Differences in hypothalamic structure between heterosexual and homosexual men." Perhaps you are interested in the main effects, what we found poking around in the hypothalamus. Just in case you are interested, it was found that the relevant cell groups associated with male sexual behavior were twice as large in heterosexual men as in women, and in heterosexual men as in homosexual men. I think it is fair to say that had such a finding been available in the 1950s, it would have been a conclusive proof that homosexuality is not only a pathological condition but an instance of neuropathology, witnessed by the homosexual's "abnormal" cellular morphology. You see, the facts are helpless. Facts are just there. And then we come in and decide what we're going to say about them. Pick your year, pick your country, pick your political party, and I'll tell you what you are going to say about the facts. Meanwhile, the little facts just sit there, helpless, wondering what libel you will heap on them next.

If I might be permitted an aside in this connection, there are many instances in which, as a result of therapy or pastoral counseling or deep inward reflection, men and women do abandon homosexuality and undertake lives of loving and intimate association with members of the opposite sex. One can only wonder whether, in these cases, relevant portions of the hypothalamus now undergo some degree of hypertrophy. And what would we want to say about a culture deeply interested in that? Of course, just to ask the question is to expose the perspective as just the sort of simplification that Köhler's editor friend found to be so characteristic of the scientific reduction of the human condition to something other than the human condition—the scientific reduction of intimacy to something hypothalamic, you see. (In case you didn't notice, there's a lot of money in that.)

Let me emphasize here that I have neither the competence nor the conviction that would allow me to decide how best to understand homosexuality. History teaches, however, that what the law permits it encourages. With Edmund Burke, I am disinclined to jettison whole traditions and institutions that have served humanity well. With Burke, I am especially disinclined when the argument favoring a radical change rises no higher than a claimed "right" or a mere conjecture that would pose as a fact of nature. (Just to tell you, when I don't consult Burke on these, I might consult *Darthy *Pocker: "I don't care what you do, just don't frighten the horses.") I offer these remarks on the scientific understanding of homosexuality to make clear that the putative "facts" of science not only carry cultural and political weight—no matter how carefully concealed—but very often seem to be shaped and even "discovered" by way of factors that are themselves ineliminably political and ideological.

It is in this connection that the shifting status of homosexuality is again revealing. In the early 1970s, reacting to the classification of homosexuality as a treatable mental disorder, well-organized protesters appeared regularly at the offices and meetings of the American Psychiatric Association. Clearly in response to these petitions—some would say clearly in response to what was nothing less than harassment—the APA Board of Trustees agreed to remove homosexuality from the Diagnostic and Statistics Manual. It is worth noting, however, that when their action was submitted for a vote it was approved by only 58 percent of the general membership. Quite apart from the spectacle of a professional medical association essentially asking for a show of hands on such a matter, the results in 1974 make clear that a substantial number of practicing psychiatrists disagreed with the action. Alternative designations were used over the intervening years, but by 1987 the position of the APA was still at variance with what had long been accepted by, among others, the World Health Organization. Can you imagine disputes and votes of this kind taking place on the question of whether balls rolling down an inclined plane undergo acceleration?

I don't want this to be regarded as a commentary on homosexuality within a clinical, or for that matter, broadly social context. No e-mail, please! It is, instead, a reflection on the social sciences and even the medical sciences when the complexities of human behavior, the complexities of human values, are filtered in such a way as to serve what, in the end, is a political end, not a medical end. It is to abandon the mission to understand in favor of the impulse to control.

To understand an event is, among other considerations, to be able to explain it. Thus, when we claim to understand something, we make explicit an adopted position on the nature of explanation itself. Productive disputes within philosophy of science continue to focus on the nature of explanation and on the manner in which a scientific explanation is supposed to differ from what would be acceptable or expected in other and non-scientific areas of inquiry. Of course, central to these disputes is the question of just those features an event must have in order to render it suitable for scientific explanation.

Generalizations are hazardous in matters of this kind, but there is widespread agreement that events best suited for scientific treatment are those that allow repeated measurements with a view toward subsuming them under general laws.

There are good reasons to reject this model of explanation as being applicable to the human condition; applicable to important events in human history, in individual lives, in social and political contexts. Yes, complexity is surely one barrier to this form of explanation, but it is not the only barrier and it might not be the major one. The major barrier, I would submit, arises from the fact that significant psychological, social and moral engagements are highly individuated. I want to clarify this by way of an example.

I would ask you to recall the dates June 16-19, 1815. In what is now Belgium, over these several days, Napoleon's forces engaged and were vanquished by coalition-army led by Wellington. A Google search—(I know how to use Google, by the way. I do recall it was as recently as 2002 that a student told me he had "Googled" me, and I wasn't quite sure how to take that. I recall saying, "Did I know it when it happened?") A Google search of the terms "Battle of Waterloo—Facts" turns up 78 thousand hits. Scores of volumes have been written about the major participants, about Napoleon, about strategies formed and abandoned. To be sure, a sufficient number of features are there to establish that in fact, it was a battle that took place. These surely are features common to all encounters that we classify as hostile military engagements, but beyond these features, however, each such engagement is unique. You cannot replace Napoleon with someone else and still have the battle of Waterloo as we understand it.  And this for the obvious reason that understanding the battle of Waterloo requires, in large measure, understanding Napoleon. It is to understand motives and aspirations, thought-processes and earlier conditions disposing one toward one set of values over another. It is, as best as we can, to get into the mind and the life of a specific person and to see things as he does.

Let us narrow the scale and scope. How might we understand anyone? How might we understand you or understand me? We might begin by locating that person within a species, and then taking stock of how that species is to be distinguished from other species. Note, however, that it is not for science to legislate just which characteristics are legitimately included and which are irrelevant as we set out to understand someone. We may have good reason to include factors reasonably taken to be as important as species-membership: the characteristics of telling jokes, keeping pets, living under a rule of law, and loving God. These are characteristics, too, and it is not for science to legislate in or legislate out the discriminants we use in an attempt to understand a life, a person.

Presumably, though not all will agree on the relevance of every identifiable feature, there will be general agreement that some things true of almost all human beings are not found outside the human community. For the longest time language was thought to draw the dividing line. Aristotle was inclined to view our ability to comprehend and frame universal propositions as rendering us distinct, as far as he could tell, from nonhuman forms of life. The ability to understand and frame universal propositions, among other things, renders us fit for the rule of law.

As we pick and choose among candidate features that we are, metaphorically speaking, using to assemble the alphabet of man, the features, taken one by one, will then call upon us to assemble them in a manner that generates a coherent, accurate, and intelligible story. Assembling the alphabet is the starting point, and then we put the letters together to form propositions and then entire tales. The story is never quite complete, but it serves as an explanation—an explanation of what makes a given life distinctly human, different from others, fulfilled or unrequited; an explanation of what confers a special identity on a given historical epic, a given culture, a given achievement, that is distinguishable from others of its kind.

Obviously, if we choose an insufficient number of features, we will have too few letters in our alphabet, and thus find it impossible to write certain words and sentences. Reductionism in science is a summons to economize. It is an explanatory strategy with a noble heritage. We refine it with an instrument we call Occam's razor—the principle of parsimony—and we take proper pride when we are able to explain the widest range of phenomena with the fewest number of causal elements.

Good results have been achieved this way. It is a great achievement to discover that the force required to impart acceleration to an object is determined by the mass of the object. We praise Newton for that discovery, which, among other things, would help us reach the moon and return us safely to the earth.

It surprises some to learn that Newton was a consistent advocate of explanations based on Aristotle's notion of final causes. Granting that celestial dynamics is governed by the law of universal gravitation, Newton, when attempting to explain the whole picture, takes recourse in his Principia to what he calls "the design and dominion of an intelligent and powerful being." Intelligent design? Clearly Newton was victim of the "God delusion"!

It is a measure of Newton's greatness that he understood how the choice of a method and the choice of an event will come to dictate the very logic of our explanations. If all one seeks to explain is the fact of acceleration, one needs no more than the force applied to a given mass. If, instead, a larger question arises—How do we best explain the fact that the relationship is lawful at all in the first place?—then no number of repeated observations will be helpful in any way.

We must be careful in handling a razor. Occam's razor is no exception. It is sharp and, wielded carefully, it can strip away much that is irrelevant and distracting, or based on little more than prejudice and superstition. Had it been used as intended, those unfortunate women in England and Salem in the 1660s would have had a different and a kinder fate. Certain guidelines have been proposed for those who would use the razor with precision and care. But at all cost, one is to save the phenomenon, as the expression goes. What is meant is that no metaphysical presupposition, no mode of measurement or explanation should be adopted, where the net effect is to lose the phenomenon of interest that got you started on the inquiry in the first instance.

But herein lies the problem, for the maxim that would urge us to save the phenomena is of very little help in establishing just which phenomena are to be identified and preserved. We are all well aware, for example, of saintly and heroic acts. We may attempt to cover these with the term "altruism," now applied indifferently across very different "phenomena." Taking "altruism" to be nothing other than some form of sharing or self-sacrifice of advantage to others, even of disadvantageous to oneself, we now have a range of phenomena all answering to the same term, Altruism. With a strategy of this sort, the following account is both illustrative and inevitable.

"To investigate when chimpanzees might aid either humans or each other, researchers studied 36 chimps at Ngamba Island Chimpanzee Sanctuary in Uganda that were born in the wild. In experiments, each chimp watched a person they had never seen before unsuccessfully reach for a wooden stick that was within reach of the ape. The person had struggled over the stick beforehand, suggesting it was valued. Scientists found the chimpanzee often handed the stick over, even when the apes had to climb eight feet out of their way to get the stick, and regardless of whether or not any reward was given. A similar result with 36 human infants just 18 months old yielded comparable results."

What can one say? This is not the occasion for critical appraisal of research methods, nor am I at all skeptical about instances of non-human animals providing aid to each other. Clearly, if the phenomenon of interest is no more than handing over a stick, and if that action illustrates and even exhausts one's developed conception of altruism, then evolutionary accounts of how such behavior favors the survival of the species will become credible, if not convincing, if not total. But handing over a stick cannot be the phenomenon of interest to one who is seeking to understand the nature of moral judgment, the sense of duty guiding conduct toward others. If that is the subject of interest, then the observed overt behavior stands merely as a symptom calling for a diagnosis. We really can't say anything about it unless we know the motives and desires and judgments that, combined in such a way, render the behavior something of an imperative from the actor's point of view.

I have used the metaphor of the alphabet of man several times now, and as the very title of my remarks this evening. We can't be casual in choosing the letters that will form our vocabulary here, because we cannot be casual in assembling stories that might more fully disclose our defining nature and the possibilities that are imminent in that nature. Using Occam's razor is too often to deny ourselves a vowel or two, perhaps a much-needed consonant. How do we gauge the size of the necessary alphabet? I answer, from the stories already told, from human history and the study of lives, of creatures like us.

One story originates with the Big Bang--which I'm fully prepared to accept. I wasn't there, it's all right, something happened. (If I may speak in a self-disclosing way, the Big Bang never really mattered much in my thinking. I don't sleep much anyway, and the Big Bang story wasn't anything that was going to keep me awake. But I'll tell you what did keep me awake: that at the moment of the Big Bang, all of the laws of physics came in with it. That was something, and nobody would have guessed that. I mean, bang, and F=ma . . . .) The Big Bang, in time, distributes cosmic material in such a way as to render life possible. We are constituted out of the stuff that makes stars and galaxies, and to that extent we seem to have the mark of the original maker in our nature. To that extent we are of nature. On the evolutionary account, a biogenetic account, very long seasons and favoring conditions would move life toward ever more complex modes of expression.

On that same account, ours is an evolved nature, but here the evolutionary part of the story comes to a screeching halt. It does not show us by what process we came to be the cultural, political and aesthetic creatures we are; the moral creatures who, in their better moments, create the conditions supportive of a perfectionist impulse. The story that we've written from life in the cave to the present time is not anticipated in any of the earliest stages or phrases. All animals provide some form of shelter for themselves, but this surely is not a model of the Acropolis or the Cathedral at Chartres, neither of which was intended for shelter. Patterns of aggression are found throughout the animal kingdom, but only we are prepared to die for a principle, for a belief in something higher and more significant than our individual lives. I mention these things not to excite vanity, or to relegate the balance of creation to some secondary or unimportant status. Rather, I list just a few of the many parts of the overall story, the telling of which requires a robust and flexible alphabet.

Herodotus tells us of an encounter—probably apocryphal—between that man of legendary wealth, Croesus, and that man of legendary wisdom, Solon. Wishing to assure himself that he is as majestic as he believes, Croesus invites Solon to the palace, and asks him to name the man who Solon regards as the most fortunate. He is put off when Solon names Tellus, a man utterly unknown beyond the perimeter of his own city. Solon explains that Tellus had served valiantly in the military and was a source of great pride to his friends and family, revered in life and in death, and buried with full honors. What could be better?

Undaunted, Croesus asks who might occupy second place, only to learn of two more unknowns, Cleobis and Biton. These, we discover, are the virtuous sons of Cydippe, Priestess to the goddess Hera. Seeing that the oxen had not been yoked, and that their mother would be late in performing her temple duties, the young men yoked themselves to a cart and ran more than five miles, bringing their mother to her appointed rounds in a timely fashion.

Utterly exhausted, they found shelter under a shade tree, and they passed into a gentle sleep while, inside the temple, Cydippe supplicated Hera. Praising her sons for their many and manly virtues, she begs the goddess to allow her sons to die the most fortunate of men. Cleobis and Biton never awaken. Their young lives ended at a time when they had so fully realized the excellence of thought, of sentiment, of duty, of honor that nothing could ever again be more fulfilling.

There is something within us. It is useless to search for a name for it. If we attempt to hold it steady in consciousness, it darts away. If we count on a crowd around us to acknowledge it, by applause or earthly reward, we do run the risk of losing it. It seems to be repelled by what is merely earthly. Those of its features which we can glimpse more readily in other lives than in our own suggest at once a moral and aesthetic dimension, something of a harmony, something of a proportion and fitness. When it is sensed or felt, no matter how fleetingly, there seems to be an expansion of the very terms of life itself. Sir Thomas Browne was content to call it a Spirit. Having thus identified it, the good Doctor concluded his essay with these lines, as I conclude mine:

"Whosoever feels not the warm gale and gentle ventilation of this Spirit, though I feel his pulse, I dare not say he lives: for truly, without this, to me there is no heat under the Tropic; nor any light, though I dwelt in the body of the Sun."

Click here to hide the transcript