Archive for the ‘History’ Category

The Politics of Perplexity in Twenty-First Century America

Friday, July 17th, 2020

In the context of twenty-first century America, “politics” is perhaps one of the most curiously irritating words in the English language. I know from personal experience – whether from observing others, or from paying attention to myself – that there is a visceral reflex to feel something between annoyance and disgust upon hearing the word. If politics rears its ugly head, you may think something along the lines of “I’ve had enough of that, thank you!” before rapidly extricating yourself from an unwanted intrusion into an otherwise perfect day. Alternatively, I suspect many of us know people who hear the word “politics” or some related term and can immediately launch into an ambitious lecture on what is wrong and what should be done that somehow promises (implausibly) to solve all our social, political, and economic problems in one fell legislative swoop. We’re surrounded by bitter disputes – online and on television, in print and in person – over political issues, to the extent that it can be hard to stomach contemplating (much less discussing) politics without feeling a little irritated, even disgusted, with both our neighbors and ourselves.

These powerful emotional reactions should give us some pause for reflection. In theory, if not always in practice, the United States of America is a democratic republic, ruled by representative officials in the name of its citizenry. Even without considering the matter deeply, it should be clear to us that such a government cannot function if its citizens are entirely disengaged, as radical factions across the political spectrum will be left to do the politicking on our behalf. Whether we like it or not, our nation’s political life will likely remain interested in us even if we are uninterested in return. We might as well make the best of it, and get down to the business of figuring out where, exactly, we went wrong, and what might be done to repair the damage.

Since the early twentieth century, the predominant approach to teaching American students about their form of government has been in the form of what is known as political science. This perspective is primarily (though not exclusively) concerned with educating students about the practical mechanics of their government and the political dynamics of the American electorate – in short, the branches of the United States government, their differing roles and jurisdictions, group behavioral dynamics, and so forth. All of these political institutions and phenomena are generally treated as abstractions that can be measured and predicted with some degree of accuracy using scientific methodology and data analysis.

The meaning of political science must be carefully qualified and defined. Science is derived from the Latin scientia, or knowledge. The majority of ancient, medieval, and early modern political thinkers used the term political science to refer to the study of politics as a domain of the humanities. They studied politics in light of inquiries in philosophy and history: they did not, as a general rule, conceive of the art of government as something that could be understood as an institutional abstraction that operated independently of the deepest human needs and desires (such as for law and virtue), or the eternal problems that confront every human individual and society (what is justice and truth, and how de we find them?). Above all else, classical political science aimed at cultivating self-governing (moderate) individuals that would be capable of wielding political power responsibly while refraining from tyrannical injustice. Hence, in the conclusion of Plato’s Republic, Socrates teaches Glaucon that the highest end of political science is to teach the soul to bear “all evils and all goods… and practice justice with prudence in every way.” (Republic, Book X, 621c).

Modern political science operates on an entirely different basis and different assumptions about human beings and political life. It begins with the premise that human beings, like all natural things, are subject to mechanical laws that render them predictable. Once these laws are understood, the political life of human beings can be mastered and directed towards progress (understood as material comforts and technological innovation) to a degree that was never remotely possible in prior eras of human history. This view of political science emerged first among certain thinkers of the Enlightenment, and became a close companion to the development of the entire field of social science in the late nineteenth century. Both modern political and social science emerged from a common intellectual project that aimed to apply modern scientific methods and insights to the study of very nearly every aspect of human communal life – economics, social dynamics (sociology), religion, sexuality, psychology, and politics, among others.

This application of human technical knowledge to endemic social problems, economic systems, and political institutions (among other domains of human life) was expected to deliver unprecedented advances that would mirror and eventually surpass the tremendous technological and intellectual achievements of the Scientific Revolution. Max Weber, a social scientist of incredible imtelligence and one of the most brilliant minds of the early twentieth century, fully expected that the complimentary discoveries of both natural and social science would ensure that human “progress goes on ad infinitum.” For many intellectuals in Europe and the United States in Weber’s day, human social and political life had become like a machine that could be kept in a perpetual state of inexorable forward motion. This view remains a powerful one within certain spheres of the social sciences and general public, and has been articulated perhaps most eloquently in the public sphere by the Harvard psychologist Steven Pinker, among others, even if it is gradually declining in popularity among the greater mass of the American citizenry.

Academically, this modern scientific approach to understanding American government had many apparent advantages that explain both its widespread acceptance and its continued influence within the academy. For one, it enabled teachers to focus on explaining the structure of U.S. government with a focus on the technical mechanics of government that can be mastered intuitively by most students, regardless of their particular political views and prejudices. Similarly, it relieves teachers and students of having to focus on tiresome historical minutia or obscure philosophical debates that bear no obvious relevance to contemporary issues: students can study their government based on recent experiences that are more easily comprehensible for them than those of, say, two hundred years ago. Above all else, contemporary political science treats the study of American government in utilitarian and mechanistic terms, thereby minimizing occasions for awkwardly passionate or unsolvable confrontations over thorny issues that touch on moral as well as historical and philosophical complexities. What many students will learn from this education is that the American form of government is perfectly reasonable, orderly, and balanced, with predictable mechanics that ensure its stability and perpetuity; in short, it makes sense. And not only does the American government operate like a well-oiled machine, but it also leaves individuals tremendous room to define themselves and act within an ever-expanding horizon of freedoms. Government exists mainly to resolve practical matters of policy and administration, leaving moral questions largely to the domain of the private sphere.

Many may rightly ask: if this model is true, then why does the American government function so poorly in practice? And why are Americans so remarkably inept at finding common ground for resolving pressing political issues? Indeed, there are alarming trends that should inspire us to doubt the viability of this interpretation. Polling conducted over the past decade consistently shows that Americans of all political persuasions are increasingly distrustful of both their governments and of their fellow citizens who hold opposing views. Rigid ideological voices have emerged among both liberal and conservative parties that insist that dialogue is impossible and compromise on any issue is a sign of political weakness, and that a candidate’s quality should be determined by ideological considerations rather than by competence and experience. As electoral politics have devolved into brutal slugging matches between increasingly extreme views, the actual levers of political power have gradually shifted into the hands of a theoretically subordinate but frequently unaccountable and inefficient bureaucracy.

The fruits of this widespread culture of distrust has been the breakdown of civic life and political order amidst frustration and mutual recrimination throughout American society. Many are understandably frustrated with a system of government that seems incapable or unwilling to fulfill its most basic functions. For that matter, generations of young Americans have now grown up in the shadow of a dysfunctional government that leaves them with little incentive for acting as responsible and engaged citizens. It should be no wonder that there are now voices who now ask questions such as the following: if our current Constitution is a product of eighteenth century political circumstances and ideals, should we not perhaps craft a new political system that is better adapted our contemporary needs and values?

Perhaps these are all passing fads, and some bearable equilibrium will return in short order. I am doubtful that such an event is likely in the near future. Recent events have shown that contemporary Americans of all political stripes are divided not merely by petty partisan differences over policy decisions and electoral contests, but even more importantly by fierce disagreements over fundamental questions about the nature of political life and American civic identity that transcend mere partisan disagreement, and we are not remotely close to resolving these disputes. What is it to be a human? What is freedom? What is justice? We do not have common answers for any of these fundamental questions, nor do we seem (at least, as of this writing) to have a clear direction for amicably resolving these disputes in the public sphere.

Yet these disputes, however unpleasant and acrimonious, provide us with a hint of where, exactly, we may have gone wrong. Far from liberating us from antiquated concerns, our modern political education (and the novel mode of thought that created it) may lie at the heart of our perplexity. Modern political science has worked tremendous wonders in allowing us to track the chimerical shifting of public whims in opinion polls or understand the psychology of group dynamics, but it has also obfuscated our ability to grapple with and comprehend problems that are part of the permanent condition of our species. Political institutions and policy alone cannot solve America’s most vexing problems. And we should remember that representative government depends ultimately on the qualities of both officeholders and voters to function properly; institutions abstracted from the body politic cannot rule themselves. Our government, as John Adams observed in 1798, “was designed for a moral and religious people. It is wholly inadequate to the government of any other.” Adams thought that republican government could not exist without some degree of self-government among the citizenry, or else it must devolve into a mass of petty tyrants; we are, perhaps, in the process of proving his point for him.

I suspect that the root of modern American political dissatisfaction is not so much in our continued subjection to an apparently antiquated form of government, nor merely in our frustration with the peculiar idiocies of our political parties, but rather in our own failure to accurately comprehend and utilize our form of government. In an era of change and tumult, we would do well, as the American novelist and essayist John Dos Passos put it in 1941, to “look backwards as well as forwards” as we attempt to extricate ourselves from our current political predicament. While we may face many distinctly twenty-first century problems in certain respects, our most pressing problems – justice, love, truth, goodness, and so forth – are as old as the human species. We live in troubled times: but so, too, did prior generations of Americans. I hope that, if we can find it in ourselves to turn back and reconsider the first principles of American government, its deep roots in English political life and philosophy, we may yet discover a firm foundation that will give us a lifeline from our current perplexity, and enable us to engage more fully in a life of dutiful, informed, and responsible citizenship that can be passed on to future generations.

Unprecedented?

Saturday, July 11th, 2020

I have to date remained silent here about the COVID-19 pandemic, because for the most part I haven’t had anything constructive to add to the discussion, and because I thought that our parents and students would probably prefer to read about something else. I also try, when possible, to discuss things that will still be of interest three or even ten years from now, and to focus largely on issues of education as we practice it. 

Still, COVID-19 has obviously become a consuming focus for many—understandably, given the extent of the problem—and what should be managed in the most intelligent way possible according to principles of epidemiology and sane public policy has become a political football that people are using as further grounds to revile each other. I’m not interested in joining that game. Knaves and cynical opportunists will have their day, and there’s probably not much to do that will stop them—at least nothing that works any better than just ignoring them.

But there is one piece of the public discourse on the subject that has shown up more and more frequently, and here it actually does wander into a domain where I have something to add. The adjective that has surfaced most commonly in public discussions about the COVID-19 epidemic with all its social and political consequences is “unprecedented”. The disease, we are told by some, is unprecedented in its scope; others lament that it’s having unprecedented consequences both medically and economically. The public response, according to others, is similarly unprecedented: for some that’s an argument that it is also unwarranted; for others, that’s merely a sign that it’s appropriately commensurate with the scope of the unprecedented problem; for still others, it’s a sign that it’s staggeringly inadequate.

As an historian I’m somewhat used to the reckless way in which the past is routinely ignored or (worse) subverted, according to the inclination of the speaker, in the service of this agenda or that. I’ve lost track of the number of people who have told me why Rome fell as a way of making a contemporary political point. But at some point one needs to raise an objection: seriously—unprecedented? As Inigo Montoya says in The Princess Bride, “You keep using that word. I do not think it means what you think it means.” To say that anything is unprecedented requires it to be contextualized in history—not just the last few years’ worth, either.

In some sense, of course, every happening in history, no matter how trivial, is unprecedented—at least if history is not strictly cyclical, as the Stoics believed it was. I’m not a Stoic on that issue or many others. So, no: this exact thing has indeed never happened before. But on that calculation, if I swat a mosquito, that’s unprecedented, too, because I’ve never swatted that particular mosquito before. This falls into Douglas Adams’ useful category of “True, but unhelpful.” Usually people use the word to denote something of larger scope, and they mean that whatever they are talking about is fundamentally different in kind or magnitude from anything that has happened before. But how different is COVID-19, really?

The COVID-19 pandemic is not unprecedented in its etiology. Viruses happen. We even know more or less how they happen. One does not have to posit a diabolical lab full of evil gene-splicers to account for it. Coronaviruses are not new, and many others have apparently come and gone throughout human history, before we even had the capacity to detect them or name them. Some of them have been fairly innocuous, some not. Every time a new one pops up, it’s a roll of the dice—but it’s not our hand that’s rolling them. Sure: investing in some kind of conspiracy theory to explain it is (in its odd way) comforting and exciting. It’s comforting because it suggests that we have a lot more control over things than we really do. It’s exciting, because it gives us a villain we can blame. Blame is a top-dollar commodity in today’s political climate, and it drives more and more of the decisions being made at the highest levels. Ascertaining the validity of the blame comes in a distant second to feeling a jolt of righteous indignation. The reality is both less exciting and somewhat bleaker: we don’t have nearly as much control as we’d like to believe. These things happen and will continue to happen without our agency or design. Viruses are fragments of genetic material that have apparently broken away from larger organic systems, and from there they are capable of almost infinite, if whimsical, mutation. They’re loose cannons: that’s their nature. That’s all. Dangerous, indisputably. Malicious? Not really.

The COVID-19 pandemic is not unprecedented in its scope and ability to be lethal. Epidemics and plagues have killed vast numbers of people over wide areas throughout history. A few years ago, National Geographic offered a portrait of the world’s most prolific killer. It was not a mass murderer, or even a tyrant. It was the flea, and the microbial load it carried. From 1348 through about 1352, the Black Death visited Europe with a ferocity that probably was unprecedented at the time. Because records from the period are sketchy, it’s hard to come up with an exact count, but best estimates are that it killed approximately a third of the population of Europe all within that little three-to-four-year period. The disease continued to revisit Europe approximately every twenty years for some centuries to come, especially killing people of childbearing age each time, with demographic results that vastly exceed what we might determine from a sheer count of losses. In some areas whole cities were wiped out, and the death toll in Europe alone may have run as high as two hundred million: the extent of its destruction throughout parts of Asia has not been ascertained. Smallpox, in the last century of its activity (1877-1977), killed approximately half a billion people. The 1918 Spanish influenza epidemic killed possibly as many as a hundred million. Wikipedia here lists over a hundred similar catastrophes caused by infectious diseases of one sort or another, each of which had a death toll of more than a thousand; it lists a number of others where the count cannot even be approximately ascertained.

Nor is the COVID-19 pandemic unprecedented in its level of social upheaval. The Black Death radically changed the social, cultural, economic, and even the religious configuration of Europe almost beyond recognition. After Columbus, Native American tribes were exposed to Old World disease agents to which they had no immunities. Many groups were reduced to less than a tenth of their former numbers. Considering these to be instances of genocide is, I think, to ascribe far more intentionality to the situation than it deserves (though there seem to have been some instances where it was intended), but the outcome was indifferent to the intent. The Spanish Influenza of 1918, coming as it did on the heels of World War I, sent a world culture that was already off balance into a deeper spiral. It required steep curbs on social activity to check its spread. Houses of worship were closed then too. Other pubic gatherings were forbidden. Theaters were closed. Even that was not really unprecedented, though: theaters had been closed in Elizabethan London during several of the recurrent visitations of the bubonic plague. The plot of Romeo and Juliet is colored by a quarantine. Boccaccio’s Decameron is a collection of tales that a group of people told to amuse themselves while in isolation, and Chaucer’s somewhat derivative Canterbury Tales are about a group of pilgrims heading for the shrine of St. Thomas à Becket for having given them aid while they were laboring under a plague. People have long known that extraordinary steps need to be taken, at least temporarily, in order to save lives during periods of contagion. It’s inconvenient, it’s costly, and it’s annoying. It’s not a hoax, and it’s not tyrannical. It’s not novel.

So no, in most ways, neither the appearance of COVID-19 nor our responses to it are really unprecedented. I say this in no way to minimize the suffering of those afflicted with the disease, or those suffering from the restrictions put in place to curb its spread. Nor do I mean to trivialize the efforts of those battling its social, medical, or economic consequences: some of them are positively heroic. But claiming that this is all unprecedented looks like an attempt to exempt ourselves from the actual flow of history, and to excuse ourselves from the very reasonable need to consult the history of such events in order to learn what we can from them—for there are, in fact, things to be learned.

In responding to the plagues and calamities of the past, it is perhaps unsurprising that people responded, then as now, primarily out of fear. Fear is one of the most powerful of human motivators, but it is seldom a wise counselor. There have been conspiracy theories before too: during the Black Death, for example, some concluded that that the disease was due to witchcraft, and so they set out to kill cats, on the ground that they were witches’ familiars. The result, of course, was that rats—the actual vectors for the disease, together with their fleas, were able to breed and spread disease all the more freely. Others sold miracle cures to credulous (and fearful) populations; these of course accomplished nothing but heightening the level of fear and desperation.

There were also people who were brave and self-sacrificing, who cared for others in these trying times. In 1665, the village of Eyam in Derbyshire quarantined itself with the plague. They knew what they could expect, and they were not mistaken. Everyone in the town perished, but their decision saved thousands of lives in neighboring villages. Fr. Damien De Veuster ministered to the lepers on Molokai before succumbing to the disease himself: he remains an icon of charity and noble devotion and is the patron saint of Hawaii.

The human race has confronted crisis situations involving infectious diseases, and the decisions they require, before. They are not easy, and sometimes they call for self-sacrifice. There is sober consolation to be wrung from the fact that we are still here, and that we still, as part of our God-given nature, have the capacity to make such decisions—both the ones that protect us and those sacrificial decisions we make to save others. We will not get through the ordeal without loss and cost, but humanity has gotten through before, and it will again. We are neither entirely without resources, but neither are we wholly in control. We need to learn from what we have at our disposal, marshal our resources wisely and well, and trust in God for the rest.

Common Ground: A Lenten Meditation

Thursday, February 25th, 2016

My heart is broken these days as I read the political protestations by the candidates for president. The conversation seems to have descended to the level of a cock fight, with each side crowing over scoring a hit, rather than rising to a thoughtful discussion of the need to supply basic health care services and pay for them responsibly, the need to supply national security without destroying personal security, the need to help those who will responsibly use that help without wasting resources on those who willfully squander them — discussions that, if the proponents weren’t so concerned with winning, might actually provoke the creativity necessary to craft viable solutions. Conversations over religious issues often seems more about scoring points by citing the most proof texts than about seeking guidance from the Holy Spirit to discern how we Christians may help each other fulfill our baptismal vows to love our neighbors — all our neighbors — without violating our consciences.  Even the debate over evolution and creation is often reduced to quips and quotes of various authors that promote neither good science nor good theology, but that do sell books.

What we have forgotten, and what we desperately need to remember, is how to become reconciled, one to the other. It’s a fitting topic for Lent, when Christians reflect on the great price God paid so that we might be reconciled to Him.

Americans seem particularly bad at reconciliation. We are great at competition, but we don’t do forgiveness well. We aspire to reconciliation from time to time — we have the incredible image of the whole of Congress standing together on the steps of the Capitol after 9/11, singing “God Bless America”. Unfortunately that dream of common cause faded all too quickly with squabbles over the nature of the threats and best ways to meet them, and there was no underlying sense of real unity to carry us through the practical realities of the ensuing politics and economics. The sense of disunity has grown over the last decade to real divisions that our leaders — political, religious, and academic — seek to exploit to their own advantage, rather than amend for everyone’s advantage.

I don’t know what the answer is. I can only offer, by way of meditation, three images that continue to haunt me: an archway at Magdelen College in Oxford, a statue and a plaque in a side chapel at the Cathedral in Rouen, and a pair of Westminster Abbey tombs.

We went to England in the summer of 1986. We were grad students with little money, but we had come by a slight windfall, we had some vacation time coming, and we had friends in England we could stay with at least part of our trip. So we packed up our 9-year-old, 3-year-old, and 15-month-old, and went to find the England of J. R. R. Tolkien and C.S. Lewis and Dorothy Sayers, Arthur Ransome, Dick Whittington, Henry II, T.H. White, Isaac Newton, and Edmund Halley.

Lewis’s England lies largely in Oxford, where he taught at Magdelen, so we boarded the train to Oxford and the hallowed groves of academe, where teachers have taught and students have studied and dreamed for a thousand years. Wandering through those grounds that were open to the public, we came upon names carved into the wall of an archway between a great square and a cloistered walk. It was a memorial to the members of the College who had fallen in the Great War of 1914 to 1918.

You find these memorials in every village in England, sometimes on the walls of a municipal building, sometimes on columns in the center of village square, sometimes on plaques flat in the ground of the churchyard, among the gravestones of those who made it home. Lists are long; casualties were high, and some villages lost half of their male population in the trenches. Standing in the shadowed archway at Magdelen, we paused to read through the names, and realized with awe that they were divided into two groups: those who died in the service of George V of England, and those who had died in the service of Wilhelm, Emperor of Germany.

For you see, Oxford colleges have this odd notion that once you become a member of college, you remain a member of college. You can return and read books in the library, be seated at the college dining table, wear the college robes, attend the college colloquia, and when you die, be memorialized on the college walls — even when you die in the service of a political enemy.  Governments rise and fall in the actions of charismatic leaders; industry bends to pragmatic ends; fashions will alter, economies will render the rich poor and the poor rich, but the underlying purpose of academics is to seek the truth, and at Magdelen, that shared journey creates a community that cannot be easily severed, even by war.

Fast-forward twenty-three years to a different trip, and a different country.

There was a decade in the nineteenth century when the Cathedral at Rouen was the tallest building in the world, its steeple rising nearly five hundred feet above the placid waters of the Seine, which meanders through the fields and orchards of Normandy on its way from Paris to the English Channel. Bombed and broken on D-Day, the cathedral nave has been repaired and remains breathtakingly impressive. Beneath the lacy stone and stained glass lie the tombs of Rollo the Northman and a shrine containing the heart of Richard the Lionheart, at the same time both king of England and Duke of Normandy. And, not surprisingly, since just about every church in Normandy has some memorial to Jeanne d’Arc, there is a chapel in the north transept with a modern statue commemorating her martyrdom. She is in chains, her expression calmly resigned, while stone flames lick at her stone gown.

Rouen is the end of Jeanne’s story: here she was held in a fat tower that still stands near the train station, tried by an illegal court, and burned to death in the town square as a witch at the hands of the English, who had determined that her uncanny ability to beat their well-trained armies with smaller forces must lie in a pact with the devil. One might well think the Normans should have little love for the English: Jeanne’s martyrdom marked a turning point in a century of war between France and England that left northern France devastated and economically impoverished for yet another century after hostilities ceased.

So it is with a bit of shock that the Cathedral visitor reads the plaque under the double gothic arches just behind the chapel altar: in French and English, it proclaims “To the Glory of God and to the memory of the one million dead of the British Empire who fell in the Great War 1914-1918, and of whom the greater part rest in France.” In the chapel, in the crimson and sapphire and emerald light from the restored windows above, you suddenly realize that you are in a holy space, one that can abide the fundamental tension of human relationships. The English killed Jeanne, savior of France; the English died in defense of France, and became themselves saviors of France.  The English are — however long ago driven back and exiled to their island, and however badly some of them have behaved — still part of Normandy, and they will still come, if need be, to defend it.

Across the channel, in the center of London, on the banks of the Thames, lies Westminister Abbey. Here the English buried their poets and painters, dreamers and scientists, princes and kings: Geoffrey Chaucer and Lewis Caroll, Isaac Newton and Robert Boyle, Charles and James and Anne Stuart.

In the north aisle of the Lady Chapel are two tombs, one stacked above the other. In the lower one lies Mary Tudor, most Catholic queen of England, who held her sister Elizabeth in house arrest to prevent a civil war, whose courts ordered Hugh Latimer and Nicholas Ridley burned at the stake on Broad Street in Oxford for their defense of a Protestant faith, and who sought all her life to serve God by bringing Him a Catholic kingdom in communion with Rome. In the coffin above her lies that same Elizabeth, most Protestant queen of England, who in her turn held her cousin Mary, Queen of Scots, under house arrest to prevent a civil war, who ordered the executions of Mary and Robert Devereux for political plots, and who sought all her life to serve God by avoiding the religious wars of the Continent and creating a peacable England. When we visited the chapel in 1986, the plaque on the floor read: “Those whom the Reformation divided, the Resurrection will reunite, who died for Christ and conscience’ sake”.  Standing among the royal tombs beneath the stone arches of the chapel, confronted by two sisters at deadly odds with one another, it is with some shock that four hundred years later what remains is the conviction that reconciliation is not merely possible: it is inevitable where there is a fundamental common goal to serve God.

The issues that divide us as Americans and as Christians are real; they are complex, and they challenge us to be the best we can be to address them. As individuals, we have limited resources of time and money and talent and intellect and emotional stamina. We cannot resolve complex problems by isolating ourselves from each other.

One of our visions in founding Scholars Online was to create a community where Christians of different backgrounds could study together, and we welcome anyone, regardless of their religious background, who shares our conviction that education requires not merely mastery of subject matter and the development of close reading and critical thinking skills, but also the formation of character that seeks to deal charitably and honestly with others. We do not require a statement of faith from our students. Our faculty comes from diverse traditions, and we all view our call to teach as a ministry. We — students, teachers, parents — do not always agree on issues, but we hold ourselves together in community, not only to serve the cause of classical Christian education, but also to serve as a model of community built out of diversity.

We start by standing together on the common ground of God’s eternal love for each of us and all of us.

[Part of this meditation appeared a 2009 entry to the All Saints Episcopal Church (Bellevue) blog.]

STEMs and Roots

Tuesday, February 2nd, 2016

Everywhere we see extravagant public handwringing about education. Something is not working. The economy seems to be the symptom that garners the most attention, and there are people across the political spectrum who want to fix it directly; but most seem to agree that education is at least an important piece of the solution. We must produce competitive workers for the twenty-first century, proclaim the banners and headlines; if we do not, the United States will become a third-world nation. We need to get education on the fast track — education that is edgy, aggressive, and technologically savvy. Whatever else it is, it must be up to date, it must be fast, and it must be modern. It must not be what we have been doing.

I’m a Latin teacher. If I were a standup comedian, that would be considered a punch line. In addition to Latin, I teach literature — much of it hundreds of years old. I ask students, improbably, to see it for what it itself is, not just for what they can use it for themselves. What’s the point of that? one might ask. Things need to be made relevant to them, not the other way around, don’t they?

Being a Latin teacher, however (among other things), I have gone for a number of years now to the Summer Institute of the American Classical League, made up largely of Latin teachers across the country. One might expect them to be stubbornly resistant to these concerns — or perhaps blandly oblivious. That’s far from the case. Every year, in between the discussions of Latin and Greek literature and history, there are far more devoted to pedagogy: how to make Latin relevant to the needs of the twenty-first century, how to advance the goals of STEM education using classical languages, and how to utilize the available technology in the latest and greatest ways. What that technology does or does not do is of some interest, but the most important thing for many there is that it be new and catchy and up to date. Only that way can we hope to engage our ever-so-modern students.

The accrediting body that reviewed our curricular offerings at Scholars Online supplies a torrent of exortation about preparing our students for twenty-first century jobs by providing them with the latest skills. It’s obvious enough that the ones they have now aren’t doing the trick, since so many people are out of work, and so many of those who are employed seem to be in dead-end positions. The way out of our social and cultural morass lies, we are told, in a focus on the STEM subjects: Science, Technology, Engineering, and Math. Providing students with job skills is the main business of education. They need to be made employable. They need to be able to become wealthy, because that’s how our society understands, recognizes, and rewards worth. We pay lip service, but little else, to other standards of value.

The Sarah D. Barder Fellowship organization to which I also belong is a branch of the Johns Hopkins University Center for Talented Youth. It’s devoted to gifted and highly gifted education. At their annual conference they continue to push for skills, chiefly in the scientific and technical areas, to make our students competitive in the emergent job market. The highly gifted ought to be highly employable and hence earn high incomes. That’s what it means, isn’t it?

The politicians of both parties have contrived to disagree about almost everything, but they seem to agree about this. In January of 2014, President Barack Obama commented, “…I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree. Now, nothing wrong with an art history degree — I love art history. So I don’t want to get a bunch of emails from everybody. I’m just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.”

From the other side of the aisle, Florida Governor Rick Scott said, “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so.”

They’re both, of course, right. The problem isn’t that they have come up with the wrong answer. It isn’t even that they’re asking the wrong question. It’s that they’re asking only one of several relevant questions. They have drawn entirely correct conclusions from their premises. A well-trained plumber with a twelfth-grade education (or less) can make more money than I ever will as a Ph.D. That has been obvious for some time now. If I needed any reminding, the last time we required a plumber’s service, the point was amply reinforced: the two of them walked away in a day with about what I make in a month. It’s true, too, that a supply of anthropologists is not, on the face of things, serving the “compelling interests” of the state of Florida (or any other state, probably). In all fairness, President Obama said that he wasn’t talking about the value of art history as such, but merely its value in the job market. All the same, that he was dealing with the job market as the chief index of an education’s value is symptomatic of our culture’s expectations about education and its understanding of what it’s for.

The politicians haven’t created the problem; but they have bought, and are now helping to articulate further, the prevalent assessment of what ends are worth pursuing, and, by sheer repetition and emphasis, crowding the others out. I’m not at all against STEM subjects, nor am I against technologically competent workers. I use and enjoy technology. I am not intimidated by it. I teach online. I’ve been using the Internet for twenty-odd years. I buy a fantastic range of products online. I programmed the chat software I use to teach Latin and Greek, using PHP, JavaScript, and mySQL. I’m a registered Apple Developer. I think every literate person should know not only some Latin and Greek, but also some algebra and geometry. I even think, when going through Thucydides’ description of how the Plataeans determined the height of the wall the Thebans had built around their city, “This would be so much easier if they just applied a little trigonometry.” Everyone should know how to program a computer. Those are all good things, and help us understand the world we’re living in, whether we use them for work or not.

But they are not all that we need to know. So before you quietly determine that what I’m offering is just irrelevant, allow me to bring some news from the past. If that sounds contradictory, bear in mind that it’s really the only kind of news there is. All we know about anything at all, we know from the past, whether recent or distant. Everything in the paper or on the radio news is already in the past. Every idea we have has been formulated based on already-accumulated evidence and already-completed ratiocination. We may think we are looking at the future, but we aren’t: we’re at most observing the trends of the recent past and hypothesizing about what the future will be like. What I have to say is news, not because it’s about late-breaking happenings, but because it seems not to be widely known. The unsettling truth is that if we understood the past better and more deeply, we might be less sanguine about trusting the apparent trends of a year or even a decade as predictors of the future. They do not define our course into the infinite future, or even necessarily the short term — be they about job creation, technical developments, or weather patterns. We are no more able to envision the global culture and economy of 2050 than the independent bookseller in 1980 could have predicted that a company named Amazon would put him out of business by 2015.

So here’s my news: if the United States becomes a third-world nation (a distinct possibility), it will not be because of a failure in our technology, or even in our technological education. It will be because, in our headlong pursuit of what glitters, we have forgotten how to differentiate value from price: we have forgotten how to be a free people. Citizenship — not merely in terms of law and government, but the whole spectrum of activities involved in evaluating and making decisions about what kind of people to be, collectively and individually — is not a STEM subject. Our ability to articulate and grasp values, and to make reasoned and well-informed decisions at the polls, in the workplace, and in our families, cannot be transmitted by a simple, repeatable process. Nor can achievement in citizenship be assessed simply, or, in the short term, accurately at all. The successes and failures of the polity as a whole, and of the citizens individually, will remain for the next generation to identify and evaluate — if we have left them tools equal to the task. Our human achievement cannot be measured by lines of code, by units of product off the assembly line, or by GNP. Our competence in the business of being human cannot be certified like competence in Java or Oracle (or, for that matter, plumbing). Even a success does not necessarily hold out much prospect of employment or material advantage, because that was never what it was about in the first place. It offers only the elusive hope that we will have spent our stock of days with meaning — measured not by our net worth when we die, but by what we have contributed when we’re alive. The questions we encounter in this arena are not new ones, but rather old ones. If we lose sight of them, however, we will have left every child behind, for technocracy can offer nothing to redirect our attention to what matters.

Is learning this material of compelling interest to the state? That depends on what you think the state is. The state as a bureaucratic organism is capable of getting along just fine with drones that don’t ask any inconvenient questions. We’re already well on the way to achieving that kind of state. Noam Chomsky, ever a firebrand and not a man with whom I invariably agree, trenchantly pointed out, “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.” He’s right. If we are to become unfree people, it will be because we gave our freedom away in exchange for material security or some other ephemeral reward — an illusion of safety and welfare, and those same jobs that President Obama and Governor Scott have tacitly accepted as the chief — or perhaps the only — real objects of our educational system. Whatever lies outside that narrow band of approved material is an object of ridicule.

If the state is the people who make it up, the question is subtly but massively different. Real education may not be in the compelling interest of the state qua state, but it is in the compelling interest of the people. It’s the unique and unfathomably complex amalgam that each person forges out of personal reflection, of coming to understand one’s place in the family, in the nation, and in the world. It is not primarily practical, and we should eschew it altogether, if our highest goal were merely to get along materially. The only reason to value it is the belief that there is some meaning to life beyond one’s bank balance and material comfort. I cannot prove that there is, and the vocabulary of the market has done its best to be rid of the idea. But I will cling to it while I live, because I think it’s what makes that life worthwhile.

Technical skills — job skills of any sort — are means, among others, to the well-lived life. They are even useful means in their place, and everyone should become as competent as possible. But as they are means, they are definitionally not ends in themselves. They can be mistakenly viewed as ends in themselves, and sold to the credulous as such, but the traffic is fraudulent, and it corrupts the good that is being conveyed. Wherever that sale is going on, it’s because the real ends are being quietly bought up by those with the power to keep them out of our view in their own interest.

Approximately 1900 years ago, Tacitus wrote of a sea change in another civilization that had happened not by cataclysm but through inattention to what really mattered. Describing the state of Rome at the end of the reign of Augustus, he wrote: “At home all was calm. The officials carried the old names; the younger men had been born after the victory of Actium; most even of the elder generation, during the civil wars; few indeed were left who had seen the Republic. It was thus an altered world, and of the old, unspoilt Roman character not a trace lingered.” It takes but a single generation to forget the work of ages.

But perhaps that’s an old story, and terribly out of date. I teach Latin, Greek, literature, and history, after all.

News — Spring 2015

Sunday, May 17th, 2015

National French Teachers Examination

Congratulations to Mrs. Mary Catherine Lavissière’s students Katie Cruse, Alana Ross, Micah Wittenberg, and Moriah Wittenberg! These four Scholars Online students placed with honors in the National French Test Le Grand Concours 2015. The test is offered annually by the American Association of Teachers of French to identify and recognize students achieving high proficiency in the French language.

Madame Lavissière offers courses in both French and Spanish through Scholars Online. See our Modern Languages course descriptions for more information.
Update on Summer Session Courses for 2015

We’ve added several new courses for the summer session, which runs from June 8-August 21, 2015 (individual courses may span different periods within the session, so check your course description for exact start dates). Most summer classes are chances for students to build new skills in fun but still useful ways. Click on the course name to see descriptions of class schedules and costs, and on syllabus links to see detailed course content and assignments. Enrollment must be completed by May 31 to ensure placement in the course, and payment in full is due before students can attend chat sessions. Enrollments received after May 31 may not be processed in time for students to attend the first sessions of their course.

  • Explore the many facets of J.R.R. Tolkien’s creation in Looking at Middle-earth. Discussions will focus on Tolkien’s world-building, use of language, his theology of “subcreation”, and his work as a philologist. Students are expected to have read The Hobbit and The Lord of the Rings.
  • Sample Shakespeare’s comedy, tragedy, and history plays, including Twelfth Night, As You Like ItThe Taming of the Shrew,The Merchant of VeniceA Midsummer Night’s DreamKing LearJulius CaesarRomeo and Juliet, and Richard II in Summer Shakespeare I. Students taking Scholars Online’s literature series, supplemented with Summer Shakespeare II and III, have the opportunity to study and discuss all of Shakespeare’s plays. [See the Full Syllabus for details.]
  • Gain practical writing skills with Molding Your Prose (based on an idea suggested to Dr. Bruce McMenomy by Mary McDermott Shideler). Learn to organize your ideas and improve your dialectic skills in Molding your Argument. Both of these popular courses requires short weekly writing exercises, with students analyzing each others’ work to learn to identify and improve their own writing.
  • Jump start your academic year Physics course with an overview of key theories and concepts in Introduction to Physics, a survey of the fundamental concepts of classical mechanics and modern physics, and gain essential analysis and problem-solving skills. Students planning to take the combined AP Physics 1 and 2 course will be able to count lab work from this course toward their AP lab requirements. [Full syllabus]
  • NEW COURSE! In The Age of Reagan, discover how the events and decisions of the Reagan administration have shaped current political, religious, economic, and environmental policies. Students opting for the media studies component of this course will also examine how movies, TV, and ads portray cultural messages (parental guide available in the full syllabus).

Adventures in Team Teaching

Monday, July 16th, 2012

About this time last year, Dr. McMenomy approached me with an idea that was half-proposal and half-plea. The World History course for that year had both students enrolled and a textbook picked out and purchased, but did not have a teacher. As I was the other specifically-history teacher on the Scholars Online staff, Dr. McMenomy asked me to take over. I hesitated; while I enjoy teaching anything historical, I was less delighted with the notion of taking over a class with its syllabus essentially dictated by a book I hadn’t chosen, especially a book which Dr. McMenomy acknowledged was far from ideal. Moreover, I knew that I’d need to do some preparation in August and September, which happened to be when I was a) moving and b) organizing a small conference. With this in mind, I said I had to decline. But the good doctor was politely insistent, pointing out that he was deeply opposed to canceling a class with students enrolled. He offered to lend a hand in getting the class set up: we could team-teach until I’d found my feet, he said, and then he’d let me take over. Still with some trepidation, I accepted this proposal.

Our first task was to write up summaries and quizzes for each chapter. We soon settled into the pattern of swapping off, each one of us writing up the notes and the quiz, and then leading the discussion. The other teacher would chime in with some additional comments.

This led to some interesting moments right off. Dr. McMenomy and I do not agree on everything; in fact on some subjects we stand at opposites. In the teaching of history our differences are not quite so pronounced, but there’s a real difference of emphasis. The doctor knows more than I about the intellectual history of the world, being by trade and inclination a classics instructor; he studies ideas, their transmission, and their influence. In contrast my attention is usually on everyday people in history—not the rulers or the scholars, but the ordinary folks who worked, fought, and struggled; the ones doing the digging and the dying, as it were. Dr. McMenomy is a master of learning and knowing what has been handed down to us, whereas I am trying to find out who and what has been overlooked, and therefore he has somewhat higher regard for established authority, while I am usually cheering for the underdog.

With such differences in style—not total opposition, but different enough—the result could have been tension and conflict. Instead, the result was a creative tension. Dr. McMenomy and I have known each other for almost three decades now, in fact since I was a young child, and as a result we know each other’s standpoints and respect them. Thus any difference of opinion that might have triggered a dispute was kept in check by our long friendship. This did not keep us from discussing and even debating, but we did so with high regard for each other even as we contested each other’s points.

At first I was concerned about showing this during class. But Dr. McMenomy pointed out that it would be good for our students to see that history is not a settled issue. The truth of history is not relative—something did happen, after all—but knowing the whole of that truth is nigh on impossible, and thus history is a realm of theory and evidence. My historical theories have support and also have a few holes; Dr. McMenomy’s ideas are, being human, similarly incomplete. The Grand Unified Theory of History, he says, is that no Grand Unified Theory of History is possible. When we discussed in front of the students, we thus made it clear that history is subject to continual questioning and debate. We also showed them that it’s up to them to make up their own minds. We refused to hand down definitive answers, because any such answers would keep the students from coming to their own conclusions… and besides, any such definitive answers would probably be flawed.

The doctor and I did come to many points of agreement; it wasn’t a continual debate. We did not always agree on the sources of power, but we both agreed that power was at the heart of history, and frequently steered our discussion in that direction. We also came to agree that geography is destiny, though naturally with a few limits. We were also firmly united in our growing disdain for the textbook we were using.

The weeks went by, the classes and the discussions continued, and it dawned on us that, rather than a chore forced on us, the class had become downright fun. Dr. McMenomy never stepped back and handed the class over to me; neither of us wanted him to. The collaboration was too delightful. We each built half of the exams, reviewed each other’s work, and then sent it on to the students; for grading, we would grade the work separately, compare notes, and then settle on a compromise where needed. As far as could be managed we kept things balanced, splitting the chapters between us and writing up extensive commentaries on each one, with discussion questions at the end to guide the class.

We noted that the commentaries were growing more and more lengthy. This was necessary; the book was continually failing to provide adequate coverage and synthesis. Sometimes it failed to provide even basic coherency, and was riddled with errors great and small. Looking for a replacement, we discovered to our dismay that it was the best available at that grade level. There were better books, but only for college students.

At the end of the year Dr. McMenomy began overhauling our class website, which was beginning to stagger under the amount of material we’d loaded onto it, and realized that over the course of the year, we’d written over forty-five thousand words. At which point he made a new proposal to me: “Would you like to write a textbook? We’re already well on our way.”

The idea caught our imaginations. We would continue the collaborative approach, we decided: each one of us would write certain chapters, then review the other’s work. Moreover we agreed to keep the useful conversation going within the text itself: we would respond to each other’s chapters, assessing and evaluating the other’s ideas, in a note at the end of each section. Thus students, reading through the text, would learn from the book itself that there are no easy answers, that when it comes to history you can’t necessarily just look up the answer, and that you should not automatically assume that what you read is gospel truth.

Then it dawned on us that we could begin to write our book piecemeal and replace sections of the current text as we went through (starting with the most egregiously inaccurate and inadequate chapters). As students read through, they will alternate between reading our material, posted online, and reading the old book. We aspire to rewrite about a third of the book as we teach this next year’s class. The material will be posted on the class website, which you can find here. The site is undergoing some changes at the moment and will undergo more throughout the year. Gradually, we’ll replace more and more of the book material, and eventually wind up with our own, brand-new text.

This is a substantial project, and we know it will take years. It’s also a highly ambitious project—ambitious to the point of madness, maybe!

But if so, it’s a truly pleasant madness. It is a deep privilege to work with Dr. McMenomy, who for all our differences of opinion remains the wisest and most insightful man I have ever met. I believe I speak for both of us when I say that we have learned a lot from each other and from the process of teaching this class; and we hope, with some confidence, that our learning leads to broader and better instruction, and our students will reap the benefits.

Four Roads to Jerusalem

Thursday, April 5th, 2012

When Jesus of Nazareth entered Jerusalem in triumph, he rode — but accounts differ as to what he was riding on, and how he got it. Take Matthew: in the First Gospel, Jesus sends his disciples for a colt and a donkey, in order to fulfill the prophecy of Zechariah, “Look, your king is coming to you […] mounted on a donkey, and on a colt.” (Matthew 21.5, all verses cited from the NRSV). Yet it seems Matthew inserted that “and” — Zechariah 9.9 is actually talking about one animal, but repeating for dramatic effect. So Matthew describes Jesus summoning two animals to fulfill a prophecy that doesn’t say what he thought it said.

The insertion of the donkey is all the more clear when we consult Mark 11.1-7. Here Jesus sends for just a colt. He also takes some pains in his instructions, telling the disciples what to say if anyone tries to stop them, and promising to send the animal back immediately — a nice thought, since otherwise some random inhabitant of Jerusalem would have lost a valuable animal. Jesus’ foresight pays off, as one would assume it did regularly, for sure enough someone asks the disciples about their errand, and the words of the master set everything straight.

Luke backs up Mark’s story, by and large, with just one animal; he also includes the disciples being questioned about their task. He fails to note the promise to return the animal — an odd omission, considering Luke is in many a way the most compassionate gospel.

The Gospel of John, hurrying ahead to more important things, gives the incident almost no mention: “Jesus found a young donkey and sat on it” (John 12.14). Like Matthew, it’s a donkey, and it’s a fulfillment of Zechariah, although John’s reading of the prophet is more accurate.

As differences between the gospels go, this is absolutely trivial. No points of theology or doctrine hang on whether or not Jesus promised to return his mount. But other differences between the Gospels are not so light-weight. And even this miniscule difference in text does lead us to wonder, “Who was right?”

I am a historian by trade and training, so my first instincts are to treat this as a historical puzzle.

All such conundrums about the past boil down to to sources: primary sources, which are the eyewitness or contemporary accounts, and secondary sources, which are later analyses. What we have here are four primary sources, the gospels. (Some point out that the Bible is one source; but the gospels predate the Bible as currently compiled by centuries. St. Athanasius finally listed out the twenty-seven books of the New Testament in AD 367. And even the root word, biblia, is a plural.) For secondary sources, we have the enormous literature and criticism that has been built up around the gospels, from the church fathers like Origen to whatever was published last week. These secondary sources hinge on the primaries, however; all they can really do is talk about the gospels and give differing opinions based on them. Secondary sources do inform me, though, that Mark is the earliest of the four, probably written between AD 60 and 70, with Matthew and Luke later in that century, and John around AD 80 or 90. Not quite eyewitnesses, the historically scrupulous will point out, but of the era, which is more than all later scholarship can say.

The secondary sources also point out that Matthew and Luke clearly borrowed from Mark. While Luke is non-specific, he says right up front that he read up on all he could find before writing out his gospel, and alludes to others making the same effort (Luke 1.1-3). So while Matthew and Luke seem to have had information that Mark didn’t, or didn’t include, we really only have two sources for the colt story: Mark and John. John seems to claim a link to “the beloved disciple,” perhaps indicating it was written at the behest and under the guidance of someone who walked with Jesus. Mark makes no such claim — but historians generally give more credence to earlier sources rather than later. The principle is simple: the longer it’s been since the events, the more likely it is that memory has faded, failed, or simply been faked.

Those who discuss the copying errors in handing down the gospels might add, however, that the older a book is, the more time it’s had for people to make changes in it, historical or not; but after two thousand years, a few decades here or there probably makes little difference. We have no “original” copies of Mark or John, so everything we read must rest on the hopes that the scribes got it mostly right, at least on the aggregate. Studies of the even-older Hebrew scriptures show that such accuracy is entirely possible.

Even when dealing with more recent and more thoroughly-documented events, however, there often comes a time when historians finally have to admit they don’t know the whole truth of a matter, and, aside from those most scrupulously bound to their sources, have to make a guess. Often they go on what feels most probable to them. Here we have a little help, in the story of Jesus and the colt he rode in on; reassuring the owners that the animal will be returned feels like something Jesus would do. He’d think ahead, and he’d know the owner would need the colt back. And so with a little historical technique and a little gut instinct, I suggest that of the four variants, Mark’s is the closest to the truth.

Again, this is a dramatically minor point of contention. Nothing is riding on what Jesus rode, and my assertion of Mark’s superiority on this is essentially meaningless. The story reveals, however, that the gospels do give different versions of the story, and that does matter. In fact, with Matthew’s blunder in reading Zechariah, it reminds us that the Bible can, in fact, contain errors.

There are those in the world for whom that statement alone is sacrilege and heresy. Yet I must stand by it; the Bible contains mistakes, a few great stumbles and many small ones. The simple act of copying by hand so many words for so many years practically guarantees it — in hand-writing my first outline for this essay, I spelled “inerrancy” with three Rs. Consider: Genesis begins with two rather different accounts of creation. There are three different versions of the last words of Christ; he may have spoken all the words given, but they cannot all have been said last. And, in the case of the colt, it’s the work of a moment to flip back to Zechariah 9.9 (in many editions it will only be a few pages, with only Malachi intervening!) to see that Matthew simply counted wrong. There are mistakes, blunders,  additions, and deliberate alterations.

Personally I find this does not diminish the Bible’s power. An early printed edition of the King James Version accidentally left the word “not” out of the seventh commandment, giving us “Thou shalt commit adultery.” Yet the “Wicked Bible,” as this deeply unfortunate edition came to be known, still held the Sermon on the Mount, still contained the Greatest Commandment, still taught “For God so loved the world…” One error did not break the rest. No, the Wicked Bible and all the other changes and mistakes over the centuries teach us two things: one, the Bible should be read with care, and two, the Bible should be read.

Indeed, a multitude of versions actually has some weight with historians, who recall that eyewitnesses to the same event can give wildly differing accounts of what happened, but will still tell you that something occurred. If  we had only one gospel we could call it an invention. Having four, including two written independently, plus all the letters of Paul, we know that something of vast significance occurred in Palestine in the reign of Tiberias. We should try and figure out the truth, of course. Like pieces of a mosaic, having multiple gospels helps here as well. Mark, in is simplicity and as the earliest, seems like historical bedrock. We can look to Mark and see the outlines of a story: a man who lived and taught, and was crucified, and rose again — even if Mark sometimes says little more! As a historian this appeals to me. As a teacher, however, I find that Matthew and Luke resonate deeply with me, Matthew with the Sermon on the Mount and all the other lessons, Luke with his parables and his care for the downtrodden. Finally, as I am a man who tries to live his life by love, there is John. I struggled with the Fourth Gospel at times — standing apart, clearly written in an altogether different way, and sometimes with manifestly added passages. Once I even found myself thinking, “This is redundant.” The next lines I read were these: “This is my commandment, that you love one another as I have loved you. No one has greater love than this, to lay down one’s life for one’s friends” (John 15.12-13). I have taken that as a lesson that nothing is wholly redundant when it comes to the Good News.

From a historian’s standpoint, Mark is the most “accurate.” But the four gospels remind me of teaching the same lesson of US History to four separate classes in my public school days. Each class, being different, required a different emphasis, and each class got the benefit of my experience teaching the ones before.

Jesus rode into Jerusalem. You have four versions to choose from if you must select just one; but you can also draw on all four and learn much. After all, the important part is not what Jesus rode or how he got it, but the part all four versions agree on perfectly: that he rode in triumphant.

Making Sense and Finding Meaning

Sunday, October 4th, 2009

My intermediate and advanced Greek and Latin classes are largely translation-based. There’s a lot of discussion among Latin teachers about whether that’s a good approach, but much of the dispute is, I think, mired in terminological ambiguity, and at least some of the objections to translation classes don’t entirely apply to what we’re doing. What I’m looking for is emphatically not a mechanical translation according to rigid and externally objective rules (“Render the subjunctive with ‘might’,” “Translate the imperfect with the English progressive,” or the like), but rather the expression of the student’s understanding of each sentence as a whole, in the context of the larger discussion or narrative.

We aren’t there to produce publishable translations: that’s an entirely different game, with different rules. For us, translations are the means to an end: the understanding is the real point of the process, but it’s hard to measure understanding unless it’s expressed somehow. The translations, therefore, are like a scaffold surrounding the real edifice — engagement with the text as a whole: its words, its sounds, and its various levels of meaning. That engagement is hard to pin down, but it allows us to make a genuine human connection with the mind of the author. A detached mechanical “translation”, though, is like a scaffold built around nothing, or the new clothes without the emperor. Even were artificial intelligence able to advance to the point that a computer could produce a flawless rendition of a text into another language, it still would not have achieved what is essential. It will not have understood. It will not have savored the words, grasped the concepts, combined them into larger ideas, applied them to new contexts, or come to a meeting of the minds with the author.
This is not always an easy concept for students to grasp. Some are fretful to get exactly the right wording (as if there were such a thing), but apparently less concerned with understanding the essential meaning. At the beginning of the year, I usually have a few students who make the (to me bizarre) claim, “I translated this sentence, but I don’t understand it.” My response is always some variation on, “If you didn’t make sense of it, you didn’t really translate it.”

We talk about making sense of the passage, but even that turn of phrase may be one of the little arrogances of the modern world. The prevalent modern paradigm suggests that the world is without order or meaning unless we impose it; Christianity, however, presupposes a world informed by its Creator with a consistent meaning that we only occasionally perceive. For us, it would probably be more accurate, and certainly more modest, to talk of finding or discovering the sense in the passage.

Whether we call it “making sense” or “finding sense”, though, it is not just the stuff of language classes. Every discipline is ultimately about finding meaning in and through its subject matter. In language and literature we look for the informing thought behind speech and writing. In history we look to understand the whole complex relationship of individuals and groups through time, with their ideas, movements, and circumstances, and what it all meant for them and what it means for us today. The sciences look to find the rationale in the order of the physical universe, mathematics the meaning of pure number and proportion, and philosophy to find the sense of sense itself. Each discipline has its own methods, its own vocabulary, and its own techniques. Each has its own equivalent of the translation exercise, too — something we do not really for its own sake, but to verify that the student has grasped something larger that cannot be measured directly. But behind those differences of method and process, all of them are about engaging with the underlying meaning. All real learning is. (In that respect it differs from training, which is not really about learning as such, but about acquiring known skills. Both learning and training are essential to a well-rounded human being, but they shouldn’t be confused with one another.)

From a secular point of view, this must seem a rather granular exercise with many dead ends. That each thing should have its own limited kind of meaning, unrelated to every other, seems at least aesthetically unsatisfying; it offers us Eliot’s Waste Land: a “heap of broken images”, pointing nowhere. Language is fractured, and our first great gift of articulate speech clogs and becomes useless.

Our faith offers us something else: we were given the power to name creation — to refer to one thing through or with another — as a way of proclaiming the truth of God, surely, but also, I think, as a kind of hint as to how we should view the whole world. Everything, viewed properly, can be a sign. As Paul says in Romans, “For since the creation of the world God’s invisible qualities—his eternal power and divine nature—have been clearly seen, being understood from what has been made, so that men are without excuse” (1:20, NIV); Alanus ab Insulis (1128-1202) wrote, about 1100 years later, “Every creature in the world is like a picture to us, or a mirror.” Signification itself is transformed and transfigured, sub specie aeternitatis, from a set of chaotic references into a kind of tree, in which the signifiers converge, both attesting the unitary truth of the Lord and endowing every created thing in its turn with a holy function.