Author Archive

Failure as a good thing

Friday, March 11th, 2016

People tout many different goals in the educational enterprise, but not all goals are created equal. They require a good deal of sifting, and some should be discarded. Many of them seem to be either obvious on the one hand or, on the other, completely wrong-headed (to my way of thinking, at least).

One of the most improbable goals one could posit, however, would be failure. Yet failure — not as an end (and hence not a final goal), but as an essential and salutary means to achieving a real education — is the subject of Jessica Lahey’s The Gift of Failure (New York, HarperCollins, 2015). In all fairness, I guess I was predisposed to like what she had to say, since she’s a teacher of both English and Latin, but I genuinely think that it is one of the more trenchant critiques I have read of modern pedagogy and the child-rearing approaches that have helped shape it, sometimes with the complicity of teachers, and sometimes in spite of their best efforts.

Christe first drew my attention to an extract of her book at The Atlantic here. When we conferred after reading it, we discovered that we’d both been sufficiently impressed that we’d each ordered a copy of the book.

Lahey calls into question, first and foremost, the notion that the student (whether younger or older) really needs to feel that he or she is doing well at all stages of the process. Feeling good about your achievement, whether or not it really amounts to anything, is not in fact a particularly useful thing. That seems common-sensical to me, but it has for some time gone against the grain of a good deal of teaching theory. Instead, Lahey argues, failing — and in the process learning to get up again, and throw oneself back into the task at hand — is not only beneficial to a student, but essential to the formation of any kind of adult autonomy. Insofar as education is not merely about achieving a certain number of grades and scores, but about the actual formation of characer, this is (I think) spot-on.

A good deal of her discussion is centered around the sharply diminishing value of any system of extrinsic reward — that is, anything attached secondarily to the process of learning — be it grades on a paper or a report card, a monetary payoff from parents for good grades, or the often illusory goal of getting into a good college. The only real reward for learning something, she insists, is knowing it. She has articulated better than I have a number of things I’ve tried to express before. (On the notion that the reason to learn Latin and Greek was not as a stepping-stone to something else, but really to know Latin and Greek, see here and here. On allowing the student freedom to fail, see here. On grades, see here.) Education should be — and arguably can only be — about learning, not about grades, and about mastery, not about serving time, passing tests so that one can be certified or bumped along to something else. In meticulous detail, Lahey documents the uselessness of extrinsic rewards at almost every level — not merely because they fail to achieve the desired result, but because they drag the student away from engagement in learning, dull the mind and sensitivity, and effectively promote the ongoing infantilization of our adolescents — making sure that they are never directly exposed to the real and natural consequences of either their successes or their failures. Put differently, unless you can fail, you can’t really succeed either.

Rather than merely being content to denounce the inadequacies of modern pedagogy, Ms. Lahey has concrete suggestions for how to turn things around. She honestly reports how she has had to do so herself in her ways of dealing with her own children. The book is graciously honest, and I enthusiastically recommend it to parents and teachers at every level. If I haven’t convinced you this far, though, at least read the excerpt linked above. The kind of learning she’s talking about — engaged learning tied to a real love of learning, coupled with the humility to take the occasional setback not as an invalidation of oneself but as a challenge to grow into something tougher — is precisely what we’re hoping to cultivate at Scholars Online. If that’s what you’re looking for, I hope we can provide it.

STEMs and Roots

Tuesday, February 2nd, 2016

Everywhere we see extravagant public handwringing about education. Something is not working. The economy seems to be the symptom that garners the most attention, and there are people across the political spectrum who want to fix it directly; but most seem to agree that education is at least an important piece of the solution. We must produce competitive workers for the twenty-first century, proclaim the banners and headlines; if we do not, the United States will become a third-world nation. We need to get education on the fast track — education that is edgy, aggressive, and technologically savvy. Whatever else it is, it must be up to date, it must be fast, and it must be modern. It must not be what we have been doing.

I’m a Latin teacher. If I were a standup comedian, that would be considered a punch line. In addition to Latin, I teach literature — much of it hundreds of years old. I ask students, improbably, to see it for what it itself is, not just for what they can use it for themselves. What’s the point of that? one might ask. Things need to be made relevant to them, not the other way around, don’t they?

Being a Latin teacher, however (among other things), I have gone for a number of years now to the Summer Institute of the American Classical League, made up largely of Latin teachers across the country. One might expect them to be stubbornly resistant to these concerns — or perhaps blandly oblivious. That’s far from the case. Every year, in between the discussions of Latin and Greek literature and history, there are far more devoted to pedagogy: how to make Latin relevant to the needs of the twenty-first century, how to advance the goals of STEM education using classical languages, and how to utilize the available technology in the latest and greatest ways. What that technology does or does not do is of some interest, but the most important thing for many there is that it be new and catchy and up to date. Only that way can we hope to engage our ever-so-modern students.

The accrediting body that reviewed our curricular offerings at Scholars Online supplies a torrent of exortation about preparing our students for twenty-first century jobs by providing them with the latest skills. It’s obvious enough that the ones they have now aren’t doing the trick, since so many people are out of work, and so many of those who are employed seem to be in dead-end positions. The way out of our social and cultural morass lies, we are told, in a focus on the STEM subjects: Science, Technology, Engineering, and Math. Providing students with job skills is the main business of education. They need to be made employable. They need to be able to become wealthy, because that’s how our society understands, recognizes, and rewards worth. We pay lip service, but little else, to other standards of value.

The Sarah D. Barder Fellowship organization to which I also belong is a branch of the Johns Hopkins University Center for Talented Youth. It’s devoted to gifted and highly gifted education. At their annual conference they continue to push for skills, chiefly in the scientific and technical areas, to make our students competitive in the emergent job market. The highly gifted ought to be highly employable and hence earn high incomes. That’s what it means, isn’t it?

The politicians of both parties have contrived to disagree about almost everything, but they seem to agree about this. In January of 2014, President Barack Obama commented, “…I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree. Now, nothing wrong with an art history degree — I love art history. So I don’t want to get a bunch of emails from everybody. I’m just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.”

From the other side of the aisle, Florida Governor Rick Scott said, “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so.”

They’re both, of course, right. The problem isn’t that they have come up with the wrong answer. It isn’t even that they’re asking the wrong question. It’s that they’re asking only one of several relevant questions. They have drawn entirely correct conclusions from their premises. A well-trained plumber with a twelfth-grade education (or less) can make more money than I ever will as a Ph.D. That has been obvious for some time now. If I needed any reminding, the last time we required a plumber’s service, the point was amply reinforced: the two of them walked away in a day with about what I make in a month. It’s true, too, that a supply of anthropologists is not, on the face of things, serving the “compelling interests” of the state of Florida (or any other state, probably). In all fairness, President Obama said that he wasn’t talking about the value of art history as such, but merely its value in the job market. All the same, that he was dealing with the job market as the chief index of an education’s value is symptomatic of our culture’s expectations about education and its understanding of what it’s for.

The politicians haven’t created the problem; but they have bought, and are now helping to articulate further, the prevalent assessment of what ends are worth pursuing, and, by sheer repetition and emphasis, crowding the others out. I’m not at all against STEM subjects, nor am I against technologically competent workers. I use and enjoy technology. I am not intimidated by it. I teach online. I’ve been using the Internet for twenty-odd years. I buy a fantastic range of products online. I programmed the chat software I use to teach Latin and Greek, using PHP, JavaScript, and mySQL. I’m a registered Apple Developer. I think every literate person should know not only some Latin and Greek, but also some algebra and geometry. I even think, when going through Thucydides’ description of how the Plataeans determined the height of the wall the Thebans had built around their city, “This would be so much easier if they just applied a little trigonometry.” Everyone should know how to program a computer. Those are all good things, and help us understand the world we’re living in, whether we use them for work or not.

But they are not all that we need to know. So before you quietly determine that what I’m offering is just irrelevant, allow me to bring some news from the past. If that sounds contradictory, bear in mind that it’s really the only kind of news there is. All we know about anything at all, we know from the past, whether recent or distant. Everything in the paper or on the radio news is already in the past. Every idea we have has been formulated based on already-accumulated evidence and already-completed ratiocination. We may think we are looking at the future, but we aren’t: we’re at most observing the trends of the recent past and hypothesizing about what the future will be like. What I have to say is news, not because it’s about late-breaking happenings, but because it seems not to be widely known. The unsettling truth is that if we understood the past better and more deeply, we might be less sanguine about trusting the apparent trends of a year or even a decade as predictors of the future. They do not define our course into the infinite future, or even necessarily the short term — be they about job creation, technical developments, or weather patterns. We are no more able to envision the global culture and economy of 2050 than the independent bookseller in 1980 could have predicted that a company named Amazon would put him out of business by 2015.

So here’s my news: if the United States becomes a third-world nation (a distinct possibility), it will not be because of a failure in our technology, or even in our technological education. It will be because, in our headlong pursuit of what glitters, we have forgotten how to differentiate value from price: we have forgotten how be a free people. Citizenship — not merely in terms of law and government, but the whole spectrum of activities involved in evaluating and making decisions about what kind of people to be, collectively and individually — is not a STEM subject. Our ability to articulate and grasp values, and to make reasoned and well-informed decisions at the polls, in the workplace, and in our families, cannot be transmitted by a simple, repeatable process. Nor can achievement in citizenship be assessed simply, or, in the short term, accurately at all. The successes and failures of the polity as a whole, and of the citizens individually, will remain for the next generation to identify and evaluate — if we have left them tools equal to the task. Our human achievement cannot be measured by lines of code, by units of product off the assembly line, or by GNP. Our competence in the business of being human cannot be certified like competence in Java or Oracle (or, for that matter, plumbing). Even a success does not necessarily hold out much prospect of employment or material advantage, because that was never what it was about in the first place. It offers only the elusive hope that we will have spent our stock of days with meaning — measured not by our net worth when we die, but by what we have contributed when we’re alive. The questions we encounter in this arena are not new ones, but rather old ones. If we lose sight of them, however, we will have left every child behind, for technocracy can offer nothing to redirect our attention to what matters.

Is learning this material of compelling interest to the state? That depends on what you think the state is. The state as a bureaucratic organism is capable of getting along just fine with drones that don’t ask any inconvenient questions. We’re already well on the way to achieving that kind of state. Noam Chomsky, ever a firebrand and not a man with whom I invariably agree, trenchantly pointed out, “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.” He’s right. If we are to become unfree people, it will be because we gave our freedom away in exchange for material security or some other ephemeral reward — an illusion of safety and welfare, and those same jobs that President Obama and Governor Scott have tacitly accepted as the chief — or perhaps the only — real objects of our educational system. Whatever lies outside that narrow band of approved material is an object of ridicule.

If the state is the people who make it up, the question is subtly but massively different. Real education may not be in the compelling interest of the state qua state, but it is in the compelling interest of the people. It’s the unique and unfathomably complex amalgam that each person forges out of personal reflection, of coming to understand one’s place in the family, in the nation, and in the world. It is not primarily practical, and we should eschew it altogether, if our highest goal were merely to get along materially. The only reason to value it is the belief that there is some meaning to life beyond one’s bank balance and material comfort. I cannot prove that there is, and the vocabulary of the market has done its best to be rid of the idea. But I will cling to it while I live, because I think it’s what makes that life worthwhile.

Technical skills — job skills of any sort — are means, among others, to the well-lived life. They are even useful means in their place, and everyone should become as competent as possible. But as they are means, they are definitionally not ends in themselves. They can be mistakenly viewed as ends in themselves, and sold to the credulous as such, but the traffic is fraudulent, and it corrupts the good that is being conveyed. Wherever that sale is going on, it’s because the real ends are being quietly bought up by those with the power to keep them out of our view in their own interest.

Approximately 1900 years ago, Tacitus wrote of a sea change in another civilization that had happened not by cataclysm but through inattention to what really mattered. Describing the state of Rome at the end of the reign of Augustus, he wrote: “At home all was calm. The officials carried the old names; the younger men had been born after the victory of Actium; most even of the elder generation, during the civil wars; few indeed were left who had seen the Republic. It was thus an altered world, and of the old, unspoilt Roman character not a trace lingered.” It takes but a single generation to forget the work of ages.

But perhaps that’s an old story, and terribly out of date. I teach Latin, Greek, literature, and history, after all.

Reading and Christian Charity

Friday, September 21st, 2012

[This was originally posted as part of the Scholars Online website, and it remains there among the "White Papers", but I thought that putting it out on the blog would give it a little more exposure.]


Over my years as a teacher, I have had parents and students challenge me on my choice of literature on the grounds that some of it was not morally suitable. Among the works that have fallen under their scrutiny are the Iliad (for violence), the Oedipus Rex of Sophocles (patricide, incest), the Volsunga Saga (murder, incest), the plays of Shakespeare (murder, bawdry, violence, drunkenness, adultery, lying, theft, treason…the list is nearly endless), and Frankenstein and “The Importance of Being Earnest” (their authors’ lifestyles). Teaching the various pagan myths, furthermore, has been variously condemned on the ground that they presume false gods. Even some students who have come looking specifically for classical education have not been able to refrain from ridiculing the Greeks for their beliefs.

Most of these people have good intentions. The behavior to which they are objecting is usually indeed objectionable. We should not practice murder or incest or adultery; we should not steal to become wealthier, or kill others to enhance our personal glory; we should not embrace any of the thousand human vices that are detailed almost any selection of literature one could pick. We should not be moved by admiration for a work to emulate its author’s bad behavior, either. And I would certainly affirm that the gods of Olympus and Valhalla are fictions, and that any inclination to worship them ourselves ought to be suppressed. So why should we concern ourselves with literature that includes them—and if we do, how should we approach them? It’s a good question, requiring a serious answer.

The case that observation elicits emulation has been made for generations, and there is something to be said for it. One is unlikely to be drawn to a sin one has never heard of. An adult charged with the care and education of children must bear this in mind. One oughtn’t expose an unprepared mind to even literary descriptions of some human activity, any more than one hands a six-year-old the keys to the car.

The countervailing argument is that some familiarity with the harsh reality of the world is necessary: children need to recognize evil to reject it. After all, we live inescapably in a sinful world, and are ourselves part of it. Our tendency to sin will not be eliminated by cultivating ignorance. We may not have heard of one sin, but we can almost certainly make up the lack in some other way out of our own flawed natures. In addition to exposing us to bad things, good literature can help teach us to recognize evil and avert it.

So these two excellent arguments stand perpetually at odds. It’s nothing new: the question rages in Plato’s Republic, and it’s still part of our public discourse relating to censorship. Unsurprisingly, Plato, who believes in the moral perfectibility of man, prefers censorship; today’s libertarian tends toward the other extreme, in a fond belief that market forces will achieve something optimized not only for economic equilibrium but moral balance. A Christian cognizant of our fallen nature, though, can accept neither extreme. The trick seems to be in ascertaining exactly where to place the boundary at any given time.

That is of course not obvious, nor is it even really clear that there is a fixed line so much as a murky zone in the middle somewhere, to be navigated with a queasy caution. It is hard to decide what ought to be read and what ought to be suppressed without relying on some kind of calculus—whether simple or elaborate—of a work’s infractions. The problem is that an incident-count is usually going to miss the point. One can promote a thoroughly repugnant ethic without resorting to obscenity or violence. It is also possible to show the light of redemption shining through the nastiest human experiences.

The shallow scoring methodology often used to define movies or books as unsuitable because of their quantities of inappropriate behavior will also erode the Scriptures. The Old Testament objectively recounts almost every known form of sin. The Gospels are not much better on that computation: they’re full of hypocrites and adulterers and sinners of every other sort, and the narrative comes to a wholly unwarranted execution by crucifixion. Can we allow our children to read such things?

And yet—dare we allow our children not to read such things? Are we are saved by the overwhelming niceness of God, or by this horrific and bloody sacrifice once offered? Weren’t the Children of Israel freed in a sequence of increasingly grisly plagues upon their Egyptian oppressors? Don’t we need to take these stories into ourselves and make them ours? We aren’t coaxed into the Kingdom of God as into a four-star hotel, by its elegant appointments and superior service—we’re driven, battered, and corralled, lifted up out of the mire of our own making because that’s finally the only place we can turn where we don’t see our own destruction. So sooner or later we must allow our children to encounter some unpleasant material. It seems to me better that they should encounter at least some of it in literature rather than in person.

Still, as responsible parents trying to raise children in the nurture of the Lord, we have to wrestle with the boundaries of where and when, and even after that we’re going to have to determine how to approach it in substance. It’s never going to be easy, safe, or comfortable. We’re going to make mistakes. We’re going to give our children some things they’re not ready for. It will hurt them. We’re going to protect them from things they really should have known. That will hurt them too.

We can lament this, but we cannot avoid it. There’s no easy answer. People who rely on simplistic formulae will achieve commensurately simplistic solutions. I know some families, for example, that simply rely on movie ratings for their film standards. PG-13 is okay; R is not. The problem is that it doesn’t work. There are some profoundly moving and powerful movies—important ones—that are rated R. There are also some vile ones that skate by with a PG or even a G rating—perhaps not those promoting ostentatious sexuality, extreme physical violence, or drug abuse, but some that plant corrupting seeds in the soul all the same (and there are sins other than sins of sex, violence, and substance abuse). The complexity of our experience cannot be reduced to a simple tally.

My thinking on this issue has been transformed by a number of things over the last decade, but by nothing as much as C. S. Lewis’s An Experiment in Criticism. I recommend it to everyone concerned with the complex balance of both what and how we read. What I’m going to propose here has to do with drawing what Lewis says to the critical community at large at least partway under the umbrella of specifically Christian thought. (While Lewis was of course a noted Christian apologist, this particular work was written for the scholarly community, and tends not to take an openly theological approach, though I think it is informed at a deeper level by his faith.) For the most part my own thinking is motivated by an awareness that when you read something, you are not just taking words into your eyes and mind: you are actually encountering, at some level, the person who wrote it.

This is a brash claim, and I’ll be the first to admit that it’s a limited sort of encounter. We don’t know what Homer looked like, or whether there were two of him, or whether one or both of him were blind or female or slaves. There are a lot of facts we don’t know—facts we would know if we were sitting down with him (or her, or them) at dinner. We don’t know, either, whether Shakespeare the writer was Shakespeare the actor or someone else entirely. And even where we have a fair amount of reliable biographical data, we still don’t know a good deal about most authors. A lot slips through the cracks.

But how is that different from any other human encounter? There are a lot of things I don’t know about the fellow I meet on the street. And yet (assuming he doesn’t approach me with obvious hostile intent) I try to give such a person a reception in Christian charity—a fair hearing, genuinely trying to understand what he has to say to me. Our Lord tells us, “For as much as you have done to one of the least of these, you have done also to me.”

There are a lot of things, for that matter, that I don’t know or understand even about those people who are closest to me. There are parts of my wife’s personality that I am only now discovering after nearly thirty years of marriage. There are parts I’m still pretty puzzled about. Maybe I’ll get them figured out eventually, but I’m not wagering on it. This is heady stuff, and should keep us humble and aware of the profound gravity (Lewis called it the weight of glory) that inheres in every single human interaction we have.

Is it reasonable to suggest that we approach reading that way too? I think it’s a thought-experiment worth trying. When we pick up a book, we are privileged to make—on some level—the author’s acquaintance. At the most basic level, we encounter persons when we read. And that imposes on us a moral obligation to listen—listen hard, sincerely, and attempt to understand what they’re trying to tell us.

Note that there are a number of things we have no obligation to do. We’re not obliged to believe everything they say—merely to hear it, and to strive to understand them. We have—and should have—own beliefs, and others (whether we meet them on the street or through books) have no presumptive claim upon those beliefs, unless they manage to persuade us by honest argument.

At the same time, I don’t think we need to feel obliged to judge everything they say, or to condemn them for crossing this or that line. This seems to be a favorite academic pastime, and a favorite pastime too among a lot of other groups. We live in a society ruled by the iconic thumbs-up or thumbs-down. Things are apparently either to be embraced or dismissed, with no intermediate gradations of evaluation or analysis. Many watchdog groups pronounce a movie worthy or unworthy of my attention, based on whether they agree with what they think it’s propounding. Few from any part of the political or religious spectrum suggest that I sift the work’s content for myself. It’s either one way or the other.

Do we deal with people that way? Some, I suppose, do—but that’s not what Our Lord has told us to do. We believe—at least those of us who believe that God loves us all, sinners as we all are—that we need to receive not only those with whom we agree, but also those with whom we do not. We don’t receive them for the rightness of their opinions, but because of our shared humanity. We don’t give them a cup of water in Jesus’ name because of their own righteousness (or even because of ours) but because of His. And when we do so, if we have the humility for it, we can see a partial image of God in each of them, too: again, not because they are right but because they are His— even if they don’t know it.

That’s why I can read Homer and receive the raw humanity of his tale, expressed in selfish, generous, sinful, driven, glorious—and contradictory—people. I don’t have to approve of Hamlet and his decisions (many of which are repellent, I think) to find a window on some very serious truths about human nature. We can have a literary sympathy for him without approving his deeds. I don’t have to approve of Mary Shelley’s behavior to recognize that she has some important and serious things to tell me about our capacity to create and to betray. I don’t have to approve of Oscar Wilde’s lifestyle to appreciate some of his scathingly funny (and often correct) pieces of human insight.

I am not eager to found another school of literary criticism, but I cannot find in any of the currently dominant ones the slightest note of the moral burden I think we inherit as readers. I would like at least to advance the notion that there is—less as a school of critical practice, and more as a disposition of the heart—a Christian way of reading. I would like to suggest that as a paradigm for such Christian reading, we take an approach that may seem simplistic to some, daring to others; but I think it will exercise our moral capacity and force us back where we belong, humbled, upon the all-sufficient love of Christ.

If we as Christians were to read with a fundamental charity toward the author, we would achieve something of a revolution in critical thought, at least within the Church. No, the world will not listen, most likely; it seldom does. It will have its own combinations of pettiness and loftiness, and it will come to its own mix of profound and vapid perceptions. And we may not do a lot better, in terms of critical output. But we are under no obligation to be successful: we are obliged to do what is right, irrespective of its success or failure.

Herewith I present a handful of what seem to me to be the chief implications of that principle:

We must make a good-faith effort to learn what the author was trying to say.
The so-called New Criticism of the 1950s laid it down as axiomatic that authorial intention was more or less beyond recovery, and that the text itself should be scrutinized on absolute terms as a work entirely unto itself. There is of course a profound truth behind what they claimed. We can never wholly or perfectly know the mind of another. In fact, the likelihood is high we will from time to time make some rather serious errors.

But it does not make matters better to combine a profound insight with an oversight, even more profound. What the New Critics seem to have missed is the fact that, if there is no authorial intention at stake, there is really no point to reading at all: if there is no context, neither is there, in any meaningful sense, a text. The purpose of writing in the first place is lost, for an author is almost never merely weaving words into an abstract object for his own amusement: he is attempting to communicate with readers, whoever they may be. If we respect that intention and respond in charity, we have to take this seriously.

We will never completely discover that intention.
As I said above, our understanding will be imperfect. This chafes some, especially those who require pure theory. I’ve come to expect it. Reality is messy and confusing. Now we see as through a glass darkly: if we can only see God imperfectly (whose intention, at least, is perfect, and whose capacity for self-expression issued in the Logos of all creation) then surely our understanding of our fellow man will be no better. That’s unfortunate, but for now, it’s what we have, so we’d better make the best of it. We can hope that we will in the next life be united not only with God, but also with the rest of God’s creation, in a more perfect understanding.

Not everything in a work can be encompassed by the author’s intention.
Sometimes we will perceive something valuable in the text without being sure whether the author intended it or not. There are passages in the Psalms where the Hebrew word is simply unknown to us. There are passages in Shakespeare where the words seem clear, but the thought that knits them together is impenetrable. There are places in poetry and prose alike where words take on a complex of meanings, and we cannot be entirely sure of whether the author really meant all those meanings or just one. This is where the New Criticism got it right. In the overall richness of literary production, connections emerge either from the subconsciousness of the author, where murky things reside beyond the scrutiny of pure intention, or else they emerge from the innate coherence of the material itself: the author has touched a truth perhaps unwittingly, but the truth of the universe resonates with it. This is part of the literary experience, too, and it would be churlish to reject it. Christian readers, I think, can take it as a sign of the grace of God operating in and on our small creative efforts, validating them, fructifying them, and turning them to a higher purpose. I’m not sure how others take it, but that’s not my present concern.

The whole intention of a work will be greater than the sum of its parts.
We cannot evaluate a work solely by regarding the incidents of its narrative. There may be reasons to proscribe certain works because of such things, or to ban them from schools, but this is a pragmatic tactical judgment—not a real evaluation. Put somewhat more pointedly, the mere presence of a sin in a story, no matter how appalling it is, does not make the story immoral. Yes, there are stories that we can call immoral, insofar as they seem to conduce to immoral practices on the part of those who read and believe them, or (at a deeper level) because they present a lie as a truth. But most stories—and all good ones—have to account for the reality of human sin. Dramatically presenting sinful behavior in a story is not ipso facto an endorsement of the sin. A story that presumes a sinless or perfectible humanity is, in the long run, immeasurably more dangerous.

We haven’t entered into the reading process primarily to judge.
I know, the term “judge” is tossed around rather sloppily both inside the Church and at its periphery, and indignant secularists with a somewhat deficient sense of irony routinely condemn Christians for being judgmental. What I’m saying here is merely this: just as we don’t talk to people in order to tally up the conversation’s share of virtue, the goal of the process of reading is not primarily evaluative either. The goal of reading is the meeting of minds itself. That imperfect meeting, across the gaps of time, space, world-view, and personality is not a side benefit of reading; it’s what reading is about. It’s another instance of human interaction—which seems to be a large part of what God put us here for—and it should be conducted with full regard for what Lewis called the weight of glory. I don’t need to pronounce on the ultimate state of Homer’s soul (God surely doesn’t need my help to sort that out); I don’t even need to come up with a value to assign to his work. I could not possibly do so anyway. I do need to love Homer—not because of his artistic virtuosity, or even because of his own intrinsic worth as a person, but because God loved him first and loves him still.

To recognize and embrace a truth is infinitely more rewarding than rejecting something.
When we have moved away from the position of judging, we also allow all those people—imperfect as they are—to mediate God’s love and God’s presence to us, and in that very act we can turn around some of the perceived deficiencies in these works, and make of them powerful lenses. When Achilles, a proud killing machine, and yet also a deeply sensitive representative of his culture—poetic, cruel, brilliant, and vengeful—extends mercy to Priam at last, he offers not only the mercy of Achilles, but an image of the mercy of God. Does Achilles know that? No. Does Homer know it? No. Does it matter that they don’t know it?  No. It’s powerful because it comes unexpectedly, like lightning from a clear sky. What we experience there is not pure alienation and bewilderment: the great shock here at the end of the ordeal of the Iliad is the shock of recognition—like climbing Everest and finding there, waiting for you, an old friend. The part of our souls that responds to the love of Christ, mirrored among our fellow churchmen on Sunday morning, should be able to recognize it, even in glimmers half-understood, in the far reaches of time and space. The incongruity of the context can endow it with a peculiar power: a bright light shines with equal intensity by day or night, but it’s by night that we see it best.

Humility is never out of place.
The words that come most painfully to most academics are, “I don’t know.” Scarcely less shameful than not knowing something is not having an opinion on it. Being willing to admit that we don’t know something, and withholding the formation of an opinion until we do, though, can be hugely liberating. It leaves us open to perceive  without bias. And if it entails an admission that we aren’t infinitely wise, so much the better. We all need to be reminded of that.

It’s easier to miss something that’s there than mistakenly to see something that isn’t.
Accordingly we should remain open to the possibility—indeed, the virtual certainty—that we’ve missed something. This is one of the reasons one can keep coming back to the same literature; it has the happy result that as one grows older, one can find valuable new things in what we might previously have discarded.

It’s akin to the unicorn problem. It’s easy to demonstrate the existence of people or dogs—one need just point one out. It’s nearly impossible to prove that unicorns don’t exist, though, unless they are logically self-contradictory. After conducting a painstaking search, we can say with some assurance that there are no unicorns here—but that doesn’t mean that they aren’t lurking just beyond our sight. In the same way, it’s virtually impossible to show that a work lacks real literary value. I’m not sure why anyone feels called upon to try, and why some seem so eager to dismiss as many things as possible. As ever, the dismissal on this level is tantamount to a dismissal of the person behind the work. Dare we, on peril of our own souls, to do that?

When a work doesn’t speak to me, really the worst thing I can honestly say about it is that it doesn’t speak to me. That’s a statement that’s as much about me as about it. I have too often had the humbling experience, though, of returning to works—sometimes after several readings—and discovering in them something I had missed before. It was many years and a dozen readings or more before Hamlet really started to make sense to me. I don’t think Hamlet really improved or altered in the interim.

Do note that this is different from perceiving a positive literary or moral fault in a work. Of the two, the literary fault is just a failure of workmanship; the moral fault is more problematic and probably more important. I think one can say that a work of literature is to be approached with caution or avoided altogether if its whole program is positively pernicious. But this is properly the domain of moral philosophy, and not in and of itself a literary judgment. Of course a literary scholar is also a moral agent, and this is not a concern that can be ruled out of bounds; nor in many cases are moral and artistic faults completely separable. I think it is always possible, too, that a work that is apparently advocating something we don’t approve of will, upon recognition of its artistic virtues, turn out not to have been saying that all along—but that is a complex and troubling line of inquiry too big for the present context.

Ridicule is not helpful to the enterprise.
Ridicule does not ennoble the one ridiculing; it does not benefit the one ridiculed; it does not helpfully inform the third party. It virtually never promotes real understanding; it seldom makes a significant distinction; it is, accordingly, at best pointless, at worst cruel, and most often (even when the object of ridicule is dead and gone, and beyond apparent harm) it sets a low example of callous disregard and uncharity, a pattern of not hearing and not receiving another genuinely. There is room for satire in the world, but it’s the form of literature most perilous for its practitioner: it needs to be conducted with an eye on the higher goal of lifting someone or something up, not merely tearing people down.

All truth is God’s truth.
If something is not true in and of itself, no amount of pious dressing will make it true. Conversely, if it is true, it needs no further raison d’être. We don’t need to apologize for every and any truth, or make it a platform for apologetics or pious polemics. Apologetics have their place, and I applaud and appreciate them: but truth, insofar as anything is true in itself, needs no further justification. The attempt to frame everything up as a case for Jesus, or to endow every story with a moral, or to force on every historical essay an evaluative pronouncement upon a culture, does not work to the glory of God. It instead tends to give the impression that truth is only worth heeding if we can somehow cash it in for platitudes, and tie it to an overtly theological point. Such a timorous view of the truth confounds the fear of the Lord: it’s fear for the Lord, and argues a fragile faith that cannot endure to look at the beauty of truth for what it is, and know that it is God’s.

And in a sense, I think, such people deprive themselves of a view of God in the very act of trying to keep their perspectives pure. For while I am very far from being a pantheist, I think (as Paul suggests in the first chapter of Romans) that the Lord has in fact hidden himself—or perhaps we might say, metaphorically, that he has left his fingerprints, to be discovered, as a channel of revelation and delight for us—throughout the weird and wonderful diversity of creation, with the divinely ironic result that even those who deny Him can convey to us an image of Him in spite of themselves.

Why Study Greek?

Thursday, September 13th, 2012

I would make them all learn English: and then I would let the clever ones learn Latin as an honour, and Greek as a treat.
— Winston Churchill (somewhat out of context).

A few years ago I wrote an entry on this blog entitled “Why Study Latin?” It was a distillation of my own thoughts about the actual benefits of learning Latin — and the benefits one ought legitimately to expect from doing so. I tried to distinguish the real benefits from other phantom benefits that might not be, in and of themselves, fully valid reasons for undertaking the study. Not everyone agreed, but in general I stand by what I said there. From my point of view, the chief reason to learn Latin is to be able to read Latin; a significant second is to gain that unique way of looking at the world that attends that ability. One has access to a number of great works of Latin literature in their original forms, therefore, and one has also an enhanced ability to think in Latinate terms.

Of course other collateral benefits might reasonably accrue, but they are neither absolutely guaranteed to the student of Latin, nor are they benefits that attend Latin study exclusively. Dr. Karl Maurer of the University of Dallas suggested that I didn’t sufficiently credit the advantages a trained Latinist would have in reading older English — and he’s definitely right that this kind of textural depth of English poetry and prose will probably elude anyone who isn’t familiar with Latin, and the way Latin education was a cornerstone of English education from about 1500 to at least 1900. I certainly don’t disagree with his claims there; I don’t think they rank as matters of linguistics as much as matters of literary development and style. They’re still not trivial, however.

Be that as it may, for a variety of reasons, some of them right and some of them wrong, learning Latin has its champions, and I hope it gains a lot more. While I don’t agree with all the reasons one might advance for Latin study, I will enthusiastically concur that it’s a terrific thing to learn and to know.

Far fewer, however, champion learning Greek so loudly. For a variety of reasons, Greek is seen as far less significant. Some of those reasons are sound: Greek does not directly stand behind a broad range of modern Western European languages the way Latin does. Many of our ideas of statecraft and polity come from Greece, but most of them came through Latin in the process. Other reasons people shy away from Greek are fairly trivial. It has an odd-looking alphabet. Its literature seems to depend on a lot of odder assumptions. Realistic, though rather defeatist, is the fact that, in general, Greek is just considered tougher to learn. Many mainstream churches no longer even require their clergy to be able to read Greek (which seems preposterous to me, but that’s another matter).

For whatever reasons, Greek is certainly studied far less at the high school level than it once was. I read a statistic a few years ago suggesting that maybe a thousand students were actually studying ancient Greek in modern American high schools at any one time. The numbers may be as high as two thousand, but surely no higher than that. I don’t know whether those numbers have risen or fallen since I read it, but I certainly see no evidence that they have skyrocketed. I do occasionally run into a Latin teacher at the American Classical League Summer Institutes who teaches some Greek, but it’s most often a sideline, and often a completely optional “extra” for before or after school. Most of those students are being exposed to the Greek alphabet and some vocabulary, but fairly few of them are receiving a rigorous exposure to the grammar of Greek as a whole. If one narrows that to those who have studied real Classical Greek, as opposed to New Testament Greek, the numbers are probably smaller still.

For me most of the reasons for learning to read Greek are similar to those for reading Latin. The chief benefit, I would still insist, is to be able to read Greek literature in its original terms. Lucie Buisson wrote an eloquent defense of Homer in Greek not long ago in this blog. You cannot acquire a perspective on the Homeric poems like Lucie’s without reading them in Greek. It’s a huge deal: something snaps into view in a way that just cannot be explained to someone who hasn’t experienced it. No translation, no matter how good, can capture it for you. Though Keats memorably thanked Chapman for something like this eye-opening experience, the fact remains that Keats didn’t have the real thing as a comparandum. Chapman’s Homer is terrific — but Homer’s Homer is better.

Beyond the immediate experience of the literary objects themselves there is the fact that Greek provides its students with what I can only (metaphorically) call another set of eyes — that is, a different way of seeing the world, with different categories of thought that run deeper than mere changes in vocabulary. Virtually any new language one learns will provide that kind of new perspective: French, Spanish, or German will do so; Latin certainly does. I would suggest that Greek provides a uniquely valuable set precisely because it is further removed from English in its basic terms.

A reasonable command of multiple languages gives us what might be likened to stereoscopic vision. One eye, or one point of view, may be able to see a great deal — but it’s still limited because it’s looking from one position. A second eye, set some distance from the first, may allow us to see a somewhat enlarged field of view, but its real benefit is that it allows us, by the uncannily accurate trigonometric processor resident in our brains, to apprehend things in three dimensions. Images that are flat to one eye achieve depth with two, and we perceive their solidity as we never could do otherwise. Something similar goes on with an array of telescope dishes spread out over a distance on the earth — they allow, by exploiting even relatively slight amount of parallax in cosmic terms, an enhanced apprehension of depth in space. (Yes, there are also some other advantages having to do with resolution — all analogies have their limits.)

I would argue that every new language one learns will thus provide another point of view, enhancing and enriching, by a kind of analogical stereoscopy, a deeper and more penetrating view of the world. And like the more widely spaced eyes, or the telescopes strung out in a very large array, the further apart they are, the more powerfully their “parallax” (to speak purely analogically) will work upon us. This, I would argue, is one of the chief reasons for learning Greek. In some of its most fundamental assumptions, Greek is more sharply distinct from English than is Latin. A few examples will have to suffice.

Greek, for example, invites us to think about time differently. Greek verb tenses are not as much about absolute time as English verb tenses are; they are more about what linguists call aspect (or aspect of action in older writings). That is, they have more to do with the shape of an action — not its intrinsic shape, but how we’re talking about it — than merely locating it in the past, present, or future. Greek has a tense — the aorist — that English and Latin are missing. The aorist is used in the indicative mood to denote simple action in the past, but in other moods to express other encapsulation of simple verb action. Greek aorist verbs in the indicative will certainly locate events in the temporal continuum, and certainly English also has ways to express aspect — things such as the progressive or emphatic verb forms: e.g., “I run” vs. “I am running” or “I do run”. But whereas the English verb is chiefly centered in the idea of when something happened or is happening or will happen, with aspect being somewhat secondary, in Greek it’s the other way around. What exactly that does to the way Greek speakers and thinkers see the world is probably impossible to nail down exactly — but it’s not trivial.

Attic and earlier Greek has a whole mood of the verb that isn’t present in English or Latin — the optative. Students of New Testament Greek won’t see this on its own as a rule. There are a few examples such as Paul’s repeated μὴ γένοιτο in Romans (sometimes translated as “by no means”, but intrinsically meaning something more like “may it not come about”). But Attic and older dialects (like Homeric Greek) are loaded with it. It’s not just an arbitrary extension of a subjunctive idea: it runs alongside the subjunctive and plays parallel games with it in ways that defy simple classification.

Greek has a voice that neither English nor Latin knows, properly speaking — what is called the middle voice. It is neither active nor passive; but tends to refer to things acting on or on behalf of themselves, either reflexively or in a more convoluted way that defies any kind of classification in English language categories.

The Greek conditional sentence has a range of subtlety and nuance that dwarfs almost anything we have in English. Expressing a condition in Greek, or translating a condition from Greek, requires a very particular degree of attention to how the condition is doing what it is doing. In the present and the past, one may have either contrary to fact conditions (“If I were a rich man, I would have more staircases,” or “If I had brought an umbrella I would not have become so wet,”) general conditions (“If you push that button, a light goes on,”), and particular conditions (“If you have the older edition of the book, this paragraph is different”); in the future there are three other kinds of conditions, one of them nearly (but not quite) contrary to fact (“If you were to steal my diamonds, I’d be sad,”) called the future less vivid, and then a future more vivid and a future most vivid, representing increasing degrees of urgency in the future. All of these can be tweaked and modified and, in some rather athletic situations, mixed. If you study Greek, you will never think about conditions in quite the same way again.

Greek has what are called conditional temporal clauses that model themselves on conditions in terms of their verb usage, though they don’t actually take the form of a condition. There is something like this in English, but because we don’t use such a precise and distinct range of verbs for these clauses, they don’t show their similarities nearly as well.

The Greek participle is a powerhouse unlike any in any other Western language. Whole clauses and ideas for which we would require entire sentences can be packaged up with nuance and dexterity in participles and participial phrases. Because Greek participles have vastly more forms than English (which has only a perfect passive and a present active — “broken” and “breaking”) or than Latin (which has a perfect passive and a present active, and future active and passive forms), it can do vastly more. Greek participles have a variety of tenses, they can appear in active, middle, and passive voices, and they are inflected for all cases, numbers, and genders. All of these will affect the way one apprehends these nuggets of meaning in the language.

Those are only some examples of how a Greek sentence enforces a subtly different pattern of thought upon people who are dealing with it. As I said, however, for me the real treasure is in seeing these things in action, and seeing the ideas that arise through and in these expressions. So what’s so special as to require to be read in Greek?

Lucie already has written thoroughly enough about the joys of Homer; much the same could be said of almost any of the other classical authors. Plato’s dialogues come alive with a witty, edgy repartee that mostly gets flattened in translation. The dazzling wordplay and terrifying rhythms of Euripidean choruses cannot be emulated in meaningful English. Herodotus’s meandering storytelling in his slightly twisted Ionic dialect is a piece of wayfaring all on its own. The list goes on.

For a Christian, of course, being able to read the New Testament in its original form is a very significant advantage. Those who have spent any time investigating what we do at Scholars Online will realize that this is perhaps an odd thing to bring up, since we don’t teach New Testament Greek as such. My rationale there is really quite simple: the marginal cost of learning classical Attic Greek is small enough, compared with its advantages, that there seems no point in learning merely the New Testament (koine) version of the language. Anyone who can read Attic Greek can handle the New Testament with virtually no trouble. Yes, there are a few different forms: some internal consonants are lost, so that γίγνομαι (gignomai) becomes γίνομαι (ginomai), and the like. Yes, some of the more elaborate constructions go away, and one has to get used to a range of conditions (for example) that is significantly diminished from the Attic models I talked about above. But none of this will really throw a student of Attic into a tailspin; the converse is not true. Someone trained in New Testament Greek can read only New Testament Greek. Homer, Euripides, Plato, Aristotle, Sophocles, Herodotus, Thucydides — all the treasures of the classical Greek tradition remain inaccessible. But the important contents of the New Testament and the early Greek church fathers is open even with this restricted subset of Greek — and they are very well worth reading.

Greek is not, as mentioned earlier, a very popular subject to take at the high school level, and it’s obvious that it’s one of those things that requires a real investment of time and effort. Nevertheless, it is one of the most rewarding things one can study, both for the intrinsic delights of reading Greek texts and for some of the new categories of thought it will open up. For the truly exceptional student it can go alongside Latin to create a much richer apprehension of the way language and literary art can work, and to provide a set of age-old eyes with which to look all that more precisely at the modern world.

Computer Programming as a Liberal Art

Monday, September 3rd, 2012

One of the college majors most widely pursued these days is computer science. This is largely because it’s generally seen as a ticket into a difficult and parsimonious job market. Specific computer skills are demonstrably marketable: one need merely review the help wanted section of almost any newspaper to see just how particular those demands are.

As a field of study, in other words, its value is generally seen entirely in terms of employability. It’s about training, rather than about education. Just to be clear: by “education”, I mean something that has to do with forming a person as a whole, rather just preparing him or her for a given job, which I generally refer to as “training”. If one wants to become somewhat Aristotelian and Dantean, it’s at least partly a distinction between essence and function. (That these two are inter-related is relevant, I think, to what follows.) One sign of the distinction, however, is that if things evolve sufficiently, one’s former training may become irrelevant, and one may need to be retrained for some other task or set of tasks. Education, on the other hand, is cumulative. Nothing is ever entirely lost or wasted; each thing we learn provides us with a new set of eyes, so to speak, with which to view the next thing. In a broad and somewhat simplistic reduction, training teaches you how to do, while education teaches you how to think.

One of the implications of that, I suppose, is that the distinction between education and training has largely to do with how one approaches it. What is training for one person may well be education for another. In fact, in the real world, probably these two things don’t actually appear unmixed. Life being what it is, and given that God has a sense of humor, what was training at one time may, on reflection, turn into something more like education. That’s all fine. Neither education nor training is a bad thing, and one needs both in the course of a well-balanced life. And though keeping the two distinct may be of considerable practical value, we must also acknowledge that the line is blurry. Whatever one takes in an educational mode will probably produce an educational effect, even if it’s something normally considered to be training. If this distinction seems a bit like C. S. Lewis’s distinction between “using” and “receiving”, articulated in his An Experiment in Criticism, that’s probably not accidental. Lewis’s argument there has gone a long way toward forming how I look at such things.

Having laid that groundwork, therefore, I’d like to talk a bit about computer programming as a liberal art. Anyone who knows me or knows much about me knows that I’m not really a programmer by profession, and that the mathematical studies were not my strong suit in high school or college (though I’ve since come to make peace with them).

Programming is obviously not one of the original liberal arts. Then again, neither are most of the things we study under today’s “liberal arts” heading. The original liberal arts included seven: grammar, dialectic, and rhetoric — all of which were about cultivating precise expression (and which were effectively a kind of training for ancient legal processes), and arithmetic, geometry, music, and astronomy. Those last four were all mathematical disciplines: both music and astronomy bore virtually no relation to what is taught today under those rubrics. Music was not about pavanes or symphonies or improvisational jazz: it was about divisions of vibrating strings into equal sections, and the harmonies thereby generated. Astronomy was similarly not about celestial atmospheres or planetary gravitation, but about proportions and periodicity in the heavens, and the placement of planets on epicycles. Kepler managed to dispense with epicycles, which are now of chiefly historical interest.

In keeping with the spirit, if not the letter, of that original categorization, we’ve come to apply the term “liberal arts” today to almost any discipline that is pursued for its own sake — or at least not for the sake of any immediate material or financial advantage. Art, literature, drama, and music (of the pavane-symphony-jazz sort) are all considered liberal arts largely because they have no immediate practical application to the job of surviving in the world. That’s okay, as long as we know what we’re doing, and realize that it’s not quite the same thing.

While today’s economic life in the “information age” is largely driven by computers, and there are job openings for those with the right set of skills and certifications, I would suggest that computer programming does have a place in the education of a free and adaptable person in the modern world, irrespective of whether it has any direct or immediate job applicability.

I first encountered computer programming (in a practical sense) when I was in graduate school in classics. At the time (when we got our first computer, an Osborne I with 64K of memory and two drives with 92K capacity each), there was virtually nothing to do with classics that was going to be aided a great deal by computers or programming, other than using the very basic word processor to produce papers. That was indeed useful — but had nothing to do with programming from my own perspective. Still, I found Miscrosoft Basic and some of the other tools inviting and intriguing — eventually moving on to Forth, Pascal, C, and even some 8080 Assembler — because they allowed one to envision new things to do, and project ways of doing them.

Programming — originally recreational as it might have been — taught me a number of things that I have come to use at various levels in my own personal and professional life. Even more importantly, though, it has taught me things that are fundamental about the nature of thought and the way I can go about doing anything at all.

Douglas Adams, the author of the Hitchhiker’s Guide books, probably caught its most essential truth in Dirk Gently’s Holistic Detective Agency:

”…if you really want to understand something, the best way is to try and explain it to someone else. That forces you to sort it out in your mind. And the more slow and dim-witted your pupil, the more you have to break things down into more and more simple ideas. And that’s really the essence of programming. By the time you’ve sorted out a complicated idea into little steps that even a stupid machine can deal with, you’ve learned something about it yourself.”

I might add that not only have you yourself learned something about it, but you have, in the process learned something about yourself.

Adams also wrote, “I am rarely happier than when spending entire day programming my computer to perform automatically a task that it would otherwise take me a good ten seconds to do by hand.” This is, of course, one of the drolleries about programming. The hidden benefit is that, once perfected, that tool, whatever it was, allows one to save ten seconds every time it is run. If one judges things and their needs rightly, one might be able to save ten seconds a few hundred thousand or even a few million times. At that point, the time spent on programming the tool will not merely save time, but may make possible things that simply could never have been done otherwise.

One occasionally hears it said that a good programmer is a lazy programmer. That’s not strictly true — but the fact is that a really effective programmer is one who would rather do something once, and then have it take over the job of repeating things. A good programmer will use one set of tools to create other tools — and those will increase his or her effective range not two or three times, but often a thousandfold or more. Related to this is the curious phenomenon that a really good programmer is probably worth a few hundred merely adequate ones, in terms of productivity. The market realities haven’t yet caught up with this fact — and it may be that they never will — but it’s an interesting phenomenon.

Not only does programming require one to break things down into very tiny granular steps, but it also encourages one to come up with the simplest way of expressing those things. Economy of expression comes close to the liberal arts of rhetoric and dialectic, in its own way. Something expressed elegantly has a certain intrinsic beauty, even. Non-programmers are often nonplussed when they hear programmers talking about another programmer’s style or the beauty of his or her code — but the phenomenon is as real as the elegance of a Ciceronian period.

Pursuit of elegance and economy in programming also invites us to try looking at things from the other side of the process. When programming an early version of the game of Life for the Osborne, I discovered that by simply inverting a certain algorithm (having each live cell increment the neighbor count of all its adjacent spaces, rather than having each space count its live neighbors) achieved an eight-to-tenfold improvement in performance. Once one has done this kind of thing a few times, one starts to look for such opportunities. They are not all in a programming context.

There are general truths that one can learn from engaging in a larger programming project, too. I’ve come reluctantly to realize over the years that the problem in coming up with a really good computer program is seldom an inability to execute what one envisions: it’s much more likely to be a problem of executing what one hasn’t adequately envisioned in the first place. Not knowing what winning looks like, in other words, makes the game much harder to play. Forming a really clear plan first is going to pay dividends all the way down the line. One can find a few thousand applications for that principle every day, both in the computing world and everywhere else. Rushing into the production of something is almost always a recipe for disaster, a fact explored by Frederick P. Brooks in his brilliantly insightful (and still relevant) 1975 book, The Mythical Man-Month, which documents his own blunders as the head of the IBM System 360 project, and the costly lessons he learned from the process.

One of the virtues of programming as a way of training the mind is that it provides an objective “hard” target. One cannot make merely suggestive remarks to a computer and expect them to be understood. A computer is, in some ways, an objective engine of pure logic, and it is relentless and completely unsympathetic. It will do precisely what it’s told to do — no more and no less. Barring actual mechanical failure, it will do it over and over again exactly the same way. One cannot browbeat or cajole a computer into changing its approach. There’s a practical lesson and probably a moral lesson too there. People can be persuaded; reality just doesn’t work that way — which is probably just as well.

I am certainly not the first to have noted that computer programming can have this kind of function in educational terms. Brian Kernighan — someone known well to the community of Unix and C programmers over the years (he was a major part of the team that invented C and Unix) has argued that it’s precisely that in a New York Times article linked here. Donald Knuth, one of the magisterial figures of the first generation of programming, holds forth on its place as an art, too, here. In 2008, members of the faculties of Williams College and Pomona College (my own alma mater) collaborated on a similar statement available here. Another reflection on computer science and math in a pedagogical context is here. And of course Douglas Hofstadter in 1979 adumbrated some of the more important issues in his delightful and bizarre book, Gödel, Escher, Bach: An Eternal Golden Braid.

Is this all theory and general knowledge? Of course not. What one learns along the line here can be completely practical, too, even in a narrower sense. For me it paid off in ways I could never have envisioned when I was starting out.

When I was finishing my dissertation — an edition of the ninth-century Latin commentary of Claudius, Bishop of Turin, on the Gospel of Matthew — I realized that there was no practical way to produce a page format that would echo what normal classical and mediaeval text editions typically show on a page. Microsoft Word (which was what I was using at the time) supported footnotes — but typically these texts don’t use footnotes. Instead, the variations in manuscript readings are keyed not to footnote marks, but to the line numbers of the original text, and kept in a repository of textual variants at the bottom of the page (what is called in the trade an apparatus criticus). In addition, I wanted to have two further sets of notes at the bottom of the page, one giving the sources of the earlier church fathers that Claudius was quoting, and another giving specifically scriptural citations. I also wanted to mark in the margins where the foliation of the original manuscripts changed. Unsurprisingly, there’s really not a way to get Microsoft Word to do all that for you automatically. But with a bit of Pascal, I was able to write a page formatter that would take a compressed set of notes indicating all these things, and parcel them out to the right parts of the page, in a way that would be consistent with RTF and University Microfilms standards.

When, some years ago, we were setting Scholars Online up as an independent operation, I was able, using Javascript, PHP, and MySQL, to write a chat program that would serve our needs. It’s done pretty well since. It’s robust enough that it hasn’t seriously failed; we now have thousands of chats recorded, supporting various languages, pictures, audio and video files, and so on. I didn’t set out to learn programming to accomplish something like this. It was just what needed to be done.

Recently I had to recast my Latin IV class to correspond to the new AP curriculum definition from the College Board. (While it is not, for several reasons, a certified AP course, I’m using the course definition, on the assumption that a majority of the students will want to take the AP exam.) Among the things I wanted to do was to provide a set of vocabulary quizzes to keep the students ahead of the curve, and reduce the amount of dictionary-thumping they’d have to do en route. Using Lee Butterman’s useful and elegant NoDictionaries site, I was able to get a complete list of the words required for the passages in question from Caesar and Vergil; using a spreadsheet, I was able to sort and re-order these lists so as to catch each word the first time it appeared, and eliminate the repetitions; using regular expressions with a “grep” utility in my programming editor (BBEdit for the Macintosh) I was able to take those lists and format them into GIFT format files for importation into the Moodle, where they will be, I trust, reasonably useful for my students. That took me less than a day for several thousand words — something I probably could not have done otherwise in anything approaching a reasonable amount of time. For none of those tasks did I have any training as such. But the ways of thinking I had learned by doing other programming tasks enabled me to do these here.

Perhaps the real lesson here is that there is probably nothing — however mechanical it may seem to be — that cannot be in some senses used as a basis of education, and no education that cannot yield some practical fruit down the road a ways. That all seems consistent (to me) with the larger divine economy of things.

Do you still have that old double-dactyl thing…?

Monday, May 7th, 2012

Okay…now for something a mite silly. Of the various things I’ve published in one medium or another over the years, the one that people still e-mail me asking about is not actually anything serious — but this. It’s not widely available any more, so I thought I’d put it where those who want it can find it. It may also give my students in Latin IV and Western Literature to Dante something to chuckle at. I submitted it to a list of Latinists back in 1995, in response to a double-dactyl contest that had been announced there. For those who were looking for it, here it is. For those who just stumbled on it, I hope you enjoy it. For those who consider me humorless…perhaps you’re right. For those who find it out of place in this serious context…well, flip ahead to the next item or back to the last one…


I realize that the deadline for the double-dactyl competition has come and gone. I also realize that these do not qualify as Proper Double-Dactyls because:

a) there is an irregular overlapping of the sense occasionally into the first verse, which is properly off-limits to all but the obligatory nonsense, and

b) I have dispensed summarily (though, I think, for good cause) with the placement of a name in the second line of every stanza (a concession that cost three permanent punches on my poetic license — but I suspect it’s about to be revoked anyway).

Nevertheless, they do preserve the other features of the form, and constitute a cycle, as it were, of Almost-proper Double-Dactyls, maintaining a one-to-one correspondence of stanza to book of the Aeneid, something that has not, to my knowledge, been attempted before. One wonders why.

Their propriety on other, less formal, grounds, I decline to consider, and encourage the reader to do the same. The fact that they are only slightly and/or obscurely salacious (and not at all vicious) will strike some as a virtue, others as a deficiency; it is, for the time being, an unalterable function of my own mild and retiring nature. I must accordingly leave it to my readers to pronounce on the eligibility of these nugae for admission to the elect and spiritually rarefied company of classic double-dactyls.

To do so will of course require a certain amount of imaginative energy, since the corpus comprises so few real classics. The same task has already caused some discomfort for the author. Though one is inevitably stimulated by the freedom of a new species of verse, still it is a pity that this one is itself so young and its traditions so relatively slight, and there are so few verses eligible for allusive parody. We must manufacture them by exercising the power of hypothesis to — nay, beyond — its furthest reasonable extent. Then, what wonders emerge! Who can imagine what an Archilochus could have done with so potent a form in a siege: who can doubt that he would have reduced whole poleis by suicide, making that cast-off shield of his unnecessary? What clear little rivulets might Callimachus not have fashioned on this irrational bipedal Parnassus? What ripe mysteries could not Sappho have enclosed within the ambit of the Aeolic Iggledy Piggledy?

And yet my particular undertaking here is an epic one, and of a Latinate mold as well. It presumes (for sake of argument) all those Hellenic and Hellenistic antecedents and more. It presumes as well an entire early history of Latin double-dactyls, and invites us to suppose them as we may. It is obvious on reflection, surely, that the twelve thousand Ennian double-dactyls that never made it into Warmington’s collection would have afforded an unparallelled mine of six-syllable words, elaborately compounded by insertion, one into another. It seems similarly apparent that Lucretius could have written double-dactyls without much altering his general procedure at all. And imagine, for a moment, the Catullan hexasyllabic in all of its pumicexpolitous glory — darkly ironic and bitterly playful. What a lot we have lost to the fact that the double-dactyl was not contrived sooner. I like to think that these very verses here presented (rendered dashingly into Latin, of course) would have afforded Vergil himself a quicker and easier, if not a better, recusatio when pestered by Augustus to produce an epic. Surely the Princeps would have known better than to ask for more. And this is but the beginning. What could the Nachleben of such a work have been? Would Augustine have wept over the fourth double-dactyl? I think not. He’d have had to confess other things. Would Dante have sought another guide, or would the Divine Comedy have been much more comedic, and much less divine?

Be that as it may, it is our mortal lot to patch up as we can the deficiencies of the past, and to this mighty and thankless work I have here set my hand. Lest I appear a mere Johnny-come-lately to this particular area of historical repair, I hasten to point out that the first version of this nugatory opus had in fact been completed before I learned of the similar (and wholly admirable) efforts of some of my colleagues to render the Iliad into limericks. It seems fitting, though, that whereas that has been an accretive product of many authors’ labors (one might say an instance of traditional poetry, growing in our midst, even while we debate whether such a thing is possible), my contribution, like the poem on which it is modeled, is the product of a single vision, howso astigmatic: which is to say, I bear the blame for it entirely myself. That its relationship to its model is one of Very Free Interpretation is granted, and need not, I think, be pointed out in any critical essays; note of all other defects, real or imagined, should be carefully written down and sent to dev.null@nowhere.edu, where they will receive the attention they deserve. In conclusion, I should also warn one and all that any attempt at Deconstruction by anyone anywhere, with or without the proper credentials, will be vigorously resisted to the fullest extent permitted under the prevailing laws.

Which being said, for the amusement of those of my fellow Latinists still possessed of a sense of the absurd (which, given the state of the discipline, must be most of us), I offer the following:




Aeneas Reductus,
or,
The Epick Taym’d

            I.
Arma virumque ca-
nobody’s suffered as
pius Aeneas, the
      Trojan, has done:
so he tells Dido, that
Carthagenetical
Tyrian princess and
      bundle of fun.

            II.
“Arma virumque, ca-
cophonous noises came
down through the floor of a
      large wooden horse;
that night all Hellas broke
pyromaniacally
loose, wrecking Troy, sealing
      Helen’s divorce.

            III.
“Arma virumque, ca-
lamitous ruin has
followed me everywhere,
      run me to ground;
now I, across the whole
Mediterranean,
find myself searching for
      something to found.”

            IV.
Arma virumque, Ca-
lypso had no better
luck when she tried to keep
      arms on her man;
Dido does dire deeds
autophoneutical
(Suicide’s shorter, but
      it wouldn’t scan).

            V.
Arma virumque, ca-
priciously Juno has
fired up the blighters to
      burn all the ships;
pius Aeneas says
(labiorigidly):
“Build some new galleys, guys:
      then — watch your slips.”

            VI.
Arma virumque, ca-
no one expects to get
out when they once have gone
      down into hell;
heroes, though, packing a
patrioracular
promise, appear to come
      through it quite well.

            VII.
Arma virumque, ca-
tastrophe hatches to
cancel the wedding — a
      hitch in the plan:
Turnus, the mettlesome
Rutuliprincipal
lad, grows so mad as to
      nettle our man.

            VIII.
Arma virumque, ca-
nonical topics: a
good man, Evander, now
      enters the field;
Venus grows fretful, and
matriprotectively
calling on Vulcan, buys
      sonny a shield.

            IX.
Arma virumque, can-
tankerous Turnus tries
storming the camp — hopes to
      clean up the plains;
Nisus and Co., caught in
noctiprogredient
slaughters, are slaughtered in
      turn for their pains.

            X.
Arma virumque, (ca-
tharsis unbounded!) young
Pallas, Evander’s son
      buys it, poor pup;
Venus’s son fixes
responsibility —
sees that the prime bounder’s
      number is up.

            XI.
Arma virumque, Ca-
milla the Volscian
makes for the Latins a
      splendid last stand;
leaving a legacy
axiomatical:
“Trust no Etruscan who’s
      eyeing your land.”

            XII.
Arma virumque: can
’neas put Pallas’s
fall from his mind, sweeten
      bitter with verse? —
“But that reminds me…” — so,
semperspontaneous,
he does to Turnus two
      turns for the worse.

Copyright © 1995, Bruce A. McMenomy

Freedom to fail

Thursday, March 31st, 2011

The previous entry on this blog was about failure not being an option — and I subscribe to that. Failure in an ultimate sense is something we should never choose for ourselves: the universe or some other person may well cause us to fail but we should not elect to fail in a final sense. Nevertheless, failure, and the freedom to fail in the short run without disastrous long-term consequences, is essential to learning. I have taught students with a whole range of abilities and inclinations over the years; there have been some who have been afraid to venture on anything, lest they fail to complete it to some arbitrary standard of perfection. Others tear into the subject with giddy abandon, making mistakes freely and without compunction. Of the two groups, it is invariably the latter that gets the job done. The students in the former group are frozen by fear or reverence for some external standard of excellence or perfection, and they really cannot or will not transcend that fear.

It may seem odd that, while I consider education to be one of the more important activities one can engage in throughout life, it’s actually the model of the game that speaks most directly to what’s going on here. The Dutch historian Johan Huizinga, in a marvelous little book called Homo Ludens, explores the notion of game and gaming in historical cultures. He identifies a number of salient features — but chief among them are two facts: first, that the universe of the game is somehow set apart, a kind of sacred precinct, and, second, that what goes on there does not effectively leave that arena. I think the same can be said of education — and, interestingly, the idea of education as a game is of long standing: the Roman word that most commonly was applied to the school was ludus, which is also the most common word for game or play.

Who doesn’t know at least one student who loves to play games, and who may be remarkably expert in them, but still has difficulty engaging the subjects he or she is nominally studying seriously? In my experience, it’s more the norm than the exception. I’ve heard people decry that fact as a sign of the sorry state into which the world has fallen — but I don’t think that’s all, or even most, of the picture. One of the things that sets games apart from other learning activities is that in a game, one is encouraged, or even required, to try things, in the relative certainty that, at first at least, one is going to make an awful mess of most of them. That’s okay. You get to do it again, and again, and again, if need be.

Within the bounds of the game, one is free to fail. Even there, one should not choose to fail: doing that subverts the game as nothing else ever could. But even if one is trying to win, failure comes easily and frequently, but without serious penalty. The consequence, though, is that students learn quickly enough how not to fail. The idea that one must get everything right the first time is nonsense. The creeping fear that one needs to score 100 on every quiz is nonsense. Even the belief that the highest grade signifies the best education is nonsense. Sure, I have had some students who got extraordinarily high grades and were very engaged with the material; I have had some students who were completely disengaged and got miserable scores. But those are the easy cases, and they are relatively few. The mixed cases are interesting and hard. I’ve had a few who operated the system in order to get good scores, but never really closed with the material. They walked away with a grade — though usually not the best grade — and little else. I wish it were possible to prevent tweaking the system this way, but it often is not. In the end, though, like the student at UCLA Christe recounted in the previous post, they achieved a real failure because they chose it: they sacrificed the substance of their education in order to win a favorable report on the education. It’s a bad trade — yet another instance of the means becoming autonomous.

I have also had other students — probably more of them than in any of the other groups — who thrashed about, and had real difficulty with the material, but kept bashing at it, and wound up making real strides, and in a meaningful sense winning the battle. Christe talked about how a baby learning to walk is taught by the unforgiving nature of gravity. That’s true enough. Gravity is exacting: its rules never waver, and so it may be unforgiving in that regard. It’s also very forgiving in another sense, however. Falling once or even a thousand times doesn’t keep you down or make you more likely to fall the next time. Every time you fall, assuming you haven’t injured yourself critically, you are free to get up again and keep on trying. And perhaps you have learned something this time. If not, give it another go.

Children learning to speak succeed with such amazing speed not in spite of but because of their abundant mistakes. They are forming concepts about the language, and testing and refining them by playing with it so recklessly. A child who learns that “I walked” is a way of putting “I walk” into the past will quite reasonably assume that “I runned” is a way of putting “I run” into the past. This may be local and small-scale setback when it comes to identifying the right verb form for the task: it most definitely is not failure in a larger sense. It’s a triumph. Sure, it’s incorrect English. It is, nevertheless, the vindication of that child’s language-forming capacity, and the ability to abstract general principles from specific instances. He or she will eventually learn about strong verbs. But such engagement with what one wants to say, and such fearlessness in expressing it, is rocket fuel for the mind. The child learns to speak the way a devoted gamer learns a game — through immersion and unquestioning involvement, untainted by the slightest fear of the failure that invariably, repeatedly attends the enterprise.

When I first started teaching Greek I and II online about fifteen years ago, I came up with what seemed to me a rather innovative plan for the final for the course. Over the years since I haven’t altered it much, because of all the things I’ve ever done as a teacher, it seems to have been one of the most successful. Though in recent years Sarah Miller Esposito has taken Greek I and II over from me, I believe that she’s still doing roughly the same thing, too. I set the final up as a huge, exhaustive survey of virtually everthing covered in the course — especially the mechanical things. All the declensions, all the conjugations, all the pronoun forms, and so on, became part of that final exam. It took many hours to complete. I eventually even gave up having other exams throughout the year. Everything (in terms of grade) could hang from the final.

Everything for a year depending on a final? For a high school student? This sounds like a nightmare. I’ve had parents balk and complain — but seldom students: not when they’ve been through it and seen the results. Here’s the trick: the student was allowed to take that exam throughout the summer, as many times as he or she wanted. It could be taken with the book in the lap, with an answer sheet propped up next to the computer; students could discuss the contents with one another, or ask me for answers (though they seldom needed to: I put the number of the relevant section in the book next to each question). The results of each pass could be reviewed, and each section could be retaken as many times as desired. The only requirement was this — the last time any given section of the exam (I think there are eighteen sections, some of them worth several hundred points each) was taken, it had to be taken under exam conditions: closed book, with no outside sources. The final version had to come in by Sept. 1. Students were free to complete it at any point prior: most of them didn’t. Why should they? They were playing the game, and improving their scores. They actually rather liked it. Especially after I was able to get these exam segments running under the Moodle, so that scoring was instantaneous and painless (frankly there’s little that’s as excruciating for a teacher to grade by hand as accented polytonic Greek), they did it a lot. They’d take each segment four, five, perhaps even ten times.

The results of this were, from a statistical point of view, probably ridiculous. It tended to produce a spread of scores ranging from a low of about 98.3 to a high of about 99.9. Nobody left without an A. “What kind of grade inflation is this?” one might ask. But the simple (and exhilarating) fact was that they all came back to class in the fall ready to perform like A students. They had the material down cold — and they hadn’t forgotten it all over the summer either. This is not just my own assessment: they went on to win national competitions, and to gain admission to some of the most prestigious universities in the country — where at least some of them tested into upper division classics courses right away. If that’s grade inflation, so be it. I like to think rather that it’s education inflation. We could use a little more of that. I don’t really take credit for it myself — it’s not that I was such a brilliant teacher. I’m not even primarily a Hellenist — I’m a Latinist. But I credit the fact that they became engaged with it as if with a game.

We live in a society with a remarkably strong gaming culture; but most historical societies have had the same thing. We have surviving games from Egypt and Greece and Rome; chess comes from ancient India and Persia, and go (probably the only game to match chess for complexity from simplicity) from ancient China and Japan. We have ancient African games, and ancient Native American games. Today the videogame industry is a multibillion dollar affair. Board games, card games, sporting equipment, and every other form of game equipment is marketed and consumed with a rare zeal. These products find buyers even in a downturn economy, because they appeal to something very fundamental about who we are. Even while the educational establishment seems to be ever more involved in protecting the fragile ego and self-image of the learner, our games don’t tell us pretty lies. They don’t tell us that we’ll win every time. They tell us we’ll fail and have to keep trying if we want to win. I really think that people savor that honesty, and that the lesson to be learned from it is enormously significant.

I know that there are a lot of things that people have had to say against games, and certainly an undue or inappropriate preoccupation with them may not be a good thing. Nevertheless, they are genuine part of our God-given nature, and they form, I would argue, one of our most robust models for learning. In games we are free to fail: and that freedom fosters the ability to learn, which is ultimately the legitimate freedom to win. If we can extract any lesson from our games, and perhaps apply it more broadly to the sphere of learning, I think we all will benefit.

Learning and teaching…and learning

Monday, February 28th, 2011

When we first started homeschooling our kids, Christe and I generally divided our tasks according to our general areas of relative expertise — she took the more scientific and mathematical subjects, while I dealt with the more humanities-oriented ones, especially those having to do with language. But it didn’t always fall out that way, and sometimes we had occasion to cross those lines.

One of the more surprising and delightful discoveries to emerge from this process was that it offered, on occasion, an opportunity to do right what I hadn’t done terribly well the first time around. My high school math career was not a progress from glory to glory: I did pretty well in geometry, but that experience was sandwiched between twin skirmishes with algebra from which I emerged somewhat bloodied and perhaps prematurely bowed. After algebra/trig, I generally concluded that math was not for me (or that I was not for it) and I set a course that wouldn’t require me to take any more of it. I completely avoided it in college — something that I now rather regret.

But a number of years later, after college and partway through my graduate career, I found myself teaching both geometry and algebra to our kids. It was liberating to have the controlling hand on the algebra, and to realize that once I was able to see the overall rationale behind the subject, it wasn’t so hard. From here it seems painfully obvious that the whole point of algebra is to isolate the key variable for which you are solving, and simplify the expression on the other side of the equal sign as much as possible. This is the invariable task in every algebra problem. Why none of my teachers ever made that clear to me at the time is a mystery to me, though in all honesty, I’m not sure whether my failure to grasp it was their fault or my own. In any case, I’ve actually come to like and appreciate algebra after all these years. Do I use it in my daily work? No, generally not: its intersection with Latin and Greek is fairly slight. But I use it just enough, to solve for all manner of things, that I wouldn’t be without it.

It was not in algebra, however, but in geometry that I encountered my most humbling but exhilarating experiences as a homeschooling dad. We had a copy of the old Jurgensen, Brown, and Jurgensen geometry book — a traditional, solid member of the Dolciani family of texts that many of my generation used in high school. We did not own a teacher’s manual. Most of the problems in the book were reasonably straightforward, once you knew how to tackle geometry in general, and, as I said, I was fairly good at geometrical thinking. A minority of them, however, were considerably less straightforward, and a handful just stopped us — my high-school aged daughter Mary, her mathematically precocious younger brother David, and me — in our tracks. There were a few of them that occupied us for hours over the space of several days, while our progress through the text came to a standstill.

I would be lying if I didn’t admit that I was occasionally afflicted with self-doubt on these occasions. What, I wondered, am I doing to my kids? Don’t they deserve someone with more expertise here? And doubtless in some situations they would have benefited from that expertise. But they did ultimately become very good in geometry anyway, and they learned into the bargain another lesson that I couldn’t have predicted or contrived, but that I wouldn’t trade for anything. We did, I think, eventually come up with a workable solution in each case, but the most important lesson Mary and David got from the experience as a whole was a lesson in gumption. They learned that it was possible to be stuck — not just for them to be stuck, but for us all to be stuck — and still not give up. I wasn’t holding the right answer in a sealed envelope or a crystal box, ready to produce it when I figured they’d evinced enough character or good will. They came to realize that it just wasn’t about them. It was about it — what we were trying to learn and figure out. There was an objective reality out there that was the implacable goal of our efforts. We could get ourselves to the finish line, or we could not; but the finish line wasn’t moving. It wouldn’t come to us, no matter what. It was what it was.

Gumption is a virtue that has largely gone out of fashion of late in educational circles. Concern for self-esteem has in some places eclipsed it, I think, but it’s a bad bargain. Gumption of the sort I’m talking about is rooted in a healthy regard for objective truth, and for the fact that the world as a whole really doesn’t care about our self-esteem. By the same token, real self-esteem comes from measuring oneself up against that objective reality and doing something with it. The value of learning that lesson that cannot be overestimated, and only a genuine appreciation and realization of what it means will turn the passive — perhaps even docile — student into a scholar in the more meaningful sense of the term. I don’t mean a professional academic, necessarily: out of our three kids, only one of them has gone on to pursue formal academics as a career path; many people in professional academics today, moreover, aren’t really scholars in the sense I’m talking about anyway. I mean something else — I’m talking about the cultivation of a bull-terrier mind that won’t take no for an answer or be deterred from finding out. I think all three of our kids got that.

That transformation is not, of course, instantaneous, and in our kids’ case it was not entirely a consequence of wrestling with a few geometry problems. But I think they helped crystallize the process, precisely because it was not a set-up thing, contrived as an object lesson, in which the answer would emerge after one had played the the game for a certain number of hours, or had demonstrated a sufficient degree of effort or frustration. In the real world, the truth is not dispensed, like a treat tossed to a dog who has done a trick. It’s won through struggle. The problems may have been contrived, but our engagement with them was genuine. We realized soon enough that if we didn’t figure the problem out, we wouldn’t get the answer. We worked on these problems together, and we worked on them separately too. In retrospect, I think that the most important thing I was able to do for my kids in homeschooling them was to model my own real response to my own real ignorance.

In a world where there is so much to know, we are surrounded by nothing so much as our own ignorance. It’s with us all the time, and if we don’t confront it honestly, we’re certainly fairly far gone in a pattern of self-delusion. Having a sane attitude toward it, and a way of dealing with it, is essential to overcoming it. The victory over ignorance, however great or small, is never assured: it’s always at stake. Sometimes you just don’t get what you were striving for. There are things we still don’t know, that people have been trying to figure out for a long time. That’s okay. Victory over ignorance is not given as a reward for diligence, but it will seldom be won without hard work. Ultimately the cold fact is that each student must take responsibility for his or her own learning. Nobody else can carry that burden. Nobody — not a parent, not a teacher, not anyone — can learn for you, any more than someone else can eat, sleep, or perform any of the other basic functions of life in your place; neither can you win a race by proxy. The student who grasps that lesson, and is willing to embrace it, despite the lack of assurances, is the one that really stands to make something of education and of life.

Autonomy of Means revisited: the Internet

Saturday, February 19th, 2011

Last May I wrote a piece for this blog entitled “Autonomy of Means and Education”. The choice of phrasing was drawn from Charles WIlliams, “Bors to Elayne, on the King’s Coins”. I’ve recently had reason to revisit the question again, from a different direction.

I’ve just finished reading Nicholas Carr’s The Shallows: What the Internet is Doing to our Brains. Some may consider it ironic that I discovered this book at the recommendation of some friends via Facebook: it is an extended (and not particularly optimistic) meditation on how the Internet is “rewiring” our minds — making quantifiable and physically measurable changes in our brains — by the kinds of information it delivers, and the way it delivers it.

Carr’s main point is fairly straightforward, and very hard to refute from common experience: he contends that the rapid-fire interruption-machine that the Internet offers us tends to fragment our attention, perpetually redirect us to the superficial, and prevent us from achieving any of the continuous long-term concentration from which emerge real ideas, serious discourse, and, in the long view, civilization itself. Not only is it not conducive to such thinking in and of itself — it actually suppresses our capacity for such thinking even when we’re away from our computers. Carr doesn’t point fingers or lay particularly onerous burdens of blame at anyone’s door, though one is moved to wonder cui bono? — to whom is all this a benefit, and where is the money coming from? There is a curious unquestioned positivist philosophy driving companies like Google that is not consistent with at least how I see myself in relation to my God, and the other people in his world.

Carr supports his case with a dazzling array of synthetic arguments ranging from the philosophical to the neuropsychological. He makes a very convincing case for the plasticity of the human brain, even into adulthood — and for the notion that those capacities that get exercise tend to be enhanced through measurable growth and synaptic enhancement of specific areas of the brain. All this can happen in remarkably short time (mere days or even hours). My own field is rather far removed from psychology, but what he says rings true with me — my ability do do almost any kind of mental activity really does improve with practice. Unused abilities, by the same token, can atrophy. That this happens is probably not very surprising to any of us; what is surprising is its extent and the objectivity with which it can be measured. I was intrigued to learn, for example, that one can identify particular developments characteristic of the brains of taxi-drivers, and that discernible physical differences distinguish the brains of readers of Italian, for example, from readers of English. We tend to think of language as largely convertible from one to another; it’s not necessarily so. Whether this has some other implications about why one ought to learn Latin or Greek is intriguing to me, but not something I’m going to chase down here.

Carr’s thesis, if it’s true, has serious consequences for us at Scholars Online. It has implications about who we are and how we do what we are doing. As a teacher who has found his calling trying to teach people to read carefully and thoughtfully, analytically and critically, with concentration and focus — via the Internet — I naturally feel torn. I like to believe that the format in which I’m pursuing that work is not itself militating against its success. It is at the very least a strong warning that we should examine how we work and why we do what we do the way we do it.

I do feel somewhat vindicated in the fact that we have never chosen to pursue each and every new technological gewgaw that came down the pike. Our own concern has always been for cautiously adopting appropriate technology. I still tend not to direct students to heavily linked hypertext documents (which, as Carr argues, provide vastly less benefit than they promise, with substantially lower retention than simple linear documents in prose); almost anything that requires the division or fragmentation of attention is an impediment to real learning. As I have said elsewhere in my discussions of the literature program, my main effort there has always been to teach students to read carefully and thoroughly — not just the mechanics of decoding text, but the skills of interpreting and understanding its meaning.

The book is not without a few technical flaws. Carr has either misread or misinterpreted some of the points in Paul Saenger’s Space Between Words: The Origins of Silent Reading. Many of his claims about Latin and the development of the manuscript are too facile, and some are simply incorrect. Saenger points out that in Classical Latin, word order makes relatively little syntactic difference. He’s using that distinction precisely. Carr apparently takes this to mean that, as a function of the way manuscripts were written and produced in late Antiquity and the early Middle Ages, there was less concern for discrete idenitification of word boundaries (likely to be true), and less concern for word order in a given text (completely preposterous). Yes, it’s true that Latin syntax does not rely as heavily as English does on word order; it’s not true that word order is without significance semantically. The fact that many of our survivals from ancient sources are poetic would clearly argue against this: if you rearrange the words in a line of Vergil, you will destroy the meter, if nothing else. Word order in poetry is essential for meter (something we can verify objectively); it’s also powerful poetically. Words echo each other only if they stand in a certain arrangment; this one will be left enjambed at the beginning of a new line with potent poetical effect.

Of Horace, Friedrich Nietzsche said:

Bis heute habe ich an keinem Dichter dasselbe artistische Entzücken gehabt, das mir von Anfang an eine Horazische Ode gab. In gewissen Sprachen ist Das, was hier erreicht ist, nicht einmal zu wollen. Dies Mosaik von Worten, wo jedes Wort als Klang, als Ort, als Begriff, nach rechts und links und über das Ganze hin seine Kraft ausströmt, dies minimum in Umfang und Zahl der Zeichen, dies damit erzielte maximum in der Energie der Zeichen – das Alles ist römisch und, wenn man mir glauben will, vornehm par excellence.
(Götzen-Dämmerung, “Was ich den Alten verdanke”, 1)

To this day, I have had from no other poet the same artistic pleasure that one of Horace’s Odes gave me from the beginning. In some languages, what Horace accomplished here could not even be hoped for. This mosaic of words, where each word — [understood] as sound, as place, and as idea — exerts its influence to the right and left and over the whole, this economy in the extent and number of the signs, through which those signs receive their greatest power — that is all Roman and, to my way of thinking, supremely noble.
(Twilight of the Gods, “What I owe to the Ancients”, 1. Tr. my own.)

Nietzsche was a very strange philosopher (if that’s even the right term to describe him); I don’t hold with many of his ideas. But he was actually a pretty astute reader of Horace.

Cicero’s orations — not poetry — were similarly characterized by prose rhythms and semantic subtleties that could not possibly have been preserved were the scribes or copyists indifferent to word order. Whether we’re dealing with poetry or prose, word order is ultimately no less important in Latin than in English. It just has a different importance. Don’t let anyone tell you otherwise.

Carr also routinely refers to Socrates as an orator, which is certainly not how Socrates viewed himself. He correctly notes that Socrates eschewed writing, partly because (as is discussed in the Phaedrus, one of the weirder Platonic dialogues), the old Egyptian priest claimed that it tended to weaken the memory. This is true, but it’s only one of Socrates’ reasons. He also disdained writing and oratory both because they were one-way forms of communication. What he valued (as can be found elsewhere throughout his work) is the give-and-take of two-way conversation: in the Greek, διαλέγεσθαι (dialegesthai) — the root of our own “dialogue” and “dialectic”. He believed that the exchange was uniquely capable of allowing people to dig out the truth.

In the Apology (which I’m now reading with some terrific students in Greek III), Socrates specifically and fairly extensively begs to be excused from having to talk like an orator. This is how the dialogue begins:

How you, men of Athens, have been affected by my accusers, I do not know; but I, for my part, almost forgot my own identity, so persuasively did they talk; and yet there is hardly a word of truth in what they have said. But I was most amazed by one of the many lies that they told—when they said that you must be on your guard not to be deceived by me, because I was a clever speaker. For I thought it the most shameless part of their conduct that they are not ashamed because they will immediately be convicted by me of falsehood by the evidence of fact, when I show myself to be not in the least a clever speaker, unless indeed they call him a clever speaker who speaks the truth; for if this is what they mean, I would agree that I am an orator—not after their fashion. Now they, as I say, have said little or nothing true; but you shall hear from me nothing but the truth. Not, however, men of Athens, speeches finely tricked out with words and phrases, as theirs are, nor carefully arranged, but you will hear things said at random with the words that happen to occur to me. For I trust that what I say is just; and let none of you expect anything else. For surely it would not be fitting for one of my age to come before you like a youngster making up speeches. And, men of Athens, I urgently beg and beseech you if you hear me making my defence with the same words with which I have been accustomed to speak both in the market place at the bankers tables, where many of you have heard me, and elsewhere, not to be surprised or to make a disturbance on this account. For the fact is that this is the first time I have come before the court, although I am seventy years old; I am therefore an utter foreigner to the manner of speech here. Hence, just as you would, of course, if I were really a foreigner, pardon me if I spoke in that dialect and that manner in which I had been brought up, so now I make this request of you, a fair one, as it seems to me, that you disregard the manner of my speech—for perhaps it might be worse and perhaps better—and observe and pay attention merely to this, whether what I say is just or not; for that is the virtue of a judge, and an orator’s virtue is to speak the truth.
(Plat. Apol., 17a-18a, tr. Harold North Fowler).

One of the things that struck me while I was reading the latter stretches of this book was the subject I raised last May: when a tool — any tool — becomes autonomous, we’re heading for trouble with it. We pour much of who and what we are into our tools, and the making of tools is apparently very much a part of our nature as human beings. We are homo faber — man the maker — as much as we are homo sapiens. That is, as I take it, a good thing. With our tools we have been able to do many things that are worth doing, and that could not have been done otherwise. But we must always hold our tools accountable to our higher purposes. The mere fact that one can do something with a given tool does not mean that it’s a good thing. They say the man with a hammer sees every problem as a nail. That adage still holds good. We can be empowered by our tools, but every one comes at a cost — a cost to us in terms of who we are and how we work, and what ends our work ultimately serves. There is some power in choosing not to use certain tools on certain occasions.

Autonomy of means and education

Monday, May 31st, 2010

Though not as well known as his friends J.R.R. Tolkien and C.S. Lewis, Charles Williams (1886–1945) was nevertheless an active member of the Inklings throughout most of its lifetime, and displayed a powerful, if somewhat eccentric, spiritual insight. He wrote seven odd metaphysical novels that haven’t ever quite caught the imagination of mainstream readers, but which have had a fervent following among a few; he also wrote a number of plays and various works of literary analysis, and The Descent of the Dove, a history of the Holy Spirit in the church. It would be hard to imagine a more daring enterprise.

He also wrote two slim volumes of poetry. His poetic style is odd, his imagery occasionally encumbered with a kind of private symbolic vocabulary that defies casual analysis, and his points are frequently highly abstract and obscure. For all that, I personally think that these two books — Taliessin through Logres (1938) and The Region of the Summer Stars (1944) — are the pinnacle of his creative achievement. He was admired by such prominent poetic luminaries as W.H. Auden, who wrote a kind of hommage to him on his death. But Williams’ unique power, I think, comes largely from his capacity to articulate transcendent truths that slice through every aspect of life — often drawing steely, almost brutally realistic distinctions that are nevertheless rooted in the love of Christ.

Partway through the first of those volumes is a poem entitled “Bors to Elayne: on the King’s Coins”. It is about the introduction of a money economy into an abstracted kind of Arthurian Britain (which he refers to by its older name “Logres”). From the middle of that poem comes the following passage (the dragons are the images stamped on the coins):

They laid the coins before the council.
Kay, the king’s steward, wise in economics, said:
“Good; these cover the years and the miles
and talk one style’s dialects to London and Omsk.
Traffic can hold now and treasure be held,
streams are bridged and mountains of ridged space
tunnelled; gold dances deftly across frontiers.
The poor have choice of purchase, the rich of rents,
and events move now in a smoother control
than the swords of lords or the orisons of nuns.
Money is the medium of exchange.”

Taliessin’s look darkened; his hand shook
while he touched the dragons; he said, “We had a good thought.
Sir, if you made verse you would doubt symbols.
I am afraid of the little loosed dragons.
When the means are autonomous, they are deadly; when words
escape from verse they hurry to rape souls;
when sensation slips from intellect, expect the tyrant;
the brood of carriers levels the good they carry.
We have taught our images to be free; are we glad?
are we glad to have brought convenient heresy to Logres?”

Ever since I first encountered these words more than thirty years ago, they have resonated with me — and in particular the line, “When the means are autonomous, they are deadly.” In almost every aspect of life today, we can see evidence of its truth.

It’s as true in economics, I think, as it ever was — as Williams first envisioned it. A preoccupation with money rather than actual goods and services — price as opposed to value — enables the twin banes of inflation and depression that have become all too familiar to us in recent years; it allows manipulation of currency as a tertium quid, essentially sundered from the goods and services themselves and from the human beings to whom they are meaningful or essential. In politics more broadly, I think, we daily see examples of means — offices, commisions, departments, or even whole governments, set up for noble reasons — that have, over time, become ends in themselves. They now exist less to advance the causes for which they were founded than to perpetuate themselves and to aggrandize their own power. One could make a similar argument for many unions, charitable organizations, political parties, businesses, or even schools: in short, for almost any of the human institutions that crowd and confuse our frail fallen world. The underlying pattern is the same. Things created to be means have become autonomous — ends in themselves, answerable to nobody.

I don’t want to become mired in the bog of elaborating on this politically: I have my own opinions, and so, probably, do you. Whatever your beliefs, there are probably a number of places where you can easily produce a ringing denunciation of these means-turned-ends. Your list might not be the same as mine, but there would probably be some overlap. In many cases it’s pretty clear that not only are these entities, whatever they are, no longer serving the good at which they originally aimed, but that they are actually subverting it. They stand in a kind of rebellion from their initial purposes. When it happens, we wind up spiraling downward into a kind of idolatrous service of the means rather than the end.

But I do think it’s worth looking at how this phenomenon intersects with our common goals here of enabling and supporting classical Christian education. Educational institutions, practices, and procedures are not exempt from this broad tendency, which is, after all, a reflection of our nature as fallen beings. Herewith are a handful of reflections on how that concerns us here and now.

Perhaps the most obvious case in point is the matter of grades. Grades are, like money, a medium of exchange. That’s all. They are only a medium, however, and of no intrinsic value. They presumably enable us to compare this student with that one and to come up with a kind of relative determination of their achievement, worth, or so on. From a Christian point of view, of course, that’s rather grotesquely misguided: that anyone could presume to evaluate another person’s worth in an absolute sense, when Christ died for each of us equally and entirely, is preposterous: but we may do it all the same, while masking the reality with comfortable rhetoric. It’s not about the students’ value, we say, but about their achievement. Fair enough: but people still tend to use the term as if it were evaluative of the person. Moreover, what we don’t admit nearly as freely as we should is the fact that the grades don’t reflect the students’ achievement or learning in a more than superficial way, either.

Sooner or later — usually sooner — this dichotomy will drive us to a parting of the ways. I have had parents withdraw students from my classes on the grounds that, though (they admitted) I was providing their children with a better educational experience, in which they were learning more and understanding more deeply, they really were sure that they would get better grades from someone less rigorous. That’s probably true. I should also say that I have also had parents tell me that they valued the substance of what we were delivering over the easy grade.

A grade for a course is only a way — a very reductive way — of measuring, quantifying, and talking about achievement. It is not, however, the achievement itself. It is a purely derivative good, and entirely without value on its own. Worrying about the grade in preference to worrying about the education that it supposedly represents is a bizarre substitution of the sign for the signified. It makes about as much sense as going to a restaurant on the grounds that, though the food is inferior, the menu seems better, or the man who convinced himself he was losing weight by redefining the pound to be twenty ounces.

At first blush, this seems comical, but self-deception is always, in the long run, a grave matter, and contains the seeds of tragedy, in both earthly and spiritual terms. It eventually leads us to a kind of idolatry of the signifier, while disregarding the thing signified. It propagates up and down the whole hierarchy of being and of our experience, and eventually will — as it must — taint our relationship with God.

A similar phenomenon is the frenzy of attention attaching to Advanced Placement (AP) courses. Someone, somewhere, has been telling (especially homeschooled) students and their parents that they really need AP credits by the truckload to be in contention for admission to any kind of good college. U. S. News and World Report rates high schools on the basis of how many AP courses they offer; certainly the College Board itself is not going to play down the importance of a multi-million-dollar industry that is making it (another now-autonomous means to an end) more powerful every year. This is further heightened by the fact that many schools compute the grade point average (GPA) in such a way that a B in an AP class is equivalent to an A in anything else; an A in an AP class gets one a 5.0 on a four-point scale. It’s insurance to assure that the GPA doesn’t dip below 4.0. One bogus marker becomes convertible with another. None of them any longer has much to do with learning.

In the increasingly frenetic pursuit of these brass rings, though, fewer and fewer seem to be stopping to consider that they really are just brass. Who is fooling whom here? One of the purposes of education, it seems to me, should have to do with cultivating the ability to distinguish the genuine article from the dross.

We’re trying to do that at Scholars Online (doubtless with limited success, but we’re trying). We offer grades because people demand them, but I confess I remain uneasy about the whole process. I’d much rather graduate class after class of people who were so excellent that no grade other than an A would be appropriate, but at that point it would lose its comparative punch. Similarly, we offer some A.P. courses because people want them, and because we’ve concluded that the curricula have been established on pretty solid grounds. In other cases, we’ve made the decision not to pursue A.P. status because the A.P. curriculum definition either seems intractable or pedagogically unsound, or would in effect entail a dumbing-down of what we’re doing. A majority of the students in my Senior English class go on to take the A.P. exam in Literature and Composition, and they normally do quite well. But it’s not required, and I don’t bill it as an A.P. course. After a few passes through the College Board’s review process, I determined that in order to meet their criteria, I would have to remove a good deal of the substance of the course to enable extensive rewriting exercises that are not, to my way of thinking, the best way of spending our limited time. One can agree with that decision or disagree with it: I respect that. But that disagreement should be about the substance of the educational experience, not because there’s any real pedagogical value to having the letters “AP” on a transcript.

If classical education is worth anything, it is about seeing past the superficial to the essential. Ideally it’s taking a stand against a culture of superficiality. The value of any part of your education is not, contrary to popular opinion, in its ability to lever you into a position to get more of it somewhere else, or even a job down the line. If it has no intrinsic value, scrap it. If it has that value, grab onto it and hold on tight.

As a Christian, I believe that education is for us ultimately a matter of helping us fulfill our real life goal — in Greek philosophical vocabulary, our telos — as created beings, which is to serve and to glorify God. It is to enable us to grow more fully into that personhood for which he created and redeemed us. It’s not just to get a good job, it’s not just to get more schooling.

Williams introduces Taliessin Through Logres with an epigraph from Dante’s Latin treatise entitled De monarchia: Unde est, quod non operatio propria propter essentiam, sed haec propter illam habet ut sit. Translated a mite loosely, that is: “Therefore it is that the proper function [of any given thing] does not get its reason for being from its essence, but the latter from the former.” It’s a demanding, humbling perception that gets tougher and chewier the longer you think about it. But I think it’s entirely correct.

I will try to follow up on this theme more in particular in the coming weeks.