Archive for the ‘Character formation’ Category

Getting Started Asking the Right Questions

Wednesday, April 13th, 2016

Recently on a homework assignment for my Natural Science course, I asked students to identify which solar system planets it would be possible to explore from Earth-based telescopes, which from space-born but Earth-orbit telescopes (like the Hubble), and which would require space probes sent to the planet. The results were fairly telling about the approaches students take to open-ended questions. Several students left the question blank, because (as they explained in emails and chat), they “couldn’t find the answer” in the assigned reading. Most students explained in very generic terms that while Venus and Mars (closest to Earth) could be viewed from Earth-based or Earth-orbit satellites, more distant objects would require probes. None addressed the question of Earth-based vs. Earth-orbit telescopes implied by listing the two options, and only one considered planetary characteristics other than distance, and identified Venus as a candidate for a probe because its dense atmosphere blocks any attempt to view its surface from Earth-bound telescopes.

Although I had warned the students repeatedly that some of their homework questions would require them to think through the answer and not simply look it up, at least some of my students felt this was unfair: they couldn’t be held responsible for what they couldn’t not look up and quote from the assigned reading. Most of my students were game to tackle the question even without finding the words “we can use telescopes to look at Mars but need probes for Pluto” in their reading, but they still did not study the question carefully enough to realize they needed to think about the advantages and disadvantages of each option, and consider the individual planet itself, as criteria for determining the best method of observation. They immediately seized on finding a single uniform answer, which would let them complete the assignment quickly, without worrying too much about whether the situation was more complicated.

From the perspective of classical education, this is backward, because classical education, and in particular Christian classical education, is not fundamentally about finding right answers to practical questions. Its goal is to develop the skills required by a free citizen who would be responsible for discerning God’s will, then making decisions for himself and for the state. It recognizes that there is no set formula for “getting the right answer” to the questions of real life; the most important first step is to make sure we can even formulate or recognize the important questions. Then we can look at how these questions (or ones like them) have been answered before, in literature and in history. Although Greek philosophy in particular developed rigorous methods of rational analysis, classical education still depends on story to convey the complexity of real decisions in real circumstances. Greek drama and history constantly throw out examples where individuals must decide between personal integrity, duty to the gods, and duty to the state. Latin literature is full of examples of individuals who sacrifice themselves to fulfill their civic duty. The Greeks and Romans of the classical period studied their own history, read their own literature, and produced often conflicting philosophical reflections because they realized that determining the right questions to ask was a citizen’s responsibility, and it was no use seeking answers until one had the right questions.

St. Clement of Alexandria, in his Stromata calls rational philosophy God’s divine gift to the Greeks, identifying direct revelation as God’s gift to the Jews. The early Christian Fathers, like Clement, who found inspiration in and supported the study of pagan classical literature recognized the development of reason and logic as an important skill in witnessing to and defending the Christian faith — and for distinguishing important questions that were worth pursuing from questions that were merely divisive and distracting. For nearly twenty centuries, western civilization depended on these stories and philosophies to help students form the values that would let them determine the wisest course in a difficult situation, both for themselves and for those they were expected to govern: personal integrity, loyalty, charity, civic duty, duty to God. The questions it forces us to ask are the most important questions of our lives: who am I? what do I want? what is God calling me to do?

If classical Christian education is about questions, then, how do we get students to learn to formulate or recognize the questions they need to ask? And equally important, how do we get them to develop passion for their studies and the courage to overcome the sense that asking questions is somehow an admission of failure to study correctly?

I believe that we need to make asking questions the most important job a student has to do: not completing the homework, not skimming the reading, but thinking deeply about the ideas they encounter and formulating questions about them. Every student (and teacher) needs to come to class full of questions. We meet in discussion to bring up these questions and to hear each other’s questions. As a teacher, I usually have a stock set of questions for a given chat to get the ball rolling, but these are based on my own experience and my own values and they are essentially “plan B” material for the chat. The most successful discussions occur when the students raise questions about their own understanding, based on their own concerns. When they voice those questions and we explore possible answers, we all find new ways of thinking about the material, and new insight on the questions that pester us.

While ultimately we need to address the important questions of our lives, we have to start somewhere, so here are some of the questions even novice students ought to be asking themselves constantly as they read both factual material and literature. Most important, to put this into practice, students should be writing down notes to bring up in class if they are puzzled or don’t have answers, or want to test their own assumptions:

Do I know what all the words mean? Does a term (even if I know its dictionary definition) seem vague or misapplied to the topic? Do all the parts of this graph or illustration make sense and can I see how they are related?

Can I follow all the steps of an example? Do I know what all the assumptions are, and how they are justified? Can I follow the calculation or reasons given for making a conclusion?

Is the author making an argument for a general conclusion or interpretation? Is the author’s claim valid? Do I understand the evidence used to support it? Does it apply to the cases given? Is it too general? too narrow? Can I think of any contradictory examples?

If several things are described, what characteristics, values, or processes are used to distinguish them? Do I understand how these distinctions are made, and can I use these methods myself to make distinctions among similar things?

If a process is described, can I follow the process? If not, why not — where do I get confused? If I do understand the process, do I accept it as valid, or do I think it skips important steps or considerations?

What does the author think are the most important points to make or take away from his or her presentation? What criteria does he or she use to make this evaluation (and do I agree)?  Can I make an outline of the important points and identify the supporting details? Does the author skip points I think are important? Can I figure out why — what values or assumptions am I making differently from those the author is making?

If I am translating a sentence or a paragraph, does my English translation make sense in English? Does it reflect something an intelligent person would say, or is it gibberish? Do I understand what each word means and how it functions in the sentence?

These questions force us to be honest with ourselves in two ways. One is making sure that we actually engage with the materials, not just pass our eyes over the reading, and that we seriously  attempt to comprehend the knowledge presented. The second is making sure that we are also applying our own developing system of values to what we are learning, that we are not simply agreeing blindly to what is presented, but trying to develop ways to determine for ourselves what is right, pure, lovely, admirable, and worthy of praise.

Failure as a good thing

Friday, March 11th, 2016

People tout many different goals in the educational enterprise, but not all goals are created equal. They require a good deal of sifting, and some should be discarded. Many of them seem to be either obvious on the one hand or, on the other, completely wrong-headed (to my way of thinking, at least).

One of the most improbable goals one could posit, however, would be failure. Yet failure — not as an end (and hence not a final goal), but as an essential and salutary means to achieving a real education — is the subject of Jessica Lahey’s The Gift of Failure (New York, HarperCollins, 2015). In all fairness, I guess I was predisposed to like what she had to say, since she’s a teacher of both English and Latin, but I genuinely think that it is one of the more trenchant critiques I have read of modern pedagogy and the child-rearing approaches that have helped shape it, sometimes with the complicity of teachers, and sometimes in spite of their best efforts.

Christe first drew my attention to an extract of her book at The Atlantic here. When we conferred after reading it, we discovered that we’d both been sufficiently impressed that we’d each ordered a copy of the book.

Lahey calls into question, first and foremost, the notion that the student (whether younger or older) really needs to feel that he or she is doing well at all stages of the process. Feeling good about your achievement, whether or not it really amounts to anything, is not in fact a particularly useful thing. That seems common-sensical to me, but it has for some time gone against the grain of a good deal of teaching theory. Instead, Lahey argues, failing — and in the process learning to get up again, and throw oneself back into the task at hand — is not only beneficial to a student, but essential to the formation of any kind of adult autonomy. Insofar as education is not merely about achieving a certain number of grades and scores, but about the actual formation of characer, this is (I think) spot-on.

A good deal of her discussion is centered around the sharply diminishing value of any system of extrinsic reward — that is, anything attached secondarily to the process of learning — be it grades on a paper or a report card, a monetary payoff from parents for good grades, or the often illusory goal of getting into a good college. The only real reward for learning something, she insists, is knowing it. She has articulated better than I have a number of things I’ve tried to express before. (On the notion that the reason to learn Latin and Greek was not as a stepping-stone to something else, but really to know Latin and Greek, see here and here. On allowing the student freedom to fail, see here. On grades, see here.) Education should be — and arguably can only be — about learning, not about grades, and about mastery, not about serving time, passing tests so that one can be certified or bumped along to something else. In meticulous detail, Lahey documents the uselessness of extrinsic rewards at almost every level — not merely because they fail to achieve the desired result, but because they drag the student away from engagement in learning, dull the mind and sensitivity, and effectively promote the ongoing infantilization of our adolescents — making sure that they are never directly exposed to the real and natural consequences of either their successes or their failures. Put differently, unless you can fail, you can’t really succeed either.

Rather than merely being content to denounce the inadequacies of modern pedagogy, Ms. Lahey has concrete suggestions for how to turn things around. She honestly reports how she has had to do so herself in her ways of dealing with her own children. The book is graciously honest, and I enthusiastically recommend it to parents and teachers at every level. If I haven’t convinced you this far, though, at least read the excerpt linked above. The kind of learning she’s talking about — engaged learning tied to a real love of learning, coupled with the humility to take the occasional setback not as an invalidation of oneself but as a challenge to grow into something tougher — is precisely what we’re hoping to cultivate at Scholars Online. If that’s what you’re looking for, I hope we can provide it.

Common Ground: A Lenten Meditation

Thursday, February 25th, 2016

My heart is broken these days as I read the political protestations by the candidates for president. The conversation seems to have descended to the level of a cock fight, with each side crowing over scoring a hit, rather than rising to a thoughtful discussion of the need to supply basic health care services and pay for them responsibly, the need to supply national security without destroying personal security, the need to help those who will responsibly use that help without wasting resources on those who willfully squander them — discussions that, if the proponents weren’t so concerned with winning, might actually provoke the creativity necessary to craft viable solutions. Conversations over religious issues often seems more about scoring points by citing the most proof texts than about seeking guidance from the Holy Spirit to discern how we Christians may help each other fulfill our baptismal vows to love our neighbors — all our neighbors — without violating our consciences.  Even the debate over evolution and creation is often reduced to quips and quotes of various authors that promote neither good science nor good theology, but that do sell books.

What we have forgotten, and what we desperately need to remember, is how to become reconciled, one to the other. It’s a fitting topic for Lent, when Christians reflect on the great price God paid so that we might be reconciled to Him.

Americans seem particularly bad at reconciliation. We are great at competition, but we don’t do forgiveness well. We aspire to reconciliation from time to time — we have the incredible image of the whole of Congress standing together on the steps of the Capitol after 9/11, singing “God Bless America”. Unfortunately that dream of common cause faded all too quickly with squabbles over the nature of the threats and best ways to meet them, and there was no underlying sense of real unity to carry us through the practical realities of the ensuing politics and economics. The sense of disunity has grown over the last decade to real divisions that our leaders — political, religious, and academic — seek to exploit to their own advantage, rather than amend for everyone’s advantage.

I don’t know what the answer is. I can only offer, by way of meditation, three images that continue to haunt me: an archway at Magdelen College in Oxford, a statue and a plaque in a side chapel at the Cathedral in Rouen, and a pair of Westminster Abbey tombs.

We went to England in the summer of 1986. We were grad students with little money, but we had come by a slight windfall, we had some vacation time coming, and we had friends in England we could stay with at least part of our trip. So we packed up our 9-year-old, 3-year-old, and 15-month-old, and went to find the England of J. R. R. Tolkien and C.S. Lewis and Dorothy Sayers, Arthur Ransome, Dick Whittington, Henry II, T.H. White, Isaac Newton, and Edmund Halley.

Lewis’s England lies largely in Oxford, where he taught at Magdelen, so we boarded the train to Oxford and the hallowed groves of academe, where teachers have taught and students have studied and dreamed for a thousand years. Wandering through those grounds that were open to the public, we came upon names carved into the wall of an archway between a great square and a cloistered walk. It was a memorial to the members of the College who had fallen in the Great War of 1914 to 1918.

You find these memorials in every village in England, sometimes on the walls of a municipal building, sometimes on columns in the center of village square, sometimes on plaques flat in the ground of the churchyard, among the gravestones of those who made it home. Lists are long; casualties were high, and some villages lost half of their male population in the trenches. Standing in the shadowed archway at Magdelen, we paused to read through the names, and realized with awe that they were divided into two groups: those who died in the service of George V of England, and those who had died in the service of Wilhelm, Emperor of Germany.

For you see, Oxford colleges have this odd notion that once you become a member of college, you remain a member of college. You can return and read books in the library, be seated at the college dining table, wear the college robes, attend the college colloquia, and when you die, be memorialized on the college walls — even when you die in the service of a political enemy.  Governments rise and fall in the actions of charismatic leaders; industry bends to pragmatic ends; fashions will alter, economies will render the rich poor and the poor rich, but the underlying purpose of academics is to seek the truth, and at Magdelen, that shared journey creates a community that cannot be easily severed, even by war.

Fast-forward twenty-three years to a different trip, and a different country.

There was a decade in the nineteenth century when the Cathedral at Rouen was the tallest building in the world, its steeple rising nearly five hundred feet above the placid waters of the Seine, which meanders through the fields and orchards of Normandy on its way from Paris to the English Channel. Bombed and broken on D-Day, the cathedral nave has been repaired and remains breathtakingly impressive. Beneath the lacy stone and stained glass lie the tombs of Rollo the Northman and a shrine containing the heart of Richard the Lionheart, at the same time both king of England and Duke of Normandy. And, not surprisingly, since just about every church in Normandy has some memorial to Jeanne d’Arc, there is a chapel in the north transept with a modern statue commemorating her martyrdom. She is in chains, her expression calmly resigned, while stone flames lick at her stone gown.

Rouen is the end of Jeanne’s story: here she was held in a fat tower that still stands near the train station, tried by an illegal court, and burned to death in the town square as a witch at the hands of the English, who had determined that her uncanny ability to beat their well-trained armies with smaller forces must lie in a pact with the devil. One might well think the Normans should have little love for the English: Jeanne’s martyrdom marked a turning point in a century of war between France and England that left northern France devastated and economically impoverished for yet another century after hostilities ceased.

So it is with a bit of shock that the Cathedral visitor reads the plaque under the double gothic arches just behind the chapel altar: in French and English, it proclaims “To the Glory of God and to the memory of the one million dead of the British Empire who fell in the Great War 1914-1918, and of whom the greater part rest in France.” In the chapel, in the crimson and sapphire and emerald light from the restored windows above, you suddenly realize that you are in a holy space, one that can abide the fundamental tension of human relationships. The English killed Jeanne, savior of France; the English died in defense of France, and became themselves saviors of France.  The English are — however long ago driven back and exiled to their island, and however badly some of them have behaved — still part of Normandy, and they will still come, if need be, to defend it.

Across the channel, in the center of London, on the banks of the Thames, lies Westminister Abbey. Here the English buried their poets and painters, dreamers and scientists, princes and kings: Geoffrey Chaucer and Lewis Caroll, Isaac Newton and Robert Boyle, Charles and James and Anne Stuart.

In the north aisle of the Lady Chapel are two tombs, one stacked above the other. In the lower one lies Mary Tudor, most Catholic queen of England, who held her sister Elizabeth in house arrest to prevent a civil war, whose courts ordered Hugh Latimer and Nicholas Ridley burned at the stake on Broad Street in Oxford for their defense of a Protestant faith, and who sought all her life to serve God by bringing Him a Catholic kingdom in communion with Rome. In the coffin above her lies that same Elizabeth, most Protestant queen of England, who in her turn held her cousin Mary, Queen of Scots, under house arrest to prevent a civil war, who ordered the executions of Mary and Robert Devereux for political plots, and who sought all her life to serve God by avoiding the religious wars of the Continent and creating a peacable England. When we visited the chapel in 1986, the plaque on the floor read: “Those whom the Reformation divided, the Resurrection will reunite, who died for Christ and conscience’ sake”.  Standing among the royal tombs beneath the stone arches of the chapel, confronted by two sisters at deadly odds with one another, it is with some shock that four hundred years later what remains is the conviction that reconciliation is not merely possible: it is inevitable where there is a fundamental common goal to serve God.

The issues that divide us as Americans and as Christians are real; they are complex, and they challenge us to be the best we can be to address them. As individuals, we have limited resources of time and money and talent and intellect and emotional stamina. We cannot resolve complex problems by isolating ourselves from each other.

One of our visions in founding Scholars Online was to create a community where Christians of different backgrounds could study together, and we welcome anyone, regardless of their religious background, who shares our conviction that education requires not merely mastery of subject matter and the development of close reading and critical thinking skills, but also the formation of character that seeks to deal charitably and honestly with others. We do not require a statement of faith from our students. Our faculty comes from diverse traditions, and we all view our call to teach as a ministry. We — students, teachers, parents — do not always agree on issues, but we hold ourselves together in community, not only to serve the cause of classical Christian education, but also to serve as a model of community built out of diversity.

We start by standing together on the common ground of God’s eternal love for each of us and all of us.

[Part of this meditation appeared a 2009 entry to the All Saints Episcopal Church (Bellevue) blog.]

STEMs and Roots

Tuesday, February 2nd, 2016

Everywhere we see extravagant public handwringing about education. Something is not working. The economy seems to be the symptom that garners the most attention, and there are people across the political spectrum who want to fix it directly; but most seem to agree that education is at least an important piece of the solution. We must produce competitive workers for the twenty-first century, proclaim the banners and headlines; if we do not, the United States will become a third-world nation. We need to get education on the fast track — education that is edgy, aggressive, and technologically savvy. Whatever else it is, it must be up to date, it must be fast, and it must be modern. It must not be what we have been doing.

I’m a Latin teacher. If I were a standup comedian, that would be considered a punch line. In addition to Latin, I teach literature — much of it hundreds of years old. I ask students, improbably, to see it for what it itself is, not just for what they can use it for themselves. What’s the point of that? one might ask. Things need to be made relevant to them, not the other way around, don’t they?

Being a Latin teacher, however (among other things), I have gone for a number of years now to the Summer Institute of the American Classical League, made up largely of Latin teachers across the country. One might expect them to be stubbornly resistant to these concerns — or perhaps blandly oblivious. That’s far from the case. Every year, in between the discussions of Latin and Greek literature and history, there are far more devoted to pedagogy: how to make Latin relevant to the needs of the twenty-first century, how to advance the goals of STEM education using classical languages, and how to utilize the available technology in the latest and greatest ways. What that technology does or does not do is of some interest, but the most important thing for many there is that it be new and catchy and up to date. Only that way can we hope to engage our ever-so-modern students.

The accrediting body that reviewed our curricular offerings at Scholars Online supplies a torrent of exortation about preparing our students for twenty-first century jobs by providing them with the latest skills. It’s obvious enough that the ones they have now aren’t doing the trick, since so many people are out of work, and so many of those who are employed seem to be in dead-end positions. The way out of our social and cultural morass lies, we are told, in a focus on the STEM subjects: Science, Technology, Engineering, and Math. Providing students with job skills is the main business of education. They need to be made employable. They need to be able to become wealthy, because that’s how our society understands, recognizes, and rewards worth. We pay lip service, but little else, to other standards of value.

The Sarah D. Barder Fellowship organization to which I also belong is a branch of the Johns Hopkins University Center for Talented Youth. It’s devoted to gifted and highly gifted education. At their annual conference they continue to push for skills, chiefly in the scientific and technical areas, to make our students competitive in the emergent job market. The highly gifted ought to be highly employable and hence earn high incomes. That’s what it means, isn’t it?

The politicians of both parties have contrived to disagree about almost everything, but they seem to agree about this. In January of 2014, President Barack Obama commented, “…I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree. Now, nothing wrong with an art history degree — I love art history. So I don’t want to get a bunch of emails from everybody. I’m just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.”

From the other side of the aisle, Florida Governor Rick Scott said, “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so.”

They’re both, of course, right. The problem isn’t that they have come up with the wrong answer. It isn’t even that they’re asking the wrong question. It’s that they’re asking only one of several relevant questions. They have drawn entirely correct conclusions from their premises. A well-trained plumber with a twelfth-grade education (or less) can make more money than I ever will as a Ph.D. That has been obvious for some time now. If I needed any reminding, the last time we required a plumber’s service, the point was amply reinforced: the two of them walked away in a day with about what I make in a month. It’s true, too, that a supply of anthropologists is not, on the face of things, serving the “compelling interests” of the state of Florida (or any other state, probably). In all fairness, President Obama said that he wasn’t talking about the value of art history as such, but merely its value in the job market. All the same, that he was dealing with the job market as the chief index of an education’s value is symptomatic of our culture’s expectations about education and its understanding of what it’s for.

The politicians haven’t created the problem; but they have bought, and are now helping to articulate further, the prevalent assessment of what ends are worth pursuing, and, by sheer repetition and emphasis, crowding the others out. I’m not at all against STEM subjects, nor am I against technologically competent workers. I use and enjoy technology. I am not intimidated by it. I teach online. I’ve been using the Internet for twenty-odd years. I buy a fantastic range of products online. I programmed the chat software I use to teach Latin and Greek, using PHP, JavaScript, and mySQL. I’m a registered Apple Developer. I think every literate person should know not only some Latin and Greek, but also some algebra and geometry. I even think, when going through Thucydides’ description of how the Plataeans determined the height of the wall the Thebans had built around their city, “This would be so much easier if they just applied a little trigonometry.” Everyone should know how to program a computer. Those are all good things, and help us understand the world we’re living in, whether we use them for work or not.

But they are not all that we need to know. So before you quietly determine that what I’m offering is just irrelevant, allow me to bring some news from the past. If that sounds contradictory, bear in mind that it’s really the only kind of news there is. All we know about anything at all, we know from the past, whether recent or distant. Everything in the paper or on the radio news is already in the past. Every idea we have has been formulated based on already-accumulated evidence and already-completed ratiocination. We may think we are looking at the future, but we aren’t: we’re at most observing the trends of the recent past and hypothesizing about what the future will be like. What I have to say is news, not because it’s about late-breaking happenings, but because it seems not to be widely known. The unsettling truth is that if we understood the past better and more deeply, we might be less sanguine about trusting the apparent trends of a year or even a decade as predictors of the future. They do not define our course into the infinite future, or even necessarily the short term — be they about job creation, technical developments, or weather patterns. We are no more able to envision the global culture and economy of 2050 than the independent bookseller in 1980 could have predicted that a company named Amazon would put him out of business by 2015.

So here’s my news: if the United States becomes a third-world nation (a distinct possibility), it will not be because of a failure in our technology, or even in our technological education. It will be because, in our headlong pursuit of what glitters, we have forgotten how to differentiate value from price: we have forgotten how be a free people. Citizenship — not merely in terms of law and government, but the whole spectrum of activities involved in evaluating and making decisions about what kind of people to be, collectively and individually — is not a STEM subject. Our ability to articulate and grasp values, and to make reasoned and well-informed decisions at the polls, in the workplace, and in our families, cannot be transmitted by a simple, repeatable process. Nor can achievement in citizenship be assessed simply, or, in the short term, accurately at all. The successes and failures of the polity as a whole, and of the citizens individually, will remain for the next generation to identify and evaluate — if we have left them tools equal to the task. Our human achievement cannot be measured by lines of code, by units of product off the assembly line, or by GNP. Our competence in the business of being human cannot be certified like competence in Java or Oracle (or, for that matter, plumbing). Even a success does not necessarily hold out much prospect of employment or material advantage, because that was never what it was about in the first place. It offers only the elusive hope that we will have spent our stock of days with meaning — measured not by our net worth when we die, but by what we have contributed when we’re alive. The questions we encounter in this arena are not new ones, but rather old ones. If we lose sight of them, however, we will have left every child behind, for technocracy can offer nothing to redirect our attention to what matters.

Is learning this material of compelling interest to the state? That depends on what you think the state is. The state as a bureaucratic organism is capable of getting along just fine with drones that don’t ask any inconvenient questions. We’re already well on the way to achieving that kind of state. Noam Chomsky, ever a firebrand and not a man with whom I invariably agree, trenchantly pointed out, “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.” He’s right. If we are to become unfree people, it will be because we gave our freedom away in exchange for material security or some other ephemeral reward — an illusion of safety and welfare, and those same jobs that President Obama and Governor Scott have tacitly accepted as the chief — or perhaps the only — real objects of our educational system. Whatever lies outside that narrow band of approved material is an object of ridicule.

If the state is the people who make it up, the question is subtly but massively different. Real education may not be in the compelling interest of the state qua state, but it is in the compelling interest of the people. It’s the unique and unfathomably complex amalgam that each person forges out of personal reflection, of coming to understand one’s place in the family, in the nation, and in the world. It is not primarily practical, and we should eschew it altogether, if our highest goal were merely to get along materially. The only reason to value it is the belief that there is some meaning to life beyond one’s bank balance and material comfort. I cannot prove that there is, and the vocabulary of the market has done its best to be rid of the idea. But I will cling to it while I live, because I think it’s what makes that life worthwhile.

Technical skills — job skills of any sort — are means, among others, to the well-lived life. They are even useful means in their place, and everyone should become as competent as possible. But as they are means, they are definitionally not ends in themselves. They can be mistakenly viewed as ends in themselves, and sold to the credulous as such, but the traffic is fraudulent, and it corrupts the good that is being conveyed. Wherever that sale is going on, it’s because the real ends are being quietly bought up by those with the power to keep them out of our view in their own interest.

Approximately 1900 years ago, Tacitus wrote of a sea change in another civilization that had happened not by cataclysm but through inattention to what really mattered. Describing the state of Rome at the end of the reign of Augustus, he wrote: “At home all was calm. The officials carried the old names; the younger men had been born after the victory of Actium; most even of the elder generation, during the civil wars; few indeed were left who had seen the Republic. It was thus an altered world, and of the old, unspoilt Roman character not a trace lingered.” It takes but a single generation to forget the work of ages.

But perhaps that’s an old story, and terribly out of date. I teach Latin, Greek, literature, and history, after all.

There are no short cuts. Really.

Tuesday, January 26th, 2016

Several years ago, while hunting for something to add to Dr. Bruce’s extensive Shakespeare media collection, we ran across a short documentary called “The Hobart Shakespeareans”. It’s a profile of teacher Rafe Esquith of the Los Angeles school district, and his dedication to his fifth-grade students. Besides the normal coursework required by the state for the grade level, Mr. Esquith encouraged his students, many of them from a disadvantaged area, to put in extra hours to create an end-of-term production of a Shakespeare play (in the documentary, it’s Hamlet). The success of the documentary led Esquith to write about his experiences in “There Are No Shortcuts”.

In the half-dozen years or so since I read Esquith’s book, I’ve found myself using that phrase a lot. Managers seem to think there should be a single, simple process that can solve all of their problems, but there is no short cut to good design: you have to analyze the situation and balance security, hardware, and software requirements. Students seem to think a week’s extension to study for an exam will fix a year’s worth of neglecting homework and failing quizzes or skipping classes, but there is no short cut to knowledge. You need disciplined study and review habits.

Some of this, I think, is the result of living in the instant-feedback, information-based age that that Internet has created for us. It’s easy to run a search engine and get some data to answer a question. Most of my students can google a website faster than I can, and then use the control-F find command to locate a term, and cut-and-past a response into our online discussions. They are experts at rapid retrieval.

The problem is that in their haste (if they bother to actually read it all), they haven’t noticed that the string they’ve chosen doesn’t actually define the term we’re discussing, doesn’t explain how it relates to other ideas, and doesn’t show them how to apply it. They have some data, but no context for it, and no way to determine whether it is accurate, or the prejudiced opinion of an agenda-driven author. What they have is not really knowledge, and certainly not wisdom. Yet we are often content merely to commend their ability follow a process that allows them to look something up quickly, as though this were the end, and not simply the means. We fail to push them to acquire and apply critical thinking skills to what they have found, so that they can truly evaluate its importance and implications.

Too often, students who are asked to write an essay expressing their own ideas and to justify their position with their own reasons find it much easier to look up the idea online and present the arguments as their own work. When a math or physics problem proves challenging, instead of working the solution out themselves and risking getting it wrong, they hunt for the worked-out example at some “resource” site, and copy whatever approach is presented. We used to identify blatantly copying someone else’s work — however easily available — as plagiarism, and suspend students for cheating this way, but faced with competition for college admission and lucrative jobs, students (and even their parents and their teachers) will justify these methods as “just shortcuts”, so the students can have time to do all the other things we expect them to accomplish. [At Scholars Onlilne, we still consider presenting someone else's work as one's own a form of plagiarism, and our polcies on cheating spell out the consequences.]

We have traded the goal of depth of knowledge and real mastery of a subject for a breadth of information so expansive that it can be nothing other than superficial — it’s hardly knowledge and it certainly isn’t wisdom. Instead of struggling with ideas until we truly make them our own, we fracture our time into pieces to cover dozens of topics, many of which will be revised or disappear before our students can actually use them. In order to meet all of the perceived educational requirements, our students must split their attention and wind up constantly multitasking, often to the point where they miss the key idea of a discussion entirely, misinterpret their assignments, and answer the wrong questions altogether.

Any athlete, any performance artist, and any farmer can tell you that when you are trying to develop real skills or produce an edible real crop, there are no shortcuts.

There are no shortcuts that can eliminate the hours of focused training required to build muscle tone and cardiovascular endurance if you want to run a marathon, and there is no substitute for working with and accepting critical evaluation from a coach who can help you identify the behavior that slows you down. There are no shortcuts that will let you avoid hours of concentrated practice if you want to play a piano concerto perfectly and with passion, and you will still need a teacher who will help you realize and release your love for the piece. There no shortcuts that can eliminate the labor of planting, weeding, watering, and harvesting if you want to eat your carrot crop — and you will need the experience of others who have successfully raised carrots in the same valley if you are going to realize a good harvest. Even then, you will have to wait while nature does its own work with sun and rain and soil and teaches you patience.

What we often forget is that practice and coaching are also necessary if we want to grow in knowledge and wisdom. There are no shortcuts that will let you skip the disciplined study and reflective thought required to achieve mastery of a subject, much as we would like to think there are. It takes us time to review details and develop the memory skills needed to master a new concept. We need guidance to learn how to read complex material closely and critically, and criticism to hone that preception when we stray or become distracted. We need to take enough time to follow an argument, examine its premises, research the facts it cites, and determine for ourselves whether or not the conclusion is justified.

We need teacher and peer interaction to recognize that we have not expressed ourselves clearly and we need to restate or rewrite or redraw our ideas before we can really share them. We need the support of our learning community to encourage us when we fail at all of these things from time to time. We even need to practice our ability recognize our failures and to develop the discipline to try again, so that we can emulate the athlete who doesn’t give up after losing her first race, the pianist who doesn’t stop practicing with the first flubbed trill, and the farmer who weeds and waters and bides his time until the harvest is ready.

With our educational system’s emphasis on preparing for the technical skills needed in the twenty-first century, we have forgotten that classical liberal arts education in critical thinking is the foundation on which we build build those skills, but also the foundation on which we build our system of values. The average worker in the 21st century will hold a dozen jobs, not just one, and the skills and technical expertise for each job will require retraining. We need to emphasize not only the skills that will make retraining easier — critical thinking and self-evaluation — but also the character traits of honesty, integrity, charity, and patience that will make our children valuable citizens of their communities as well as employable members of the work force.

Why Study Greek?

Thursday, September 13th, 2012

I would make them all learn English: and then I would let the clever ones learn Latin as an honour, and Greek as a treat.
— Winston Churchill (somewhat out of context).

A few years ago I wrote an entry on this blog entitled “Why Study Latin?” It was a distillation of my own thoughts about the actual benefits of learning Latin — and the benefits one ought legitimately to expect from doing so. I tried to distinguish the real benefits from other phantom benefits that might not be, in and of themselves, fully valid reasons for undertaking the study. Not everyone agreed, but in general I stand by what I said there. From my point of view, the chief reason to learn Latin is to be able to read Latin; a significant second is to gain that unique way of looking at the world that attends that ability. One has access to a number of great works of Latin literature in their original forms, therefore, and one has also an enhanced ability to think in Latinate terms.

Of course other collateral benefits might reasonably accrue, but they are neither absolutely guaranteed to the student of Latin, nor are they benefits that attend Latin study exclusively. Dr. Karl Maurer of the University of Dallas suggested that I didn’t sufficiently credit the advantages a trained Latinist would have in reading older English — and he’s definitely right that this kind of textural depth of English poetry and prose will probably elude anyone who isn’t familiar with Latin, and the way Latin education was a cornerstone of English education from about 1500 to at least 1900. I certainly don’t disagree with his claims there; I don’t think they rank as matters of linguistics as much as matters of literary development and style. They’re still not trivial, however.

Be that as it may, for a variety of reasons, some of them right and some of them wrong, learning Latin has its champions, and I hope it gains a lot more. While I don’t agree with all the reasons one might advance for Latin study, I will enthusiastically concur that it’s a terrific thing to learn and to know.

Far fewer, however, champion learning Greek so loudly. For a variety of reasons, Greek is seen as far less significant. Some of those reasons are sound: Greek does not directly stand behind a broad range of modern Western European languages the way Latin does. Many of our ideas of statecraft and polity come from Greece, but most of them came through Latin in the process. Other reasons people shy away from Greek are fairly trivial. It has an odd-looking alphabet. Its literature seems to depend on a lot of odder assumptions. Realistic, though rather defeatist, is the fact that, in general, Greek is just considered tougher to learn. Many mainstream churches no longer even require their clergy to be able to read Greek (which seems preposterous to me, but that’s another matter).

For whatever reasons, Greek is certainly studied far less at the high school level than it once was. I read a statistic a few years ago suggesting that maybe a thousand students were actually studying ancient Greek in modern American high schools at any one time. The numbers may be as high as two thousand, but surely no higher than that. I don’t know whether those numbers have risen or fallen since I read it, but I certainly see no evidence that they have skyrocketed. I do occasionally run into a Latin teacher at the American Classical League Summer Institutes who teaches some Greek, but it’s most often a sideline, and often a completely optional “extra” for before or after school. Most of those students are being exposed to the Greek alphabet and some vocabulary, but fairly few of them are receiving a rigorous exposure to the grammar of Greek as a whole. If one narrows that to those who have studied real Classical Greek, as opposed to New Testament Greek, the numbers are probably smaller still.

For me most of the reasons for learning to read Greek are similar to those for reading Latin. The chief benefit, I would still insist, is to be able to read Greek literature in its original terms. Lucie Buisson wrote an eloquent defense of Homer in Greek not long ago in this blog. You cannot acquire a perspective on the Homeric poems like Lucie’s without reading them in Greek. It’s a huge deal: something snaps into view in a way that just cannot be explained to someone who hasn’t experienced it. No translation, no matter how good, can capture it for you. Though Keats memorably thanked Chapman for something like this eye-opening experience, the fact remains that Keats didn’t have the real thing as a comparandum. Chapman’s Homer is terrific — but Homer’s Homer is better.

Beyond the immediate experience of the literary objects themselves there is the fact that Greek provides its students with what I can only (metaphorically) call another set of eyes — that is, a different way of seeing the world, with different categories of thought that run deeper than mere changes in vocabulary. Virtually any new language one learns will provide that kind of new perspective: French, Spanish, or German will do so; Latin certainly does. I would suggest that Greek provides a uniquely valuable set precisely because it is further removed from English in its basic terms.

A reasonable command of multiple languages gives us what might be likened to stereoscopic vision. One eye, or one point of view, may be able to see a great deal — but it’s still limited because it’s looking from one position. A second eye, set some distance from the first, may allow us to see a somewhat enlarged field of view, but its real benefit is that it allows us, by the uncannily accurate trigonometric processor resident in our brains, to apprehend things in three dimensions. Images that are flat to one eye achieve depth with two, and we perceive their solidity as we never could do otherwise. Something similar goes on with an array of telescope dishes spread out over a distance on the earth — they allow, by exploiting even relatively slight amount of parallax in cosmic terms, an enhanced apprehension of depth in space. (Yes, there are also some other advantages having to do with resolution — all analogies have their limits.)

I would argue that every new language one learns will thus provide another point of view, enhancing and enriching, by a kind of analogical stereoscopy, a deeper and more penetrating view of the world. And like the more widely spaced eyes, or the telescopes strung out in a very large array, the further apart they are, the more powerfully their “parallax” (to speak purely analogically) will work upon us. This, I would argue, is one of the chief reasons for learning Greek. In some of its most fundamental assumptions, Greek is more sharply distinct from English than is Latin. A few examples will have to suffice.

Greek, for example, invites us to think about time differently. Greek verb tenses are not as much about absolute time as English verb tenses are; they are more about what linguists call aspect (or aspect of action in older writings). That is, they have more to do with the shape of an action — not its intrinsic shape, but how we’re talking about it — than merely locating it in the past, present, or future. Greek has a tense — the aorist — that English and Latin are missing. The aorist is used in the indicative mood to denote simple action in the past, but in other moods to express other encapsulation of simple verb action. Greek aorist verbs in the indicative will certainly locate events in the temporal continuum, and certainly English also has ways to express aspect — things such as the progressive or emphatic verb forms: e.g., “I run” vs. “I am running” or “I do run”. But whereas the English verb is chiefly centered in the idea of when something happened or is happening or will happen, with aspect being somewhat secondary, in Greek it’s the other way around. What exactly that does to the way Greek speakers and thinkers see the world is probably impossible to nail down exactly — but it’s not trivial.

Attic and earlier Greek has a whole mood of the verb that isn’t present in English or Latin — the optative. Students of New Testament Greek won’t see this on its own as a rule. There are a few examples such as Paul’s repeated μὴ γένοιτο in Romans (sometimes translated as “by no means”, but intrinsically meaning something more like “may it not come about”). But Attic and older dialects (like Homeric Greek) are loaded with it. It’s not just an arbitrary extension of a subjunctive idea: it runs alongside the subjunctive and plays parallel games with it in ways that defy simple classification.

Greek has a voice that neither English nor Latin knows, properly speaking — what is called the middle voice. It is neither active nor passive; but tends to refer to things acting on or on behalf of themselves, either reflexively or in a more convoluted way that defies any kind of classification in English language categories.

The Greek conditional sentence has a range of subtlety and nuance that dwarfs almost anything we have in English. Expressing a condition in Greek, or translating a condition from Greek, requires a very particular degree of attention to how the condition is doing what it is doing. In the present and the past, one may have either contrary to fact conditions (“If I were a rich man, I would have more staircases,” or “If I had brought an umbrella I would not have become so wet,”) general conditions (“If you push that button, a light goes on,”), and particular conditions (“If you have the older edition of the book, this paragraph is different”); in the future there are three other kinds of conditions, one of them nearly (but not quite) contrary to fact (“If you were to steal my diamonds, I’d be sad,”) called the future less vivid, and then a future more vivid and a future most vivid, representing increasing degrees of urgency in the future. All of these can be tweaked and modified and, in some rather athletic situations, mixed. If you study Greek, you will never think about conditions in quite the same way again.

Greek has what are called conditional temporal clauses that model themselves on conditions in terms of their verb usage, though they don’t actually take the form of a condition. There is something like this in English, but because we don’t use such a precise and distinct range of verbs for these clauses, they don’t show their similarities nearly as well.

The Greek participle is a powerhouse unlike any in any other Western language. Whole clauses and ideas for which we would require entire sentences can be packaged up with nuance and dexterity in participles and participial phrases. Because Greek participles have vastly more forms than English (which has only a perfect passive and a present active — “broken” and “breaking”) or than Latin (which has a perfect passive and a present active, and future active and passive forms), it can do vastly more. Greek participles have a variety of tenses, they can appear in active, middle, and passive voices, and they are inflected for all cases, numbers, and genders. All of these will affect the way one apprehends these nuggets of meaning in the language.

Those are only some examples of how a Greek sentence enforces a subtly different pattern of thought upon people who are dealing with it. As I said, however, for me the real treasure is in seeing these things in action, and seeing the ideas that arise through and in these expressions. So what’s so special as to require to be read in Greek?

Lucie already has written thoroughly enough about the joys of Homer; much the same could be said of almost any of the other classical authors. Plato’s dialogues come alive with a witty, edgy repartee that mostly gets flattened in translation. The dazzling wordplay and terrifying rhythms of Euripidean choruses cannot be emulated in meaningful English. Herodotus’s meandering storytelling in his slightly twisted Ionic dialect is a piece of wayfaring all on its own. The list goes on.

For a Christian, of course, being able to read the New Testament in its original form is a very significant advantage. Those who have spent any time investigating what we do at Scholars Online will realize that this is perhaps an odd thing to bring up, since we don’t teach New Testament Greek as such. My rationale there is really quite simple: the marginal cost of learning classical Attic Greek is small enough, compared with its advantages, that there seems no point in learning merely the New Testament (koine) version of the language. Anyone who can read Attic Greek can handle the New Testament with virtually no trouble. Yes, there are a few different forms: some internal consonants are lost, so that γίγνομαι (gignomai) becomes γίνομαι (ginomai), and the like. Yes, some of the more elaborate constructions go away, and one has to get used to a range of conditions (for example) that is significantly diminished from the Attic models I talked about above. But none of this will really throw a student of Attic into a tailspin; the converse is not true. Someone trained in New Testament Greek can read only New Testament Greek. Homer, Euripides, Plato, Aristotle, Sophocles, Herodotus, Thucydides — all the treasures of the classical Greek tradition remain inaccessible. But the important contents of the New Testament and the early Greek church fathers is open even with this restricted subset of Greek — and they are very well worth reading.

Greek is not, as mentioned earlier, a very popular subject to take at the high school level, and it’s obvious that it’s one of those things that requires a real investment of time and effort. Nevertheless, it is one of the most rewarding things one can study, both for the intrinsic delights of reading Greek texts and for some of the new categories of thought it will open up. For the truly exceptional student it can go alongside Latin to create a much richer apprehension of the way language and literary art can work, and to provide a set of age-old eyes with which to look all that more precisely at the modern world.

Computer Programming as a Liberal Art

Monday, September 3rd, 2012

One of the college majors most widely pursued these days is computer science. This is largely because it’s generally seen as a ticket into a difficult and parsimonious job market. Specific computer skills are demonstrably marketable: one need merely review the help wanted section of almost any newspaper to see just how particular those demands are.

As a field of study, in other words, its value is generally seen entirely in terms of employability. It’s about training, rather than about education. Just to be clear: by “education”, I mean something that has to do with forming a person as a whole, rather just preparing him or her for a given job, which I generally refer to as “training”. If one wants to become somewhat Aristotelian and Dantean, it’s at least partly a distinction between essence and function. (That these two are inter-related is relevant, I think, to what follows.) One sign of the distinction, however, is that if things evolve sufficiently, one’s former training may become irrelevant, and one may need to be retrained for some other task or set of tasks. Education, on the other hand, is cumulative. Nothing is ever entirely lost or wasted; each thing we learn provides us with a new set of eyes, so to speak, with which to view the next thing. In a broad and somewhat simplistic reduction, training teaches you how to do, while education teaches you how to think.

One of the implications of that, I suppose, is that the distinction between education and training has largely to do with how one approaches it. What is training for one person may well be education for another. In fact, in the real world, probably these two things don’t actually appear unmixed. Life being what it is, and given that God has a sense of humor, what was training at one time may, on reflection, turn into something more like education. That’s all fine. Neither education nor training is a bad thing, and one needs both in the course of a well-balanced life. And though keeping the two distinct may be of considerable practical value, we must also acknowledge that the line is blurry. Whatever one takes in an educational mode will probably produce an educational effect, even if it’s something normally considered to be training. If this distinction seems a bit like C. S. Lewis’s distinction between “using” and “receiving”, articulated in his An Experiment in Criticism, that’s probably not accidental. Lewis’s argument there has gone a long way toward forming how I look at such things.

Having laid that groundwork, therefore, I’d like to talk a bit about computer programming as a liberal art. Anyone who knows me or knows much about me knows that I’m not really a programmer by profession, and that the mathematical studies were not my strong suit in high school or college (though I’ve since come to make peace with them).

Programming is obviously not one of the original liberal arts. Then again, neither are most of the things we study under today’s “liberal arts” heading. The original liberal arts included seven: grammar, dialectic, and rhetoric — all of which were about cultivating precise expression (and which were effectively a kind of training for ancient legal processes), and arithmetic, geometry, music, and astronomy. Those last four were all mathematical disciplines: both music and astronomy bore virtually no relation to what is taught today under those rubrics. Music was not about pavanes or symphonies or improvisational jazz: it was about divisions of vibrating strings into equal sections, and the harmonies thereby generated. Astronomy was similarly not about celestial atmospheres or planetary gravitation, but about proportions and periodicity in the heavens, and the placement of planets on epicycles. Kepler managed to dispense with epicycles, which are now of chiefly historical interest.

In keeping with the spirit, if not the letter, of that original categorization, we’ve come to apply the term “liberal arts” today to almost any discipline that is pursued for its own sake — or at least not for the sake of any immediate material or financial advantage. Art, literature, drama, and music (of the pavane-symphony-jazz sort) are all considered liberal arts largely because they have no immediate practical application to the job of surviving in the world. That’s okay, as long as we know what we’re doing, and realize that it’s not quite the same thing.

While today’s economic life in the “information age” is largely driven by computers, and there are job openings for those with the right set of skills and certifications, I would suggest that computer programming does have a place in the education of a free and adaptable person in the modern world, irrespective of whether it has any direct or immediate job applicability.

I first encountered computer programming (in a practical sense) when I was in graduate school in classics. At the time (when we got our first computer, an Osborne I with 64K of memory and two drives with 92K capacity each), there was virtually nothing to do with classics that was going to be aided a great deal by computers or programming, other than using the very basic word processor to produce papers. That was indeed useful — but had nothing to do with programming from my own perspective. Still, I found Miscrosoft Basic and some of the other tools inviting and intriguing — eventually moving on to Forth, Pascal, C, and even some 8080 Assembler — because they allowed one to envision new things to do, and project ways of doing them.

Programming — originally recreational as it might have been — taught me a number of things that I have come to use at various levels in my own personal and professional life. Even more importantly, though, it has taught me things that are fundamental about the nature of thought and the way I can go about doing anything at all.

Douglas Adams, the author of the Hitchhiker’s Guide books, probably caught its most essential truth in Dirk Gently’s Holistic Detective Agency:

”…if you really want to understand something, the best way is to try and explain it to someone else. That forces you to sort it out in your mind. And the more slow and dim-witted your pupil, the more you have to break things down into more and more simple ideas. And that’s really the essence of programming. By the time you’ve sorted out a complicated idea into little steps that even a stupid machine can deal with, you’ve learned something about it yourself.”

I might add that not only have you yourself learned something about it, but you have, in the process learned something about yourself.

Adams also wrote, “I am rarely happier than when spending entire day programming my computer to perform automatically a task that it would otherwise take me a good ten seconds to do by hand.” This is, of course, one of the drolleries about programming. The hidden benefit is that, once perfected, that tool, whatever it was, allows one to save ten seconds every time it is run. If one judges things and their needs rightly, one might be able to save ten seconds a few hundred thousand or even a few million times. At that point, the time spent on programming the tool will not merely save time, but may make possible things that simply could never have been done otherwise.

One occasionally hears it said that a good programmer is a lazy programmer. That’s not strictly true — but the fact is that a really effective programmer is one who would rather do something once, and then have it take over the job of repeating things. A good programmer will use one set of tools to create other tools — and those will increase his or her effective range not two or three times, but often a thousandfold or more. Related to this is the curious phenomenon that a really good programmer is probably worth a few hundred merely adequate ones, in terms of productivity. The market realities haven’t yet caught up with this fact — and it may be that they never will — but it’s an interesting phenomenon.

Not only does programming require one to break things down into very tiny granular steps, but it also encourages one to come up with the simplest way of expressing those things. Economy of expression comes close to the liberal arts of rhetoric and dialectic, in its own way. Something expressed elegantly has a certain intrinsic beauty, even. Non-programmers are often nonplussed when they hear programmers talking about another programmer’s style or the beauty of his or her code — but the phenomenon is as real as the elegance of a Ciceronian period.

Pursuit of elegance and economy in programming also invites us to try looking at things from the other side of the process. When programming an early version of the game of Life for the Osborne, I discovered that by simply inverting a certain algorithm (having each live cell increment the neighbor count of all its adjacent spaces, rather than having each space count its live neighbors) achieved an eight-to-tenfold improvement in performance. Once one has done this kind of thing a few times, one starts to look for such opportunities. They are not all in a programming context.

There are general truths that one can learn from engaging in a larger programming project, too. I’ve come reluctantly to realize over the years that the problem in coming up with a really good computer program is seldom an inability to execute what one envisions: it’s much more likely to be a problem of executing what one hasn’t adequately envisioned in the first place. Not knowing what winning looks like, in other words, makes the game much harder to play. Forming a really clear plan first is going to pay dividends all the way down the line. One can find a few thousand applications for that principle every day, both in the computing world and everywhere else. Rushing into the production of something is almost always a recipe for disaster, a fact explored by Frederick P. Brooks in his brilliantly insightful (and still relevant) 1975 book, The Mythical Man-Month, which documents his own blunders as the head of the IBM System 360 project, and the costly lessons he learned from the process.

One of the virtues of programming as a way of training the mind is that it provides an objective “hard” target. One cannot make merely suggestive remarks to a computer and expect them to be understood. A computer is, in some ways, an objective engine of pure logic, and it is relentless and completely unsympathetic. It will do precisely what it’s told to do — no more and no less. Barring actual mechanical failure, it will do it over and over again exactly the same way. One cannot browbeat or cajole a computer into changing its approach. There’s a practical lesson and probably a moral lesson too there. People can be persuaded; reality just doesn’t work that way — which is probably just as well.

I am certainly not the first to have noted that computer programming can have this kind of function in educational terms. Brian Kernighan — someone known well to the community of Unix and C programmers over the years (he was a major part of the team that invented C and Unix) has argued that it’s precisely that in a New York Times article linked here. Donald Knuth, one of the magisterial figures of the first generation of programming, holds forth on its place as an art, too, here. In 2008, members of the faculties of Williams College and Pomona College (my own alma mater) collaborated on a similar statement available here. Another reflection on computer science and math in a pedagogical context is here. And of course Douglas Hofstadter in 1979 adumbrated some of the more important issues in his delightful and bizarre book, Gödel, Escher, Bach: An Eternal Golden Braid.

Is this all theory and general knowledge? Of course not. What one learns along the line here can be completely practical, too, even in a narrower sense. For me it paid off in ways I could never have envisioned when I was starting out.

When I was finishing my dissertation — an edition of the ninth-century Latin commentary of Claudius, Bishop of Turin, on the Gospel of Matthew — I realized that there was no practical way to produce a page format that would echo what normal classical and mediaeval text editions typically show on a page. Microsoft Word (which was what I was using at the time) supported footnotes — but typically these texts don’t use footnotes. Instead, the variations in manuscript readings are keyed not to footnote marks, but to the line numbers of the original text, and kept in a repository of textual variants at the bottom of the page (what is called in the trade an apparatus criticus). In addition, I wanted to have two further sets of notes at the bottom of the page, one giving the sources of the earlier church fathers that Claudius was quoting, and another giving specifically scriptural citations. I also wanted to mark in the margins where the foliation of the original manuscripts changed. Unsurprisingly, there’s really not a way to get Microsoft Word to do all that for you automatically. But with a bit of Pascal, I was able to write a page formatter that would take a compressed set of notes indicating all these things, and parcel them out to the right parts of the page, in a way that would be consistent with RTF and University Microfilms standards.

When, some years ago, we were setting Scholars Online up as an independent operation, I was able, using Javascript, PHP, and MySQL, to write a chat program that would serve our needs. It’s done pretty well since. It’s robust enough that it hasn’t seriously failed; we now have thousands of chats recorded, supporting various languages, pictures, audio and video files, and so on. I didn’t set out to learn programming to accomplish something like this. It was just what needed to be done.

Recently I had to recast my Latin IV class to correspond to the new AP curriculum definition from the College Board. (While it is not, for several reasons, a certified AP course, I’m using the course definition, on the assumption that a majority of the students will want to take the AP exam.) Among the things I wanted to do was to provide a set of vocabulary quizzes to keep the students ahead of the curve, and reduce the amount of dictionary-thumping they’d have to do en route. Using Lee Butterman’s useful and elegant NoDictionaries site, I was able to get a complete list of the words required for the passages in question from Caesar and Vergil; using a spreadsheet, I was able to sort and re-order these lists so as to catch each word the first time it appeared, and eliminate the repetitions; using regular expressions with a “grep” utility in my programming editor (BBEdit for the Macintosh) I was able to take those lists and format them into GIFT format files for importation into the Moodle, where they will be, I trust, reasonably useful for my students. That took me less than a day for several thousand words — something I probably could not have done otherwise in anything approaching a reasonable amount of time. For none of those tasks did I have any training as such. But the ways of thinking I had learned by doing other programming tasks enabled me to do these here.

Perhaps the real lesson here is that there is probably nothing — however mechanical it may seem to be — that cannot be in some senses used as a basis of education, and no education that cannot yield some practical fruit down the road a ways. That all seems consistent (to me) with the larger divine economy of things.

Adventures in Team Teaching

Monday, July 16th, 2012

About this time last year, Dr. McMenomy approached me with an idea that was half-proposal and half-plea. The World History course for that year had both students enrolled and a textbook picked out and purchased, but did not have a teacher. As I was the other specifically-history teacher on the Scholars Online staff, Dr. McMenomy asked me to take over. I hesitated; while I enjoy teaching anything historical, I was less delighted with the notion of taking over a class with its syllabus essentially dictated by a book I hadn’t chosen, especially a book which Dr. McMenomy acknowledged was far from ideal. Moreover, I knew that I’d need to do some preparation in August and September, which happened to be when I was a) moving and b) organizing a small conference. With this in mind, I said I had to decline. But the good doctor was politely insistent, pointing out that he was deeply opposed to canceling a class with students enrolled. He offered to lend a hand in getting the class set up: we could team-teach until I’d found my feet, he said, and then he’d let me take over. Still with some trepidation, I accepted this proposal.

Our first task was to write up summaries and quizzes for each chapter. We soon settled into the pattern of swapping off, each one of us writing up the notes and the quiz, and then leading the discussion. The other teacher would chime in with some additional comments.

This led to some interesting moments right off. Dr. McMenomy and I do not agree on everything; in fact on some subjects we stand at opposites. In the teaching of history our differences are not quite so pronounced, but there’s a real difference of emphasis. The doctor knows more than I about the intellectual history of the world, being by trade and inclination a classics instructor; he studies ideas, their transmission, and their influence. In contrast my attention is usually on everyday people in history—not the rulers or the scholars, but the ordinary folks who worked, fought, and struggled; the ones doing the digging and the dying, as it were. Dr. McMenomy is a master of learning and knowing what has been handed down to us, whereas I am trying to find out who and what has been overlooked, and therefore he has somewhat higher regard for established authority, while I am usually cheering for the underdog.

With such differences in style—not total opposition, but different enough—the result could have been tension and conflict. Instead, the result was a creative tension. Dr. McMenomy and I have known each other for almost three decades now, in fact since I was a young child, and as a result we know each other’s standpoints and respect them. Thus any difference of opinion that might have triggered a dispute was kept in check by our long friendship. This did not keep us from discussing and even debating, but we did so with high regard for each other even as we contested each other’s points.

At first I was concerned about showing this during class. But Dr. McMenomy pointed out that it would be good for our students to see that history is not a settled issue. The truth of history is not relative—something did happen, after all—but knowing the whole of that truth is nigh on impossible, and thus history is a realm of theory and evidence. My historical theories have support and also have a few holes; Dr. McMenomy’s ideas are, being human, similarly incomplete. The Grand Unified Theory of History, he says, is that no Grand Unified Theory of History is possible. When we discussed in front of the students, we thus made it clear that history is subject to continual questioning and debate. We also showed them that it’s up to them to make up their own minds. We refused to hand down definitive answers, because any such answers would keep the students from coming to their own conclusions… and besides, any such definitive answers would probably be flawed.

The doctor and I did come to many points of agreement; it wasn’t a continual debate. We did not always agree on the sources of power, but we both agreed that power was at the heart of history, and frequently steered our discussion in that direction. We also came to agree that geography is destiny, though naturally with a few limits. We were also firmly united in our growing disdain for the textbook we were using.

The weeks went by, the classes and the discussions continued, and it dawned on us that, rather than a chore forced on us, the class had become downright fun. Dr. McMenomy never stepped back and handed the class over to me; neither of us wanted him to. The collaboration was too delightful. We each built half of the exams, reviewed each other’s work, and then sent it on to the students; for grading, we would grade the work separately, compare notes, and then settle on a compromise where needed. As far as could be managed we kept things balanced, splitting the chapters between us and writing up extensive commentaries on each one, with discussion questions at the end to guide the class.

We noted that the commentaries were growing more and more lengthy. This was necessary; the book was continually failing to provide adequate coverage and synthesis. Sometimes it failed to provide even basic coherency, and was riddled with errors great and small. Looking for a replacement, we discovered to our dismay that it was the best available at that grade level. There were better books, but only for college students.

At the end of the year Dr. McMenomy began overhauling our class website, which was beginning to stagger under the amount of material we’d loaded onto it, and realized that over the course of the year, we’d written over forty-five thousand words. At which point he made a new proposal to me: “Would you like to write a textbook? We’re already well on our way.”

The idea caught our imaginations. We would continue the collaborative approach, we decided: each one of us would write certain chapters, then review the other’s work. Moreover we agreed to keep the useful conversation going within the text itself: we would respond to each other’s chapters, assessing and evaluating the other’s ideas, in a note at the end of each section. Thus students, reading through the text, would learn from the book itself that there are no easy answers, that when it comes to history you can’t necessarily just look up the answer, and that you should not automatically assume that what you read is gospel truth.

Then it dawned on us that we could begin to write our book piecemeal and replace sections of the current text as we went through (starting with the most egregiously inaccurate and inadequate chapters). As students read through, they will alternate between reading our material, posted online, and reading the old book. We aspire to rewrite about a third of the book as we teach this next year’s class. The material will be posted on the class website, which you can find here. The site is undergoing some changes at the moment and will undergo more throughout the year. Gradually, we’ll replace more and more of the book material, and eventually wind up with our own, brand-new text.

This is a substantial project, and we know it will take years. It’s also a highly ambitious project—ambitious to the point of madness, maybe!

But if so, it’s a truly pleasant madness. It is a deep privilege to work with Dr. McMenomy, who for all our differences of opinion remains the wisest and most insightful man I have ever met. I believe I speak for both of us when I say that we have learned a lot from each other and from the process of teaching this class; and we hope, with some confidence, that our learning leads to broader and better instruction, and our students will reap the benefits.

Freedom to fail

Thursday, March 31st, 2011

The previous entry on this blog was about failure not being an option — and I subscribe to that. Failure in an ultimate sense is something we should never choose for ourselves: the universe or some other person may well cause us to fail but we should not elect to fail in a final sense. Nevertheless, failure, and the freedom to fail in the short run without disastrous long-term consequences, is essential to learning. I have taught students with a whole range of abilities and inclinations over the years; there have been some who have been afraid to venture on anything, lest they fail to complete it to some arbitrary standard of perfection. Others tear into the subject with giddy abandon, making mistakes freely and without compunction. Of the two groups, it is invariably the latter that gets the job done. The students in the former group are frozen by fear or reverence for some external standard of excellence or perfection, and they really cannot or will not transcend that fear.

It may seem odd that, while I consider education to be one of the more important activities one can engage in throughout life, it’s actually the model of the game that speaks most directly to what’s going on here. The Dutch historian Johan Huizinga, in a marvelous little book called Homo Ludens, explores the notion of game and gaming in historical cultures. He identifies a number of salient features — but chief among them are two facts: first, that the universe of the game is somehow set apart, a kind of sacred precinct, and, second, that what goes on there does not effectively leave that arena. I think the same can be said of education — and, interestingly, the idea of education as a game is of long standing: the Roman word that most commonly was applied to the school was ludus, which is also the most common word for game or play.

Who doesn’t know at least one student who loves to play games, and who may be remarkably expert in them, but still has difficulty engaging the subjects he or she is nominally studying seriously? In my experience, it’s more the norm than the exception. I’ve heard people decry that fact as a sign of the sorry state into which the world has fallen — but I don’t think that’s all, or even most, of the picture. One of the things that sets games apart from other learning activities is that in a game, one is encouraged, or even required, to try things, in the relative certainty that, at first at least, one is going to make an awful mess of most of them. That’s okay. You get to do it again, and again, and again, if need be.

Within the bounds of the game, one is free to fail. Even there, one should not choose to fail: doing that subverts the game as nothing else ever could. But even if one is trying to win, failure comes easily and frequently, but without serious penalty. The consequence, though, is that students learn quickly enough how not to fail. The idea that one must get everything right the first time is nonsense. The creeping fear that one needs to score 100 on every quiz is nonsense. Even the belief that the highest grade signifies the best education is nonsense. Sure, I have had some students who got extraordinarily high grades and were very engaged with the material; I have had some students who were completely disengaged and got miserable scores. But those are the easy cases, and they are relatively few. The mixed cases are interesting and hard. I’ve had a few who operated the system in order to get good scores, but never really closed with the material. They walked away with a grade — though usually not the best grade — and little else. I wish it were possible to prevent tweaking the system this way, but it often is not. In the end, though, like the student at UCLA Christe recounted in the previous post, they achieved a real failure because they chose it: they sacrificed the substance of their education in order to win a favorable report on the education. It’s a bad trade — yet another instance of the means becoming autonomous.

I have also had other students — probably more of them than in any of the other groups — who thrashed about, and had real difficulty with the material, but kept bashing at it, and wound up making real strides, and in a meaningful sense winning the battle. Christe talked about how a baby learning to walk is taught by the unforgiving nature of gravity. That’s true enough. Gravity is exacting: its rules never waver, and so it may be unforgiving in that regard. It’s also very forgiving in another sense, however. Falling once or even a thousand times doesn’t keep you down or make you more likely to fall the next time. Every time you fall, assuming you haven’t injured yourself critically, you are free to get up again and keep on trying. And perhaps you have learned something this time. If not, give it another go.

Children learning to speak succeed with such amazing speed not in spite of but because of their abundant mistakes. They are forming concepts about the language, and testing and refining them by playing with it so recklessly. A child who learns that “I walked” is a way of putting “I walk” into the past will quite reasonably assume that “I runned” is a way of putting “I run” into the past. This may be local and small-scale setback when it comes to identifying the right verb form for the task: it most definitely is not failure in a larger sense. It’s a triumph. Sure, it’s incorrect English. It is, nevertheless, the vindication of that child’s language-forming capacity, and the ability to abstract general principles from specific instances. He or she will eventually learn about strong verbs. But such engagement with what one wants to say, and such fearlessness in expressing it, is rocket fuel for the mind. The child learns to speak the way a devoted gamer learns a game — through immersion and unquestioning involvement, untainted by the slightest fear of the failure that invariably, repeatedly attends the enterprise.

When I first started teaching Greek I and II online about fifteen years ago, I came up with what seemed to me a rather innovative plan for the final for the course. Over the years since I haven’t altered it much, because of all the things I’ve ever done as a teacher, it seems to have been one of the most successful. Though in recent years Sarah Miller Esposito has taken Greek I and II over from me, I believe that she’s still doing roughly the same thing, too. I set the final up as a huge, exhaustive survey of virtually everthing covered in the course — especially the mechanical things. All the declensions, all the conjugations, all the pronoun forms, and so on, became part of that final exam. It took many hours to complete. I eventually even gave up having other exams throughout the year. Everything (in terms of grade) could hang from the final.

Everything for a year depending on a final? For a high school student? This sounds like a nightmare. I’ve had parents balk and complain — but seldom students: not when they’ve been through it and seen the results. Here’s the trick: the student was allowed to take that exam throughout the summer, as many times as he or she wanted. It could be taken with the book in the lap, with an answer sheet propped up next to the computer; students could discuss the contents with one another, or ask me for answers (though they seldom needed to: I put the number of the relevant section in the book next to each question). The results of each pass could be reviewed, and each section could be retaken as many times as desired. The only requirement was this — the last time any given section of the exam (I think there are eighteen sections, some of them worth several hundred points each) was taken, it had to be taken under exam conditions: closed book, with no outside sources. The final version had to come in by Sept. 1. Students were free to complete it at any point prior: most of them didn’t. Why should they? They were playing the game, and improving their scores. They actually rather liked it. Especially after I was able to get these exam segments running under the Moodle, so that scoring was instantaneous and painless (frankly there’s little that’s as excruciating for a teacher to grade by hand as accented polytonic Greek), they did it a lot. They’d take each segment four, five, perhaps even ten times.

The results of this were, from a statistical point of view, probably ridiculous. It tended to produce a spread of scores ranging from a low of about 98.3 to a high of about 99.9. Nobody left without an A. “What kind of grade inflation is this?” one might ask. But the simple (and exhilarating) fact was that they all came back to class in the fall ready to perform like A students. They had the material down cold — and they hadn’t forgotten it all over the summer either. This is not just my own assessment: they went on to win national competitions, and to gain admission to some of the most prestigious universities in the country — where at least some of them tested into upper division classics courses right away. If that’s grade inflation, so be it. I like to think rather that it’s education inflation. We could use a little more of that. I don’t really take credit for it myself — it’s not that I was such a brilliant teacher. I’m not even primarily a Hellenist — I’m a Latinist. But I credit the fact that they became engaged with it as if with a game.

We live in a society with a remarkably strong gaming culture; but most historical societies have had the same thing. We have surviving games from Egypt and Greece and Rome; chess comes from ancient India and Persia, and go (probably the only game to match chess for complexity from simplicity) from ancient China and Japan. We have ancient African games, and ancient Native American games. Today the videogame industry is a multibillion dollar affair. Board games, card games, sporting equipment, and every other form of game equipment is marketed and consumed with a rare zeal. These products find buyers even in a downturn economy, because they appeal to something very fundamental about who we are. Even while the educational establishment seems to be ever more involved in protecting the fragile ego and self-image of the learner, our games don’t tell us pretty lies. They don’t tell us that we’ll win every time. They tell us we’ll fail and have to keep trying if we want to win. I really think that people savor that honesty, and that the lesson to be learned from it is enormously significant.

I know that there are a lot of things that people have had to say against games, and certainly an undue or inappropriate preoccupation with them may not be a good thing. Nevertheless, they are genuine part of our God-given nature, and they form, I would argue, one of our most robust models for learning. In games we are free to fail: and that freedom fosters the ability to learn, which is ultimately the legitimate freedom to win. If we can extract any lesson from our games, and perhaps apply it more broadly to the sphere of learning, I think we all will benefit.

Failure is not an option

Monday, March 21st, 2011

When I taught my first class as a graduate assistant at UCLA, one of the students asked whether my Western Civilization section was a “Mickey Mouse” course. What he meant was, “Is this a course with a guaranteed A if I show up and do the minimal work assigned, or will I run the risk that the work I do won’t be good enough for an A?” I said no, it wasn’t a Mickey Mouse course; the history of the Western World was complex and it would take work. I would not guarantee his grade.

He didn’t show up at our next meeting and the enrolled student printout the next week confirmed that he had dropped the class. He couldn’t risk the possibility of failure (which apparently was determined by having a less than 4.0 GPA), and so he missed the opportunity to learn why the reforms of Diocletian changed the economy of the Roman Empire and influenced the rise of monasteries, or how the stirrup made the feudal system possible, or how the academic interests of Charlemagne led to the rise of universities and the very institution he was supposed to be part of.  He chose to fail to get an education rather than fail to get an A grade.

When I taught my first chemistry course online, I was blessed with an enthusiastic bunch of brilliant students who tackled the rigorous textbook and beat it into submission — except for one student we’ll call Joe. Joe lacked the science and math background that would have made the course easier, and he had a learning disability that made reading anything, but especially any kind of formulae, a real trial.  By the middle of the fall semester, it was clear that Joe was in serious trouble. His mother discussed the possibility of dropping the course, but I thought I could teach any willing student anything, so I offered extra help. Joe and I agreed to meet an hour early before the rest of the class and work through the problematic material. When I realized the extent of Joe’s problems, we backed up and started over. He continued to attend the regular online sessions with the rest of the class, but I excused him from keeping up with the homework and quiz assignments while we tried to establish a foundation he could really build on.

At the end of the academic year, the rest of the class had finished the twenty-two chapters of the text. Joe had finished four.

But he really knew those four chapters. He could answer any question and do any problem from them, with more facility and conviction than some of the students who had seemingly breezed through the material months earlier. I reluctantly entered a failing grade on his report, but wrote his parents that I didn’t think the grade reflected Joe’s real accomplishments that year. He had managed to learn some chemistry. What’s more, I’d had a salutary lesson in perseverance.

What I hadn’t realized was that my lesson wasn’t over. Joe didn’t accept his failing grade as the final word. Three years later, out of the blue, I got a letter from Joe’s mother. Her son, fired with the discovery that he could actually learn chemistry given enough time, and the realization that he actually liked chemistry, had gotten a job working part time so that he could pay a chemistry student from the local college to tutor him. He applied the same dogged determination he had shown in our extra morning sessions to his self-study and with the help of his tutor, slogged his way though the rest of our text. Kindly note that no one was giving him a grade for this work. But when he was done with his self-study, he took a community college chemistry course and passed it.

Like so many things, failure is a matter of perception. In his own estimation, Joe hadn’t failed — despite the F on his transcript. Many students would have given up early in the semester — certainly before the last withdrawal date — rather than risk a failing grade. For Joe, the grade was not a locked gate blocking his passage; it was merely measure of how far he still had to go. The educational reality was that he was four chapters further than he had been at the beginning of the year. He took heart from the fact that he was making progress, and kept going.

Our dependence on grades frustrates the educational progress of many otherwise willing students. They take easy courses where they are confident they can do well, rather than risk lowering their grade point average by taking the course that will actually challenge them to grow intellectually. In some cases, teachers even enable the process by giving “consolation” grades rather than risking damaging the fragile self-esteem of students — but everyone, even the students, realizes that they didn’t actually earn the report. We’ve created a schizoid educational system, where even though we know that recorded grades at best inadequately reflect a student’s real accomplishments, and, at worst, distort them, we still base academic advancement and even financial rewards on those abstractions for the sake of convenience. The result is that students pursue grades, rather than education.

Real education requires discipline and serious reflection, but it also requires taking risks, making mistakes, and learning from those mistakes. I would venture that making mistakes and recovering from them is not merely a normal part of learning, but an essential of classical Christian education. We do our students an enormous disservice by making them afraid to fail to “get it right” the first time. We teach them to back down, rather than to buckle down and tackle a new topic with gumption.

Gravity is an uncompromising and unforgiving teacher. Lose your balance, and you will fall.  But every child learns to walk, sooner or later, despite many tumbles along the way. We expect toddlers to fall, and we try to minimize the damage by removing sharp edges and putting down carpets. But we let them fall: how else will they learn to recognize imbalance and practice the motor skills to correct it? We teach them such tumbles should not be a reason to give up learning to walk; we laugh, encourage them to get up, and try again. Ultimately, every healthy child learns to walk, and we really don’t care how many tumbles they took, or how long it took. Parents may report the accomplishment with glee to friends and grandparents, but when was the last time anyone asked how old you were when you learned to walk? The important thing is that you didn’t give up: you chose not to fail, you are walking now, and that gives you the ability to do things you wouldn’t otherwise be able to do as easily.

The phrase “failure is not an option” comes from the movie Apollo 13. The script writers put it in the mouth of Gene Kranz, the NASA Flight Control director at the time. He never actually said those words, but they reflected a firm conviction evidenced by Mission Control that the team would not consider failure among the possible outcomes of their efforts. They could not choose to fail if none of the other options worked — failure was simply not on the list. Of course, failure was still a possibility, but it wasn’t a choice. Their goal was to find a solution that would bring the astronauts home safely, and if none of the proposed options worked, to propose something else that might, and keep working until they succeeded.

Our goal as Christian parents is to educate our children to know God and His creation better, to love all the people He has created, and to serve Him by using the talents He has given them to show His love in that world. To accomplish that, our children need to grow intellectually and spiritually. They need to tackle many subjects, push the limits, and be willing to reveal their ignorance by asking questions. If we are doing an effective job of classical education, we will teach them how to read so closely and carefully that they recognize when things don’t make sense, and be eager to find out why.

Questioning the material won’t be an indication of students’ inability to figure it out for themselves, but a witness to their deep engagement with the content of the text, whether it is making sense of a Latin translation exercise, following a geometrical proof to conclusion, imagining the ramifications of relativity theory, or understanding how the concept of nature influences the behavior of Hawthorne’s characters. When failure is not an option, we understand that students have committed to stay the course, even when they make slow progress by some arbitrary standard, or have to take a detour to pick up necessary skills. Students are freed to make the mistakes they need to make to learn, grow, and ultimately succeed without the prejudice of failed expectations, and we are free to recognize the true achievements in their education, whether or not that is reflected by their current grade level or GPA.