Archive for the ‘Science’ Category

Unprecedented?

Saturday, July 11th, 2020

I have to date remained silent here about the COVID-19 pandemic, because for the most part I haven’t had anything constructive to add to the discussion, and because I thought that our parents and students would probably prefer to read about something else. I also try, when possible, to discuss things that will still be of interest three or even ten years from now, and to focus largely on issues of education as we practice it. 

Still, COVID-19 has obviously become a consuming focus for many—understandably, given the extent of the problem—and what should be managed in the most intelligent way possible according to principles of epidemiology and sane public policy has become a political football that people are using as further grounds to revile each other. I’m not interested in joining that game. Knaves and cynical opportunists will have their day, and there’s probably not much to do that will stop them—at least nothing that works any better than just ignoring them.

But there is one piece of the public discourse on the subject that has shown up more and more frequently, and here it actually does wander into a domain where I have something to add. The adjective that has surfaced most commonly in public discussions about the COVID-19 epidemic with all its social and political consequences is “unprecedented”. The disease, we are told by some, is unprecedented in its scope; others lament that it’s having unprecedented consequences both medically and economically. The public response, according to others, is similarly unprecedented: for some that’s an argument that it is also unwarranted; for others, that’s merely a sign that it’s appropriately commensurate with the scope of the unprecedented problem; for still others, it’s a sign that it’s staggeringly inadequate.

As an historian I’m somewhat used to the reckless way in which the past is routinely ignored or (worse) subverted, according to the inclination of the speaker, in the service of this agenda or that. I’ve lost track of the number of people who have told me why Rome fell as a way of making a contemporary political point. But at some point one needs to raise an objection: seriously—unprecedented? As Inigo Montoya says in The Princess Bride, “You keep using that word. I do not think it means what you think it means.” To say that anything is unprecedented requires it to be contextualized in history—not just the last few years’ worth, either.

In some sense, of course, every happening in history, no matter how trivial, is unprecedented—at least if history is not strictly cyclical, as the Stoics believed it was. I’m not a Stoic on that issue or many others. So, no: this exact thing has indeed never happened before. But on that calculation, if I swat a mosquito, that’s unprecedented, too, because I’ve never swatted that particular mosquito before. This falls into Douglas Adams’ useful category of “True, but unhelpful.” Usually people use the word to denote something of larger scope, and they mean that whatever they are talking about is fundamentally different in kind or magnitude from anything that has happened before. But how different is COVID-19, really?

The COVID-19 pandemic is not unprecedented in its etiology. Viruses happen. We even know more or less how they happen. One does not have to posit a diabolical lab full of evil gene-splicers to account for it. Coronaviruses are not new, and many others have apparently come and gone throughout human history, before we even had the capacity to detect them or name them. Some of them have been fairly innocuous, some not. Every time a new one pops up, it’s a roll of the dice—but it’s not our hand that’s rolling them. Sure: investing in some kind of conspiracy theory to explain it is (in its odd way) comforting and exciting. It’s comforting because it suggests that we have a lot more control over things than we really do. It’s exciting, because it gives us a villain we can blame. Blame is a top-dollar commodity in today’s political climate, and it drives more and more of the decisions being made at the highest levels. Ascertaining the validity of the blame comes in a distant second to feeling a jolt of righteous indignation. The reality is both less exciting and somewhat bleaker: we don’t have nearly as much control as we’d like to believe. These things happen and will continue to happen without our agency or design. Viruses are fragments of genetic material that have apparently broken away from larger organic systems, and from there they are capable of almost infinite, if whimsical, mutation. They’re loose cannons: that’s their nature. That’s all. Dangerous, indisputably. Malicious? Not really.

The COVID-19 pandemic is not unprecedented in its scope and ability to be lethal. Epidemics and plagues have killed vast numbers of people over wide areas throughout history. A few years ago, National Geographic offered a portrait of the world’s most prolific killer. It was not a mass murderer, or even a tyrant. It was the flea, and the microbial load it carried. From 1348 through about 1352, the Black Death visited Europe with a ferocity that probably was unprecedented at the time. Because records from the period are sketchy, it’s hard to come up with an exact count, but best estimates are that it killed approximately a third of the population of Europe all within that little three-to-four-year period. The disease continued to revisit Europe approximately every twenty years for some centuries to come, especially killing people of childbearing age each time, with demographic results that vastly exceed what we might determine from a sheer count of losses. In some areas whole cities were wiped out, and the death toll in Europe alone may have run as high as two hundred million: the extent of its destruction throughout parts of Asia has not been ascertained. Smallpox, in the last century of its activity (1877-1977), killed approximately half a billion people. The 1918 Spanish influenza epidemic killed possibly as many as a hundred million. Wikipedia here lists over a hundred similar catastrophes caused by infectious diseases of one sort or another, each of which had a death toll of more than a thousand; it lists a number of others where the count cannot even be approximately ascertained.

Nor is the COVID-19 pandemic unprecedented in its level of social upheaval. The Black Death radically changed the social, cultural, economic, and even the religious configuration of Europe almost beyond recognition. After Columbus, Native American tribes were exposed to Old World disease agents to which they had no immunities. Many groups were reduced to less than a tenth of their former numbers. Considering these to be instances of genocide is, I think, to ascribe far more intentionality to the situation than it deserves (though there seem to have been some instances where it was intended), but the outcome was indifferent to the intent. The Spanish Influenza of 1918, coming as it did on the heels of World War I, sent a world culture that was already off balance into a deeper spiral. It required steep curbs on social activity to check its spread. Houses of worship were closed then too. Other pubic gatherings were forbidden. Theaters were closed. Even that was not really unprecedented, though: theaters had been closed in Elizabethan London during several of the recurrent visitations of the bubonic plague. The plot of Romeo and Juliet is colored by a quarantine. Boccaccio’s Decameron is a collection of tales that a group of people told to amuse themselves while in isolation, and Chaucer’s somewhat derivative Canterbury Tales are about a group of pilgrims heading for the shrine of St. Thomas à Becket for having given them aid while they were laboring under a plague. People have long known that extraordinary steps need to be taken, at least temporarily, in order to save lives during periods of contagion. It’s inconvenient, it’s costly, and it’s annoying. It’s not a hoax, and it’s not tyrannical. It’s not novel.

So no, in most ways, neither the appearance of COVID-19 nor our responses to it are really unprecedented. I say this in no way to minimize the suffering of those afflicted with the disease, or those suffering from the restrictions put in place to curb its spread. Nor do I mean to trivialize the efforts of those battling its social, medical, or economic consequences: some of them are positively heroic. But claiming that this is all unprecedented looks like an attempt to exempt ourselves from the actual flow of history, and to excuse ourselves from the very reasonable need to consult the history of such events in order to learn what we can from them—for there are, in fact, things to be learned.

In responding to the plagues and calamities of the past, it is perhaps unsurprising that people responded, then as now, primarily out of fear. Fear is one of the most powerful of human motivators, but it is seldom a wise counselor. There have been conspiracy theories before too: during the Black Death, for example, some concluded that that the disease was due to witchcraft, and so they set out to kill cats, on the ground that they were witches’ familiars. The result, of course, was that rats—the actual vectors for the disease, together with their fleas, were able to breed and spread disease all the more freely. Others sold miracle cures to credulous (and fearful) populations; these of course accomplished nothing but heightening the level of fear and desperation.

There were also people who were brave and self-sacrificing, who cared for others in these trying times. In 1665, the village of Eyam in Derbyshire quarantined itself with the plague. They knew what they could expect, and they were not mistaken. Everyone in the town perished, but their decision saved thousands of lives in neighboring villages. Fr. Damien De Veuster ministered to the lepers on Molokai before succumbing to the disease himself: he remains an icon of charity and noble devotion and is the patron saint of Hawaii.

The human race has confronted crisis situations involving infectious diseases, and the decisions they require, before. They are not easy, and sometimes they call for self-sacrifice. There is sober consolation to be wrung from the fact that we are still here, and that we still, as part of our God-given nature, have the capacity to make such decisions—both the ones that protect us and those sacrificial decisions we make to save others. We will not get through the ordeal without loss and cost, but humanity has gotten through before, and it will again. We are neither entirely without resources, but neither are we wholly in control. We need to learn from what we have at our disposal, marshal our resources wisely and well, and trust in God for the rest.

Mr. Spock, Pseudo-scientist

Wednesday, April 15th, 2020

I’m one of those aging folks who still remember the original run of Star Trek (no colon, no The Original Series or any other kind of elaboration — just Star Trek). It was a groundbreaking show, and whether you like it or not (there are plenty of reasons to do both), it held out a positive vision for the future, and sketched a societal ethos that was not entirely acquisitive, and not even as secular and materialistic as later outings in the Star Trek franchise. The officers of the Enterprise were not latter-day conquistadors. They were genuine explorers, with a Prime Directive to help them avoid destroying too many other nascent cultures. (Yes, I know: they violated it very frequently, but that was part of the point of the story. Sometimes there was even a good reason for doing so.)

It also offered the nerds among us a point of contact. Sure, Captain Kirk was kind of a cowboy hero, galloping into situations with fists swinging and phasers blazing, and, more often than not, reducing complex situations to polar binaries and then referring them either to fisticuffs or an outpouring of excruciatingly impassioned rhetoric. Dr. McCoy, on the other hand, was the splenetic physician, constantly kvetching about everything he couldn’t fix, and blaming people who were trying to work the problem for not being sensitive enough to be as ineffectual as he was. But Mr. Spock (usually the object of McCoy’s invective) was different. He was consummately cool, and he relied upon what he called Logic (I’m sure it had a capital “L” in his lexicon) for all his decision-making. He was the science officer on the Enterprise, and also the first officer in the command structure. Most of the more technically savvy kids aspired to be like him.

It was an article of faith that whatever conclusions Spock reached were, because he was relying on Logic, logical. They were the right answer, too, unless this week’s episode was explicitly making a concession to the value of feelings over logic (which happened occasionally, but not often enough to be really off-putting), and they could be validated by science and reason. You can’t argue with facts. People who try are doomed to failure, and their attempt is at best a distraction, and often worse. 

Up to that point, I am more or less on board, though I was always kind of on the periphery of the nerd cluster, myself. I suspected then (as I still do) that there are things that logic (with an upper-case or a lower-case L) or mathematics cannot really address. Certainly not everything is even quantifiable. But it was the concept of significant digits that ultimately demolished, for me, Mr. Spock’s credibility as a science officer. When faced with command decisions, he usually did reasonably well, but when pontificating on mathematics, he really did rather badly. (Arguably he was exactly as bad at it as some of the writers of the series. Small wonder: see the Sherlock Holmes Law, which I’ve discussed here previously.)

The concept of significant digits (or figures) is really a simple one, though its exact specifications involve some fussy details. Basically it means that you can’t make your information more accurate merely by performing arithmetic on it. (It’s more formally explained here on Wikipedia.) By combining a number of things that you know only approximately and doing some calculations on them, you’re not going to get a more accurate answer: you’re going to get a less accurate one. The uncertainty of each of those terms or factors will increase the uncertainty of the whole.

So how does Spock, for all his putative scientific and logical prowess, lose track of this notion, essential to any kind of genuine scientific thinking? In the first-season episode “Errand of Mercy”, he has a memorable exchange with Kirk: 

Kirk: What would you say the odds are on our getting out of here?

Spock: Difficult to be precise, Captain. I should say approximately 7,824.7 to 1.

Kirk: Difficult to be precise? 7,824 to 1?

Spock: 7,824.7 to 1.

Kirk: That’s pretty close approximation.

Spock: I endeavor to be accurate.

Kirk: You do quite well.

No, he doesn’t do quite well. He does miserably: he has assumed in his runaway calculations that the input values on which he bases this fantastically precise number are known to levels of precision that could not possibly be ascertained in the real world, especially in the middle of a military operation — even a skirmish in which all the participants and tactical elements are known in detail (as they are not here).  The concept of the “fog of war” has something to say about how even apparent certainties can quickly degrade, in the midst of battle, into fatal ignorance. Most of the statistical odds for this kind of thing couldn’t be discovered by any rational means whatever.

Precision and accuracy are not at all the same thing. Yes, you can calculate arbitrarily precise answers based on any data, however precise or imprecise the data may be. Beyond the range of its significant digits, however, this manufactured precision is worse than meaningless: it conveys fuzzy knowledge as if it were better understood than it really is. It certainly adds nothing to the accuracy of the result, and only a terrible scientist would assume that it did. Spock’s answer is more precise, therefore, than “about 8000 to one”, but it’s less accurate, because it suggests that the value is known to a much higher degree of precision than it possibly could be. Even “about 8000 to one” is probably not justifiable, given what the characters actually know. (It’s also kind of stupid, in the middle of a firefight, to give your commanding officer gratuitously complex answers to simple questions: “Exceedingly poor,” would be more accurate and more useful.

This has not entirely escaped the fan community, of course: “How many Vulcans does it take to change a lightbulb?” is answered with, “1.000000”. This is funny, because it is, for all its pointless precision, no more accurate than “one”, and in no situations would fractional persons form a meaningful category when it comes to changing light bulbs. (Fractional persons might be valid measurements in other contexts — for example, in a cannibalistic society. Don’t think about it too hard.) 

Elsewhere in the series, too, logic is invoked as a kind of deus ex machina — something to which the writer of the episode could appeal to justify any decision Mr. Spock might come up with, irrespective of whether it was reasonable or not. Seldom (I’m inclined to say never, but I’m not going to bother to watch the whole series over again just to verify the fact) are we shown the operation of even one actual logical operation.

The structures of deductive reasoning (logic’s home turf) seldom have a great deal to do with science, in any case. Mathematical procedures are typically deductive. Some philosophical disciplines, including traditional logic, are too. Physical science, however, is almost entirely inductive. In induction, one generalizes tentatively from an accumulation of data; such collections of data are seldom either definitive or complete. Refining hypotheses as new information comes to light is integral to the scientific process as it’s generally understood. The concept of significant digits is only one of those things that helps optimize our induction.

Odds are a measure of ignorance, not knowledge. They do not submit to purely deductive analysis. For determinate events, there are no odds. Something either happens or it doesn’t, Mr. Spock notwithstanding. However impossibly remote it might have seemed yesterday, the meteorite that actually landed in your back yard containing a message from the Great Pumpkin written in Old Church Slavonic now has a probability of 100% if it actually happened. If it didn’t, its probability is zero. There are no valid degrees between the two.

Am I bashing Star Trek at this point? Well, maybe a little. I think they had an opportunity to teach an important concept, and they blew it. It would have been really refreshing (and arguably much more realistic) to have Spock occasionally say, “Captain, why are you asking me this? You know as well as I do that we can’t really know that, because we have almost no data,” or “Well, I can compute an answer of 28.63725, but it has a margin of error in the thousands, so it’s not worth relying upon.” Obviously quiet data-gathering is not the stuff of edge-of-the-seat television. I get that. But it’s what the situation really would require. (Spock, to his credit, often says, “It’s like nothing we’ve ever seen before,” but that’s usually just prior to his reaching another unsubstantiated conclusion about it.)

I do think, however, that the Star Trek promotion of science as an oracular fount of uncontested truth — a myth that few real scientists believe, but a whole lot of others (including certain scientistic pundits one could name) do believe — is actively pernicious. It oversells and undercuts the legitimate prerogatives of science, and in the long run undermines our confidence in what it actually can do well. There are many things in this world that we don’t know. Some of the things we do know are even pretty improbable.  Some very plausible constructs, on the other hand, are in fact false. I’m all in favor of doing our best to find out, and of relying on logical inference where it’s valid, but it’s not life’s deus ex machina. At best, it’s a machina ex Deo: the exercise of one — but only one — of our God-given capacities. Like most of them, it should be used responsibly, and in concert with the rest.

Reflections on Trisecting the Angle

Thursday, March 12th, 2020

I’m not a mathematician by training, but the language and (for want of a better term) the sport of geometry has always had a special appeal for me. I wasn’t a whiz at algebra in high school, but I aced geometry. As a homeschooling parent, I had a wonderful time teaching geometry to our three kids. I still find geometry intriguing.

When I was in high school, I spent hours trying to figure out how to trisect an angle with compass and straightedge. I knew that nobody had found a way to do it. As it turns out, in 1837 (before even my school days) French mathematician Pierre Wantzel proved that it was impossible for the general case (trisecting certain special angles is trivial). I’m glad I didn’t know that, though, since it gave me a certain license to hack at it anyway. Perhaps I was motivated by a sense that it would be glorious to be the first to crack this particular nut, but mostly I just wondered, “Can it be done, and if not, why not?”

Trisecting the angle is cited in Wikipedia as an example of “pseudomathematics”, and while I will happily concede that any claim to be able to do so would doubtless rely on bogus premises or operations, I nevertheless argue that wrestling with the problem honestly, within the rules of the game, is a mathematical activity as valid as any other, at least as an exercise. I tried different strategies, mostly trying to find a useful correspondence between the (simple) trisection of a straight line and the trisection of an arc. My efforts, of course, failed (that’s what “impossible” means, after all). Had they not, my own name would be celebrated in different Wikipedia articles describing how the puzzle had finally been solved. It’s not. In my defense, I hasten to point out that I never was under the impression that I had succeeded. I just wanted to try and to know either how to do it or to know the reason why.

My failed effort might, by many measures, be accounted a waste of time. But was it? I don’t think it was. Its value for me was not in the achievement but in the striving. Pushing on El Capitan isn’t going to move the mountain, either, but doing it regularly will provide a measure of isometric exercise. Similarly confronting an impossible mental challenge can have certain benefits.

And so along the way I gained a visceral appreciation of some truths I might not have grasped as fully otherwise.

In the narrowest terms, I came to understand that the problem of trisecting the angle (either as an angle or as its corresponding arc) is fundamentally distinct from the problem of trisecting a line segment, because curvature — even in the simplest case, which is the circular — fundamentally changes the problem. One cannot treat the circumference of a circle as if it were linear, even though it is much like a line segment, having no thickness and a specific finite extension. (The fact that π is irrational seems at least obliquely connected to this, though it might not be: that’s just a surmise of my own.)

In the broadest terms, I came more fully to appreciate the fact that some things are intrinsically impossible, even if they are not obvious logical contradictions. You can bang away at them for as long as you like, but you’ll never solve them. This truth transcends mathematics by a long stretch, but it’s worth realizing that failing to accomplish something that you want to accomplish is not invariably a result of your personal moral, intellectual, or imaginative deficiencies. As disappointing as it may be for those who want to believe that every failure is a moral, intellectual, or imaginative one, it’s very liberating for the rest of us.

Between those obvious extremes are some more nuanced realizations. 

I came to appreciate iterative refinement as a tool. After all, even if you can’t trisect the general angle with perfect geometrical rigor, you actually can come up with an imperfect but eminently practical approximation — to whatever degree of precision you require. By iterative refinement (interpolating between the too-large and the too-small solutions), you can zero in on a value that’s demonstrably better than the last one every time. Eventually, the inaccuracy won’t matter to you any more for any practical application. I’m perfectly aware that this no longer pure math — but it is the very essence of engineering, which has a fairly prominent and distinguished place in the world. Thinking about this also altered my appreciation of precision as a pragmatic real-world concept. 

A more general expression of this notion is that, while some problems never have perfect solutions, they sometimes can be practically solved in a way that’s good enough for a given purpose. That’s a liberating realization. Failure to achieve the perfect solution needn’t stop you in your tracks. It doesn’t mean you can’t get a very good one. It’s worth internalizing this basic truth. And only by wrestling with the impossible do we typically discover the limits of the possible. That in turn lets us develop strategies for practical work-arounds.

Conceptually, too, iterative refinement ultimately loops around on itself and becomes a model for thinking about such things as calculus, and the strange and wonderful fact that, with limit theory, we can (at least sometimes) achieve exact (if occasionally bizarre) values for things that we can’t measure directly. Calculus gives us the ability (figuratively speaking) to bounce a very orderly sequence of successive refinements off an infinitely remote backstop and somehow get back an answer that is not only usable but sometimes actually is perfect. This is important enough that we now define the value of pi as the limit of the perimeter of a polygon with infinitely many sides.

It shows also that this is not just a problem of something being somehow too difficult to do: difficulty has little or nothing to do with intrinsic impossibility (pace the Army Corps of Engineers: they are, after all, engineers, not pure mathematicians). In fact we live in a world full of unachievable things. Irrational numbers are all around us, from pi to phi to the square root of two, and even though no amount of effort will produce a perfect rational expression of any of those values, they are not on that account any less real. You cannot solve pi to its last decimal digit because there is no such digit, and no other rational expression can capture it either. But the proportion of circumference to diameter is always exactly pi, and the circumference of the circle is an exact distance. It’s magnificently reliable and absolutely perfect, but its perfection can never be entirely expressed in the same terms as the diameter. (We could arbitrarily designate the circumference as 1 or any other rational number; but then the diameter would be inexpressible in the same terms.)

I’m inclined to draw some theological application from that, but I’m not sure I’m competent to do so. It bears thinking on. Certainly it has at least some broad philosophical applications. The prevailing culture tends to suggest that whatever is not quantifiable and tangible is not real. There are a lot of reasons we can’t quantify such things as love or justice or truth; it’s also in the nature of number that we can’t nail down many concrete things. None of them is the less real merely because we can’t express them perfectly.

Approximation by iterative refinement is basic in dealing with the world in both its rational and its irrational dimensions. While your inability to express pi rationally is not a failure of your moral or rational fiber, you may still legitimately be required — and you will be able — to get an arbitrarily precise approximation of it. In my day, we were taught the Greek value 22/7 as a practical rational value for pi, though Archimedes (288-212 BC) knew it was a bit too high (3.1428…). The Chinese mathematician Zhu Chongzhi (AD 429-500) came up with 355/113, which is not precisely pi either, but it’s more than a thousand times closer to the mark (3.1415929…). The whole domain of rational approximation is fun to explore, and has analogical implications in things not bound up with numbers at all.

So I personally don’t consider my attempts to trisect the general angle with compass and straightedge to be time wasted. It’s that way in most intellectual endeavors, really: education represents not a catalogue of facts, but a process and an exercise, in which the collateral benefits can far outweigh any immediate success or failure. Pitting yourself against reality, win or lose, you become stronger, and, one hopes, wiser. 

STEMs and Roots

Tuesday, February 2nd, 2016

Everywhere we see extravagant public handwringing about education. Something is not working. The economy seems to be the symptom that garners the most attention, and there are people across the political spectrum who want to fix it directly; but most seem to agree that education is at least an important piece of the solution. We must produce competitive workers for the twenty-first century, proclaim the banners and headlines; if we do not, the United States will become a third-world nation. We need to get education on the fast track — education that is edgy, aggressive, and technologically savvy. Whatever else it is, it must be up to date, it must be fast, and it must be modern. It must not be what we have been doing.

I’m a Latin teacher. If I were a standup comedian, that would be considered a punch line. In addition to Latin, I teach literature — much of it hundreds of years old. I ask students, improbably, to see it for what it itself is, not just for what they can use it for themselves. What’s the point of that? one might ask. Things need to be made relevant to them, not the other way around, don’t they?

Being a Latin teacher, however (among other things), I have gone for a number of years now to the Summer Institute of the American Classical League, made up largely of Latin teachers across the country. One might expect them to be stubbornly resistant to these concerns — or perhaps blandly oblivious. That’s far from the case. Every year, in between the discussions of Latin and Greek literature and history, there are far more devoted to pedagogy: how to make Latin relevant to the needs of the twenty-first century, how to advance the goals of STEM education using classical languages, and how to utilize the available technology in the latest and greatest ways. What that technology does or does not do is of some interest, but the most important thing for many there is that it be new and catchy and up to date. Only that way can we hope to engage our ever-so-modern students.

The accrediting body that reviewed our curricular offerings at Scholars Online supplies a torrent of exortation about preparing our students for twenty-first century jobs by providing them with the latest skills. It’s obvious enough that the ones they have now aren’t doing the trick, since so many people are out of work, and so many of those who are employed seem to be in dead-end positions. The way out of our social and cultural morass lies, we are told, in a focus on the STEM subjects: Science, Technology, Engineering, and Math. Providing students with job skills is the main business of education. They need to be made employable. They need to be able to become wealthy, because that’s how our society understands, recognizes, and rewards worth. We pay lip service, but little else, to other standards of value.

The Sarah D. Barder Fellowship organization to which I also belong is a branch of the Johns Hopkins University Center for Talented Youth. It’s devoted to gifted and highly gifted education. At their annual conference they continue to push for skills, chiefly in the scientific and technical areas, to make our students competitive in the emergent job market. The highly gifted ought to be highly employable and hence earn high incomes. That’s what it means, isn’t it?

The politicians of both parties have contrived to disagree about almost everything, but they seem to agree about this. In January of 2014, President Barack Obama commented, “…I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree. Now, nothing wrong with an art history degree — I love art history. So I don’t want to get a bunch of emails from everybody. I’m just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.”

From the other side of the aisle, Florida Governor Rick Scott said, “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so.”

They’re both, of course, right. The problem isn’t that they have come up with the wrong answer. It isn’t even that they’re asking the wrong question. It’s that they’re asking only one of several relevant questions. They have drawn entirely correct conclusions from their premises. A well-trained plumber with a twelfth-grade education (or less) can make more money than I ever will as a Ph.D. That has been obvious for some time now. If I needed any reminding, the last time we required a plumber’s service, the point was amply reinforced: the two of them walked away in a day with about what I make in a month. It’s true, too, that a supply of anthropologists is not, on the face of things, serving the “compelling interests” of the state of Florida (or any other state, probably). In all fairness, President Obama said that he wasn’t talking about the value of art history as such, but merely its value in the job market. All the same, that he was dealing with the job market as the chief index of an education’s value is symptomatic of our culture’s expectations about education and its understanding of what it’s for.

The politicians haven’t created the problem; but they have bought, and are now helping to articulate further, the prevalent assessment of what ends are worth pursuing, and, by sheer repetition and emphasis, crowding the others out. I’m not at all against STEM subjects, nor am I against technologically competent workers. I use and enjoy technology. I am not intimidated by it. I teach online. I’ve been using the Internet for twenty-odd years. I buy a fantastic range of products online. I programmed the chat software I use to teach Latin and Greek, using PHP, JavaScript, and mySQL. I’m a registered Apple Developer. I think every literate person should know not only some Latin and Greek, but also some algebra and geometry. I even think, when going through Thucydides’ description of how the Plataeans determined the height of the wall the Thebans had built around their city, “This would be so much easier if they just applied a little trigonometry.” Everyone should know how to program a computer. Those are all good things, and help us understand the world we’re living in, whether we use them for work or not.

But they are not all that we need to know. So before you quietly determine that what I’m offering is just irrelevant, allow me to bring some news from the past. If that sounds contradictory, bear in mind that it’s really the only kind of news there is. All we know about anything at all, we know from the past, whether recent or distant. Everything in the paper or on the radio news is already in the past. Every idea we have has been formulated based on already-accumulated evidence and already-completed ratiocination. We may think we are looking at the future, but we aren’t: we’re at most observing the trends of the recent past and hypothesizing about what the future will be like. What I have to say is news, not because it’s about late-breaking happenings, but because it seems not to be widely known. The unsettling truth is that if we understood the past better and more deeply, we might be less sanguine about trusting the apparent trends of a year or even a decade as predictors of the future. They do not define our course into the infinite future, or even necessarily the short term — be they about job creation, technical developments, or weather patterns. We are no more able to envision the global culture and economy of 2050 than the independent bookseller in 1980 could have predicted that a company named Amazon would put him out of business by 2015.

So here’s my news: if the United States becomes a third-world nation (a distinct possibility), it will not be because of a failure in our technology, or even in our technological education. It will be because, in our headlong pursuit of what glitters, we have forgotten how to differentiate value from price: we have forgotten how to be a free people. Citizenship — not merely in terms of law and government, but the whole spectrum of activities involved in evaluating and making decisions about what kind of people to be, collectively and individually — is not a STEM subject. Our ability to articulate and grasp values, and to make reasoned and well-informed decisions at the polls, in the workplace, and in our families, cannot be transmitted by a simple, repeatable process. Nor can achievement in citizenship be assessed simply, or, in the short term, accurately at all. The successes and failures of the polity as a whole, and of the citizens individually, will remain for the next generation to identify and evaluate — if we have left them tools equal to the task. Our human achievement cannot be measured by lines of code, by units of product off the assembly line, or by GNP. Our competence in the business of being human cannot be certified like competence in Java or Oracle (or, for that matter, plumbing). Even a success does not necessarily hold out much prospect of employment or material advantage, because that was never what it was about in the first place. It offers only the elusive hope that we will have spent our stock of days with meaning — measured not by our net worth when we die, but by what we have contributed when we’re alive. The questions we encounter in this arena are not new ones, but rather old ones. If we lose sight of them, however, we will have left every child behind, for technocracy can offer nothing to redirect our attention to what matters.

Is learning this material of compelling interest to the state? That depends on what you think the state is. The state as a bureaucratic organism is capable of getting along just fine with drones that don’t ask any inconvenient questions. We’re already well on the way to achieving that kind of state. Noam Chomsky, ever a firebrand and not a man with whom I invariably agree, trenchantly pointed out, “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.” He’s right. If we are to become unfree people, it will be because we gave our freedom away in exchange for material security or some other ephemeral reward — an illusion of safety and welfare, and those same jobs that President Obama and Governor Scott have tacitly accepted as the chief — or perhaps the only — real objects of our educational system. Whatever lies outside that narrow band of approved material is an object of ridicule.

If the state is the people who make it up, the question is subtly but massively different. Real education may not be in the compelling interest of the state qua state, but it is in the compelling interest of the people. It’s the unique and unfathomably complex amalgam that each person forges out of personal reflection, of coming to understand one’s place in the family, in the nation, and in the world. It is not primarily practical, and we should eschew it altogether, if our highest goal were merely to get along materially. The only reason to value it is the belief that there is some meaning to life beyond one’s bank balance and material comfort. I cannot prove that there is, and the vocabulary of the market has done its best to be rid of the idea. But I will cling to it while I live, because I think it’s what makes that life worthwhile.

Technical skills — job skills of any sort — are means, among others, to the well-lived life. They are even useful means in their place, and everyone should become as competent as possible. But as they are means, they are definitionally not ends in themselves. They can be mistakenly viewed as ends in themselves, and sold to the credulous as such, but the traffic is fraudulent, and it corrupts the good that is being conveyed. Wherever that sale is going on, it’s because the real ends are being quietly bought up by those with the power to keep them out of our view in their own interest.

Approximately 1900 years ago, Tacitus wrote of a sea change in another civilization that had happened not by cataclysm but through inattention to what really mattered. Describing the state of Rome at the end of the reign of Augustus, he wrote: “At home all was calm. The officials carried the old names; the younger men had been born after the victory of Actium; most even of the elder generation, during the civil wars; few indeed were left who had seen the Republic. It was thus an altered world, and of the old, unspoilt Roman character not a trace lingered.” It takes but a single generation to forget the work of ages.

But perhaps that’s an old story, and terribly out of date. I teach Latin, Greek, literature, and history, after all.

News — Spring 2015

Sunday, May 17th, 2015

National French Teachers Examination

Congratulations to Mrs. Mary Catherine Lavissière’s students Katie Cruse, Alana Ross, Micah Wittenberg, and Moriah Wittenberg! These four Scholars Online students placed with honors in the National French Test Le Grand Concours 2015. The test is offered annually by the American Association of Teachers of French to identify and recognize students achieving high proficiency in the French language.

Madame Lavissière offers courses in both French and Spanish through Scholars Online. See our Modern Languages course descriptions for more information.
Update on Summer Session Courses for 2015

We’ve added several new courses for the summer session, which runs from June 8-August 21, 2015 (individual courses may span different periods within the session, so check your course description for exact start dates). Most summer classes are chances for students to build new skills in fun but still useful ways. Click on the course name to see descriptions of class schedules and costs, and on syllabus links to see detailed course content and assignments. Enrollment must be completed by May 31 to ensure placement in the course, and payment in full is due before students can attend chat sessions. Enrollments received after May 31 may not be processed in time for students to attend the first sessions of their course.

  • Explore the many facets of J.R.R. Tolkien’s creation in Looking at Middle-earth. Discussions will focus on Tolkien’s world-building, use of language, his theology of “subcreation”, and his work as a philologist. Students are expected to have read The Hobbit and The Lord of the Rings.
  • Sample Shakespeare’s comedy, tragedy, and history plays, including Twelfth Night, As You Like ItThe Taming of the Shrew,The Merchant of VeniceA Midsummer Night’s DreamKing LearJulius CaesarRomeo and Juliet, and Richard II in Summer Shakespeare I. Students taking Scholars Online’s literature series, supplemented with Summer Shakespeare II and III, have the opportunity to study and discuss all of Shakespeare’s plays. [See the Full Syllabus for details.]
  • Gain practical writing skills with Molding Your Prose (based on an idea suggested to Dr. Bruce McMenomy by Mary McDermott Shideler). Learn to organize your ideas and improve your dialectic skills in Molding your Argument. Both of these popular courses requires short weekly writing exercises, with students analyzing each others’ work to learn to identify and improve their own writing.
  • Jump start your academic year Physics course with an overview of key theories and concepts in Introduction to Physics, a survey of the fundamental concepts of classical mechanics and modern physics, and gain essential analysis and problem-solving skills. Students planning to take the combined AP Physics 1 and 2 course will be able to count lab work from this course toward their AP lab requirements. [Full syllabus]
  • NEW COURSE! In The Age of Reagan, discover how the events and decisions of the Reagan administration have shaped current political, religious, economic, and environmental policies. Students opting for the media studies component of this course will also examine how movies, TV, and ads portray cultural messages (parental guide available in the full syllabus).

Failure is not an option

Monday, March 21st, 2011

When I taught my first class as a graduate assistant at UCLA, one of the students asked whether my Western Civilization section was a “Mickey Mouse” course. What he meant was, “Is this a course with a guaranteed A if I show up and do the minimal work assigned, or will I run the risk that the work I do won’t be good enough for an A?” I said no, it wasn’t a Mickey Mouse course; the history of the Western World was complex and it would take work. I would not guarantee his grade.

He didn’t show up at our next meeting and the enrolled student printout the next week confirmed that he had dropped the class. He couldn’t risk the possibility of failure (which apparently was determined by having a less than 4.0 GPA), and so he missed the opportunity to learn why the reforms of Diocletian changed the economy of the Roman Empire and influenced the rise of monasteries, or how the stirrup made the feudal system possible, or how the academic interests of Charlemagne led to the rise of universities and the very institution he was supposed to be part of.  He chose to fail to get an education rather than fail to get an A grade.

When I taught my first chemistry course online, I was blessed with an enthusiastic bunch of brilliant students who tackled the rigorous textbook and beat it into submission — except for one student we’ll call Joe. Joe lacked the science and math background that would have made the course easier, and he had a learning disability that made reading anything, but especially any kind of formulae, a real trial.  By the middle of the fall semester, it was clear that Joe was in serious trouble. His mother discussed the possibility of dropping the course, but I thought I could teach any willing student anything, so I offered extra help. Joe and I agreed to meet an hour early before the rest of the class and work through the problematic material. When I realized the extent of Joe’s problems, we backed up and started over. He continued to attend the regular online sessions with the rest of the class, but I excused him from keeping up with the homework and quiz assignments while we tried to establish a foundation he could really build on.

At the end of the academic year, the rest of the class had finished the twenty-two chapters of the text. Joe had finished four.

But he really knew those four chapters. He could answer any question and do any problem from them, with more facility and conviction than some of the students who had seemingly breezed through the material months earlier. I reluctantly entered a failing grade on his report, but wrote his parents that I didn’t think the grade reflected Joe’s real accomplishments that year. He had managed to learn some chemistry. What’s more, I’d had a salutary lesson in perseverance.

What I hadn’t realized was that my lesson wasn’t over. Joe didn’t accept his failing grade as the final word. Three years later, out of the blue, I got a letter from Joe’s mother. Her son, fired with the discovery that he could actually learn chemistry given enough time, and the realization that he actually liked chemistry, had gotten a job working part time so that he could pay a chemistry student from the local college to tutor him. He applied the same dogged determination he had shown in our extra morning sessions to his self-study and with the help of his tutor, slogged his way though the rest of our text. Kindly note that no one was giving him a grade for this work. But when he was done with his self-study, he took a community college chemistry course and passed it.

Like so many things, failure is a matter of perception. In his own estimation, Joe hadn’t failed — despite the F on his transcript. Many students would have given up early in the semester — certainly before the last withdrawal date — rather than risk a failing grade. For Joe, the grade was not a locked gate blocking his passage; it was merely measure of how far he still had to go. The educational reality was that he was four chapters further than he had been at the beginning of the year. He took heart from the fact that he was making progress, and kept going.

Our dependence on grades frustrates the educational progress of many otherwise willing students. They take easy courses where they are confident they can do well, rather than risk lowering their grade point average by taking the course that will actually challenge them to grow intellectually. In some cases, teachers even enable the process by giving “consolation” grades rather than risking damaging the fragile self-esteem of students — but everyone, even the students, realizes that they didn’t actually earn the report. We’ve created a schizoid educational system, where even though we know that recorded grades at best inadequately reflect a student’s real accomplishments, and, at worst, distort them, we still base academic advancement and even financial rewards on those abstractions for the sake of convenience. The result is that students pursue grades, rather than education.

Real education requires discipline and serious reflection, but it also requires taking risks, making mistakes, and learning from those mistakes. I would venture that making mistakes and recovering from them is not merely a normal part of learning, but an essential of classical Christian education. We do our students an enormous disservice by making them afraid to fail to “get it right” the first time. We teach them to back down, rather than to buckle down and tackle a new topic with gumption.

Gravity is an uncompromising and unforgiving teacher. Lose your balance, and you will fall.  But every child learns to walk, sooner or later, despite many tumbles along the way. We expect toddlers to fall, and we try to minimize the damage by removing sharp edges and putting down carpets. But we let them fall: how else will they learn to recognize imbalance and practice the motor skills to correct it? We teach them such tumbles should not be a reason to give up learning to walk; we laugh, encourage them to get up, and try again. Ultimately, every healthy child learns to walk, and we really don’t care how many tumbles they took, or how long it took. Parents may report the accomplishment with glee to friends and grandparents, but when was the last time anyone asked how old you were when you learned to walk? The important thing is that you didn’t give up: you chose not to fail, you are walking now, and that gives you the ability to do things you wouldn’t otherwise be able to do as easily.

The phrase “failure is not an option” comes from the movie Apollo 13. The script writers put it in the mouth of Gene Kranz, the NASA Flight Control director at the time. He never actually said those words, but they reflected a firm conviction evidenced by Mission Control that the team would not consider failure among the possible outcomes of their efforts. They could not choose to fail if none of the other options worked — failure was simply not on the list. Of course, failure was still a possibility, but it wasn’t a choice. Their goal was to find a solution that would bring the astronauts home safely, and if none of the proposed options worked, to propose something else that might, and keep working until they succeeded.

Our goal as Christian parents is to educate our children to know God and His creation better, to love all the people He has created, and to serve Him by using the talents He has given them to show His love in that world. To accomplish that, our children need to grow intellectually and spiritually. They need to tackle many subjects, push the limits, and be willing to reveal their ignorance by asking questions. If we are doing an effective job of classical education, we will teach them how to read so closely and carefully that they recognize when things don’t make sense, and be eager to find out why.

Questioning the material won’t be an indication of students’ inability to figure it out for themselves, but a witness to their deep engagement with the content of the text, whether it is making sense of a Latin translation exercise, following a geometrical proof to conclusion, imagining the ramifications of relativity theory, or understanding how the concept of nature influences the behavior of Hawthorne’s characters. When failure is not an option, we understand that students have committed to stay the course, even when they make slow progress by some arbitrary standard, or have to take a detour to pick up necessary skills. Students are freed to make the mistakes they need to make to learn, grow, and ultimately succeed without the prejudice of failed expectations, and we are free to recognize the true achievements in their education, whether or not that is reflected by their current grade level or GPA.

Making Sense and Finding Meaning

Sunday, October 4th, 2009

My intermediate and advanced Greek and Latin classes are largely translation-based. There’s a lot of discussion among Latin teachers about whether that’s a good approach, but much of the dispute is, I think, mired in terminological ambiguity, and at least some of the objections to translation classes don’t entirely apply to what we’re doing. What I’m looking for is emphatically not a mechanical translation according to rigid and externally objective rules (“Render the subjunctive with ‘might’,” “Translate the imperfect with the English progressive,” or the like), but rather the expression of the student’s understanding of each sentence as a whole, in the context of the larger discussion or narrative.

We aren’t there to produce publishable translations: that’s an entirely different game, with different rules. For us, translations are the means to an end: the understanding is the real point of the process, but it’s hard to measure understanding unless it’s expressed somehow. The translations, therefore, are like a scaffold surrounding the real edifice — engagement with the text as a whole: its words, its sounds, and its various levels of meaning. That engagement is hard to pin down, but it allows us to make a genuine human connection with the mind of the author. A detached mechanical “translation”, though, is like a scaffold built around nothing, or the new clothes without the emperor. Even were artificial intelligence able to advance to the point that a computer could produce a flawless rendition of a text into another language, it still would not have achieved what is essential. It will not have understood. It will not have savored the words, grasped the concepts, combined them into larger ideas, applied them to new contexts, or come to a meeting of the minds with the author.
This is not always an easy concept for students to grasp. Some are fretful to get exactly the right wording (as if there were such a thing), but apparently less concerned with understanding the essential meaning. At the beginning of the year, I usually have a few students who make the (to me bizarre) claim, “I translated this sentence, but I don’t understand it.” My response is always some variation on, “If you didn’t make sense of it, you didn’t really translate it.”

We talk about making sense of the passage, but even that turn of phrase may be one of the little arrogances of the modern world. The prevalent modern paradigm suggests that the world is without order or meaning unless we impose it; Christianity, however, presupposes a world informed by its Creator with a consistent meaning that we only occasionally perceive. For us, it would probably be more accurate, and certainly more modest, to talk of finding or discovering the sense in the passage.

Whether we call it “making sense” or “finding sense”, though, it is not just the stuff of language classes. Every discipline is ultimately about finding meaning in and through its subject matter. In language and literature we look for the informing thought behind speech and writing. In history we look to understand the whole complex relationship of individuals and groups through time, with their ideas, movements, and circumstances, and what it all meant for them and what it means for us today. The sciences look to find the rationale in the order of the physical universe, mathematics the meaning of pure number and proportion, and philosophy to find the sense of sense itself. Each discipline has its own methods, its own vocabulary, and its own techniques. Each has its own equivalent of the translation exercise, too — something we do not really for its own sake, but to verify that the student has grasped something larger that cannot be measured directly. But behind those differences of method and process, all of them are about engaging with the underlying meaning. All real learning is. (In that respect it differs from training, which is not really about learning as such, but about acquiring known skills. Both learning and training are essential to a well-rounded human being, but they shouldn’t be confused with one another.)

From a secular point of view, this must seem a rather granular exercise with many dead ends. That each thing should have its own limited kind of meaning, unrelated to every other, seems at least aesthetically unsatisfying; it offers us Eliot’s Waste Land: a “heap of broken images”, pointing nowhere. Language is fractured, and our first great gift of articulate speech clogs and becomes useless.

Our faith offers us something else: we were given the power to name creation — to refer to one thing through or with another — as a way of proclaiming the truth of God, surely, but also, I think, as a kind of hint as to how we should view the whole world. Everything, viewed properly, can be a sign. As Paul says in Romans, “For since the creation of the world God’s invisible qualities—his eternal power and divine nature—have been clearly seen, being understood from what has been made, so that men are without excuse” (1:20, NIV); Alanus ab Insulis (1128-1202) wrote, about 1100 years later, “Every creature in the world is like a picture to us, or a mirror.” Signification itself is transformed and transfigured, sub specie aeternitatis, from a set of chaotic references into a kind of tree, in which the signifiers converge, both attesting the unitary truth of the Lord and endowing every created thing in its turn with a holy function.

The Right Answer

Sunday, April 26th, 2009

I start Natural Science I each year with the question “What is science?” The result is generally a lively debate in which students start by giving me one-sentence answers.

“Science is the study of nature”, Joe says.

“What do you mean by ‘nature’?”  I ask.

There is consternation, silence, and eventually another attempt.   “Nature is the created world,” Joe says.

“What do you mean by ‘world’?”

It’s the reverse of the game every three-year-old plays with his parents.  Every answer Joe puts forward merely raises more questions about the meaning and  limits of the terms he uses.  He keeps trying to find the easy-to-memorize one sentence answer that I’ll accept.  I keep pushing back, trying to get him to think about what he is actually trying to say.  Over the next ninety minutes, we’ll push into what objects really are susceptible to scientific method, what scientific method is, how we know what we know, what proof is, and why we should bother to “study” any of it.

At the end of the session, I ask my students to write down their definitions of science based on our discussion, and post them to our bulletin board so the other students can see and comment on them.  Occasionally Joe will post two sentences where he only offered one in class, in tacit recognition of some aspect he had not originally considered.  Once in a while, I get a longer, more thoughtful paragraph that actually tries to summarize both trends of thought.  But inevitably, just as we are running out of time, Joe asks me what the “right” answer is.

I’m always stumped on how to deal with this. We just spent ninety minutes exploring the most obvious factors that feed into the human race’s attempt to understand  the universe in which it exists. We’ve barely scratched the surface of the all of the aspects of this complex endeavor, and if Joe had actually looked at the course syllabus, he would would realize that we are going to spend two years looking at how people have done whatever it is they thought of as  “science” for the last 3000 years — and that’s just in the Western tradition. (We don’t get into Chinese or Japanese or Indian efforts at all — there just isn’t enough time!)  Why should Joe have any illusions that I can state a right answer that everyone would accept, let alone one that is complete, in the remaining thirty seconds of chat available?

I recognize Joe’s anxiety has a real basis.  He wants to know my answer, since  I will be the one to give him credit for his bulletin board posting, and he wants to get a passing grade, preferably a high one, which is, after all, what others will look at and use to evaluate him when he attempts to go on to college and then on to a good job.   He is so concerned with the grading aspect of our educational process that he doesn’t stop to think about whether the string of words I might give him is really a correct definition of science, he doesn’t realize that he has no way yet to determine its correctness, and he never questions whether I should have the authority to dictate that definition.

This is only one symptom of a common but mistaken approach to education, where the grade is the goal, not the heart and soul of the subject.  In his book The Celebration of Discipline,  Richard Foster addresses the fundamental root of the problem (which affects much more than education in our society) when he says, “Superficiality is the curse of our age.  The desperate need today is not for a greater number of intelligent people, or gifted people, but for deep people.”

We live in a rapidly changing world, where cultures are clashing, resources for survival seem limited, competition is endless, and compassion in short supply.  We make technical advances, but we have no way to answer the question of whether we should do something, simply because we can do it.   We need people who can think critically, as did the philosophers, scientists, and poets who produced the classic works that we have turned back to for centuries.  We need people who can think charitably and humbly about the effects of their actions on others, as Christ would have us do.  But how do we be those people?  How do we help our children develop into the discerning, charitable human beings we want and need them to become, if they are to serve as Christ’s ministers to a broken world?

One of the exercises any good teacher uses to help students recognize and move beyond superficiality simply forces the student to reconsider every term in the answer.  Joe’s first attempt at an answer to an open question like “What is science” is a usually superficial response.  It may not be factually wrong, but it is almost always incomplete, involving assumptions and generalities Joe hasn’t considered, and may not even consciously recognize that he’s made.

Suppose that we look again at Joe’s answer, “Science is the study of nature”.  “Science” is what we are trying to define, so we’ll leave it alone for the moment,  but what *do* we mean by nature, really? Is it only the created universe?  Are angels part of nature?  Are triangles? Are people part of nature?  Is poetry?  Are the thought processes and electrical signals and nerve cells that produce the poetry (at least in mechanical terms) nature and subject, by our first attempt at a definition, to scientific investigation?

When we start to examine our assumptions, we realize that a more precise definition of our abstract concept is intimately tied up with the application of that definition to specific cases.  How do we do whatever it is that we define as scientific investigation?  Is the only valid scientific method experimentation done in a lab with controls under repeatable conditions with machines objectively measuring factors?  Some scientists — especially physicists — would say yes.   Can field observations and the notes of a naturalist be a legitimate form of scientific investigation?  Most biologists would defend field observation as a legitimate form of scientific investigation.  Can we really claim how hot the photosphere of the sun must be based solely on spectral line measurements from the light-emitting layer of the sun, or must we put a thermometer of some kind in the plasma itself?  Astronomers recognize the futility of direct observation, and would defend their deductions as accurate based on analogies to phenomena we can observe directly.  Can we use computer models of weather patterns to predict the path of a hurricane?  The federal government evacuates thousands of people on the basis of a mathematical abstraction of a storm as a legitimate application of science — amid huge controversies over the costs of the evacuation and the accuracies of the predictions.  How much of this is “the study of nature”?

When the “right” answer depends on whom you ask, you are really forced to start thinking of good reasons for any answer you propose.  Everyone has seminal moments, watershed moments they can point to and say “that experience taught me this”.  I can think of two in my freshman year at college which shaped the way I teach…maybe in another blog entry I’ll tell you about the second.  But the first one addresses our “right answer” problem directly.  Every freshman at Scripps College took a humanities course on the ancient world.  It met four times a week, and the entire staff rotated responsibility for giving lectures on literature, historical events, religion, philosophy, art, architecture, science, and technology.  A crucial component were the additional seminar meetings once a week for two hours in the evening, where we studied one work or concept in depth for eight weeks.  At the end of the first semester, the two professors who had presented the literature lectures agreed to do a joint lecture and clear up a discrepancy we had noticed in their separate presentations on The Iliad.  We sighed with relief: we were finally going to get the right answer.  Dr. Palmer and Dr. Howe stood on the stage in the lecture hall, but they didn’t present the common interpretation we’d hoped for, something snappy, easy to remember, and safe to use in our exam.  Instead, they presented, and debated heatedly, two completely opposing interpretations of The Iliad. At the end of their presentation, there was no “winner” with the right interpretation.  Then they announced that the only literature question on the exam would be the one they had just debated, and that one or the other would grade our exam, but we wouldn’t know which one.  We couldn’t write the answer we knew the teacher thought was correct.  The only thing we could do was champion some position as best we could — Howe’s, Palmer’s, or our own, if we disagreed with both of them.

And that, of course, was the point.  They weren’t at all interested in our simply flinging back at them some “right” answer, some clipping from one of their lectures.  That would only demonstrate that we could take notes and do rote memorization.   What they really wanted was for us to think deeply about a work of literature that has touched millions of people for two thousand years, reach a conclusion, and  make a point – our own point, not theirs — succinctly, based on solid reasoning and factually accurate references.

We should seek no less for our students.